Manage episode 298826762 series 1932286
Andy and Dave discuss the latest in AI news and research, including the new DARPA FENCE program (Fast Event-based Neuromorphic Camera and Electronics), which seeks to create event-based cameras that only focus on pixels that have changed in a scene. NIST proposed an approach for reducing the risk of bias in AI and has invited the public to comment and help improve it. Researchers from the University of Colorado, Boulder use a machine learning model to learn physical properties in electronics building blocks (such as clumps of silicon and germanium atoms), as a way to predict how larger electronics components will work or fail. Researchers in South Korea create an artificial skin that mimics human tactile recognition, and couple it with a deep learning algorithm to classify surface structures (with an accuracy of 99.1%). A survey from IE University shows, among other things, that 75% of people surveys in China support replacing parliamentarians with AI, while in the US, 60% were opposed to it. A scientist with uses machine learning to learn Rembrandt’s style and then recreate missing pieces of the painter’s “The Night Watch.” Researchers at Harvard, San Diego, Fujitsu, and MIT present methodical research on demonstrating how classification neural networks are susceptible to small 2D transformations and shifts, image crops, and changes in object colors. The GAO releases a report on Facial Recognition Technology, surveying 42 federal agencies, and finds a general lack of accountability in the use of the technology. The WHO releases a report on Ethics and Governance of AI for Health. In rebuttal to DeepMind’s “Reward is enough” paper, Roitblat and Byrnes pens separate essays on why “Reward is not enough.” An open-access book by Wang and Barabasi looks at the Science of Science. Julia Schneider and Lena Ziyal join forces to provide a comical essay on AI: We Need to Talk, AI. And the National Security Commission on AI holds an all-day summary on Global Emerging Technology.
Follow the link below to visit our website and explore the links mentioned in the episode.https://www.cna.org/CAAI/audio-video