The Need for New Deep Learning Architectures – Intel on AI Season 3, Episode 3

Intel on AI - En podcast af Intel Corporation

Kategorier:

In this episode of Intel on AI host Amir Khosrowshahi and Yoshua Bengio talk about structuring future computers on the underlying physics and biology of human intelligence. Yoshua is a professor at the Department of Computer Science and Operations Research at the Université de Montréal and scientific director of the Montreal Institute for Learning Algorithms (Mila). In 2018 Yoshua received the ACM A.M. Turing Award with Geoffrey Hinton and Yann LeCun. In the episode, Yoshua and Amir discuss causal representation learning and out-of-distribution generalization, the limitations of modern hardware, and why current models are exponentially increasing amounts of data and compute only to find slight improvements. Yoshua also goes into detail about equilibrium propagation—a learning algorithm that bridges machine learning and neuroscience by computing gradients closely matching those of backpropagation. Yoshua and Amir close the episode by talking about academic publishing, sharing information, and the responsibility to make sure artificial intelligence (AI) will not be misused in society, before touching briefly on some of the projects Intel and Mila are collaborating on, such as using parallel computing for the discovery of synthesizable molecules. Academic research discussed in the podcast episode: Computing machinery and intelligence A quantitative description of membrane current and its application to conduction and excitation in nerve From System 1 Deep Learning to System 2 Deep Learning The Consciousness Prior BabyAI: A Platform to Study the Sample Efficiency of Grounded Language Learning Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation A deep learning theory for neural networks grounded in physics

Visit the podcast's native language site