What Kind of Computation Is Cognition?

Recent successes in artificial intelligence have been largely driven by neural networks and other sophisticated machine learning tools for pattern recognition and function approximation. But human intelligence is much more than finding patterns or approximating functions. And no machine system yet…

What Kind of Computation Is Cognition?

Source

0
(0)

Recent successes in artificial intelligence have been largely driven by neural networks and other sophisticated machine learning tools for pattern recognition and function approximation. But human intelligence is much more than finding patterns or approximating functions. And no machine system yet built has anything like the flexible, general-purpose common-sense grasp of the world that every human being does. This talk will address prospects for capturing human common sense in computational terms. I will briefly illustrate recent work modeling intuitive physics and intuitive psychology—that is, computer-implemented formal models of people’s intuitive mental models of physical objects, intentional agents, and their causal interactions; how these systems guide our perception, prediction, and planning; and the learning mechanisms that extend our thinking into new domains. I will introduce in non-technical terms the computational ideas underlying these models, such as probabilistic programming, inference over video-game-style simulations, and probabilistic program synthesis. And I will close with some speculative thoughts for discussion about the metaphysics of computation: In what sense can we say that these models capture the real workings of the human mind, or brain?

Josh Tenenbaum is Professor of Computational Cognitive Science at MIT. His research aims to understand in computational terms how intelligence arises in the human mind and brain, and to use these insights to build more human-like artificial intelligence. Together with colleagues in computer science, he is exploring novel paradigms that integrate our best computational thinking from multiple eras and traditional approaches to AI and cognitive science, including probabilistic (Bayesian), symbolic, and neural (differentiable) modeling, in search of a unifying science, engineering and mathematics of intelligence. He is the recipient of the Distinguished Scientific Award for Early Career Contributions in Psychology from the American Psychological Association, the Troland Research Award from the National Academy of Sciences, the Howard Crosby Warren Medal from the Society of Experimental Psychologists, the R&D Magazine Innovator of the Year award, and a MacArthur Fellowship.

The spring 2022 Shulman Lectures have been organized in conjunction with the Yale College seminar “Metaphysics Meets Cognitive Science” taught by Brian Scholl (Psychology) and L. A. Paul (Philosophy).

0 / 5. 0