How to grow a mind: Statistics, structure, and abstraction

Joshua B. Tenenbaum, Charles Kemp, Thomas L. Griffiths, Noah D. Goodman

Research output: Contribution to journalReview articlepeer-review

1146 Scopus citations

Abstract

In coming to understand the world - in learning concepts, acquiring language, and grasping causal relations - our minds make inferences that appear to go far beyond the data available. How do we do it? This review describes recent approaches to reverse-engineering human learning and cognitive development and, in parallel, engineering more humanlike machine learning systems. Computational models that perform probabilistic inference over hierarchies of flexibly structured representations can address some of the deepest questions about the nature and origins of human thought: How does abstract knowledge guide learning and reasoning from sparse data? What forms does our knowledge take, across different domains and tasks? And how is that abstract knowledge itself acquired?

Original languageEnglish (US)
Pages (from-to)1279-1285
Number of pages7
JournalScience
Volume331
Issue number6022
DOIs
StatePublished - Mar 11 2011
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General

Fingerprint

Dive into the research topics of 'How to grow a mind: Statistics, structure, and abstraction'. Together they form a unique fingerprint.

Cite this