Intuitive Theories as Grammars for Causal Inference

Joshua B. Tenenbaum, Thomas L. Griffiths, Sourabh Niyogi

Research output: Chapter in Book/Report/Conference proceedingChapter

2 Scopus citations

Abstract

This chapter presents a framework for understanding the structure, function, and acquisition of causal theories from a rational computational perspective. Using a "reverse engineering" approach, it considers the computational problems that intuitive theories help to solve, focusing on their role in learning and reasoning about causal systems, and then using Bayesian statistics to describe the ideal solutions to these problems. The resulting framework highlights an analogy between causal theories and linguistic grammars: just as grammars generate sentences and guide inferences about their interpretation, causal theories specify a generative process for events, and guide causal inference.

Original languageEnglish (US)
Title of host publicationCausal Learning
Subtitle of host publicationPsychology, Philosophy, and Computation
PublisherOxford University Press
ISBN (Electronic)9780199958511
ISBN (Print)9780195176803
DOIs
StatePublished - Apr 1 2010
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Psychology

Keywords

  • Bayesian inference
  • Causal learning
  • Causal reasoning
  • Generative grammar
  • Intuitive theories
  • Probabilistic models

Fingerprint

Dive into the research topics of 'Intuitive Theories as Grammars for Causal Inference'. Together they form a unique fingerprint.

Cite this