Action-Based Learning of Multistate Objects in the Medial Temporal Lobe

Nicholas C. Hindy, Nicholas B. Turk-Browne

Research output: Contribution to journalReview articlepeer-review

12 Scopus citations

Abstract

Actions constrain perception by changing the appearance of objects in the environment. As such, they provide an interactive basis for learning the structure of visual input. If an action systematically transforms one stimulus into another, then these stimuli are more likely to reflect different states of the same persisting object over time. Here we show that such multistate objects are represented in the human medial temporal lobe - the result of a mechanism in which actions influence associative learning of how objects transition between states. We further demonstrate that greater recruitment of these action-based representations during object perception is accompanied by attenuated activity in stimulus-selective visual cortex. In this way, our interactions with the environment help build visual knowledge that predictively facilitates perceptual processing.

Original languageEnglish (US)
Pages (from-to)1853-1865
Number of pages13
JournalCerebral Cortex
Volume26
Issue number5
DOIs
StatePublished - May 1 2016

All Science Journal Classification (ASJC) codes

  • Cognitive Neuroscience
  • Cellular and Molecular Neuroscience

Keywords

  • action
  • medial temporal lobe
  • object perception
  • predictive coding
  • representational similarity

Fingerprint Dive into the research topics of 'Action-Based Learning of Multistate Objects in the Medial Temporal Lobe'. Together they form a unique fingerprint.

Cite this