Actions constrain perception by changing the appearance of objects in the environment. As such, they provide an interactive basis for learning the structure of visual input. If an action systematically transforms one stimulus into another, then these stimuli are more likely to reflect different states of the same persisting object over time. Here we show that such multistate objects are represented in the human medial temporal lobe - the result of a mechanism in which actions influence associative learning of how objects transition between states. We further demonstrate that greater recruitment of these action-based representations during object perception is accompanied by attenuated activity in stimulus-selective visual cortex. In this way, our interactions with the environment help build visual knowledge that predictively facilitates perceptual processing.
All Science Journal Classification (ASJC) codes
- Cognitive Neuroscience
- Cellular and Molecular Neuroscience
- medial temporal lobe
- object perception
- predictive coding
- representational similarity