Abstract Coding of Audiovisual Speech: Beyond Sensory Representation

Uri Hasson, Jeremy I. Skipper, Howard C. Nusbaum, Steven L. Small

Research output: Contribution to journalArticlepeer-review

72 Scopus citations


Is there a neural representation of speech that transcends its sensory properties? Using fMRI, we investigated whether there are brain areas where neural activity during observation of sublexical audiovisual input corresponds to a listener's speech percept (what is "heard") independent of the sensory properties of the input. A target audiovisual stimulus was preceded by stimuli that (1) shared the target's auditory features (auditory overlap), (2) shared the target's visual features (visual overlap), or (3) shared neither the target's auditory or visual features but were perceived as the target (perceptual overlap). In two left-hemisphere regions (pars opercularis, planum polare), the target invoked less activity when it was preceded by the perceptually overlapping stimulus than when preceded by stimuli that shared one of its sensory components. This pattern of neural facilitation indicates that these regions code sublexical speech at an abstract level corresponding to that of the speech percept.

Original languageEnglish (US)
Pages (from-to)1116-1126
Number of pages11
Issue number6
StatePublished - Dec 20 2007
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Neuroscience




Dive into the research topics of 'Abstract Coding of Audiovisual Speech: Beyond Sensory Representation'. Together they form a unique fingerprint.

Cite this