TY - JOUR
T1 - Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex
AU - Ghazanfar, Asif A.
AU - Maier, Joost X.
AU - Hoffman, Kari L.
AU - Logothetis, Nikos K.
PY - 2005/5/18
Y1 - 2005/5/18
N2 - In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role "unimodal" sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.
AB - In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role "unimodal" sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.
KW - Bimodal
KW - Crossmodal
KW - Speech
KW - Superior temporal sulcus
KW - Temporal lobe
KW - Vocalization
UR - http://www.scopus.com/inward/record.url?scp=19044387911&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=19044387911&partnerID=8YFLogxK
U2 - 10.1523/JNEUROSCI.0799-05.2005
DO - 10.1523/JNEUROSCI.0799-05.2005
M3 - Article
C2 - 15901781
AN - SCOPUS:19044387911
SN - 0270-6474
VL - 25
SP - 5004
EP - 5012
JO - Journal of Neuroscience
JF - Journal of Neuroscience
IS - 20
ER -