Differences in emotion recognition from body and face cues between deaf and hearing individuals

Chiara Ferrari, Costanza Papagno, Alexander Todorov, Zaira Cattaneo

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Deaf individuals may compensate for the lack of the auditory input by showing enhanced capacities in certain visual tasks. Here we assessed whether this also applies to recognition of emotions expressed by bodily and facial cues. In Experiment 1, we compared deaf participants and hearing controls in a task measuring recognition of the six basic emotions expressed by actors in a series of video-clips in which either the face, the body, or both the face and body were visible. In Experiment 2, we measured the weight of body and face cues in conveying emotional information when intense genuine emotions are expressed, a situation in which face expressions alone may have ambiguous valence. We found that deaf individuals were better at identifying disgust and fear from body cues (Experiment 1) and in integrating face and body cues in case of intense negative genuine emotions (Experiment 2). Our findings support the capacity of deaf individuals to compensate for the lack of the auditory input enhancing perceptual and attentional capacities in the spared modalities, showing that this capacity extends to the affective domain.

Original languageEnglish (US)
Pages (from-to)499-519
Number of pages21
JournalMultisensory Research
Volume32
Issue number6
DOIs
StatePublished - 2019

All Science Journal Classification (ASJC) codes

  • Experimental and Cognitive Psychology
  • Sensory Systems
  • Cognitive Neuroscience
  • Ophthalmology
  • Computer Vision and Pattern Recognition

Keywords

  • Auditory deprivation
  • Bodies
  • Deafness
  • Emotion
  • Facial expressions
  • Sign language

Fingerprint

Dive into the research topics of 'Differences in emotion recognition from body and face cues between deaf and hearing individuals'. Together they form a unique fingerprint.

Cite this