Abstract
The social brain hypothesis implies that humans and other primates evolved "modules" for representing social knowledge. Alternatively, no such cognitive specializations are needed because social knowledge is already present in the world \- we can simply monitor the dynamics of social interactions. Given the latter idea, what mechanism could account for coalition formation? We propose that statistical learning can provide a mechanism for fast and implicit learning of social signals. Using human participants, we compared learning of social signals with arbitrary signals. We found that learning of social signals was no better than learning of arbitrary signals. While coupling faces and voices led to parallel learning, the same was true for arbitrary shapes and sounds. However, coupling versus uncoupling social signals with arbitrary signals revealed that faces and voices are treated with perceptual priority. Overall, our data suggest that statistical learning is a viable domain-general mechanism for learning social group structure.
Original language | English (US) |
---|---|
Pages (from-to) | 397-417 |
Number of pages | 21 |
Journal | Interaction Studies |
Volume | 12 |
Issue number | 3 |
DOIs | |
State | Published - 2011 |
All Science Journal Classification (ASJC) codes
- Communication
- Language and Linguistics
- Animal Science and Zoology
- Linguistics and Language
- Human-Computer Interaction
Keywords
- Audiovisual speech
- Crossmodal
- Distributed cognition
- Embodied cognition
- Multimoda
- Multisensory
- Situated cognition
- Social brain