Readers' trait-based models of characters in narrative comprehension

David N. Rapp, Richard J. Gerrig, Deborah A. Prentice

Research output: Contribution to journalArticlepeer-review

99 Scopus citations

Abstract

Our experiments explore readers' application of trait-based situation models for narrative characters. In the first episode of each of our experimental stories, characters performed behaviors that allowed readers to construct trait inferences (e.g., Albert's shoes were "buried under old candy wrappers, crumpled magazines, and some dirty laundry."). Control stories omitted trait-relevant information. The second episode of each story gave readers an opportunity to apply the trait inference to generate expectations about story outcomes. In Experiment 1, participants agreed more readily to explicit outcomes that were consistent with their trait-based models. Experiment 2 demonstrated that readers' expectations were narrowly defined by specific traits (e.g., Albert is sloppy) rather than by more general inferences (e.g., Albert is not a good person). Experiments 3 and 4 suggested that trait-based models have an impact on moment-by-moment reading: Participants were slowed in their reading when story completions were inconsistent with specific trait-based models. Our results have implications both for theories of situation models and readers' causal analyses of narrative texts.

Original languageEnglish (US)
Pages (from-to)737-750
Number of pages14
JournalJournal of Memory and Language
Volume45
Issue number4
DOIs
StatePublished - 2001

All Science Journal Classification (ASJC) codes

  • Neuropsychology and Physiological Psychology
  • Language and Linguistics
  • Experimental and Cognitive Psychology
  • Linguistics and Language
  • Artificial Intelligence

Keywords

  • Character traits
  • Inference
  • Narrative causality
  • Situation models

Fingerprint

Dive into the research topics of 'Readers' trait-based models of characters in narrative comprehension'. Together they form a unique fingerprint.

Cite this