Demand effects in survey experiments: An empirical assessment

Jonathan Mummolo, Erik Peterson

Research output: Contribution to journalArticlepeer-review

262 Scopus citations

Abstract

Survey experiments are ubiquitous in social science. A frequent critique is that positive results in these studies stem from experimenter demand effects (EDEs)-bias that occurs when participants infer the purpose of an experiment and respond so as to help confirm a researcher's hypothesis. We argue that online survey experiments have several features that make them robust to EDEs, and test for their presence in studies that involve over 12,000 participants and replicate five experimental designs touching on all empirical political science subfields. We randomly assign participants information about experimenter intent and show that providing this information does not alter the treatment effects in these experiments. Even financial incentives to respond in line with researcher expectations fail to consistently induce demand effects. Research participants exhibit a limited ability to adjust their behavior to align with researcher expectations, a finding with important implications for the design and interpretation of survey experiments.

Original languageEnglish (US)
Pages (from-to)517-529
Number of pages13
JournalAmerican Political Science Review
Volume113
Issue number2
DOIs
StatePublished - May 1 2019
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Sociology and Political Science
  • Political Science and International Relations

Fingerprint

Dive into the research topics of 'Demand effects in survey experiments: An empirical assessment'. Together they form a unique fingerprint.

Cite this