Abstract
Survey experiments are ubiquitous in social science. A frequent critique is that positive results in these studies stem from experimenter demand effects (EDEs)-bias that occurs when participants infer the purpose of an experiment and respond so as to help confirm a researcher's hypothesis. We argue that online survey experiments have several features that make them robust to EDEs, and test for their presence in studies that involve over 12,000 participants and replicate five experimental designs touching on all empirical political science subfields. We randomly assign participants information about experimenter intent and show that providing this information does not alter the treatment effects in these experiments. Even financial incentives to respond in line with researcher expectations fail to consistently induce demand effects. Research participants exhibit a limited ability to adjust their behavior to align with researcher expectations, a finding with important implications for the design and interpretation of survey experiments.
Original language | English (US) |
---|---|
Pages (from-to) | 517-529 |
Number of pages | 13 |
Journal | American Political Science Review |
Volume | 113 |
Issue number | 2 |
DOIs | |
State | Published - May 1 2019 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Sociology and Political Science
- Political Science and International Relations