TY - JOUR
T1 - Short-term exposure to filter-bubble recommendation systems has limited polarization effects
T2 - Naturalistic experiments on YouTube
AU - Liu, Naijia
AU - Hu, Xinlan Emily
AU - Savas, Yasemin
AU - Baum, Matthew A.
AU - Berinsky, Adam J.
AU - Chaney, Allison J.B.
AU - Lucas, Christopher
AU - Mariman, Rei
AU - de Benedictis-Kessner, Justin
AU - Guess, Andrew M.
AU - Knox, Dean
AU - Stewart, Brandon M.
N1 - Publisher Copyright:
Copyright © 2025 the Author(s).
PY - 2025/2/25
Y1 - 2025/2/25
N2 - An enormous body of literature argues that recommendation algorithms drive political polarization by creating “filter bubbles” and “rabbit holes.” Using four experiments with nearly 9,000 participants, we show that manipulating algorithmic recommendations to create these conditions has limited effects on opinions. Our experiments employ a custom-built video platform with a naturalistic, YouTube-like interface presenting real YouTube videos and recommendations. We experimentally manipulate YouTube’s actual recommendation algorithm to simulate filter bubbles and rabbit holes by presenting ideologically balanced and slanted choices. Our design allows us to intervene in a feedback loop that has confounded the study of algorithmic polarization—the complex interplay between supply of recommendations and user demand for content—to examine downstream effects on policy attitudes. We use over 130,000 experimentally manipulated recommendations and 31,000 platform interactions to estimate how recommendation algorithms alter users’ media consumption decisions and, indirectly, their political attitudes. Our results cast doubt on widely circulating theories of algorithmic polarization by showing that even heavy-handed (although short-term) perturbations of real-world recommendations have limited causal effects on policy attitudes. Given our inability to detect consistent evidence for algorithmic effects, we argue the burden of proof for claims about algorithm-induced polarization has shifted. Our methodology, which captures and modifies the output of real-world recommendation algorithms, offers a path forward for future investigations of black-box artificial intelligence systems. Our findings reveal practical limits to effect sizes that are feasibly detectable in academic experiments.
AB - An enormous body of literature argues that recommendation algorithms drive political polarization by creating “filter bubbles” and “rabbit holes.” Using four experiments with nearly 9,000 participants, we show that manipulating algorithmic recommendations to create these conditions has limited effects on opinions. Our experiments employ a custom-built video platform with a naturalistic, YouTube-like interface presenting real YouTube videos and recommendations. We experimentally manipulate YouTube’s actual recommendation algorithm to simulate filter bubbles and rabbit holes by presenting ideologically balanced and slanted choices. Our design allows us to intervene in a feedback loop that has confounded the study of algorithmic polarization—the complex interplay between supply of recommendations and user demand for content—to examine downstream effects on policy attitudes. We use over 130,000 experimentally manipulated recommendations and 31,000 platform interactions to estimate how recommendation algorithms alter users’ media consumption decisions and, indirectly, their political attitudes. Our results cast doubt on widely circulating theories of algorithmic polarization by showing that even heavy-handed (although short-term) perturbations of real-world recommendations have limited causal effects on policy attitudes. Given our inability to detect consistent evidence for algorithmic effects, we argue the burden of proof for claims about algorithm-induced polarization has shifted. Our methodology, which captures and modifies the output of real-world recommendation algorithms, offers a path forward for future investigations of black-box artificial intelligence systems. Our findings reveal practical limits to effect sizes that are feasibly detectable in academic experiments.
KW - experiment
KW - political polarization
KW - recommendation systems
UR - http://www.scopus.com/inward/record.url?scp=85218961851&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85218961851&partnerID=8YFLogxK
U2 - 10.1073/pnas.2318127122
DO - 10.1073/pnas.2318127122
M3 - Article
C2 - 39964709
AN - SCOPUS:85218961851
SN - 0027-8424
VL - 122
JO - Proceedings of the National Academy of Sciences of the United States of America
JF - Proceedings of the National Academy of Sciences of the United States of America
IS - 8
M1 - e2318127122
ER -