TY - GEN
T1 - Removal of Anti-Vaccine Content Impacts Social Media Discourse
AU - Mitts, Tamar
AU - Pisharody, Nilima
AU - Shapiro, Jacob
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/6/26
Y1 - 2022/6/26
N2 - Over the past several years, a growing number of social media platforms have begun taking an active role in content moderation and online speech regulation. While enforcement actions have been shown to improve outcomes within moderating platforms, less is known about possible spillover effects across platforms. We study the impact of removing groups promoting anti-vaccine content on Facebook on engagement with similar content on Twitter. We followed 160 Facebook groups discussing COVID-19 vaccines and prospectively tracked their removal from the platform between April and September 2021. We then identified users who cited these groups on Twitter, and examined their online behavior over time. Our findings from a stacked difference-in-differences analysis shows that users citing removed Facebook groups promoted more anti-vaccine content on Twitter in the month after the removals. In particular, users citing the removed groups used 10-33% more anti-vaccine keywords on Twitter, when compared to accounts citing groups that were not removed. Our results suggest that taking down anti-vaccine content on one platform can result in increased production of similar content on other platforms, raising questions about the overall effectiveness of these measures.
AB - Over the past several years, a growing number of social media platforms have begun taking an active role in content moderation and online speech regulation. While enforcement actions have been shown to improve outcomes within moderating platforms, less is known about possible spillover effects across platforms. We study the impact of removing groups promoting anti-vaccine content on Facebook on engagement with similar content on Twitter. We followed 160 Facebook groups discussing COVID-19 vaccines and prospectively tracked their removal from the platform between April and September 2021. We then identified users who cited these groups on Twitter, and examined their online behavior over time. Our findings from a stacked difference-in-differences analysis shows that users citing removed Facebook groups promoted more anti-vaccine content on Twitter in the month after the removals. In particular, users citing the removed groups used 10-33% more anti-vaccine keywords on Twitter, when compared to accounts citing groups that were not removed. Our results suggest that taking down anti-vaccine content on one platform can result in increased production of similar content on other platforms, raising questions about the overall effectiveness of these measures.
KW - mis/disinformation
KW - online behavior
KW - social media
UR - http://www.scopus.com/inward/record.url?scp=85133700021&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85133700021&partnerID=8YFLogxK
U2 - 10.1145/3501247.3531548
DO - 10.1145/3501247.3531548
M3 - Conference contribution
AN - SCOPUS:85133700021
T3 - ACM International Conference Proceeding Series
SP - 319
EP - 326
BT - WebSci 2022 - Proceedings of the 14th ACM Web Science Conference
PB - Association for Computing Machinery
T2 - 14th ACM Web Science Conference, WebSci 2022
Y2 - 26 June 2022 through 29 June 2022
ER -