SoK: Content Moderation in Social Media, from Guidelines to Enforcement, and Research to Practice

Published in 8th IEEE European Symposium on Security and Privacy (EuroS&P), 2023

Social media platforms have been establishing content moderation guidelines and employing various moderation policies to counter hate speech and misinformation. The goal of this paper is to study these community guidelines and moderation practices, as well as the relevant research publications, to identify the research gaps, differences in moderation techniques, and challenges that should be tackled by the social media platforms and the research community. To this end, we study and analyze fourteen most popular social media content moderation guidelines and practices, and consolidate them. We then introduce three taxonomies drawn from this analysis as well as covering over two hundred interdisciplinary research papers about moderation strategies. We identify the differences between the content moderation employed in mainstream and fringe social media platforms. Finally, we have in-depth applied discussions on both research and practical challenges and solutions.

Recommended citation: M. Singhal, C. Ling, P. Paudel, P. Thota, N. Kumarswamy, G. Stringhini and S.Nilizadeh “SoK: Content Moderation in Social Media, from Guidelines to Enforcement, and Research to Practice,” 2023 IEEE 8th European Symposium on Security and Privacy (EuroS&P), 2023. https://ieeexplore.ieee.org/abstract/document/10190527/