Content moderation—the process of deciding what stays online and what gets taken down—is an indispensable aspect of the social media industry. Without it, online platforms would be inundated not just by spam, but by personal bullying, neo-Nazi screeds, terrorist beheadings, and child sexual abuse.
Despite the centrality of content moderation, however, major social media companies have marginalized the people who do this work, outsourcing the vast majority of it to third-party vendors. Who Moderates the Social Media Giants?, a new report from the NYU Stern Center for Business and Human Rights, examines the vital function of content moderation and identifies troubling consequences that flow from its being outsourced. These consequences include inadequate moderation of hateful content in certain developing countries and insufficient mental health care for individual reviewers who suffer psychological side effects from being continually exposed to the worst the internet has to offer.
Download the complete study (PDF): Who Moderates the Social Media Giants?
Supported by the John S. and James L. Knight Foundation and Craig Newmark Philanthropies, the research by the Center for Business and Human Rights focuses primarily on Facebook as a case study that provides lessons for the self-governance of other platform companies, as well. The report, released on June 8, 2020, concludes with a series of recommendations, which include bringing content moderation in-house, expanding the workforce doing this important task, and, in particular, increasing the level of content review in at-risk countries in Asia, Africa, and elsewhere.
Paul M. Barrett is the Deputy Director for the Center for Business and Human Rights, a Knight-supported center that advances a pro-business, high standards model. The Center believes that companies should be profitable and competitive while benefiting the people who make their success possible.