A new Gallup/Knight report looks at how Americans weigh free expression online against the threats posed by harmful content. Gallup and Knight invited several experts to weigh in on these findings and to place them within the broader context of public debates about online media and free expression. Their views are offered in a personal capacity and do not reflect the views of Gallup, Knight Foundation, or the organizations with which they are affiliated.
The Free Expression, Harmful Speech and Censorship in a Digital World report could not be timelier, particularly given recent steps to stand up the Oversight Board, which will review certain content decisions on Facebook and Instagram and of which I am a member.
Many findings in the report resonated with me personally. For example, the polling shows that the “vast majority [of Americans] have little or no trust in social media companies making the right decisions about what content appears on their sites or apps.” For some time now, I have voiced my concerns about the concentration of power in corporate actors over the human discourse of billions online, especially when such private sector decision-making is untethered to First Amendment or international human rights law principles.
The report highlights that Americans are very wary of governments making content decisions. Given the worldwide trend of problematic governmental restrictions on speech, such concerns are well-founded. The report also highlights that most Americans think oversight boards, which would review content moderation by companies, are a “good” or “very good” idea. In particular, Americans value transparency and diversity with respect to such boards, closely followed by independence and then a board’s ability to make binding decisions.
These types of factors played a role in my decision to serve on the Oversight Board. The Board’s decisions on disputed content will be posted on our website, and the Board will write an annual report summarizing its findings as well as the platforms’ reactions to its work, which will provide an opportunity for public scrutiny. Facebook must also respond to all Board decisions and recommendations publicly.
The Board has members from all over the world with different professions, areas of expertise and cultural backgrounds. While the Board can never approximate a level of diversity that encompasses the experience of billions, it does have the ability to solicit outside expertise in reaching its decisions, which will be helpful in, among other things, understanding the local context relating to content decisions.
In terms of independence, it was important to me that Board members not be employees of the platforms they oversee and that their tenure on the Board could not be revoked because of their decisions. While the initial four Board chairs were selected by Facebook, the rest of us were selected by both the chairs and Facebook. Once we reach forty members on the Board, future members will be selected solely by the Board.
I put more weight on the Board’s ability to render binding decisions than what is reflected in the polling results. The Board has the power to render binding decisions with respect to specific pieces of content that it accepts to review as well as the ability to make broader recommendations that Facebook must react to publicly. To me, it is important that the Board have certain binding powers for its mission to be impactful.
Although not part of the polling questions, it was also important to me that international human rights law principles play a role in the Board’s work. For some time now, I have argued in my scholarship that social media companies should respect international human rights standards in running their platforms. With regard to freedom of expression, that would mean, among other things, that companies refrain from imposing vague speech codes or banning speech when less intrusive means of achieving public interest objectives exist. I am pleased that the Board is committed to “upholding freedom of expression within the framework of international norms of human rights.”
All that said, the Board is a bold new approach to content moderation on a global scale. We will be both building a new institution and resolving matters involving disputed content. There no doubt will be important lessons that we will learn along the way. The Board will not solve all the problems of social media nor displace appropriate governmental regulation.
The work and responsibilities facing the Board are humbling, if not daunting, but important and worthy of our utmost commitment to protect the future of human rights online, including freedom of expression.
Evelyn Aswad is professor of law and the Herman G. Kaiser Chair in International Law at the University of Oklahoma College of Law, where she is the director of the Center for International Business and Human Rights. She is a member of the Oversight Board and the former director of the Office of Human Rights and Refugees at the U.S. Department of State Legal Bureau.