When It Comes to Managing Online Content, Americans Want It All. Can They Get It?
Prussian General Carl von Clausewitz famously called war “the continuation of politics by other means.” President Donald Trump’s escalation of his feud with Twitter in May, via an executive order demanding changes to a landmark law governing social media platforms, might be characterized as “policy as the continuation of Twitter by other means.”
But even if the president’s tiff with social media feels more personal than substantive, he is hardly unique among Americans in raising questions about the right way to manage the digital public sphere. A new poll we’ve conducted with Gallup shows that 8 in 10 Americans do not trust social media companies to make the right call on what content to leave up or take down. Yet a majority still favor letting companies make the call rather than letting government decide.
The president may not trust Twitter, but he clearly still loves the service. Apparently, most of us feel the same way. So what is the path forward in a world likely to be more dominated by social media as primary communications platforms?
The answer may be less about who is holding these companies accountable, than the perceived independence and trustworthiness of how companies are being held accountable.
Our current regime relies heavily on private standards of conduct. The landmark law governing social media platforms, Section 230 of the Communications Decency Act, actually predates the founding of Facebook by nearly a decade. Broadly construed, the law protects digital platforms from legal liability for third party content posted to their platforms—with important carve outs for illegal content, such as child pornography or content related to human trafficking.
The impetus for the law, which made sense at the time, was to protect the internet as an open and free space, and to enable innovation without fear of costly litigation. Today many are raising questions about whether the hopes for a free and open internet should be tempered by the reality of misinformation, hate speech and harassment, which are all too prevalent online.
Predictably, there are sharp disagreements in how to move forward. Some argue that, as private platforms, social media companies should exercise their prerogative in how they manage content. Many who share this perspective are also skeptical of government intervention endangering the very spirit of openness and free expression that makes the internet great.
Others believe the kinds of harms perpetrated simply are not speech in the First Amendment sense, but rather forms of conduct, such as certain kinds of harassment. These critics see a need for government to restrict the scope of legal immunity as a way to force platforms to take responsibility for substantial and instantaneous distribution of content they enable.
Our Gallup/Knight poll suggests the important question may not be whether government or private actors take action, but the character of that action. We asked respondents about Facebook’s new Oversight Board, which has appointed an independent group of scholars and advocates to review some content decisions. The board will have enforceable authority to review certain content decisions and to hear “appeals” from users.
While there are many details left to be determined and tested as the board proceeds, what was notable is that respondent perceptions were positively impacted by three values in particular: how transparent the board would be in its process, the diversity of board members, and whether the board would be independent.
This suggests that what we need for social media is an accountability system that meets the virtues of the best public institutions: transparency about how procedures are applied, trust that decisions will not be made—consciously or unconsciously—by individuals who reflect a partial point of view, and belief in independence from partial interests (such as commercial motivations).
At its finest, our democracy is comprised of public and private institutions that seek to live up to those ideals. Now we need the same for the internet. Whether in the form of governmental authority or a private oversight body, what we require is accountability people can believe in.
Sam Gill is SVP/chief program officer at Knight Foundation. Email him at [email protected] and follow him on Twitter @thesamgill.