Ellen Pao
Knowing what you know now about the internet and how your venture turned out, what do you wish you had done differently from the beginning?
We have learned so much, and as we continue to share messages, I wish leaders would listen and do the work.
Below are two lessons so many of us have learned the hard way, and so many continue to ignore:
- Sunlight is not the best disinfectant when it comes to hate. It just reinforces power structures that already exist and favors young, cisgender white men. New Zealand mosque killings resulted in hate-filled praise on social media platforms and widely shared manifestos, which likely encourage followers and other hate crimes. The free speech argument is used as a cover for inaction, and often falls by the wayside when speech that makes leaders unhappy gets posted. Allowing all speech reinforces the uneven playing field and makes it unsafe for people who are less powerful, have less of a voice and less of a platform on which to speak up. And the real-life harm has been much greater than we ever imagined: people are being harassed to the point where they are afraid for their safety. An actual insurrection that killed five people took place at the United States Capitol in January 2021.
The founders and leaders of Twitter, 4chan and 8kun (formerly 8chan) have all expressed their regrets at not understanding the impact of unlimited speech policies on their platforms.
- The fear that platform employees have of their users drives a lot of decision making. My guess is that there’s a lot of fear of people who are hateful and harassing and trolling on social media platforms. They’re not only threatening other users, but they’re also threatening employees. It fosters inertia in changing rules, dealing with harassment and enforcing unpopular policies. Then, rules aren’t clear. And when you don’t follow your own rules, you end up with even more people not following the rules. If your platform doesn’t hold people accountable, then your rules really don’t have meaning. And when people don’t know what the rules are, you end up with more and more vitriol. Unfortunately, the angry mob is a harmful mob: the false identification of the Boston Marathon bomber in 2013 was a disaster.
The solutions are clear and have been proven to work:
- Enforce anti-harassment policies. Because most platforms have shown a real reluctance to do so, resulting in real harm, regulators need to step in. We need to treat harassment as a real cost of doing business by imposing liability for harm. Executives, companies and board members should be held accountable for their actions and inactions in preventing the harm that we see being caused by lax policies or poor implementation and enforcement. Cross-platform harassment is real, and should be considered when trying to address harm.
- We need leaders with empathy for people who are experiencing harassment. We need people who are from the groups that keep getting pushed off platforms––the Black and Latinx, Indigenous, and Asian users, women and/or nonbinary users, transgender users, and disabled users. Shared experiences make a difference when leaders have the power and will to drive change.
Lessons From the First Internet Ages
What is the future of the internet? Thirty years after the creation of the first web page, what have we learned about the impact of the internet on communication, connection, and democracy? Join the Knight Foundation for Lessons from the First Internet Ages, a virtual symposium that will explore and evaluate what key figures in the development […]