Modern American democracy is inextricably intertwined with both the benefits and harms of globalized capitalism. Meanwhile, globalized capitalism has allowed the rise of new centers of power with little or no accountability. Two such power centers, the tech industry and contemporary philanthropy, will be key actors in the next decade of American democracy. We argue that American philanthropy must use its position and power to help design and put in place new forms of hard accountability for both sectors.
Our challenge to American philanthropy is twofold. First, it must use its power, influence, and money to dismantle“techno-solutionism”—the idea that technological solutions are the key to strengthening American (or any) democracy. Technology is not and never will be the solution to social and political problems, but the fetishism of technology in the public sector has been extremely lucrative for many private companies. All too often we have seen how technology, adopted in the name of progress, actually exacerbates and entrenches society’s problems. Modern capitalism and the technologies arising from it are strangling the very ideals of democracy in the United States. Combined with a rise in nationalism and nativism, this version of techno-solutionism is reinforcing structural inequities, systemic racism, and economic injustice.
Second, we must acknowledge and reform philanthropy’s own dependence on techno-solutionism. The tech sector’s runaway financial success—enabled by a deafening regulatory silence—has propped up the economy, and philanthropy’s endowments, for the last two decades. This has birthed a new class of philanthropists informed directly by the tech-solutionist logic of the tech industry. During this time, philanthropy has spent uncounted millions advancing and funding a narrative that posits new technologies as the solution to our most pressing social problems. Completing the cycle of globalization, American tech companies and philanthropic organizations have come together to export that narrative to much of the rest of the world, driving the agenda of both humanitarian aid agencies and development organizations.
Today, philanthropy and the tech industry are balanced on the same precipice. Most who participate in these sectors do so with considerable expertise in their fields and the best of intentions; they intend to use their money and power to do good. Yet, without structures in place to serve as checks on capitalism and tech power, both sectors are bound to reinforce persistent structural inequities and thus to undermine democracy. Capitalism and tech both need to be reined in by mechanisms demanding much stricter accountability if democracy is to flourish. Philanthropic organizations have a choice. They can either commit to democracy by helping to build those guardrails, fully cognizant that doing so will limit their own power, or they can become increasingly complicit in the maintenance of structural inequality.
The Good, the Bad, and the Ugly
Today, in the United States, techno-solutionism is thriving as never before. Given the economic and cultural power of the technology industry in the United States, technologists, politicians, theorists, and philanthropists are all eager to propose new technologies as the solution to structural, systemic, and political problems. But here’s the dilemma: technology consistently mirrors and magnifies the good, the bad, and the ugly of society.
Consider that, over the last decade, we’ve seen the widespread adoption of data-centric technologies in our criminal justice system, from policing to the courts. The rule of law and the desire to enact equal justice for all are at the heart of the American experiment in democracy. And yet we know this goal has not been realized, that both legal remedies and just outcomes continue to be denied to many. The protests against police brutality and violence—including, most notably, the uprisings in defense of Black lives that have swept the country in 2020—have, correctly, brought these urgent concerns to the forefront of American life. These protests have occurred against a backdrop of rising technological solutionism in our justice system. Over the past ten years, we’ve seen the rollout of police body cameras to record interactions with members of already over-surveilled Black and brown communities as a mechanism for police accountability; they were deployed with little research backing claims that they would, in fact, increase police accountability and a lack of clarity around the policies needed to create that outcome. We’ve seen the inclusion of algorithmic risk-assessment systems in courtrooms that provide guidance to judges—using biased and unverified data and data models—on sentencing and bail, landing individuals in jail with no legal recourse to challenge the tool that sent them there. And the highly anticipated use of facial recognition technologies by law enforcement, again with no recognition of the ways in which those technologies abrogate civil liberties and basic principles of democratic practice, has been paused only recently because of the multiyear and concentrated efforts by a coalition of rights groups and scholars of color who understood the potential harm from the outset.
Each of these technologies was originally developed either as a neutral effort outside of the context of criminal justice or as a solution to efficiency needs that also purported to solve long-standing biases within the field. Yet, good intentions do not prevent technologies from doing harm. And the harm keeps amassing. Fundamentally, these technologies entrench existing practices of surveillance and over-policing of marginalized populations and deepen inequalities through the falsely perceived neutrality of data. Moreover, these technologies cost significant taxpayer money or are paid for through abusive fees and fines levied disproportionately on Black and brown communities. This inequity adds up. It sows distrust for the institutions of democracy among these many unfairly targeted communities and further undermines our shared future.
Democratic values are challenged by techno-solutionism in every sphere of public life. Social media platforms, for instance, promise to give all users the ability to broadcast their voices but have demonstrated that they also reinforce long-standing issues about whose voices have power and authority. Platforms have, at times, interpreted the First Amendment to mean that more speech is good speech, a position that fails to account for how individuals and whole communities are systemically silenced when others exercise their right to speak. Social media have enabled many people to come together but also helped produce a public square easily captured by harassment and hate. This cannot be fixed by creating yet another social media platform.
To be sure, technology can be used to challenge the status quo: activists mobilize through online platforms, witnesses livestream abuses of power, and advocates share information faster to broader publics thanks to the internet. Yet there is nothing inherently democratic about these technologies. These same tools are often used to harass people, spread disinformation, and amplify hate. It is because these technologies give us both what we want and what is most destructive to the fabric of democracy that nuanced and socially grounded responses are needed to mitigate the worst harm and protect communities from the re-entrenchment of long-standing violence and inequity. Unfortunately, this is not what powerful and well-resourced voices in the tech industry are calling for as an antidote to our current woes. Theirs is, rather, an apolitical, unreflective “build” mentality, with a focus on speed and techno-solutionism and no answer to the question, “For whom?”
We need philanthropy to focus on supporting the social research, translation, and advocacy that will enable data-centric technologies to play the role of advancing justice and equity in our society that so many imagined they would. Researchers working in areas such as fairness and accountability in machine learning, sociotechnical security, and media manipulation are pushing the conversation forward through new findings that expand how we understand the role of technology in society. This research allows us to understand where good intentions can go terribly awry and gives us pathways to strategically mitigate that harm. This work showcases the importance of focusing on structural changes rather than on ad hoc responses to signs of democratic distress.
Over the next ten years, philanthropy should not be concerned with building new technologies; the concentration of capital and power in Big Tech ensures that this will happen commercially, and the drive for adoption within the public sector is unlikely to subside. Rather, philanthropy should act as a bulwark and counterweight against technological solutionism. Philanthropy can clear and hold space for the multiyear, expensive, and radically necessary work to ensure that technology serves just societal outcomes. All too often, technology will feel like a “safe” investment with tangible returns that please boards of directors and living donors. But apps do not and will never produce the hard-to-demonstrate long-term societal change that is needed to ensure a democratic future.
The Role of Philanthropy in Rebuilding Democracy
Philanthropy and democracy are uneasy bedfellows. Modern American philanthropy was born from the spoils of the last Gilded Age and benefits hugely from the wealth generated by this cycle. A handful of living donors and institutions with large endowments determine what is deserving of funding. In assuming this role and taking advantage of our charitable giving laws, they become only nominally accountable, unelected centers of power. Philanthropy needs to grapple with its own position of power and privilege; it cannot become a check on capitalist power unless it chooses to fundamentally reform its sources of power and authority. The sector must reckon with the ways in which American legal structures and tax codes supporting charitable giving reinforce and deepen inequality. Moreover, it is critical for philanthropy to grapple with how these structures centralize unchecked power in private entities rather than in public institutions supported directly by taxpayer dollars.
Our current version of globalized capitalism has led to the financial and political dominance of the tech industry, the invasions of mass data collection and profiling, and the concentration of power in a tiny number of investors and companies. Like the previous Gilded Age, this wave of lightly regulated capitalism has produced widespread inequity even as wealthy individuals, stock markets, and endowments flourish. This is what requires American philanthropy to support the reform of both the norms and the laws that anchor techno-capitalism.
This is not rapid-response, reactive work. Rather, philanthropy must do more to center its attention and influence on what it means to uphold justice at the intersection of technology and democracy. First and foremost, this means a long-term commitment to building and sustaining an array of organizations that are doing that work. The past five years have seen a flowering of new, powerhouse, tech-focused organizations combining research and advocacy with an eye toward racial justice in the United States. There are also a growing number of scholars and university centers producing an evidence base, hosting workshops and fellowships, and supporting a new generation to think about socially informed governance of data-centric technologies and to think about these issues as their problem. Many civil rights groups are fighting for new checks on tech power. And yet, each of these organizations, scholars, and networks grapples with the precarious balancing of short-term funding realities with the need for longer-term strategy and with the limitations of building the network of actors when the norm is one-year fellowships or two-year funded roles in organizations.
Second, philanthropy must commit to supporting whole organizations. The ongoing shift in philanthropy from project support toward general support is very positive. That must continue and expand and must be matched by multiyear runways that allow receiving organizations to thrive. To strengthen and retain new talent, organizations need sustainable structures enabling the development of nonprofit career paths that don’t require staff to move to other industries for stability and advancement. Building a strong ecosystem of organizations that can work together strategically over the long term also means strengthening leadership training. Talented and dynamic leaders start organizations and persuade funders of the value of their ideas. Often, funders aren’t willing to support the infrastructure costs of running a healthy and sustainable organization. Many foundations, if they cover indirect costs or overhead at all, will cap these contributions at 10–15 percent of direct programmatic costs. The MacArthur Foundation, in a study released last year, found that the real cost of administering a stable and healthy organization is 29 percent of direct costs—nearly double its previous rate. Philanthropy must commit to supporting the internal equity and justice processes—from training in equitable management to support for building ethical fundraising structures to ensuring inclusive and robust hiring practices—at the organizations they support. This work is crucial, expensive, and time-consuming and can be done only with full commitment from organizational leaders and empathetic funders with a realistic understanding of what organizational sustainability really looks like.
The third commitment that philanthropy can make is to proactively identify areas in which technological solutionism is driving major investment from corporations or government—and then, robustly support a field-level response of the counter-weight work that will clearly bring to light the social impacts of technology. Philanthropy can support the crucial chain of work that needs to be done: basic research to identify new frames of understanding, applied research to build an evidence base of specific instances of harm, advocacy movement building, policy-making, enforcement, and, wherever possible, lobbying to shift those investments in the first place. At the time of this writing, a bill before Congress, the Endless Frontiers Act, proposes a $100 billion investment in the National Science Foundation, creating a technology directorate and featuring ten key topics for research attention over the next decade. This is a tremendous opportunity but, unfortunately, not one of those ten key topics is about the social impacts of technology.
A fourth commitment that philanthropy can make is to prioritize inclusion in both funding decisions and internal staffing. First, philanthropic organizations should build funding programs that are explicitly intended to support historically underrepresented leaders, activists, and scholars in the field. This would include structural support for platforms, training, and long-term financial stability. Building truly inclusive programs inevitably means making hard choices, such as withdrawing long-standing support to organizations or individuals who have done great work but by their very structure and leadership reinforce structural inequality. It means internal foundation accountability mechanisms designed to review portfolios and programs before commitments are made to assess core questions: not only, “Where is the money being directed?,” but also, “Who has defined the core assumptions and hypotheses driving the work?” If the answer does not include broad gender representation, people of color, and those with a range of expertise that goes beyond recognized credentialing, then those mechanisms need to be revisited.
These commitments are about ensuring a robust and healthy array of organizations to carry these fights forward over the coming decade to challenge the core precept of techno-solutionism: that technology alone will solve intractable social problems. What these organizations will need to do is continue the work of determining what real accountability to society and particularly to vulnerable communities looks like for these centers of technological power and then putting those mechanisms into place with robust enforcement. This in itself is a tremendous challenge. But that leaves open the question of philanthropy’s own complicity through the very structure of American charitable giving laws.
We believe that for American democracy to flourish in the next decade, American philanthropy needs to lead a public conversation about reining in its own power and shifting that power to other, more democratic, venues. To do this, philanthropy will need to acknowledge three core points: its power to set and enact broad social and political agendas, the lack of accountability that is a hallmark of philanthropic giving under the current system, and the reality that both of those conditions are upheld by the immense financial power that has accrued to technology companies—and their shareholders—through US adherence to a system of globalized capitalism.
Unchecked concentrations of power undermine democratic practice. We see those concentrations of power in both the tech industry and in philanthropic organizations. We also see that most individuals in both sectors undertake their work with the best of intentions. What needs to be challenged and remade are the structures—tax codes, regulatory frameworks, and legacy legislation—that allow persistent inequities to become further entrenched. Even benevolent dictators are dictators. Globalized capitalism and data-centric technologies both need stronger public controls if democracy is to flourish in the coming decades. Philanthropy is in a position to help reframe and amplify this stance and then drive the design and adoption of those controls. If successful, this will limit its own power—and we believe that will contribute to a more robust democracy in the decades to come.
Janet Haven is the executive director of Data & Society. She previously worked for the Open Society Foundations, where she oversaw funding strategies and grant-making related to technology’s role in supporting and advancing civil society, particularly in governance, human rights, and transparency and accountability. The author began her career in technology start-ups in Central Europe, participating in several successful acquisitions. She sits on the board of the Public Lab for Open Technology and Science and serves as an adviser to a range of nonprofit organizations.
danah boyd is a partner/researcher at Microsoft Research and the founder/president of Data & Society. Her research is focused on addressing social and cultural inequities by understanding the relationship between technology and society. She is a director of both Crisis Text Line and the Social Science Research Council, and a trustee of the National Museum of the American Indian. She received a BA in computer science from Brown University, an MA from the MIT Media Lab, and a PhD in information from the University of California, Berkeley.
 Mark Latonero, “Opinion: AI for Good Is Often Bad,” Wired, November 18, 2019, https://www.wired.com/story/opinion-ai-for-good-is-often-bad/.
 See Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Cambridge, UK: Polity Press, 2019).
 See Alex Rosenblat, Kate Wikelius, danah boyd, Seeta Peña Gangadharan, and Corrine Yu, “Data & Civil Rights: Criminal Justice Primer,” primer, Data & Civil Rights: Why “Big Data” is a Civil Rights Issue, Washington, DC, October 30, 2014, http://www.datacivilrights.org/pubs/2014-1030/CriminalJustice.pdf; Angéle Christin, Alex Rosenblat, and danah boyd, “Courts and Predictive Algorithms,” primer, Data & Civil Rights: A New Era of Policing and Justice, Washington DC, October 27, 2015, http://www.datacivilrights.org/pubs/2015-1027/Courts_and_Predictive_Algorithms.pdf; and Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown, 2016).
 The Leadership Conference on Civil and Human Rights and Upturn, “Police Body Worn Cameras: A Policy Scorecard,” https://www.bwcscorecard.org/.
 Jay Stanley and Peter Bibring, “ACLU to Justice Department: Don’t Give Money to LAPD for Body Cameras,” American Civil Liberties Union (blog), September 3, 2015, https://www.aclu.org/blog/privacy-technology/surveillance-technologies/aclu-justice-department-dont-give-money-lapd-body.
 Rebecca Wexler, “Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System,” Stanford Law Review 70, no. 1343 (2018): 1343–1429, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2920883; David G. Robinson and Logan Koepke, “Civil Rights and Pretrial Risk Assessment Instruments,” Upturn, December 2019, https://www.upturn.org/static/files/Robinson-Koepke-Civil-Rights-Critical-Issue-Brief.pdf; and Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner, “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks,” ProPublica, May 23, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
 Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classiﬁcation,” Proceedings of Machine Learning Research 81 (Conference on Fairness, Accountability, and Transparency, New York University, February 23–24, 2018), 77–91.
 Malkia Devich-Cyril, “Defund Facial Recognition,” Atlantic, July 5, 2020, https://www.theatlantic.com/technology/archive/2020/07/defund-facial-recognition/613771/; and Robin Williams, “I Was Wrongfully Arrested Because of Facial Recognition. Why Are Police Allowed to Use It?” Washington Post, June 24, 2020, https://www.washingtonpost.com/opinions/2020/06/24/i-was-wrongfully-arrested-because-facial-recognition-why-are-police-allowed-use-this-technology/.
 Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (New York: St. Martin’s Press, 2018).
 See Michele Gilman, “AI Algorithms Intended to Root Out Welfare Fraud Often End Up Punishing the Poor Instead,” The Conversation, February 14, 2020, http://theconversation.com/ai-algorithms-intended-to-root-out-welfare-fraud-often-end-up-punishing-the-poor-instead-131625; Caroline Haskins, “How Ring Transmits Fear to American Suburbs,” Motherboard: Tech by Vice, December 6, 2019, https://www.vice.com/en_us/article/ywaa57/how-ring-transmits-fear-to-american-suburbs; Lauren Kirchner and Matthew Goldstein, “Access Denied: Faulty Automated Background Checks Freeze Out Renters,” The Markup, May 28, 2020, https://themarkup.org/locked-out/2020/05/28/access-denied-faulty-automated-background-checks-freeze-out-renters; and Mary Madden, “The Devastating Consequences of Being Poor in the Digital Age,” New York Times, April 25, 2019, https://www.nytimes.com/2019/04/25/opinion/privacy-poverty.html.
 Alexandra Mateescu, Alex Rosenblat, and danah boyd, “Dreams of Accountability, Guaranteed Surveillance: The Promises and Costs of Body-Worn Cameras,” Surveillance & Society 14, no. 1 (May 2016): 122–27, https://doi.org/10.24908/ss.v14i1.6282; and Ava Kofman, “Digital Jail: How Electronic Monitoring Drives Defendants into Debt,” ProPublica, July 3, 2019, https://www.propublica.org/article/digital-jail-how-electronic-monitoring-drives-defendants-into-debt?token=6LHoUCqhSP02JHSsAi7mlAd73V6zJtgb.
 Robert Richards and Clay Calvert, “Counterspeech 2000: A New Look at the Old Remedy for ‘Bad’ Speech,” BYU Law Review 2000, no. 2 (May 2000): 553–86; and Nabiha Syed, “Real Talk about Fake News: Towards a Better Theory for Platform Governance,” Yale Law Journal (October 2017): 337–57, https://www.yalelawjournal.org/forum/real-talk-about-fake-news.
 Mary Anne Franks, The Cult of the Constitution (Redwood City, CA: Stanford University Press, 2019); André Brock, “From the Blackhand Side: Twitter as a Cultural Conversation,” Journal of Broadcasting and Electronic Media 56, no. 4 (October 2012): 529–49; Alexandra Siegel, Evgenii Nikitin, Pablo Barberá, Joanna Sterling, Bethany Pullen, Richard Bonneau, Jonathan Nagler, and Joshua A. Tucker, “Trumping Hate on Twitter? Online Hate in the 2016 US Election and Its Aftermath,” Social Media and Political Participation, New York University, March 6, 2019.
 Amanda Lenhart, Michele Ybarra, Kathryn Zickhur, and Myeshia Price-Feeney, “Online Harassment, Digital Abuse, and Cyberstalking in America” (New York: Data & Society Research Institute, November 21, 2016), https://datasociety.net/wp-content/uploads/2016/11/Online_Harassment_2016.pdf; Nick Lowles, Nick Ryan, and Jemma Levene, eds., “State of Hate 2020: Far Right Terror Goes Global,” Hope Not Hate (February 2020), https://www.hopenothate.org.uk/wp-content/uploads/2020/02/state-of-hate-2020-final.pdf; and Brandi Collins-Dexter, “Canaries in the Coalmine: COVID-19 Misinformation and Black Communities,” Harvard Kennedy School, Shorenstein Center on Media, Politics and Public Policy, June 9, 2020, https://shorensteincenter.org/wp-content/uploads/2020/06/Canaries-in-the-Coal-Mine-Shorenstein-Center-June-2020.pdf.
 See the ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT): https://facctconference.org/.
 Matt Goerzen, Elizabeth Anne Watkins, and Gabrielle Lim, “Entanglements and Exploits: Sociotechnical Security as an Analytic Framework,” 9th USENIX Workshop on Free and Open Communications on the Internet, Santa Clara, CA, August 13, 2019, https://www.usenix.org/conference/foci19/presentation/goerzen.
 Alice Marwick and Rebecca Lewis, “Media Manipulation and Disinformation Online,” Data & Society Research Institute, 2017.
 Rob Reich, Just Giving: Why Philanthropy Is Failing Democracy and How It Can Do Better (Princeton, New Jersey: Princeton University Press, 2018).
 Tim Wu, The Curse of Bigness: Antitrust in the New Gilded Age (New York: Columbia Global Reports, 2018).
 See Algorithmic Justice League (founded 2016), Black in AI (founded 2017), Data for Black Lives (founded 2017).
 “Changing How We Support Indirect Costs,” MacArthur Foundation, December 16, 2019, https://www.macfound.org/press/perspectives/changing-how-we-support-indirect-costs/.
 Jeffrey Mervis, “Bill Would Supersize NSF’s Budget—and Role,” Science 368, no. 6495 (June 2020): 1045.