Gathering: A Prerequisite for Democracy

Shall we gather at the river?

Traditional Christian hymn, Robert Lowry, 1864

Organize the hood under I Ching banners,

Red, black, and green instead of gang bandannas

FBI spying on us through the radio antennas

And them hidden cameras in the streetlight watching society

With no respect for the people’s right to privacy

“Police State,” dead prez, 2000

Mourning is a collective act. Refugees and diasporic communities know the pain and sorrow of mourning from afar; now this truth of geography has become a pandemic truth. Dying alone, mourning alone compounds the sorrow. We are social beings. We gather to mourn.

And we gather to build societies. Coming together for friendship, worship, play, learning, commerce, protest, governing, mourning, or celebration is fundamental. The desire to be with others is innate; the right to do so is declared universal, and the design of spaces—both physical and digital—to encourage or prevent gathering defines countless professions. At the root of civil society is the need to gather, the right to assemble, the ability to voluntarily associate.

The pandemic of 2020, shelter-in-place orders, and protests against endemic racism have brought new attention to our desire and right to assemble. The object of this attention is legitimate: our abilities to gather together are threatened though many protesters raising concerns in this context have misdiagnosed the cause. Public health directives to manage viral spread by means of sheltering and mask-wearing are not the problem. Street protests are an expression of our right to gather, not a threat to that right. On the contrary, our ability to assemble and gather is threatened by the same combination of forces that have so damaged our experience of free expression: the private, digitized control of the spaces where gathering happens and state limitations on the right to assemble.[1] This essay focuses on those digital threats.

Profit-maximizing corporate control of online communications channels has changed who dictates the rules for speech as well as how and where it takes place. Public governance is changing in response, and policymakers, scholars, and activists are retheorizing everything from antitrust to employment law and elections to protect public interests.[2] Efforts over the last three decades to define and protect our rights of expression in digital spaces reveal the magnitude of what’s needed to protect our rights of assembly.

The magnitude of the challenges we face in maintaining our rights to assembly and our rights to free speech is similar, but the location is different. Assembly occurs in person, online, and in the spaces that connect the two. To protect our ability to gather, we first need to reconceptualize time and space.

Living in the Liminality

The places and ways in which we gather are changing. The consequences of this can be seen in how we take collective action as well as in the role of philanthropy. To follow this logic, we must first consider how we understand “digital spaces,” physical spaces, and the liminal area between them. We are hampered by our own language and metaphors. The phrase “going online” persists, as if the internet is a place that is separate from “real life.” This distinction no longer holds.

There are more objects than people connected to the internet today.[3] By embedding digital controls in doorbells and traffic lights, we have “turned on” our physical spaces. The internet serves as the unseen infrastructure tying all these devices together and serving as a globally networked on/off switch for our physical spaces. Everything that happens in such connected spaces—and every institutional function connected to the sensors—becomes dependent on the digital network. By installing audio, visual, and data sensors throughout our built environment and public places, we have effectively digitized our physical spaces.

People with mobile phones, functioning electrical grids, and affordable internet access spend their days phase-switching between active, passive, and remote digital data generation. The active phase involves sending text messages or emails, storing documents in the cloud, shopping online, or videoconferencing with colleagues, congregations, or community groups.

The passive experience may begin when you wake up if your only digital device is a smart phone that you grasp upon rising. If you have connected thermostats, voice assistants, or other devices in your home, you were online even while sleeping. These devices emit trails of data beyond our sightlines that make us visible to external service providers. Moreover, these devices generate digital data trails that announce where we are and with whom.

The remote phase of our digital existence begins when we leave our homes, as street cameras, license plate readers, and employer-owned software all track our actions.[4] Wherever we go in most US towns or cities, we are “seen” by our built environment.[5] When we leave those spaces for the rural outdoors or our domestic indoors the majority of us bring along our own location-aware, data-generating devices.

Being “online”—in the sense that we are generating digital data that is collected by third parties—is now our default state. We switch invisibly across these phases of active, passive, and remote, weaving trails of digital data that tie us to time, place, resources, and other people. For many of us, now, it takes effort to go offline and become untrackable since our normal state is to emanate digital signals.

Most of us move through these phases, but the consequences of doing so are unjustly more burdensome on already marginalized communities. The liminal is no less racialized and discriminatory than the physical and virtual spaces it sits between. Black people and their communities are not only over-surveilled, they are used as deliberate testing grounds for technologies such as facial recognition, license plate readers, and Stingray devices that feed cell phone IDs to police.[6] Despite pervasive digital monitors, these same communities are also the ones most likely to lack reliable, affordable internet access of their own.[7]

Being watched changes how we gather. Leaders trying to build trust in their communities may seek to gather in places away from external gaze—a feat that becomes harder to do as we digitize ever more of our physical spaces. Activists gather and disperse quickly, design obfuscating coverings to mask themselves from sensors, dispose of their technology, and rely on coded language and encrypted software. Savvy communities must now incorporate a deep and adaptive understanding of the digital environment into their work, regardless of whether their mission focuses on education, health, poverty alleviation, environmental justice, or political participation.

Those who know what it is to be watched also know how to evade, obfuscate, and confuse the watchers. Black and indigenous leaders, environmental and human rights activists, and journalists tend to be on the cutting edge of the learning curve for digital monitoring. They are the first to be subjected to “innovations” such as gait recognition or geofencing. They have been watched for so long they have adapted their gathering practices accordingly. They interrogate every new technology through the lens of discrimination and information asymmetry. Some people within these communities also build adjacent systems—from cooperative banks to mutual insurance companies to digital technologies—to provide the services they need under rules acceptable to them.

People who work in fields, hospitals, factories, and transportation systems are monitored, and their assignments are mediated by algorithms. These dynamics are coursing upward through the professions. People working from home, as well as on site, are monitored by management through shared public calendars, software configured on company laptops, videoconferences, and collaboration software logs. Productivity software installed by employers, fitness monitors provided by insurance companies, metadata embedded in every email and memo, and proctoring software used for test-taking all watch professionals and students as they go about their daily work.

Public health directives that closed doctors’ offices and other health care facilities have revealed the breadth and depth of our digital dependencies though, once again, they did not create these dependencies. Artists and yoga teachers have shifted their practices to streaming apps and payment platforms. Community theaters and dance instructors, seminar leaders and therapists, medical doctors and kindergarten teachers, boards of directors and volunteer coordinators now interact via video services and shared documents. We are adapting the norms of offices, examination rooms, courts and government facilities, board rooms, theaters, cafes, and classrooms into the design constraints and regulatory preferences of commercial software providers. We have been extending our dependencies on digital systems for years. They are newly visible and rapidly expanding, but they are not newly created.

Seeing this phenomenon as clearly as we can, now, might be just the spark we need to take action. As we embed every element of daily life into these digital systems, we also transplant the power relationships that shape online discourse into our parks, streets, schools, community centers, and the halls of government. These dynamics include the challenges of instituting public oversight of privatized infrastructure, opaque product designs that determine who sees what and whom, and “spaces” that are maximized for profit instead of participation, equitable access, personal safety, or collective deliberation.

For millennia, we have sorted and clustered ourselves. For centuries, rulers and religions have categorized people using the technologies of their times from counting to advanced statistics. Today, massive sets of data collected from our activities in virtual, physical, and liminal spaces power the corporate and state algorithms that tag, divide, and cluster us. Sorting people into groups is something we do and is done to us. However, today this is being done with an ever more powerful set of tools, ruled by an amalgam of global companies and nation states, using increasingly inscrutable methodologies. Scholars and advocates have made some progress in raising awareness and policy activity about these phenomena when it comes to online speech. Our challenge now is to bring the same intention to protect our ability to assemble online, offline, and everywhere in between.

Today’s policy battles about online discourse, political advertising, and hate speech are informed by decades of scholarship that position the internet as communications infrastructure. Now that digital infrastructure supports physical interactions, we need to consider the implications for assembly. As the lines between physical and digital spaces blur, these challenges are being repotted into the soil of public life. Unfortunately, we do not have decades for study before these forces irreparably harm our ability to voluntarily gather, plan, mobilize, and take collective action.

What We Need Now

The first step is to recognize that our associational lives now operate in both physical and digital spaces. From the 1990s’ enthusiasm for online communities to the subsequent proliferation of social media “groups” and “circles,” the internet has long promised bigger, easier, and more diverse associational options and spaces for assembly. But the reality is more complicated, as marginalized communities have long experienced. We have no window into how the sorting and clustering designed to serve digital ads bounds our online experiences. What variables do the platforms use to define you or those with whom they think you might share common interests? How do the machines see each of us, and how does that categorizing shape with whom we associate?

We have studied trolls, bots, misinformation, and platform governance, and we need to explicate how they shape the human relationships that contribute to and result from them. Most analysis of these phenomena focuses on them either through the lens of violence or speech, but we also need to inquire about them as examples of manipulated assembly. In a similar vein, driving women and queer people off online spaces through harassment, targeting Black citizens with fake information about elections or coronavirus, or livestreaming armed attacks within houses of worship all need to be understood as forms of associational suppression and threats to assembly.

Moderators online do more than shape speech; they shape relationships. We don’t know whether those relationships align with how we see ourselves or what we’re looking for. How can we exert our own agency and define our own communities in an environment of inscrutable, profit-intermediated choices? We don’t know what rules, variables, or personal judgments are behind digital decisions to promote or obscure protest information, community announcements, or even meet-ups. We are repeating the mistake we made with online speech, assuming, for decades, that giving access to more and more voices meant that everyone would be heard and all would be well. We’ve learned the fallacy of this assumption the hard way; we must avoid repeating the mistake with assembly and association.

People now need adaptive expertise about product design and platform priorities in order to organize, communicate, and mobilize with others both online and in physical spaces. This expertise involves reverse-engineering social media and search priorities, evolving security concerns, and situational awareness about state and corporate boundary-setting on associational spaces via regulation, subpoena, or product design. It requires ongoing attention to unacceptable consequences, such as repurposing health data for economic gain or geofencing certain groups for political messaging.[8] The social effects of massive data collection, concentration, and analysis are seen in the outsize power of a small number of corporations, filter bubbles, and the feeling that we’ve lost control of online speech. Unless we intervene now, we are on a similar trajectory of corporate enclosure of our choices for physical and virtual gathering.

The second step is to recognize and support the expertise that already exists. Leaders at the Detroit Community Technology Project and MediaJustice in Oakland repeatedly demonstrate the digital expertise of community organizers. Native American communities organize horizontally and ecosystemically, flowing like water away from hierarchical watched spaces. In their 2020 book, Design Justice: Community-Led Practices to Build the Worlds We Need, Sasha Costanza-Chock reminds us that communities know what they need. Philanthropists need to respect and support these communities, help them share their knowledge, and aid them in imagining, teaching, and building alternative technological futures.[9]

On June 20 and 21, 2020, more than one million people participated in the Mass Poor People’s Assembly and Moral March on Washington, a digital gathering supported by videoconferencing, radio, social media, and telephone dial-in services. Religious organizations, labor unions, Black fraternities, veterans, environmental advocates, and digital rights activists organized their members and built the digital scaffolding for the event. The aspiration had been for a physical gathering reminiscent of the 1963 March on Washington. In some ways, the digital version was more inclusive, but the arrangements for participants and organizers changed as the event shifted from streets to screens. Instead of the masks, water, and medics they would have brought to the National Mall, the organizers brought passwords to video lines, protected the chat rooms, and used redundant servers to prevent being taken offline by opponents. Bringing people together now requires constantly learning and updating a mix of physical and digital safety measures for individuals, their online presences, and entire communities. Supporting diverse alliances of community groups and digital advocates is fundamental to civic and political engagement today.[10]

Hierarchical and mostly white nonprofits and foundations, on the other hand, scrambled to find this kind of expertise when shelter-in-place orders required them to disperse overnight. Smart managers, of course, will work to move forward under these new conditions, engaging their staff and board members in the kinds of ongoing digital safety practices that their critical missions deserve. Doing so will mean encouraging the distribution of expertise throughout the organizations they work with, a small step toward helping nonprofits adapt to their dependence on digital systems.

Third, Big Philanthropy needs to expand its investments beyond the instrumental nature of digital technologies. Simply helping nonprofits expand their use of digital technology without questioning the effects of our digital dependencies will do more harm than good. The move to public interest technology is a positive step, but it must widen its focus to take into account the digital controls shaping every domain. This vision must expand to address the effects of digitized meeting spaces on assembly, to weld community-based expertise about safety and vibrancy to decisions about public digital infrastructure, and to create the networks of expertise that can identify, critique, prevent, and provide alternatives to a digital takeover of public physical spaces. Starting places include Catherine Sandoval’s framing of net neutrality as a public safety issue and efforts to articulate critical digital infrastructure for democratic participation.[11]

Finally, institutional philanthropy needs to shift its policy focus. For fifty years, the policy agenda of the nonprofit sector has been tax and corporate law.[12] From an equity standpoint, this agenda is misguided—it prioritizes institutional self-interest over tax provisions that would mitigate against extreme wealth inequities. If foundations aspire to any legitimacy in struggles for justice, equity, or sustainability, they need to support policies that expand people’s ability to take collective action, to associate with whom they choose, and to give time, money, and data safely and with agency. All these actions are now digitally dependent, and so philanthropy’s policy agenda must follow.

The policy domains that matter to the existence and functioning of all nonprofits and philanthropic organizations are those that directly implicate the core values upon which civil society exists in democracies: access to information, participation, pluralism, and freedom of assembly and speech. Our lived experience of these values now depends on an intersection of public policy and corporate product choices.

Undergirding all of civil society are digital transmission systems to which we need affordable, reliable access and assurances that our information will be treated fairly. The policy concerns of all philanthropic enterprises should be those that protect the public’s access to information, the people’s ability to participate, the freedom of expression and assembly, and the existence of digital, physical, and liminal spaces that encourage pluralistic participation. These are the requirements for gathering and for taking collective action; they are the underpinnings of civic space. Philanthropy, especially such legally privileged, institutional forms of philanthropy embodied by foundations, is a subset of civic space; it exists within the broader frames of assembly and association. Big foundations exist only because laws allow them to—laws that are just barely more than a century old. Those laws are negotiated through the mechanics of our democracy. They are grounded in a societal commitment to allowing people to come together to use their private resources for public benefit. The changing nature of how and where we assemble, and how we protect our ability to do so, is an existential threat to civil society and its most familiar US institutions: nonprofits and foundations. Protecting the space for civil society should be fundamental to institutional philanthropy, for the latter can’t exist without the former.

In the decade ahead, we will be obliged to rebuild all our public systems, from health care to housing, education to food systems, and transportation to employment. Each is broken in unique ways, but rebuilding them requires beginning at the root level. Foundations prefer to silo these domains and approach them independently. There is neither time nor capital for that. Real change will require massive public investment in these pillars of democracy. Philanthropy’s rightful role will be to support and sustain the infrastructure for broad, inclusive community leadership; the space for assembly and associational life; and a commitment to a thriving, independent digital civil society. We have before us the opportunity to reimagine it all.

Otherwise, we will mourn alone for the democracy that we let die.

Lucy Bernholz

Lucy Bernholz

Lucy Bernholz is a senior research scholar at Stanford University’s Center on Philanthropy and Civil Society and the director of the Digital Civil Society Lab.


[1] Several US states have considered or passed legislation limiting the rights of people to protest or to assemble in public parks. Strategies range from burdensome permit requirements to raising fees to regulatory investigations of get-out-the-vote organizations. See “Anti-Protest Bills around the Country,” American Civil Liberties Union (2017),; “Reforms Introduced to Protect the Freedom of Assembly, International Center for Not-for-Profit Law (2020),; and Tiffany D. Cross, Say it Louder!: Black Lives, White Narratives, and Saving Our Democracy (New York: Amistad Press, 2020), 131–42.

[2] See Lina M. Khan, “Amazon’s Antitrust Paradox,” Yale Law Journal 126, no. 3 (January 2017): 564-907; Karen E. C. Levy “The Contexts of Control: Information, Power, and Truck-Driving Work,” Information Society 31, 2 (March 2015): 160-74; and Anna G. Eshoo, “Rep. Eshoo Introduces Bill to Ban Microtargeted Political Ads,” press release, May 26, 2020,

[3] See Laura DeNardis, The Internet in Everything: Freedom and Security in a World with No Off Switch (New Haven, CT: Yale University Press, 2020).

[4] Zach Whittaker, “CBP Says It’s ‘Unrealistic’ for Americans to Avoid Its License Plate Surveillance,” TechCrunch, July 10, 2020,

[5] The next frontier for remote trackers includes olfactory sensing. See Kyle Wiggers, “Aryballe Raises $7.9 Million for Odor Detecting AI Sensors,” Venture Beat, July 10, 2020,

[6] Simone Browne, Dark Matters: On the Surveillance of Blackness (Chapel Hill, NC: Duke University Press, 2015).

[7] Sumit Chandra et al., Closing the K–12 Digital Divide in the Age of Distance Learning (San Francisco: Common Sense Media; Boston: Boston Consulting Group, 2020).

[8] Heidi Schlumpf, “Pro-Trump Group Targets Catholic Voters through Cell Phone Technology,” National Catholic Reporter, January 2, 2020,

[9] Sasha Costanza-Chock, Design Justice: Community-Led Practices to Build the Worlds We Need (Cambridge, MA: MIT Press, 2020).

[10] Lucy Bernholz, Nicole Ozer, Kip Wainscott, and Wren Elhai, Integrated Advocacy: Paths Forward for Digital Civil Society (Stanford, California: Stanford Center on Philanthropy and Civil Society, 2020),

[11] Catherine Sandoval, “Cybersecurity Paradigm Shift: The Risks of Net Neutrality Repeal to Energy Reliability, Public Safety, and Climate Change Solutions,” San Diego Journal 10, no. 1 (2019): 91. See also Argyri Panezi, Jessica Feldman, and Lucy Bernholz, Critical Digital Infrastructure Research, Ford Foundation,, and Ethan Zuckerman, “The Case for Digital Public Infrastructure” (New York: Knight First Amendment Institute at Columbia University, 2020),

[12] Here, I am speaking specifically of nonprofit and philanthropic trade associations that advocate on behalf of nonprofits and foundations, not the actions of individual foundations per se.