Fifty years ago, Congress passed the Public Broadcasting Act of 1967. The Act institutionalized nonprofit public media, creating the Corporation for Public Broadcasting, which itself helped create and continues to fund the two largest and most influential elements of public media: the Public Broadcasting Service, a national network of television stations; and National Public Radio, a national network of radio stations. On the occasion of this half-century mark, the Knight Foundation commissioned this paper to sketch out a map ahead for public media. The paper looks to the future first by addressing the question of whether we still need a public media system, and particularly a federally organized and funded one. It then identifies changes in technology and the market that are transforming the media ecosystem, impacting both public and private enterprises. In the concluding section, the paper sets forth ideas for how public media can adjust its strategies and tactics to take advantage of these changes, and build a digital platform that serves the still critical role of what we should consider a community information commons, a resource the visionaries who acted five decades ago saw as the animating reason for its creation.
Part I. Fifty years later, the continuing case for the information commons
The argument starts to fall apart when other metrics are examined, including market related ones, that demonstrate the continuing need for, and the benefits of, having a commons of information – one whose motive, as noted by Carnegie, is not for private profit but for public service.
The media landscape has changed dramatically since Congress passed the 1967 Act, so much so that many have raised the question of whether we still need a public media system.
Reviewing the Act’s original rationale is a good starting point to explore that question. While the legislation was clearly a response to the growing importance of the medium of broadcast television, the key documents indicate that the foundation underlying its genesis was an expression of deeper, longer-term purpose.
The seminal Carnegie Commission that led the movement for a new television system emphasized community. Its call highlighted the need to “deepen a sense of community,” suggesting the value of a television network was to “show us our community as it really is,” and provide a medium “where people of the community express their hopes, their protests, their enthusiasms and their will.”
Further, the Commission envisioned that this resource should be thought of as a commons of information, not subject to pure market forces. The concept of a commons, a resource available to all and not owned privately, has been used as a framework to create and safeguard grazing areas, parks, air and water, among others. As the Commission made clear, this idea was at the heart of its recommendations. As it noted in proposing a nonprofit media organization, “the nonprofit sector – in education, public service and the arts – has a different bottom line from the business community. In an ultimate sense, its contributions to human betterment constitute its ‘profit.’ This is a unique form of social dividend that Western society has devised as a counterweight to the implacable economic laws of the marketplace.”
While the commons and private activity inevitably overlap in some ways – there are public and private lakes for example – the commons has a different purpose. As it responds to different incentives than similar resources that are privately held, it follows that its evolution will also be different. Private institutions tend to migrate to new profit opportunities, while nonprofit institutions tend to migrate to new opportunities to serve their mission.
The Public Broadcasting Act of 1967 closely followed the Carnegie Commission recommendations in clarifying that the purpose of creating these institutions was fostering community, and not just creating a broadcast signal. While the Act identifies the importance of using television and radio for “instructional, educational and cultural purposes,” it also seeks to encourage “the growth and development of nonbroadcast telecommunications technologies for the delivery of public telecommunications services.” The Act states that “it furthers the general welfare to encourage public telecommunications services which will be responsive to the interests of people both in particular localities and throughout the United States, which will constitute an expression of diversity and excellence, and which will constitute a source of alternative telecommunications services for all the citizens of the nation.”
The Act’s mandate extends beyond the medium of broadcast to include other potential ways to serve the information needs of communities. Even when it identifies broadcast as the medium of the moment, as it does when it states that “public television and radio stations and public telecommunications services constitute valuable local community resources for utilizing electronic media to address national concerns and solve local problems through community programs and outreach programs,” the gravamen of the concern is creating community resources to address both national and local issues.
President Johnson reiterated the importance of creating a commons to address community concerns in his statement introducing the Act, suggesting that, “at its best, public television would help make our nation a replica of the old Greek marketplace, where public affairs took place in view of all the citizens.” He further expanded the reach beyond television, saying, “I believe the time has come to enlist the computer and the satellite, as well as television and radio, and to enlist them in the cause of education … I think we must consider new ways to build a great network for knowledge – not just a broadcast system, but one that employs every means of sending and of storing information that the individual can rise.”
With Johnson’s grander vision now manifest in many ways, some argue that public media’s role is obsolete. The principal claim is that multichannel video distribution platforms, such as cable and direct broadcast satellite, as well as the internet, have created so many choices that the scarcity of distribution options and content that once justified public expenditures no longer exists. George Will summarized this line of thinking in a recent editorial, noting that while public television was created in an era of three television networks, eliminating public television today “would reduce viewers’ approximately 500 choices to approximately 499. Listeners to public radio might have to make do with America’s 4,666 AM and 6,754 FM commercial stations, 437 satellite radio channels, perhaps 70,000 podcasts, and other internet and streaming services.” Will continues by suggesting the trend of cord cutting represents the public’s desire for individualized choices, rather than big bundles, and that “compelling taxpayers to finance government-subsidized broadcasting is discordant with today’s a la carte impulse and raises a point: If it has a loyal constituency, those viewers and listeners, who are disproportionately financially upscale, can afford voluntary contributions to replace the government money. And advertisers would pay handsomely to address this constituency.”
These arguments are not without merit; new options are relevant to how public media can best accomplish its mission. Certainly, some channels are doing things that in the previous era of scarcity would have been target markets for public media’s mission. Further, if private resources duplicate outcomes of public investment, most would agree public investment should be phased out.
But this argument positing the obsolescence of PBS and NPR misses the insight that founders of public media understood: private sector enterprises will never produce the types and quality of programming that a noncommercial enterprise can, and when done properly, does. The argument relies on a single metric: the elimination of scarcity in the options for distribution and content. The argument starts to fall apart when other metrics are examined, including market related ones, that demonstrate the continuing need for, and the benefits of, having a commons of information – one whose motive, as noted by Carnegie, is not for private profit but for public service.
Indeed, if the theory that market forces have, on their own, produced product that replaces PBS and NPR is valid, then these public media would be suffering the fate of others who have lost significant market share. To the contrary, however, data shows that the public sees PBS and NPR as still providing a valuable public service that market forces on their own do not provide. Their popularity remains while commercial entities fragment. In fact, PBS has climbed from the 15th most popular network to its current position as 6th ; and despite all kinds of new audio competition, NPR ratings are at an all-time high and the network is the leading producer of podcasts.
These ratings suggest that market forces alone do not have the incentive to address all community and public interest needs in programming. There are many examples of PBS programming that, while they may overlap with private channels, demonstrate that a public mission, rather than a profit mission, results in superior content offerings, including:
- children’s educational programming.
- science and documentary programming.
- investigative journalism.
- programs directed at minority and low income populations.
- experimental programming.
Ratings, however, are not the only measure of the continuing value of public media. While private channels occasionally produce programming that is used for educational and community purposes, public media does so in a way that has a greater reach and impact, making its resources available in ways unlike any commercial entities.
For example, in 2009 PBS created PBS KIDS Video Player, which in its first month delivered more than 87 million streams of educational content. PBS station WGBH created a teachers’ domain that offers more than 1,000 digital, classroom-ready resources for students and teachers, and has more than 333,000 registered users. PBS station KQED created KQED Teach, a professional learning platform and source for a lineup of free online courses for educators. NPR has created an Application Programming Interface that provides access for computer applications to an archive of over 250,000 NPR stories grouped into more than 5,000 aggregations.
It is not surprising that commercial broadcasters do not make such resources available; they are unlikely to earn a return. As the commons of information, however, public media routinely makes such information available.
Perhaps most critically in today’s society, public media has been and remains the most trusted news source among the widest spectrum of the population. In an annual survey that includes media, news organizations and other institutions like the courts, PBS has been rated as the most trustworthy institution among nationally known organizations for 14 consecutive years, from 2003 through 2017.
This continuing level of trust and commitment to innovative ways of providing news and information is critical at all times, but perhaps is even more important now. Market forces are making the economics of traditional journalism harder, not easier. Newspapers – traditionally the major source of investigative journalism – are shrinking or shutting down at alarming rates. A decade ago, a Pew study suggested the business climate for newspapers was “chilling,” and since then the metrics have deteriorated even further. While digital advertising, including for newspapers, is growing, Pew recently found that “journalism organizations have not been the primary beneficiaries.” Total 2016 newspaper revenues, combining advertising and circulation, were estimated at $28 billion, a drop of more than 50 percent from a decade earlier where the combined revenues were nearly $60 billion. Newspapers, as an economic force, are so weak that while they are largely local monopolies, they have petitioned the Department of Justice to seek an antitrust exemption allowing them to negotiate collectively with Google and Facebook.
Perhaps most critically in today’s society, public media has been and remains the most trusted news source among the widest spectrum of the population.
Another critical trend demonstrating the value of a trusted source of information is how technology is making the task of journalism both more important and more difficult. While online sources are gathering market share in terms of eyeballs, they generally depend on traditional journalism sources for actual reporting and data. Further, the rise of what we might think of as actual fake news,” such as the assault of fabricated stories that circulated near the election in 2016, is likely to continue and get worse.  In such an environment, a source of information trusted by a broad spectrum of the population increases in importance to civic and social progress.
Fragmentation in media is also making the job of a commons, a trusted source for information, more essential. As many have noted, news sources are increasingly motivated to provide information that serves confirmation bias, not information to challenge the audience’s pre-existing views. The traditional broadcast medium is unlikely to stem this tide; indeed it is more likely that it will exacerbate it. The Fairness Doctrine is gone and the Public Interest Standard is essentially meaningless. As discussed further below, the private television broadcasting market is about to go through another round of consolidation that will likely result in less local news and more reporting that imitates the tactics of click-bait journalism.
…the private television broadcasting market is about to go through another round of consolidation that will likely result in less local news and more reporting that imitates the tactics of click-bait journalism.
In light of these trends, public media stands apart from its commercial competitors. It ranks high in trust among all institutions, not just media institutions. It has stayed true to the vision of serving community interests laid out by Carnegie and others. With over 80 percent of the American public watching PBS, with demographics that match the race, ethnicity, education and income characteristics of the general population, PBS is unique among television networks, with the public still seeing it as a commons where all Americans have opportunities to engage and benefit.
Withdrawing federal funding for CBP would not simply result in the loss, as George Wills says, of one of 500 television channels and one of thousands of radio channels. It would be walking away from a hard-won and essential community commons of information. The task ahead is to continue to create value in the new media ecosystem, which, as discussed below, will involve a different distribution architecture, different foundation stones for content, and a different market structure.
Part II. Big bandwidth, big data and big media: Technology and market changes that inform the future of public media
…it is clear from how the media environment has evolved from 1967 to today that public media needs to adjust its strategies to focus on where it can have the most impact in creating healthy information communities: changing its asset mix on infrastructure; focusing on content creation where it has a comparative advantage; and finding efficient ways to invest in the tasks of data analysis and curation.
While the simplistic analysis of the value of public media based on the sole metric of abundant distribution and content options is wrong, it is vital to understand the technology and market changes – most importantly the confluent forces, discussed below, of big bandwidth, big data and big media – that are transforming the environment for media enterprises. These developments are already disrupting, restructuring, and in some cases, destroying parts of the media ecosystem.
Prerequisites for a healthy information community
As public media considers how to respond, it should begin by asking what are the facilities a community needs, and the tasks that must be undertaken to have an information healthy community. Just as a physically healthy community must assure it has certain resources, such as clean water and sewer systems, and an economically healthy community must have access to other resources, like electricity, transportation and an effective local government, so too does an information healthy community depend on certain resources being available and tasks being performed.
There are four fundamental resources and related tasks required for an information healthy community:
- creating and maintaining a high-performing information exchange infrastructure that enables the universal distribution of information, with all having the ability to afford and use that infrastructure.
- having a critical mass of people and enterprises engaged in information creation.
- producing a mechanism for information collection.
- ensuring a critical mass, again of people and enterprises, capable of information curation.
As discussed below, because the technology and economics of these elements have changed profoundly since Congress established public media fifty years ago, it is essential to rethink how best to allocate resources to serve these information needs.
Information exchange infrastructure: The internet has replaced broadcast networks as the fundamental distribution infrastructure. The vast majority of information flowing to individuals and enterprises now flows over the internet, which itself has become the commons for collaboration. The amount of time spent by adults on digital media, for example, has more than doubled in the last decade, from 2.7 hours per day in 2008 to 5.6 in 2016.
Increasingly, this occurs on mobile devices, with a significant increase in news being delivered by news aggregators like Facebook and Apple News. In 2014, nearly 75 percent of the population used a computer to access news while less than 25 percent used a smart phone for that purpose. By this year (2017), the two were equal, at about 45 percent. If one includes tablets, mobile devices are now significantly ahead of fixed devices for news distribution.
This shift to the internet platform also includes watching videos: as early as 2014, more videos were viewed over the internet than on broadcast television. A recent Wall Street analyst report estimated that 31 million Americans would cut the cord of multichannel video packages in the next decade, while at the same time, Over-the-Top video bundles would gain 17 million new subscribers. On the supply side, more video content is uploaded to the internet in 30 days than the major television networks have created in 30 years.
This does not mean that the need for broadcast facilities and assets has disappeared. On the contrary, broadcast retains its position as a significant method of distribution. It is universal, reaching nearly all American homes, while the internet is still below 75 percent in terms of home adoption. Broadcast is free, providing a safety net service that the internet does not. It is also easier to use; it does not require the kind of digital readiness that characterizes internet users and serves as a significant barrier to internet adoption. The internet still presents accessibility hurdles for persons with disabilities. Further, the internet architecture, while increasing the options for content, creates new challenges regarding how to break through the cacophony of noise and voices to find trusted and insightful content.
Radio did not destroy the market for theater, nor television the market for radio. It is unlikely, even projecting several decades ahead, that the internet will destroy the market for broadcast. But while the need for broadcast facilities remains, a healthy information community has additional needs for affordable and abundant broadband networks, and content creators with cloud-related facilities and capacities. Some public media, understanding the need to restructure their asset base, participated in the recent incentive auction, effectively trading spectrum assets for financial assets that can now be used in a variety of ways. That trade, however, will not be a one-time event, but rather, a long-term process with constant evaluation of when and how to realign assets.
Information creation: The information creation task is much easier and more broadly distributed. The cost of gathering data, researching and producing high-quality audio and video has been revolutionized by services available on the internet. These tools, while driving down costs, still have the challenge identified in the Carnegie study: to create local content in a cost-effective manner even when such content has a small market. On the other hand, it is now much more economical to create content for smaller ethnic and language minorities, as the worldwide distribution made possible by the internet creates a number of long tail markets for specific language and interest groups.
Again, this does not mean public media is not needed to help create and sponsor such big budget and seminal works as Ken Burns’ documentaries, or that shows like Hoop Dreams or The Thin Blue Line will produce themselves. If public media wishes to retain its brand as creating the highest quality programming in categories like documentaries and children’s educational programming, it will have to allocate resources to the creation function. But the surfeit of programming and the opportunity to produce certain kinds of programming at lower costs while distributing them more broadly suggests, as with the infrastructure function, a need to consider a reallocation of funds.
Information collection, storage and analysis: The third task – collecting, storing and analyzing data – has also completely changed, requiring a different set of physical assets, and in terms of staff, a different skill set. As an initial matter, the economics of collecting and storing data has seen steady and large decreases: in the last several decades computing costs have been declining annually by 33 percent; storage costs by 38 percent.
These tasks, which, for the most part, were not on the radar when PBS was created, have now become critical for public media. While collection and storage costs have dropped and the activities themselves are relatively straightforward, they nonetheless require attention and funds that they did not in decades past.
The third point here – information analysis – is more complicated on two fronts: analyzing data to make it more meaningful to the community and analyzing the interests of the community. As to the first, Ken Burns did not invent any technology or even an art form, but he reinvented the documentary in a way that has had a profound effect on public media and all documentary filmmaking. His genius was in finding and producing a compelling narrative thread that ties together photos, video clips and music that in lesser hands would not have attracted an audience. There is not yet – but certainly will be – a Ken Burns of the data era who ties together the new information we can now access into a compelling narrative that enlightens us about where we come from, where we are and where we are going.
The second front is analyzing the interests of the community, much as Netflix and others do with their customer data, to devise a strategy to produce content that will be of greater interest to those who look to, or could look to, public media as a primary source of content. This is a task that goes far beyond the kind of analysis of Nielsen ratings and other audience measures of the first decades of public media. Public media is already working on this, but the importance of the task will no doubt grow.
Success on these two fronts will require public media to allocate more resources than it has in the past, but as discussed in the third section of this paper, these fronts are fundamental to increasing the value of public media in the future.
Information curation: As the content universe expands, the need for fair and expert curation remains and in fact, grows. The problem is that the economic incentives for fair and thoughtful guides in information ecosystems are sorely lacking. Instead, we see changes in how audiences obtain their news that undercut the dollars flowing to such curation. As the most recent Reuters Institute Digital News Report concluded, more people are discovering news through algorithms than editors: “We can also add up preferences for content that is selected by an algorithm (search, social and many aggregators) and compare with that selected by an editor (direct, email and mobile notifications). More than half of us (54 percent) prefer paths that use algorithms to select stories rather than editors or journalists (44 percent). This effect is even more apparent for those who mainly use smartphones (58 percent) and for younger users (64 percent).” Another change in the landscape is suggested by the rise in what is known as click-bait – a description of the way web sites offer headlines or short-form content designed to attract immediate attention.
As noted above, the universe of content is rapidly expanding, as are the tools for collecting, storing and analyzing data. This changes the nature and importance of the curation process. If before, the task could be compared to finding a needle in a haystack, now it can be compared to finding the needle in a thousand haystacks. If the analysis of data can lead public media to better understand its audience, it will also make the curation activity more effective.
In short, it is clear from how the media environment has evolved from 1967 to today that public media needs to adjust its strategies to focus on where it can have the most impact in creating healthy information communities: changing its asset mix on infrastructure; focusing on content creation where it has a comparative advantage; and finding efficient ways to invest in the tasks of data analysis and curation. But we should not consider today’s market an end state. We should also ask what future trends might affect the environment? The three biggest – big bandwidth, big data and big media – will shape the environment in which public media will address its mission in the next several decades.
Providers of both wireline and mobile services are currently preparing for an upgrade of their networks that will result in approximately a tenfold increase in bandwidth in the next five years.
Wireline entering the “Game of Gigs” – On the wireline side, the upgrade represents a change in strategy. In the five years from 2007 to 2012, no internet service provider announced large investments in next generation networks, largely because the one company to do so, Verizon, with its FIOS project that began in 2005, was seen by Wall Street as a failed effort. Instead, the two major protagonists, cable and telephone companies, were content with a harvest strategy in which both focused on generating profits from their existing networks, with occasional increases in speed, rather than investing in next generation capacity. While the two competed at some level, it was competition in the way that Nordstrom’s competes with Wal-Mart; one looks to customers who want and can afford the highest level of quality, while the other looks for customers interested in everyday low prices.
Into this environment Google Fiber began what might be thought of as the “Game of Gigs,” an effort to drive orders-of-magnitude better performance at mass market prices. As Google Fiber expanded its reach, telcos started to respond with their own gigabit efforts, and the cable companies then responded with upgrades as well. While the slowdown, if not the halting, of Google’s efforts will no doubt remove some of the urgency of the telco and cable investments, the upgrades are continuing, driven by a variety of factors, including the anticipated next generation mobile networks discussed below. Another factor is a view that there are markets on the horizon, such as augmented reality, immersive reality and 4K video, which will require much greater bandwidth. Indeed, the importance of superior bandwidth can be seen in cable’s continuing to gain market share due to its superior network and cost structure. According to the most recent report from Leichtman Research, cable broadband providers collectively added 2.7 million net additional high-speed internet subscribers in 2016 – the most net additions since 2010. In contrast, phone companies collectively lost about 600,000 broadband customers.
Wireless to join game with 5G – The two largest mobile companies, AT&T and Verizon, are also fixed broadband companies and understand the threat cable’s superior bandwidth represents. Moreover, cable is also moving forward to enter the mobile market with an offering largely based on its Wi-Fi networks. In response to that and general trends in technology, the mobile carriers have their own plans to offer far more abundant bandwidth with a new generation of services, generally called 5G, as it represents the fifth generation of mobile services. 5G will provide a massive increase in performance and throughput, a reduction in latency, and the ability to handle the exponential increase in the number of connections that will occur due to the emerging internet of things, discussed further below. 5G holds great promise for basic communication needs, advanced new communication services like two-way 4K video, advanced security, connected wearables, augmented reality, immersive gaming, privacy controls and many other services.
While the promise of 5G is great, so too are its challenges, perhaps the greatest of which is cost, which is likely to be significantly higher than any previous wireless network deployment. Indeed, it may be cost prohibitive for the major carriers or any new carriers to overbuild end-to-end wireline networks. Nonetheless, it is likely that mobile networks sometime early in the next decade will have significantly greater bandwidth than they do today, and the fixed and mobile networks will start to compete more directly.
All content and services become web content and services – There are many unknowns about the future of big bandwidth. Will it be available everywhere or just in areas of greater density and wealth? Will it be affordable to all? Will it be available on both a fixed and mobile basis?
Questions notwithstanding, from a public media perspective, the big takeaway is already clear. Video and audio will have more than sufficient bandwidth so that the current move to spend time with Over-the-Top video, instead of traditional broadcast or cable channels, will accelerate, and social media will increasingly have video elements. Further, the big bandwidth future will open the door to certain kinds of content that we are only just now seeing, like the merger of virtual reality with science, geography and history documentaries. The ability to put individuals in a virtual reality setting requires significant bandwidth, as well as creating the potential for new forms of narratives, an area where public media excels.
But this is only the beginning. A world of abundant bandwidth changes not just the world of content but the world of services as well. The broadband platform holds the promise for low-cost distribution, mass customization and high-performance knowledge exchange, transforming how services are delivered. We have already seen in many fields, from banking to law to retail, enterprises providing customers access to more personalized and powerful information, at all times, from anywhere, and in a manner in which the quality of the information and solutions are constantly improving. Public services, such as education, public safety, job training, health care and general government services are generally behind the private sector in moving completely from the analog to the digital platform. Government, for many reasons including its obligation to serve everyone, will be the last major enterprise to operate on both an analog and digital platform. Many of those services, including education, health care and job training, are within the ambit of public media’s mission.
Public media cannot replace government services, but the movement of all services to broadband should cause it to consider how best to use its own platform and expertise in building community information exchanges that improve public services.
For purposes of public media, there are five big data developments that bear particular importance: social media, mass customization, big data journalism, artificial intelligence platforms, and the civic internet of things.
Big data is a phrase that is often used to describe different things. As the Obama White House noted in its discussion of the topic, “There are many definitions of ‘big data’ which may differ depending on whether you are a computer scientist, a financial analyst or an entrepreneur pitching an idea to a venture capitalist. Most definitions reflect the growing technological ability to capture, aggregate and process an ever-greater volume, velocity and variety of data.” In other words, “data is now available faster, has greater coverage and scope, and includes new types of observations and measurements that previously were not available.” More precisely, big datasets are “large, diverse, complex, longitudinal, and/or distributed datasets generated from instruments, sensors, internet transactions, email, video, click streams, and/or all other digital sources available today and in the future.”
For purposes of public media, there are five big data developments that bear particular importance: social media, mass customization, big data journalism, artificial intelligence platforms, and the civic internet of things.
Social media – Social media is a big data platform in which the users create content by providing information about their own lives, and the analytic tools underlying the platform create new ways to provide information to discrete groups. This has had a profound impact on many aspects of our lives, not least the way we deliver and receive news. For example, with respect to news delivery, a recent Reuters survey reported, “Over half (51 percent) of our U.S. sample now get news via social media – up five percentage points on last year and twice as many as accessed in 2013.” While Reuters goes on to point out that only two percent of that majority get their news exclusively from social media sources, there is no doubt that social media currently plays, and will continue to play, a major role in the distribution of information, and that the ability to target information creates new opportunities, as well as threats, for all content providers.
Mass customization – Mass customization involves the ability to shape a product or service for individuals on a mass scale at a very low incremental cost. For example, Amazon and Netflix take in data from millions of customers and use it to design personalized recommendations. PBS has similar large datasets, with nearly 28 million unique visitors to its websites each month, creating data on over 330 million sessions, 800 million page views and 17.5 million episode plays. The challenge for PBS, as it is for all big data platforms, is turning that data into insight that can improve the individual’s relationship with the content, and improve the ability of public media to produce and curate content that will both attract greater audiences and serve its mission.
Big data journalism – Journalism is fundamentally about turning a number of different data points into a story that is both compelling and insightful. As noted above, the world of big data changes the volume and variety of information available, and the velocity of data analysis, as the world tries to respond more quickly to new information. In so doing, it elevates the value of some information while degrading others.
It has also led to changes in how news stories develop, and provides tools to help journalists find previously overlooked stories. For example, trending topics on Twitter or Google analytics serve as an early warning system for breaking stories, and as a longer-term measurement tool for issues of importance. Another example is how the Texas Tribune, a news startup in Austin, created a tool for its readers to sort through data about Texas policy makers, public employees, hospitals and educational institutions among other topics, and based on their interests, create new kinds of news stories. Given public media’s mission of covering both national and local concerns, big data provides new opportunities to follow both national and local trends, and provide insight with a much lower cost structure than in the past.
AI platforms – Artificial intelligence involves the ability of computing devices and platforms to learn and continually improve their performance. It is likely to affect every business, much the same way that electricity, broadband and other general-purpose technologies have transformed the way every product and service is delivered. For purposes of the content platform, it is important to note that the fastest-growing consumer product category is intelligent personal assistants that understand spoken language and can respond in that language, relying heavily on developments with artificial intelligence. The most popular of these is Amazon’s Echo, sometimes referred to as Alexa. Google has a similar product, Google Home, and Samsung, Apple and Microsoft all have plans for comparable devices.
This product category is important to public media for three reasons. First, these devices are quickly becoming like radios, in that they provide high-quality audio services. A number of news organizations, including CNN, the BBC and Der Spiegel are already providing audio content such as news and weather summaries specifically for these devices. The number of radio listeners using these devices to connect to their favorite programming and find new programming is certain to grow.
Second, they are the beginning of a transition from a touch to a voice interface. This has profound implications for new kinds of services and tools for populations that find the current typing or touch interface problematic, such as the visually impaired, children or those with limited literacy. Given the mission of public media to serve these audiences, a voice interface with the internet creates new opportunities for both content and marketing.
Third, unlike radios, these platforms learn. A radio would never recommend listening to a particular station. Alexa, however, might say, “I noticed you like listening to ‘Morning Edition.’ It is about to begin. Would you like me to play it?” While that example might sound like a positive development for NPR, because of the commercial interest of the platforms and their different interests in pushing certain kinds of products and content, the ability to remember content preferences and make recommendations can be either a positive or a negative.
The civic internet of things – The internet of things refers to the ability of devices, equipped with far greater computing power and connected to the cloud and each other through far greater bandwidth, to provide greater situation awareness and to act to improve outcomes. It includes devices embedded in clothes, for medical or other purposes, and autonomous vehicles among many other examples. Gartner predicts that the average family home will have more than 500 smart devices by 2022.
A key use of this technology will be to add intelligent devices to a number of infrastructure systems generally run by local governments, including but not limited to water, sewer, power and transportation. This phenomenon, which can be thought of as the “civic internet of things,” creates new opportunities to improve the data on which decisions are made in areas such as public safety, public health and social services.  McKinsey estimates that the global economics of state and local government’s use of the civic internet of things will grow exponentially to between $930 billion and $1.7 trillion by 2025. Some cities are already using such technology to better inform residents and improve public dialogue.
Most of this information, however, will only be used by city officials seeking to improve the performance of city operations. Public media will be in an advantageous position to monitor the information, make it available to the community, and host the dialogue that will emerge from the availability of such data.
No platform, as yet, has captured and built a platform based on people’s relationship with their communities.
The big opportunity for public media in big data – All of these big data trends create real opportunities for public media, both in the way news and other content is publicized and distributed, and in the nature of the information available for a community. Public media should consider how the big data platforms have all succeeded by capturing a segment of data from hundreds of millions of people: Google captures intention data, Amazon captures consumption data, Facebook captures data about how individuals present themselves to others, LinkedIn captures work-related data, while Netflix captures data about their customers’ interests and passions. No platform, as yet, has captured and built a platform based on people’s relationship with their communities. As discussed in the third section of this paper, that may be the most important task ahead for public media.
The third major trend is consolidation to create bigger media enterprises. There has been a significant desire within the industry to consolidate, that over the last several years had been suppressed by fear of the Obama Department of Justice, uncertainty about the election outcome, uncertainty about the populist leanings of the Trump administration, prohibited communications provisions related to the broadcast incentive auction, and Obama era broadcast regulations.
Those constraints are now gone. Moreover, Wall Street believes the Trump administration will be much more lenient on mergers than Democrats would have been and might be in the future. Thus, companies are likely to take advantage of the window to do one (or potentially two or even three deals) while the Republicans control the Department of Justice.
Of particular importance for public media, there will be a wave of mergers involving broadcast facilities. With the Federal Communications Commission bringing back the UHF discount, deals that once would have violated the 39 percent national cap will now be allowed. In addition, Congress and the FCC are considering rule changes allowing greater consolidation between platforms, such as newspapers and broadcasters, and greater consolidation in local markets.
Further, there are likely to be deals that result in content consolidation. The general view of the market is that in light of increasing use of streaming to deliver content, the number of viable cable channels will shrink. To survive, some believe that cable channels need to be part of larger entities. 
One cannot know for certain how this will play out. The 1996 Telecommunication Act allowed significant radio consolidation, and not long after passage, the industry became dominated by a few large national players. (Before the 1996 Act, the national limit for a single enterprise was 40 stations; the largest radio enterprise now has more than 1,200.) The emerging competition of streaming music and podcasts, however, has undercut the viability of the radio giants.
What is clear, however, is the difference between the path of private and public media. As noted in the discussion of Carnegie’s insight into how a nonprofit entity would differ from private media, private media has incentives to continually adjust its content to obtain the greatest financial return. This has not led private media to provide more vibrant local news and information. Rather, as one analyst said, consolidation represents an understanding that future financial returns require “deep cost cutting and increased scale” to achieve increased leverage in the negotiations for content or distribution.
Given its nonprofit status, public media will not play the consolidation game. It has to achieve scale benefits and sustain or improve its leverage in other ways. As others get bigger, there is a danger that media giants may seek to disadvantage public media to provide greater advantages to their own content. While public media cannot respond with a merger strategy, it can develop a strategy, as discussed further below, that increases its impact by building on the foundations constructed over the last 50 years to become the community information commons for the digital broadband era.
Part III. A path forward for public media: The information commons for the broadband era
So how should public media adjust its strategy to serve the information needs of communities and thrive in a world of big bandwidth, big data and big media?
The 1967 Act passed at a time when, according to Washington lore, the FCC Chairman only needed to know the letters of the alphabet (ABC, CBS, NBC and AT&T), and he didn’t even need to know all 26. Every one of these companies is now part of a bigger entity, they all face competition from companies whose founders were not yet born when the Act passed, and all are proceeding with strategies unimaginable five decades ago. So how should public media adjust its strategy to serve the information needs of communities and thrive in a world of big bandwidth, big data and big media?
As with any strategic plan, the starting place is an understanding of what strengths one brings to the situation. Public media has significant assets and comparative advantages, ones that no other enterprise is likely to be able to duplicate. These include:
- a programming and spectrum footprint that is both national and local.
- large group institutions and individuals at both the national and local level that voluntarily contribute, financially and otherwise, to its work.
- a customer base of several hundred million.
- leadership in certain niche but important content segments, including children’s educational programming, and science and history documentaries.
- a trusted brand.
- a large number of partnerships with both commercial and other nonprofit enterprises.
These are significant achievements, but none are, by themselves, a guarantor of success in the future. Public media must build on this foundation to succeed in an era where the asset mix for providing community information needs new tools and skills.
Given its public, noncommercial roots and mission, public media should always consider what private markets are unlikely to do. As noted in the first section of this report, the content gaps identified in the initial studies that facilitated the creation of PBS have changed, although market forces have not duplicated or surpassed the ability of PBS to produce this kind of high-quality content in various categories, like children’s programming.
One gap is funding for journalism. It would take somewhere between $265 million and $1.6 billion to fill the current gaps in local reporting each year. Some have suggested government fill the gap. That is probably unwise. Government funded journalism always carries the potential for conflict. It is also unrealistic to expect new government funding at a time when most policy makers are looking for ways to shrink government budgets. Rather, to the extent that government takes notice of and wishes to address the funding gaps in journalism, the main focus of government policy is more likely to be helping create conditions under which nonprofit news operations can improve their own abilities to thrive. In that light, a principal role for CPB (and its affiliates), as the most trusted and well-known nonprofit news source, ought to be as a thought leader and advocate for how government policy can improve the environment for nonprofit journalism in general.
A second gap is in long-term capacity building. Newspapers and broadcast news divisions used to do this, but disappearing margins have shortened their investment time frame. Therefore, they do not make the investments they once did in such initiatives as basic research and development into tools for more effective journalism or in seeding next generation talent. Further, market forces have never, and are unlikely to ever, drive significant investment into content and applications for smaller audiences, such as local, low-income or low consuming markets, as those markets are unlikely to be attractive to those who depend on subscription and advertising markets. As discussed below, public media should undertake a number of efforts for itself and others in building long-term capacity for its mission.
A third gap in the private market is likely to be building a digital platform focusing on news, information and community that generates trust across a broad cross section of audience. As markets fragment, commercial information services have an incentive to narrow their focus to a passionate niche, rather than trying to take on the more difficult and less lucrative mission of appealing to a broad cultural and political spectrum.
When one looks at the strengths of public media, the gaps in the private market, and the trends in technology and markets generally, one can see a direction in which, with proper execution, public media can be true to its mission and increase its impact.
Vision: The public media digital big data platform
…the direction for public media seems clear – it should seek to be the digital big data platform for community news and information.
In 1967, the largest companies in the United States were the monopoly phone company (AT&T), followed by car (GM and Ford), gas (Exxon, Mobil, and Texaco) and manufacturing (General Electric and U.S. Steel) companies. Fifty years later, it is a very different list. Based on market capitalization, Apple, Alphabet (Google), Microsoft, Facebook and Amazon constitute the top five. Each, in its own way, is a big data platform company although the platforms are different (mobile, search, social and retail).
What this suggests is that the path to value creation in today’s economy and society runs through platforms that collect, analyze and use data to address the specific needs of an individual or group. Of course, public media does not seek to create value in the same way; its purpose is not pricing power or market dominance. The value created by the market leaders, however, reflects not just a business strategy but also how people and enterprises spend time and what they value. The dominance of these companies suggests that immediate access to certain kinds of information is the key value creator in today’s economy.
In this light, the direction for public media seems clear – it should seek to be the digital big data platform for community news and information. As discussed at the beginning of this paper, such a platform aligns with why Congress passed the Act.  It wanted to institutionalize public media on a national and local level, with the data of the time being largely expressed through video and audio. Today’s world involves a convergence of video and audio as well as data, all with the capability of personalization.
Creating a platform also takes advantage of the market trends discussed above. With bandwidth abundance growing and data storage costs shrinking, the principal issue with creating a platform is not the capital costs of the facilities but the cost of creating and keeping an audience. More than any national or local broadcaster, public media already has that. Creating the platform for noncommercial content, with both a national and local element, is also something private market forces are not likely to do. Uniquely among information providers today, public media has a national and local footprint, a critical mass of users, and a brand that reaches across interests, demographics and opinions. As noted above, when one wants to shop, search, find entertainment, or tell the world what one is up to, there is a clear, dominant and highly effective platform for doing so. But when one wants access to news and information, from the community level up to the international level, there is enormous fragmentation and no platform with anywhere close to the efficiency and consumer friendliness of the dominant platforms. Similarly, on the supply side, if someone wants to sell a product, optimize content for search, or offer an app, one knows where to go. But for those who wish to create news and information, the market is far more fragmented.
To a certain extent, PBS is already moving in this direction with PBS.org and its work with Google Analytics. Further, the individual stations all have websites that are attempting to obtain similar results. None of these, however, have achieved the kind of network effects, nor the stickiness of the platform that the market capitalization leaders noted above have attained.
These efforts also fail to capture the great insight underlying the dominant digital platforms: the mission of the platform is to match the individual with a universe of content far greater than the content of the platform’s owner. Amazon’s genius was in becoming a platform for all sellers, not just Amazon’s own products. Google (and YouTube) enable search for all manner of web content, not just their own. Facebook has been brilliant in providing access not just to friends but also to multiple ways of interaction, some owned by Facebook and some not. What made the iPhone platform powerful was the app store that contained the content of others.
In a way, the right model for public media is Amazon, and specifically Amazon Web Services. It provides retail (and other) infrastructure that is so advanced while being so inexpensive that even Amazon’s competitors believe it is in their interest to use it. No one else in the space has the economies of scale to offer such services.
When it comes to public media, and other nonprofit institutions such as museums, cultural centers and the like, no one matches the current scale, scope and reach of CPB and its affiliates. No one is better positioned to become the “Amazon of community news and information.”
Moving from building a better website to building a platform is no easy task. Many have tried – and failed – to attract and keep the critical mass necessary for the network effects to kick in. Nonetheless, public media already has the foundation stones in place for creating a successful platform. Combining content with the functions of the digital platform is clearly the direction for creating value, both economic and social, in the decades ahead. Pushing forward with this effort involves a number of different initiatives in three fundamental areas:
- initiatives that increase the functionality of the platform not just for public media but for larger community efforts.
- partnerships that bring more content and customers to the platform.
- policy changes that improve the prospects for the platform.
When it comes to public media, and other nonprofit institutions such as museums, cultural centers and the like, no one matches the current scale, scope and reach of CPB and its affiliates. No one is better positioned to become the “Amazon of community news and information.”
Improve functionality for public media and all sources of community information – It is beyond the scope of this paper to describe in detail how to create that platform. But at the highest level, the task involves having unique content and functionality that attracts and keeps large groups of people, and using the data created by these interactions to consistently improve attractiveness and ease of use, thus reinforcing the likelihood of return. By having data that improves the content, search and recommendations functions, the platform creates network effects that constantly improve its impact.
Public media should aspire to create this platform not just for itself, but for all sources of community information. As noted above, the model is Amazon, which hosts products it sells but also competitive products sold by others who find it simply, the best platform, and who, by coming onboard, improve it. Throughout the country, there are thousands of community institutions that cannot afford the kinds of analytic tools such a platform could provide. But if public media provides it in a way that attracts community institutions to join, the platform improves for all. Below are eight such suggestions for doing so.
Public media and community information analytics. Public media needs an analytic foundation on par with the most sophisticated digital big data platforms. As part of its mission to create the commons of information, it needs to make those tools available to other community groups, such as museums, cultural centers, nonprofit educational centers and others, so that the tools of big data can easily be utilized to connect all with community information relevant to them.
Public media tech talent. Public media, as a collection of nonprofits, is unlikely to attract long-term technology talent sufficient to build, operate and improve the platform. It should, however, take a page from what the federal government did during the Obama administration to assure that it could bring such talent to bear on its information technology issues. First, it created a group, now called the Digital Services Corps, which among other activities, recruited top technologists from leading big data platform companies for term-limited tours of duty with the federal government. The Corps, formed after the Obamacare website debacle, parachutes into different agencies to both prevent such problems, and to identify and implement shared tools and services to address common technical issues and usability challenges across the government. Second, it created a group of White House Innovation Fellows to serve with various agencies in an effort to open up traditional processes to innovation. While public media cannot duplicate these exact White House efforts, it can use similar tools to recruit talent to work on problems for both public media and the broader community information platform.
The civic video hub. Public media should create a national digital video archive for public interest digital content. This would include, for example, historic footage in the public media archive, as well as the federal government’s current and historic video. Public media should collaborate with the Library of Congress to produce the archive, creating the possibility of a one-time appropriation for this purpose. The hub would constitute an eternal treasure for generations of students, teachers, journalists, researchers and other interested citizens. And usage data would provide public media with keen insight into the interests of individuals and large groups. In this regard, it should seek to be “the Netflix for civic-related video.”
Open data warehouse. The federal government has created a website for its open data sets called data.gov. The site currently hosts nearly 200,000 open data sets from federal, state, county and local governments, as well as universities. The site not only offers access to the data, but also provides a set of tools that allow visitors to conduct research, develop web and mobile applications, design data visualizations, and interact with the data in other ways to create new insights, products and services. These data sets will continue to grow.
Most local communities cannot afford the kind of site that hosts data.gov. Public media, however, because of its national and local reach, could aggregate a number of local communities and create a similar kind of warehouse that allows a similar sharing of data, but with an added feature of a more targeted approach to local communities. In addition, it should be built to enable the eventual incorporation of data from the civic internet of things.
Civic cross-community comparison. A related and complementary enterprise would be to create functionalities within the data warehouse that allow communities to compare their performance with other similarly situated ones. While there are outside groups that are always doing “Top Ten” lists for certain attributes, this would allow communities themselves to make comparisons on data they deem most critical.
In this regard, it is worth exploring a partnership with the effort begun by former Microsoft CEO Steve Ballmer, USA Facts, which has as its mission creating a “common set of facts on which even people with opposing points of view can agree.” They do this with a website that provides public access to data that Mr. Ballmer likens to a government 10-K form, by which publicly traded companies disclose their financial situation. It includes data from state and local governments, as well as the federal government. Public media would be an ideal partner to facilitate expanding the site, and helping local groups develop content that then produces insights into the performance of local governments, comparisons with others, and setting agendas for participating communities.
“We the People” petitions. The White House currently hosts “We the People” petitions, which enable individuals to register their views about what the White House should be doing. Under Obama administration rules, if the petitions garnered 100,000 signatures, the administration promised a formal response.
Public media could become the center of similar petitions focused on state, county and local governments. Not only could it host the site, it could also promise that when a certain critical mass of signatures is reached, it would dispatch local reporters to relevant government officials to do a story about potential responses. Again, because of its national and local reach, the public media platform would have the ability to compare both the concerns and the responses across multiple communities.
Contests and challenges. One of the key principles of innovation in Silicon Valley is attributable to Bill Joy, co-founder of Sun Microsystems, who said, “no matter who you are, most of the smartest people work for someone else.” Public media could benefit by inducing those who work for “someone else” to help solve its challenges.
Again, it might be useful to borrow from the Obama administration and others who have used challenges and prizes to persuade top talent to help solve big problems. It was a Defense Advanced Research Projects Agency grand challenge, for example, that stimulated the work now being used to create self-driving cars.
Instead of trying to solve every problem itself, public media could provide incentives for others to join the effort. This could be particularly useful in attempting to reach underrepresented communities, as there are often members of these communities who have the best insight into how platforms should be designed to meet their needs.
Tool kits and self-help guides. As more community information and services go online, there is an increasing need for help navigating information. Unfortunately, most local governments, which provide the bulk of the services that residents depend on, don’t have the motive or scale to invest in providing tool kits that make it easier for people to obtain what they need. If, however, public media becomes a warehouse for all kinds of government data, it might have the scale to produce navigation tools that provide such assistance. Further, by hosting such navigation, it will gain insight – without violating personal privacy rights – into what issues individuals are grappling with and how effective communities are at matching individuals with solutions. One could envision targeted tool kits for different population groups, from students to new parents, from immigrants to individuals with disabilities among others. The platform could also take a page from modern techniques of customer service in which companies offer incentives for customers themselves to help other customers resolve problems.
Partnerships – Another key to a successful digital big data platform will be engaging in partnerships to, among other activities, create content while also growing size and interactions with readers and viewers. There are already many examples of public media partnering with traditional media in creating content. For example, a partnership between NBC-owned Channel 7 in San Diego and the nonprofit website, Voice of San Diego, to produce two regular segments on local issues – San Diego Fact Check and San Diego Explained – was so promising that the owner, Comcast, decided to apply it to four other local NBC stations. Another is when The Washington Post collaborated with the Pulitzer Center on Crisis Reporting, a nonprofit specializing in foreign reporting, to bolster’s the Post’s ability to cover foreign events where it did not have reporters. A third example is The New York Times hiring the Texas Tribune and Chicago News Cooperative to similarly improve its ability to cover other areas.
While the idea of the commercial and nonprofit sectors working together is not new, this model may become more important over time, particularly if federal resources become scarcer. Commercial ventures will focus on more profitable opportunities but still require labor-intensive, and therefore less profitable, coverage that aims to hold enterprises and government accountable. Partnerships offer the commercial entity opportunities for high-quality and low-cost coverage, while public media can benefit from payments, exposure, and if done correctly, more content on their platform. If nonprofits can begin to reliably count on this additional revenue while the commercial sector has a source of high-quality coverage, both sides benefit. While this has happened organically, public policy may be able to remove some hurdles to accelerate the movement. For instance, IRS changes may ease how nonprofit websites facilitate such partnerships. As the FCC streamlines disclosure forms for broadcasters, it could offer a field for stations to describe their partnerships, including with nonprofit news websites.
But as public media tries to build the digital platform, such partnerships move from “good to have” to “must have.” One can easily imagine, for example, partnerships between public media and museums in every community, so when individuals visit an exhibit, they are notified about relevant public media content, and when they access certain content, they are notified when museums have shows relevant to that content. The platform should be structured to provide information and recommendations similar to what Amazon does – in particular, the way in which the more information it obtains, the better it is at helping the customer. Thus, partnerships are vital to providing greater insight into the community and the individuals who live there. But there is a very big difference with Amazon’s platform. Amazon is trying to sell product. Public media and its partners are trying to provide information.
Policy Changes – Having the right technology tools and partners is critical, but so is having the right policy environment. As noted above, while the prospects for more government funding of journalism are nil, for both policy and budgetary reasons, it is more likely that government could be convinced to create a hospitable environment for nonprofit news and information services. In that light, there are several policy initiatives public media should undertake to improve that environment by enabling public media to strengthen existing efforts and take advantage of new opportunities. None of these are essential to building the platform but all would increase the ability of public media to create a more powerful platform. These policy changes include:
Revise Public Broadcast Act to equalize broadcast with digital technologies. Public media, along with its supporters and government officials, should jointly recognize that the governing law has details that reflect a broadcast-centric world. While broadcast remains essential, the law should be reviewed to reflect the opportunity and obligation of public media to engage with the public on multiple nonbroadcast platforms.
Provide greater flexibility for public media and other nonprofit media to create new revenue models and assist others in fundraising. While there is a political divide about funding for public media, there is a political consensus that the nonprofit media sector should have flexibility to use new technologies to create new business models and revenue streams. For example, there is some confusion on whether the FCC restrictions on underwriting or merchandizing applicable to on-air programming apply to the websites created by public media. While the better read is that the restrictions do not apply, it would be helpful if the FCC clarified its position, and in addition, extended the principle that government restrictions will not be applied to any distribution method other than the broadcasting service.
Greater flexibility would also be welcome in the realm of allowing the use of broadcast services for charitable fundraising. Noncommercial broadcasters, including the National Religious Broadcasters, have long advocated allowing noncommercial stations to offer a small amount of airtime to charities and other nonprofits for fundraising. Not only does such fundraising help a charitable mission, having local charities on the air can also help inform residents about issues in their communities, creating the kind of dialogue envisioned when Congress passed the Act. PBS should be part of the coalition urging the FCC to allow public media stations or programmers who are not part of PBS to allocate a set, small percentage of airtime for fundraising efforts to charities and other third-party community nonprofits.
Permit greater flexibility to support multimedia innovations. CPB needs more flexibility to take advantage of new opportunities in a changing marketplace. For instance, it may not make sense to continue the traditional 75/25 funding ratio, where approximately 75 percent of funding goes to TV and 25 percent to radio. In today’s world, every platform has elements of a hybrid; radio stations have video on their websites, TV stations produce audio for theirs, and websites use multimedia for multiple platforms. CPB needs flexibility to award money to multimedia innovators, whether their original platforms were TV, radio or some other form of distribution. While some established recipients might object, greater flexibility would compel all creators to react to changes in the market and make content that appeals to the audience on whatever platform the audience prefers. Such flexibility, for example, could be used by CPB to incentivize public media players to provide more local content and services.
Further, CPB should have more leeway to fund nonprofit media outside of broadcast TV or radio licensees. A nonprofit children’s program that airs on a broadcast station is eligible to receive money, but an equally good nonprofit children’s program that runs only on satellite TV or distributes its programming over the web is not. CPB should consider funding educational programming across a broader range of platforms, and as such programming demonstrates success, find ways to bring it to more platforms.
Amend Copyright Act. As with any content creator, public media requires a copyright framework that balances the rights of content creators and fair use. This balance is particularly important for public media as its mission puts it on both sides of that equation. There are at least three amendments to the Copyright Act that would prove beneficial to public media:
- provide exemptions to public broadcast organizations for online broadcast and distribution of public media.
- enable public media to more easily contribute archival content to national digital archives and grant downstream usage rights.
- encourage copyright holders to grant educational digital rights of use, without prejudging other rights.
As to the first, Section 114(b) of the Copyright Act exempts public broadcasters from having to obtain licenses to use “sound recordings in educational television and radio programs … distributed or transmitted by or through public broadcasting entities,” provided that copies of such programs “are not commercially distributed … to the general public.”  When the law was passed, television and radio were the only two forms of distribution available to public broadcasters. Today, public broadcasters distribute materials over a range of platforms and models, including online streaming, podcasts, DVDs, video-on-demand, and apps on mobile devices, among others. The law could be seen as providing public broadcasters with a broad exemption, but some of the new platforms are not clearly within the scope of the statutory exemption. This causes public media producers to incur the time and cost of seeking permissions and paying license fees to the owners of sound recordings, effectively eliminating the intended benefit of the exemption. Public media should advocate for extending the original exemption to today’s media ecosystem.
As noted earlier, public media should create a civic video hub as part of its effort to create a public media digital platform. This is not a new idea; public television has tried to create such a hub in the past. Unfortunately, it ran into various difficulties obtaining approvals from property rights holders related to materials in the videos. The solution, which public media and others should join in advocating, is having Congress amend the Copyright Act to enable public media to more easily contribute their own archival material to that digital civic video hub. In addition to clearing the way for granting upstream rights for this purpose, Congress should also grant the public reasonable noncommercial downstream rights to all material in the hub, assuring the content is open and accessible to all noncommercial uses.
As for the third suggested amendment, the public media platform could provide a wealth of materials to use for educational purposes. Unfortunately, copyright laws have not kept pace with technology. The current laws have the unintended effect of limiting beneficial uses of copyrighted materials for educational purposes, particularly with respect to digital content and online learning. Because of the nature of historic materials, it is often difficult to identify rights holders and obtain the required authorizations. In one famous example, a powerful film series documenting Martin Luther King Jr.’s efforts to end segregation could not be shown or distributed because of legal complications related to copyright. In another example, copyright disputes have raised barriers for lower cost e-textbooks that have a text-to-speech feature particularly valuable for the hearing-impaired. There are a number of fixes to this last set of copyright problems. For example, Congress could provide a new framework that makes it easier to secure permission from copyright holders for use in educational and nonprofit settings. It could direct the Register of Copyrights to create a copyright notice that allows copyright owners to easily and clearly authorize educational or nonprofit uses without impairing other rights, or Congress could update the Technology, Education and Copyright Harmonization Act of 2002 (the TEACH Act), to clearly allow the use of content for educational purposes, including distance learning and online environments, without prejudicing other rights the copyright holder enjoys.
Adjust spectrum policy to facilitate technology upgrades and market flexibility. Public media continues to be a significant owner of spectrum. The recent incentive auction provided an opportunity for public broadcasters to readjust their assets, essentially trading spectrum for financial capital. It is unlikely that this opportunity will arise again in the next few years, although it is certainly possible that early in the next decade, the FCC will initiate another process to determine whether it should hold an additional incentive auction.
There is, however, another spectrum proceeding that will affect television broadcasters. This one, sometimes referred to as ATSC (Advanced Television Systems Committee) 3.0, is designed to create a new broadcast transmission standard that, in addition to improving the picture quality, provides broadcasters increased opportunities to utilize and monetize spectrum. Its backers suggest that, among other things, ATSC 3.0 will allow broadcasters to seamlessly combine broadcast programming and broadband content, enable personalization of internet content and targeted advertising, and compete with mobile carriers in offering mobile services. One should have a high degree of skepticism about any assertion that changes in a transmission standard are likely to empower broadcasters to compete with Google and Facebook with respect to advertising, or AT&T, Verizon, T-Mobile and Sprint with mobile services, but from a public media perspective, the change may well deliver functionalities that aid in its mission of community information and enrichment, and might be particularly beneficial to public media as it builds its digital platform.
The five decades since the passage of the Public Broadcast Act of 1967 have undone many institutions and disrupted many previously held beliefs. They have not, however, done anything to diminish the central insight of the Act: that the nation and all its communities would benefit from having a commons of information, and institutions organized to build and operate that commons that are designed not for profit, but for community service.
This insight alone, however, is not sufficient for public media to thrive in the decades ahead. The broadcast world that was its place of origin is no longer sufficient to fulfill its mission. It must reorient the commons to construct a platform for community collaboration and information exchange. It is well situated to do so. No other enterprise has its set of assets. The world of big bandwidth, big data and big media create big challenges, but the opportunities to ensure that the vision of 50 years ago thrives for another 50 are far greater.
Blair Levin serves as a non-resident Senior Fellow of the Metropolitan Policy Project of the Brookings Institute. He also serves as the Executive Director of Gig.U: The Next Generation Network Innovation Project, an initiative of three dozen leading research university communities seeking to accelerate the deployment of next generation networks. He also serves as a consultant to the investment community and to numerous small communications enterprises.
From 2009-2010, Mr. Levin oversaw the development of the FCC’s National Broadband Plan. FCC Chairman Tom Wheeler has praised Mr. Levin’s work, noting “no one’s done more to advance broadband expansion and competition through the vision of the National Broadband Plan and Gig.U.” Prior to his work on the National Broadband Plan, Mr. Levin worked as an equity analyst at Legg Mason and Stifel Nicolaus. Barron’s Magazine noted his work, “has always been on top of developing trends and policy shifts in media and telecommunications … and has proved visionary in getting out in front of many of today’s headline making events.” He is the co-author, with Reed Hundt, of “The Politics of Abundance” (2012) and, with Denise Linn, of “The Next Generation Connectivity Handbook: a Guide for Community Leaders Seeking Affordable, Abundant Bandwidth” (2014) as well as numerous articles on telecommunications policy. From 1993-1997 Levin served as Chief of Staff to FCC Chairman Reed Hundt. Previously, Mr. Levin had practiced law in North Carolina, where he represented new communications ventures, as well as local governments. He is a graduate of Yale College and Yale Law School.
Timothy CarneyVisiting Fellow, American Enterprise Institute
Mike GonzalezSenior Fellow, Heritage Foundation
Melody Kramer and Betsy O’DonovanWikimedia Foundation and The Daily Tar Heel (respectively)
 Carnegie Commission on the Future of Public Broadcasting, the Report and Recommendations of the Carnegie Commission on Educational Television: Public Television, A Program for Action (1967).
 Carnegie Commission on the Future of Public Broadcasting, the Report and Recommendations of the Carnegie Commission on Educational Television: Public Television, A Program for Action (1979) (Carnegie II), p. 297
 The Public Broadcasting Act of 1967, 47 U.S.C. Section 396.
 Statement of President Lyndon Johnson: Remarks Upon Signing the Public Broadcasting Act of 1967. http://www.presidency.ucsb.edu/ws/?pid=28532
 Nielsen NPower, 9/21/2015-9/18/2016
 Many different data points demonstrate public media’s superiority in these realms. Among the most recent (2017), PBS received 46 Emmy nominations for news and documentary programming, the most earned by any organization; PBS kids programming is the most trusted kids educational programming by parents with a 62% level of trust, nearly six times more than the second place programming which had 11%; and PBS reaches more kids from low-income and Hispanic families than any other network.
 Pew Project for Excellence in Journalism, Pew Research Center, The State of the News Media 2009: An Annual Report on American Journalism 3 (2009)
 See, for example, this article on how CGI and AI are going to make the problem worse. www.businessinsider.com/cgi-ai-fake-news-videos-real-2017-7?nr_email_referer=1&utm_source=Sailthru&utm_medium=email&utm_content=TechSelect&pt=385758&ct=Sailthru_BI_Newsletters&mt=8&utm_campaign=BI%20Tech%20%28Wednesday%20Friday%29%202017-07-28&utm_term=Tech%20Select%20-%20Engaged%2C%20Active%2C%20Passive%2C%20Disengaged
 The Fairness Doctrine was a long-standing FCC policy requiring broadcast licensees to present issues of public policy in an equitable and balanced manner. The FCC eliminated the Doctrine in 1987.
 The Public Interest Standard requires broadcasters to act as a fiduciary for the public and operate the broadcast station to serve public needs. Broadcasters have argued, and the current FCC is likely to agree, that broadcasters themselves should be the judge of what the public interest is, effectively replacing a public interest standard with one in which broadcasters are free to do what is in their corporate interest and offer public programming in the same way that other corporations engage in acts of corporate social responsibility.
 This, of course, was the starting question of the seminal Knight Commission report, “Informing Communities: Sustaining Democracy in the Digital Age.” https://www.aspeninstitute.org/programs/communications-and-society-program/the-knight-commission-on-information-needs-of-communities-in-a-democracy/. The Knight Commission greatly influenced the National Broadband Plan and the Waldman Report on the Information Needs of Communities, both published in subsequent years. All three had a significant influence on the thinking of this paper.
 Mary Meeker, Internet Trends 2017, Slide 9, http://www.kpcb.com/internet-trends
 Reuters Institute Digital News Report, 2017, pp. 12-13.
 Over-the-Top video refers to video delivered over the internet, in contrast to broadcast, cable or satellite. While Netflix is the best known, it also includes such services as YouTube, Amazon Prime, Sling TV, Go90 and many others.
 It should be noted, however, that while television is universal, the actual number of televisions per home is declining, as people, particularly younger people, spend more time with mobile devices. https://www.eia.gov/todayinenergy/detail.php?id=30132&src=%E2%80%B9%20Consumption%20%20%20%20%20%20Residential%20Energy%20Consumption%20Survey%20(RECS)-f2
 While the federal government has a program, called Lifeline, designed to remove financial barriers to internet access for low-income people, it has not yet done so, and it is uncertain whether the program will be funded or structured adequately to achieve that purpose.
 The incentive auction, completed in 2017, allowed broadcasters to turn in all or part of their broadcast spectrum and share in the proceeds of the subsequent sale of the spectrum to a number of wireless broadband providers.
 Long tail markets refer to smaller, niche markets, in contrast to markets dependent on big, mass-market hits. Internet distribution, and the ability of marketers to more finely tune market segments, creates opportunities for long-tail marketing that did not exist in the pre-internet market. The seminal article on long tail markets can be found at https://www.wired.com/2004/10/tail/. While that article has not necessarily been proven accurate as a matter of where the market will invest and make profit (see, for example, http://www.newstatesman.com/2014/01/long-tail-cut-short-economics-blockbuster-capitalism) its insight about the ability to reach new long tail markets has been borne out.
 See, for example, the PBS case study done by Google Analytics that can be found at: https://www.google.com/analytics/partners/img/company/5194739590627328/gacp/5741031244955648/service/5724160613416960/assets/5698497110081536
 See, for example, https://www.theatlantic.com/entertainment/archive/2014/11/clickbait-what-is/382545/
 As of this writing, AT&T has announced its FTTP service (originally called GigaPower) in 51 metro areas, including San Francisco, with plans to reach 67. The reach of the fiber network in each city is not uniform. AT&T regards it as a success, noting that 30% of their subscribers take the highest level of service, where available, and that their market share is 9% higher where they offer a fiber-based service than in communities where they still depend on copper-based digital subscriber line services. CenturyLink followed up with their own gigabit fiber service, which is now available in 11 cities. Cable’s upgrade strategy, referred to as DOCSIS 3.1, has the advantage of being cheaper and faster to accomplish. Rather than replace its network conduit, which the telcos have to do (from copper to fiber), cable does not have to dig up streets or replace wires. Rather, it upgrades its network by reallocating an additional 28 of their 6MHz channels from video to data, and switching out the customers’ set-top boxes. This provides cable with the ability to offer an asymmetric service of approximately 1.2 gigs down and 300 mbps up. Cable anticipates rolling out this service nationwide over the next several years. A study released in September 2017 found that 57.5 million Americans had access to a gigabit service. http://www.telecompetitor.com/gigabit-report-57-5-million-americans-now-in-gigabit-reach-chicago-and-california-lead/
 PBS Turns to Google Analytics 360 and Google Cloud Platform to Deepen Audience Understanding, found at: https://www.google.com/analytics/partners/img/company/5194739590627328/gacp/5741031244955648/service/5724160613416960/assets/5698497110081536
 The Texas Tribune data page can be found at: https://www.texastribune.org/data/
 For example, while numerous cities already have security cameras and gunshot recognition sensors, developing technologies are enabling such cameras and sensors to automatically detect unusual activities and to develop a rapid response, resulting in a 10% to 30% decrease in crime.
 The fear was not without justification given how the Obama DOJ and FCC rejected the AT&T/T-Mobile and Comcast/Time-Warner Cable mergers and the nascent Sprint/T-Mobile merger.
 Questions have arisen about the fate of the AT&T/Time Warner Entertainment deal. It is uncertain at the time of this writing whether the reported DOJ objections are grounded in a more aggressive antitrust policy than expected or are really a function of the President’s displeasure with CNN’s coverage of the Administration. Wall Street will recalibrate when the legal sitituation around that deal resolves itself but it still expects one or more additional deals involving, among other things, cable consolidation, cable/wireless integration, wireless consolidation, content/ISP consolidation and edge/content/distribution. As an example of the current thinking that perhaps no deal is too large to contemplate, the equity research group at Citi recently suggested that Comcast, the largest cable and broadband provider, should buy Verizon, the largest mobile services provider. https://www.cnbc.com/2017/07/26/comcast-should-buy-verizon-for-215-billion-citi-analyst-says.html
 The current law imposes a cap on any individual enterprise owning television broadcast licenses that they cannot own licenses reaching more than 39% of the national population. The UHF discount, however, only counts UHF licenses as reaching 50% of the actual population capable of receiving the signal. It was adopted before more signals were delivered by cable and satellite. The Obama FCC eliminated the discount. The Trump FCC has reinstated it.
 As noted above, no individual entity can own licenses that reach more than 39% of the national population. The FCC is reconsidering all the rules limiting broadcast ownerships. http://www.fiercecable.com/broadcasting/fcc-officially-puts-media-ownership-rules-review-docket
 The first one up will be the Sinclair/Tribune deal, one that will create a footprint reaching over 70% of the population, but under the UHF discount rules will just be above 40%, thus involving minimal divestitures. Wall Street expects the deal to be approved and other broadcast deals will follow.
 Already, Discovery has proposed buying Scripps, having rebuffed an offer from Viacom in the process, suggesting that Viacom would also like to get bigger and that consolidation among pure content assets has not yet run its course. With AT&T’s purchase of Time Warner Entertainment, ISPs without content assets may feel they have to bulk up in similar ways.
 Michael Nathanson, quoted in Communications Daily, August 1, 2017, p. 3.
 “The Information Needs of Communities,” (2011) which can be found at https://apps.fcc.gov/edocs_public/attachmatch/DOC-307406A1.pdf
 A recent poll suggested that 45% of Republicans favor shutting down media outlets that are “biased or inaccurate.” That topic is both scary and beyond the scope of this paper but it suggests that government funding of pure journalism, as opposed to more generic public media, opens a Pandora’s Box of problems. See https://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/u4wgpax6ng/econTabReport.pdf at p. 98.
 The Act did envision public media as offering more than news and information, such as various cultural offerings, and CBP and its affiliates have successfully done so. This paper does not mean to suggest that it should cease to do so but it does mean to argue that given the changes in markets and technologies, as well as public media’s position in the media ecosystem, it is in the public’s and public media’s interest, to allocate more resources to serve as the big data platform for community news and information.
 For example, Amazon is so powerful that companies have to develop ad strategies designed explicitly for the Amazon platform. https://www.nytimes.com/2017/07/31/business/media/amazon-advertising.html
 While some who see Amazon as a threat to the economy might object to the metaphor, my point is that Amazon brings tremendous efficiency and convenience to the task of shopping, and that public media has the opportunity to do the same for community news and information, without the cutthroat effect on the product pricing of others.
 Another example of this kind of information can be found on the Missouri Dashboard, a project of the Missouri State Treasurer, which allows comparisons between the economic metrics of the state and the country, as well as comparisons on a county-by-county basis. The dashboard can be found at https://treasurer.mo.gov/economicdashboard/
 While the critical mass for the White House to respond under the Obama administration was 100,000 signatures, a critical mass for public media’s petitions would be a percentage of the registered voters in the governmental jurisdiction being petitioned.
 These examples, and a myriad of others, can be found in a filing that was part of the National Broadband Plan proceeding by Ellen P. Goodman and Anne Chen entitled, “Digital Public Media Networks to Advance Broadband and Enrich Connected Communities,” (November 6, 2009).
 17 U.S.C. Section 114 (b)
 17 U.S.C. Sections 110(2), 112(f). Generally, the TEACH Act updated the copyright laws to enable teachers to use copyrighted material in nonclassroom situations, such as with distance learning.