20 projects will address the spread of misinformation through Knight Prototype Fund – Knight Foundation
Journalism

20 projects will address the spread of misinformation through Knight Prototype Fund

In March Knight Foundation, along with partners the Democracy Fund and the Rita Allen Foundation, launched an open call for ideas around the question: How might we improve the flow of accurate information? As part of a larger initiative centered on trust in journalism, we were seeking projects that could be quickly built and to test ideas that respond to the challenges affecting the health of our news ecosystem and ultimately our democracy.

Today, timed with the Investigative Reporters and Editors conference in Phoenix, we are announcing support for 20 projects aimed at combating the spread of misinformation online and increasing trust in journalism. The winning projects will receive a share of $1 million through the Knight Prototype Fund, a program focused on iterative and human-centered approaches to solving difficult problems. Along with the larger application pool of over 800 ideas, the projects align with three broad themes:

  • Citizen journalism/News engagement: Ideas in this area explore models to involve the public in the news gathering process. From experiments to deeply embed journalists in the community, to efforts to enlist citizens in the gathering of facts, these projects aim to increase trust in journalism by demystifying the processes by which news comes to be.
  • Media/News/Information Literacy: The Internet has opened up the availability of limitless information to millions of people, however the skills to critically analyze information for indicators of quality and truthfulness is underdeveloped for many Americans. Projects in this area aim to educate people to be discerning news consumers.
  • Fact Checking: From the use of computers to identify facts and debunk false information, to assigning quality scores to news, to tracking the spread of information on the Internet, these projects aim to find ways to ensure accuracy and address the spread of misinformation. In addition to educating news consumers, many projects in this area also target the platforms and advertising networks that aid the spread of misinformation.
The winners of the 2017 Knight Prototype Fund open call completed LUMA human-centered design training in Phoenix this week.

The winning projects:

Breaking filter bubbles in science journalism by the University of California, Santa Cruz (Project lead: Erika Check Hayden | Santa Cruz, California @Erika_Check, @UCSC_SciCom): Producing visually-engaging science journalism around topics such as climate change and genetics, to determine whether content delivered by a trusted messenger in a culturally-relevant context has greater reach. The articles will be tested through the digital platform EscapeYourBubble.com, which distributes curated content to users across ideological divides.

Calling Bullshit in the Age of Fake News by the University of Washington (Project lead: Jevin West | Seattle @jevinwest, @UW_iSchool): Developing a curriculum and set of tools to teach students and the public to better assess quantitative information and combat misinformation, with a particular emphasis on data, visualizations and statistics.

ChartCheck by Periscopic (Project lead: Megan Mermis | Portland, Oregon | @periscopic): Addressing the spread of misinformation through charts, graphs and data visualizations by fact-checking these resources and publishing results. The team will also build tools to evaluate the spread of these charts on social media and the Internet.

Crosscheck by Vanderbilt University in collaboration with First Draft (Project leads: Lisa Fazio and Claire Wardle | Nashville, Tennessee | @lkfazio, @cward1e, @firstdraftnews, @crosscheck): Using design features to make correct news more memorable, so that people can recall it more easily when faced with false information, using a platform initially developed in France to address misinformation around the country’s election. 

Facts Matter by PolitiFact (Project lead: Aaron Sharockman | St. Petersburg, Florida | @asharock, @PolitiFact): Helping to improve trust in fact-checking, particularly among people who identify as conservative, through experiments including in-person events; a mobile-game that tracks misconceptions about specific facts; diverse commentators who would assess fact-checking reports; and a study of the language used in these reports to determine their effect on perceptions of trustworthiness.

Glorious ContextuBot by Bad Idea Factory (Project lead: Daniel Schultz | Philadelphia | @biffud, @slifty): Helping people become better consumers of online audio and video content through a tool that provides the original source of individual clips and identifies who else has discussed it on the news.

Hoaxy Bot-o-Meter by Center for Complex Networks and Systems Research (Project leads: Filippo Menczer and Valentin Pentchev | Bloomington, Indiana | @Botometer, @truthyatindiana, @IUNetSci): Developing a tool to uncover attempts to use Internet bots to boost the spread of misinformation and shape public opinion. The tool aims to reveal how this information is generated and broadcasted, how it becomes viral, its overall reach and how it competes with accurate information for placement on user feeds.

Immigration Lab by Univision News (Project lead: Ronny Rojas | Miami | @ronnyrojas, @UniNoticias): Engaging undocumented immigrants on issues that affect their lives by creating a reliable news resource to help them access and gather information. The project team will do on-the-ground research in communities with a high percentage of undocumented immigrants and learn about their media literacy skills, news consumption habits and needs, and trusted information sources.

KQED Learn by KQED (Project lead: Randall Depew | San Francisco | @randydepew, @KQEDEdSpace): Encouraging young people to ask critical questions that deepen learning and improve media literacy through KQED Learn, a free online platform for students and teachers that reveals ways to ask good questions, investigate answers and share conclusions.

Media Literacy @ Your Library by American Library Association in collaboration with the Center for News Literacy (Project lead: Samantha Oakley | Chicago | @ALALibrary, @NewsLiteracy): Developing an adult media literacy program in five public libraries, including a series of online learning sessions, resources and an in-person workshop to train library workers to help patrons become more informed media consumers.

News Inequality Project (Project leads: Hamdan Azhar, Cathy Deng, Christian MilNeil, and Leslie Shapiro | Portland, Maine | @HamdanAzhar, @cthydng, @c_milneil, @lmshap, @pressherald): Developing a web-based analytics dashboard to help media organizations and community organizers understand how – and how often – different communities are covered in news outlets over time.

News Quality Score Project (Project lead: Frederic Filloux | Palo Alto, California | @filloux): Creating a tool to surface quality journalism from the web, at scale and in real-time, through algorithms and machine learning. The tool will evaluate and score content on criteria ranging from the notoriety of authors and publishers to an analysis of various components of the story structure.

NewsTracker.org by PBS NewsHour and Miles O’Brien Productions (Project lead: Cameron Hickey | Washington, D.C. | @cameronhickey, @newshour): Developing a tool that combines online news content with engagement data from social media and other sources to help journalists and others better understand the scale, scope and shape of the misinformation problem. The tool will enable content analysis by gathering data about what is being written, by whom, where it is distributed, and the size of the audience consuming it.

Putting Civic Online Reasoning in Civics Class by Stanford History Education Group, Stanford University (Project lead: Sam Wineburg | Palo Alto, California | @SHEG_Stanford, @samwineburg): Creating professional development resources for teachers to become better consumers of digital content, in addition to classroom-ready materials that they can use to help students find and assess information online.

Social Media Interventions by Boston University (Project leads: Jacob Groshek and Dylan Walker | Boston | @jgroshek, @EMSatBU, @dylanwalker): Experimenting with the effectiveness of combatting the spread of misinformation through real-time online interventions, such as direct messages to users who post or share false information.

The Documenters Project by City Bureau (Project lead: Darryl Holliday | Chicago | @d_holli, @city_bureau): Strengthening local media coverage and building trust in journalism by creating an online network of citizen “documenters” who receive training in the use of journalistic ethics and tools, attend public civic events and produce short summaries that are posted online as a public resource.

Veracity.ai (Project lead: Danny Rogers | Baltimore): Helping to curb the financial incentives of creating misleading content with automatically-updated lists of “fake news” websites and easy-to-deploy tools that allow ad buyers to block, in bulk, the domains where misinformation is propagated.

Viz Lab (Project leads: Susie Cagle, Caroline Sinders and Francis Tseng | San Francisco | @susie_c, @carolinesinders, @frnsys): Developing a dashboard to track and visualize images and ‘memes,’ as common sources of fake news, to enable journalists and researchers to more easily understand the origins of the image, its promoters and where it might have been altered and then redistributed.

Who Said What by Joostware (Project lead: Delip Rao | San Francisco | @deliprao, @joostware): Helping people more easily fact-check audio and video news clips with a search tool that annotates millions of these clips and allows users to explore both what is said and the identity of the speaker.

Technical Schema for Credibility by Meedan in collaboration with Hacks/Hackers (Project lead: Xiao Mina | San Francisco | @anxiaostudio, @meedan, @hackshackers): Creating a clear, standardized framework to define the credibility of a piece of content, how conclusions about its credibility were reached, and how to communicate that information effectively.