Fact checkers globally offer alternatives to the flood of lies and misinformation
Why write about “reasons for optimism” when so much is going wrong in our politics, economy, and environment? Optimism gives us confidence that we can make things better. As I like to say, it’s another day of opportunity.
Much of what I read about misinformation on the internet depresses me. The volume of misleading content and some people’s acceptance of it are a matter of life and death during this pandemic. At times I feel that journalists producing public-service information might be losing the battle.
In addition, I believe that some of the progressive movement’s attempts to discredit misinformation adopt a condescending air toward people inclined to believe the distortions and lies. The suggestion is that these misguided people need to be educated so that they think more like us. That is no way to change their minds.
Related: Why I avoid reading (some of) the news.
Creative solutions and collaborations
However, my spirits are lifted by what I see as inspired solutions to the problem. They rely mainly on presenting quality alternatives — tools people can use to find trustworthy sources of news and information. Extremists on both ends of the spectrum are probably unreachable. These tools benefit people who have questions and are seeking answers.
One of those initiatives comes from, NewsGuard, which made its reliability ratings available to 7 million library patrons last year. NewsGuard, which is a for-profit company, offers its services free to schools and public libraries. That includes donating media literacy resources to schools at all levels and making its browser extension available free on the computers of more than 800 public libraries globally. The browser extension indicates the trustworthiness of the information on the site that the patron has called up — green is mostly trustworthy, red is mostly not.
NewsGuard’s transparency about how the ratings are calculated models a best practice. Each rating includes names and contact information for all those involved in writing and editing the rationale for the rating. It also provides the rated media with a chance to dispute NewsGuard and to add their own comments to the rating.
Lies that go viral
The big question facing ethical public-service news outlets is how to respond to post-truth politics, an era in which lies go viral: Russian trolls organize events on social media attended by thousands of unwitting Americans, and Facebook provides a platform for Buddhist ultranationalists in Myanmar to harass Muslims and lead to the killing of thousands, according to Gabriele Cosentino in her book “Social media and the post-truth world order: the global dynamics of disinformation.”
We are living with deliberate lies, rumors, and conspiracy theories designed to undermine the credibility of democratic institutions and the media that hold them accountable
Related: The Truth Sandwich: how to cover falsehoods from official sources.
I am not totally convinced of the effectiveness of debunking lies by politicians and media manipulators. Rumors and lies are designed to be sexy clickbait and go viral. Debunking, by contrast, is necessarily slower and un-sexy, the opposite of viral.
Media literacy and ‘prebunking’ misinformation
I have more confidence in the effectiveness of media literacy programs, partly because they represent a grass-roots effort that respects the intelligence of the public to identify trustworthy sources. Media literacy training shows people how to do their own analysis of sources and their credibility. They can help debunk the lies.
— First Draft News describes its mission this way: “We work to empower people with knowledge and tools to build resilience against harmful, false, and misleading information.” First Draft offers training in “prebunking”, one of whose elements is to show people how to identify the techniques used by media manipulators. Researcher John Cook has developed a series of techniques (below) for prebunking, which research has shown to be effective against misinformation.
— Agência Lupa, which began as a fact-checking service for big media in Brazil, discovered an appetite for news literacy programs among business people and media development organizations. These organizations wanted to show the public how to identify disinformation or misinformation promoted by the right-wing government. And Agência Lupa discovered that these organizations were willing to sponsor training, so the team pivoted in that direction.
— Research published by the Harvard Kennedy School’s Misinformation Review (and reported on by Laura Hazard Owen at Nieman Lab) suggests that debunking misinformation is far less effective than “inoculating” people with a weak form of misinformation. Here is how inoculation works: “First, it includes an explicit warning about the danger of being misled by misinformation. Second, you need to provide counterarguments explaining the flaws in that misinformation.” This sounds a lot like prebunking.
— The Poynter Institute’s International Fact-Checking Network announced in December that it had expanded with the addition of two staff positions, a program officer and a community and impact manager. Poynter’s announcement stated: “The new roles further Poynter’s commitment as a central hub for accountability journalism with three groundbreaking fact-checking enterprises: the International Fact-Checking Network, the Pulitzer Prize-winning PolitiFact, and a digital-first initiative called MediaWise, which empowers teenagers, first-time voters and seniors to tell fact from fiction online.”
— Trusting News is a nonprofit dedicated to educating journalists on how to build trust with their audiences using engagement and transparency strategies. It offers free courses and trainings. “We firmly believe that responsible, ethical journalists should not succumb to the national rhetoric around ‘the media’ and instead need to invest in telling the story of their own work and why it’s valuable and trustworthy.”
Again, the emphasis is on accountability journalism and educating people.
Bottom up vs. top down
It is popular these days to argue for more regulation of internet platforms such as Google, Facebook, and other social media. The idea is to make the platforms responsible for eliminating content that is racist, sexist, exploitative, dangerously misleading, or otherwise objectionable. But this path has its own challenges.
More regulation as a solution supposes that algorithms and armies of censors will be able to police the millions of pieces of content uploaded each minute. But in a free society based on open discussion of differences, putting censorship in the hands of the government or powerful corporations brings its own dangers.
Over the long haul, I put my trust in giving people tools that they can use to identify the manipulations of extremists. The extremists themselves — either left or right, liberal or conservative — are not the target for these programs.
The target is and should be people who are genuinely seeking answers to complex questions. And the perspective should be long term. Facts about complex problems are not usually sensational, and it takes time to understand them. We need patience and commitment. Many organizations are collaborating to restore trust. It’s another day of opportunity for us to join the effort.
Related:
Reasons for optimism #1: Andrew Yang
Reasons for optimism #2: Edwy Plenel of Mediapart
Reasons for optimism #3: Journalists collaborating around the world
Reasons for optimism #4: José Luis Orihuela’s ‘Digital Cultures’
Reasons for optimism #5: SembraMedia’s discoveries
Reasons for optimism #6: Jared Diamond and other cautious optimists
Coming soon:
Reasons for optimism #8: Marketing trustworthy information
Reasons for optimism #7: The movement for trustworthy information