icon caret-left icon caret-right instagram pinterest linkedin facebook twitter goodreads question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

Writers and Editors (RSS feed)

Fake news and media literacy

gathered by Pat McNees, originally published December 2016


Youth Voice, Authorship, & Democracy: Unpacking Media Literacy with Dr. Renee Hobbs (Sara Falluji, Beyond the Classroom, KSTV, The New Edu, 7-19-23) The "fake news crisis and the rise of disinformation have fueled a lot of interest in media literacy, because we've got misinformation being shared on social media and political campaigns. We've got misinformation about vaccines. We've got misinformation about climate change; about the economy. We've got misinformation intentionally spread for political purposes in ways that are harmful to people's understanding of reality.
      "In the context of school and schooling, media literacy comes in through the changing nature of literacy– literacy is not just reading and writing. It's speaking and listening and creating media and analyzing media. More and more English teachers are recognizing that it's as important to critically analyze a film as it is to critically analyze a work of literature. It's as important to critically analyze an article in The New York Times as it is to critically analyze a work of photojournalism."

How social media took us from Tahrir Square to Donald Trump (Zeynep Tufekci, MIT Technology Review, 8-14-18) To understand how digital technologies went from instruments for spreading democracy to weapons for attacking it, you have to look beyond the technologies themselves. “The problem is that when we encounter opposing views in the age and context. It’s like hearing them from the opposing team while sitting with our fellow fans in a football stadium. Online, we’re connected with our communities, and we seek approval from our like-minded peers. We bond with our team by yelling at the fans of the other one.” In his analysis, Tufekci explains five factors. [I urge you to read the full article, and risk exceeding "fair use" here because this long article spells out so clearly weaknesses in our digital world that can and have been exploited.] 

     1. The euphoria of discovery: social media as "breaking down what social scientists call 'pluralistic ignorance'—the belief that one is alone in one's views when in reality everyone has been collectively silenced. That, I said, was why social media had fomented so much rebellion: people who were previously isolated in their dissent found and drew strength from one another."

     2. The audacity of hope: "Barack Obama's election in 2008 as the first African-American president of the United States had prefigured the Arab Spring's narrative of technology empowering the underdog....There were laudatory articles about Barack Obama's use of voter profiling and microtargeting....[but] microtargeting, especially on Facebook, could be used to wreak havoc with the public sphere.... It was true that social media let dissidents know they were not alone, but online microtargeting could also create a world in which you wouldn't know what messages your neighbors were getting or how the ones aimed at you were being tailored to your desires and vulnerabilities."

     3. The illusion of immunity: "There doesn't seem to have been a major realization within the US's institutions—its intelligence agencies, its bureaucracy, its electoral machinery—that true digital security required both better technical infrastructure and better public awareness about the risks of hacking, meddling, misinformation, and more. The US's corporate dominance and its technical wizardry in some areas seemed to have blinded the country to the brewing weaknesses in other, more consequential ones."

     4. The power of the platforms: "In that context, the handful of giant US social-media platforms seem to have been left to deal as they saw fit with what problems might emerge. Unsurprisingly, they prioritized their stock prices and profitability. Throughout the years of the Obama administration, these platforms grew boisterously and were essentially unregulated. They spent their time solidifying their technical chops for deeply surveilling their users, so as to make advertising on the platforms ever more efficacious. In less than a decade, Google and Facebook became a virtual duopoly in the digital ad market." Discussion of how digital tools have figured significantly in political upheavals around the world in the past few years, and how a reality TV star came along and took advantage of Twitter.

     5. The lessons of the era

     "First, the weakening of old-style information gatekeepers (such as media, NGOs, and government and academic institutions), while empowering the underdogs, has also, in another way, deeply disempowered underdogs. Dissidents can more easily circumvent censorship, but the public sphere they can now reach is often too noisy and confusing for them to have an impact...The old gatekeepers blocked some truth and dissent, but they blocked many forms of misinformation too

      "Second, the new, algorithmic gatekeepers...make their money by keeping people on their sites and apps...succeed by fueling mistrust and doubt, as long as the clicks keep coming.

      "Third, the loss of gatekeepers has been especially severe in local journalism....The Russian operatives who created fake local media brands across the US either understood the hunger for local news or just lucked into this strategy. Without local checks and balances, local corruption grows and trickles up to feed a global corruption wave playing a major part in many of the current political crises.

       "Fourth, "While algorithms will often feed people some of what they already want to hear, research shows that we probably encounter a wider variety of opinions online than we do offline, or than we did before the advent of digital tools....Online, we're connected with our communities, and we seek approval from our like-minded peers....In sociology terms, we strengthen our feeling of "in-group" belonging by increasing our distance from and tension with the "out-group"—us versus them. Our cognitive universe isn't an echo chamber, but our social one is....This is also how Russian operatives fueled polarization in the United States..."

       Fifth, "Russia exploited the US's weak digital security—its "nobody but us" mind-set—to subvert the public debate around the 2016 election. The hacking and release of e-mails from the Democratic National Committee and the account of Clinton campaign manager John Podesta amounted to a censorship campaign, flooding conventional media channels with mostly irrelevant content."  

‘Belonging Is Stronger Than Facts’: The Age of Misinformation (Max Fisher, The Interpreter, NY Times, 5-7-21) Social and psychological forces are combining to make the sharing and believing of misinformation an endemic problem with no easy solution. Have you heard that President Biden plans to force Americans to eat less meat; that Virginia is eliminating advanced math in schools to advance racial equality; and that border officials are mass-purchasing copies of Vice President Kamala Harris’s book to hand out to refugee children? Some believe the drivers of misinformation today are social and psychological forces that make people prone to sharing and believing misinformation in the first place.
      "People become more prone to misinformation when three things happen.

      "First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems....In times of perceived conflict or social change, we seek security in groups. And that makes us eager to consume information, true or not, that lets us see the world as a conflict putting our righteous ingroup against a nefarious outgroup."
      The second factor: "the emergence of high-profile political figures who encourage their followers to indulge their desire for identity-affirming misinformation. After all, an atmosphere of all-out political conflict often benefits those leaders, at least in the short term, by rallying people behind them."
      The "third factor — a shift to social media, which is a powerful outlet for composers of disinformation, a pervasive vector for misinformation itself and a multiplier of the other risk factors.... “When you post things, you’re highly aware of the feedback that you get, the social feedback in terms of likes and shares,” Dr. William J. Brady said. Research demonstrates that people who get positive feedback for posting inflammatory or false statements become much more likely to do so again in the future. “You are affected by that."

­Most Americans favor restrictions on false information, violent content online (Christopher St. Aubin and Jacob Liedke, Pew Research, 7-20-23) Most Americans say the U.S. government and technology companies should each take steps to restrict false information and extremely violent content online. But there is more support for tech companies moderating these types of content than for the federal government doing so, according to a new Pew Research Center survey. This increase in support comes amid public debates about online content regulation and court cases that look at how tech companies moderate content on their platforms. And yet:

Big Tech rolls back misinformation measures ahead of 2024 (Sara Fischer, Axios,6-6-23) Ahead of the 2024 election cycle, the world's largest tech companies are walking back policies meant to curb misinformation around COVID-19 and the 2020 election.

    YouTube last week confirmed that it will reverse its election integrity policy to leave up content that says fraud, errors or glitches occurred in the 2020 presidential election.

    Meta reinstated the Instagram account of Robert F. Kennedy Jr., who was removed from the platform in 2021 for posting misinformation about COVID.

    Kathleen Hall Jamieson, director of the Annenberg Public Policy Center and founder of Factcheck.org, argued that with a few exceptions — including health threats and real-time incitement of violence — fact-checking is a stronger antidote to misinformation than blocking speech. The best solution, she argues, is to "flood the zone with the best available information, make sure that when the misinformation gets up there, you've got corrective context with good information up next to it."

[Back to Top]

****Catalogue of all projects working to solve Misinformation and Disinformation (Shane Greenup, MisinfoCon, 6-9-18)

Starts with The Disinformation Index (rating the probability of a source carrying disinformation).
MisinfoCon ( a community of people focused on the challenge of #misinformation & what can be done to address it. Events so far at MIT, London and Kyiv--DC in August)
rbutr (tells you when the webpage you are viewing has been disputed, rebutted or contradicted elsewhere on the internet).Get the plugin.
Credibility Coalition An interdisciplinary community committed to improving our information ecosystems and media literacy through transparent and collaborative exploration. Tackling the misinformation problem successfully will require a holistic approach, with reputation systems, fact-checking, media literacy, revenue models, and public feedback all helping to address the health of the information ecosystem."
Teaching in the Age of Trump (Andrea Rinard, Medium, 7-13-18) Five tenets for navigating alternative facts and ad hominem attacks in the classroom:

1. Kids need to learn how to be more responsible and canny media consumers.

2. We must create safe spaces and insist on civility. And so on, with stories from the classroom.
How and why to spot and identify fake news (Pat McNees, Writers and Editors)
Faking News: Fraudulent News and the Fight for Truth (PDF, PEN America report, 10-12-17) Invaluable.
How to squash fake news without trampling free speech (Callum Borchers, WashPost, 10-12-17) About the PEN report and its findings and recommendations.
Ten Questions for Fake News Detection (The News Literacy Project, or NLP)
The Best Tools To Help Develop Global Media Literacy (Larry Ferlazzo, 3-12-09)
Blue Feed, Red Feed Liberal and conservative views on the same topic, side by side. Try "Trump," for example.
The Learning Network (New York Times web-based lessons in media literacy)
6 types of misinformation circulated this election season (Claire Wardle, Columbia Journalism Review, 11-18-16) She discusses and gives examples of
     1. Authentic material used in the wrong context.
     2. Imposter news sites designed to look like brands we already know.
     3. Fake news sites.
     4. Fake information.
     5. Manipulated content.
     6. Parody content.

[Back to Top]

To Fix Fake News, Look To Yellow Journalism (Alexandra Samuel, JStor Daily, 11-29-16) How The Internet Ruined Everything (Or Did It? Social media critics have been quick to blame Facebook and the spread of "fake news" for the election upset. But poorly researched and downright dishonest reporting has been undermining the first amendment since the early days of journalism. Click journalism has plenty of precedents in the history of mass media, and particularly, in the history of American journalism. A good starting point on this topic.

Skewed: A Critical Thinker's Guide to Media Bias by Larry Atkins, as reviewed on Philly.com: 'Skewed': How to be your own filter in the Web universe . "Atkins, a longtime adjunct professor of journalism at Temple University, Arcadia University, and Montgomery County Community College, lays out the difference between "clear and balanced" news and advocacy journalism. He highlights the urgency for media consumers to recognize this difference." Not that all advocacy journalism is bad -- it can also involve solid investigative journalism but then come down on a side: Silent Spring by Rachel Carson is an example of advocating to advocate to expose corruption or harm. Here are Atkins' main points (HuffPost, 12-6-16)

After Comet Ping Pong and Pizzagate, teachers tackle fake news (Moriah Balingit, WaPo, 12-11-16) For conspiracy theorists, "pizzagate" didn't end when a man brought a gun to Comet Ping Pong in Washington in a misguided attempt to rescue child sex slaves. Instead, the shooting fired up further belief in the baseless claims.
A century ago, progressives were the ones shouting ‘fake news’ (Matthew Jordan, The Conversation, 2-1-18) As a rhetorical strategy for eroding trust in the media, the term dates back to the end of the 19th century. Righteous "muckrakers were usually the ones deploying the term. They sought to challenge the growing numbers of powerful newspapers that were concocting fake stories to either sell papers or advance the interests of their corporate benefactors."

[Back to Top]


House ethics committee warns lawmakers against posting deepfakes (Emily Birnbaum, The Hill, 1-29-20) The House Ethics Committee issued a memo warning lawmakers that they may violate Congress’s Code of Official Conduct if they post “deep fakes,” or distorted videos that operate as a technologically sophisticated form of disinformation. The warning comes soon after Rep. Paul Gosar, of Arizona, re-tweeted an edited photo falsely depicting President Obama meeting Iranian President Hassan Rouhani.

Journalists can change the way they build stories to create organic news fluency (Tom Rosenstiel and Jane Elizabeth, White Paper for American Press Institute, 5-9-18)  "We propose a new way of creating journalism that helps audiences become more fluent and more skilled consumers of news the more they consume it....imagine a format or presentation that, alongside the story, poses some key questions a discriminating or "fluent" news consumer might ask to decide what to make of the story." They might ask: What is new here? What evidence is there? What sources did you talk to and when? What facts don't we know yet? What, if anything, is still in dispute? ...Imagine if more journalists were to raise and answer these questions in an element placed at the top of the narrative."

    Is teaching news literacy a journalist's job? Yes. Here's a way to build stories that can show people the difference between good and bad journalism and outright fakery. The first step is thinking about — and asking — what questions audiences may have about a story and then providing those answers explicitly. That step guides the journalist into a new and important mindset of putting themselves in the audience’s shoes.

      The authors of this white paper present templates for building news fluency for nine news categories — standard news stories, non-investigative projects, investigations, Read More 

Post a comment