gathered by Pat McNees, originally published December 2016
• Youth Voice, Authorship, & Democracy: Unpacking Media Literacy with Dr. Renee Hobbs (Sara Falluji, Beyond the Classroom, KSTV, The New Edu, 7-19-23) The "fake news crisis and the rise of disinformation have fueled a lot of interest in media literacy, because we've got misinformation being shared on social media and political campaigns. We've got misinformation about vaccines. We've got misinformation about climate change; about the economy. We've got misinformation intentionally spread for political purposes in ways that are harmful to people's understanding of reality.
"In the context of school and schooling, media literacy comes in through the changing nature of literacy– literacy is not just reading and writing. It's speaking and listening and creating media and analyzing media. More and more English teachers are recognizing that it's as important to critically analyze a film as it is to critically analyze a work of literature. It's as important to critically analyze an article in The New York Times as it is to critically analyze a work of photojournalism."
• How social media took us from Tahrir Square to Donald Trump (Zeynep Tufekci, MIT Technology Review, 8-14-18) To understand how digital technologies went from instruments for spreading democracy to weapons for attacking it, you have to look beyond the technologies themselves. “The problem is that when we encounter opposing views in the age and context. It’s like hearing them from the opposing team while sitting with our fellow fans in a football stadium. Online, we’re connected with our communities, and we seek approval from our like-minded peers. We bond with our team by yelling at the fans of the other one.” In his analysis, Tufekci explains five factors. [I urge you to read the full article, and risk exceeding "fair use" here because this long article spells out so clearly weaknesses in our digital world that can and have been exploited.]
1. The euphoria of discovery: social media as "breaking down what social scientists call 'pluralistic ignorance'—the belief that one is alone in one's views when in reality everyone has been collectively silenced. That, I said, was why social media had fomented so much rebellion: people who were previously isolated in their dissent found and drew strength from one another."
2. The audacity of hope: "Barack Obama's election in 2008 as the first African-American president of the United States had prefigured the Arab Spring's narrative of technology empowering the underdog....There were laudatory articles about Barack Obama's use of voter profiling and microtargeting....[but] microtargeting, especially on Facebook, could be used to wreak havoc with the public sphere.... It was true that social media let dissidents know they were not alone, but online microtargeting could also create a world in which you wouldn't know what messages your neighbors were getting or how the ones aimed at you were being tailored to your desires and vulnerabilities."
3. The illusion of immunity: "There doesn't seem to have been a major realization within the US's institutions—its intelligence agencies, its bureaucracy, its electoral machinery—that true digital security required both better technical infrastructure and better public awareness about the risks of hacking, meddling, misinformation, and more. The US's corporate dominance and its technical wizardry in some areas seemed to have blinded the country to the brewing weaknesses in other, more consequential ones."
4. The power of the platforms: "In that context, the handful of giant US social-media platforms seem to have been left to deal as they saw fit with what problems might emerge. Unsurprisingly, they prioritized their stock prices and profitability. Throughout the years of the Obama administration, these platforms grew boisterously and were essentially unregulated. They spent their time solidifying their technical chops for deeply surveilling their users, so as to make advertising on the platforms ever more efficacious. In less than a decade, Google and Facebook became a virtual duopoly in the digital ad market." Discussion of how digital tools have figured significantly in political upheavals around the world in the past few years, and how a reality TV star came along and took advantage of Twitter.
5. The lessons of the era:
"First, the weakening of old-style information gatekeepers (such as media, NGOs, and government and academic institutions), while empowering the underdogs, has also, in another way, deeply disempowered underdogs. Dissidents can more easily circumvent censorship, but the public sphere they can now reach is often too noisy and confusing for them to have an impact...The old gatekeepers blocked some truth and dissent, but they blocked many forms of misinformation too.
"Second, the new, algorithmic gatekeepers...make their money by keeping people on their sites and apps...succeed by fueling mistrust and doubt, as long as the clicks keep coming.
"Third, the loss of gatekeepers has been especially severe in local journalism....The Russian operatives who created fake local media brands across the US either understood the hunger for local news or just lucked into this strategy. Without local checks and balances, local corruption grows and trickles up to feed a global corruption wave playing a major part in many of the current political crises.
"Fourth, "While algorithms will often feed people some of what they already want to hear, research shows that we probably encounter a wider variety of opinions online than we do offline, or than we did before the advent of digital tools....Online, we're connected with our communities, and we seek approval from our like-minded peers....In sociology terms, we strengthen our feeling of "in-group" belonging by increasing our distance from and tension with the "out-group"—us versus them. Our cognitive universe isn't an echo chamber, but our social one is....This is also how Russian operatives fueled polarization in the United States..."
Fifth, "Russia exploited the US's weak digital security—its "nobody but us" mind-set—to subvert the public debate around the 2016 election. The hacking and release of e-mails from the Democratic National Committee and the account of Clinton campaign manager John Podesta amounted to a censorship campaign, flooding conventional media channels with mostly irrelevant content."
• ‘Belonging Is Stronger Than Facts’: The Age of Misinformation (Max Fisher, The Interpreter, NY Times, 5-7-21) Social and psychological forces are combining to make the sharing and believing of misinformation an endemic problem with no easy solution. Have you heard that President Biden plans to force Americans to eat less meat; that Virginia is eliminating advanced math in schools to advance racial equality; and that border officials are mass-purchasing copies of Vice President Kamala Harris’s book to hand out to refugee children? Some believe the drivers of misinformation today are social and psychological forces that make people prone to sharing and believing misinformation in the first place.
"People become more prone to misinformation when three things happen.
"First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems....In times of perceived conflict or social change, we seek security in groups. And that makes us eager to consume information, true or not, that lets us see the world as a conflict putting our righteous ingroup against a nefarious outgroup."
The second factor: "the emergence of high-profile political figures who encourage their followers to indulge their desire for identity-affirming misinformation. After all, an atmosphere of all-out political conflict often benefits those leaders, at least in the short term, by rallying people behind them."
The "third factor — a shift to social media, which is a powerful outlet for composers of disinformation, a pervasive vector for misinformation itself and a multiplier of the other risk factors.... “When you post things, you’re highly aware of the feedback that you get, the social feedback in terms of likes and shares,” Dr. William J. Brady said. Research demonstrates that people who get positive feedback for posting inflammatory or false statements become much more likely to do so again in the future. “You are affected by that."
• Most Americans favor restrictions on false information, violent content online (Christopher St. Aubin and Jacob Liedke, Pew Research, 7-20-23) Most Americans say the U.S. government and technology companies should each take steps to restrict false information and extremely violent content online. But there is more support for tech companies moderating these types of content than for the federal government doing so, according to a new Pew Research Center survey. This increase in support comes amid public debates about online content regulation and court cases that look at how tech companies moderate content on their platforms. And yet:
• Big Tech rolls back misinformation measures ahead of 2024 (Sara Fischer, Axios,6-6-23) Ahead of the 2024 election cycle, the world's largest tech companies are walking back policies meant to curb misinformation around COVID-19 and the 2020 election.
YouTube last week confirmed that it will reverse its election integrity policy to leave up content that says fraud, errors or glitches occurred in the 2020 presidential election.
Meta reinstated the Instagram account of Robert F. Kennedy Jr., who was removed from the platform in 2021 for posting misinformation about COVID.
Kathleen Hall Jamieson, director of the Annenberg Public Policy Center and founder of Factcheck.org, argued that with a few exceptions — including health threats and real-time incitement of violence — fact-checking is a stronger antidote to misinformation than blocking speech. The best solution, she argues, is to "flood the zone with the best available information, make sure that when the misinformation gets up there, you've got corrective context with good information up next to it."
****Catalogue of all projects working to solve Misinformation and Disinformation (Shane Greenup, MisinfoCon, 6-9-18)
Starts with The Disinformation Index (rating the probability of a source carrying disinformation).
• MisinfoCon ( a community of people focused on the challenge of #misinformation & what can be done to address it. Events so far at MIT, London and Kyiv--DC in August)
• rbutr (tells you when the webpage you are viewing has been disputed, rebutted or contradicted elsewhere on the internet).Get the plugin.
• Credibility Coalition An interdisciplinary community committed to improving our information ecosystems and media literacy through transparent and collaborative exploration. Tackling the misinformation problem successfully will require a holistic approach, with reputation systems, fact-checking, media literacy, revenue models, and public feedback all helping to address the health of the information ecosystem."
• Teaching in the Age of Trump (Andrea Rinard, Medium, 7-13-18) Five tenets for navigating alternative facts and ad hominem attacks in the classroom:
1. Kids need to learn how to be more responsible and canny media consumers.
2. We must create safe spaces and insist on civility. And so on, with stories from the classroom.
• How and why to spot and identify fake news (Pat McNees, Writers and Editors)
• Faking News: Fraudulent News and the Fight for Truth (PDF, PEN America report, 10-12-17) Invaluable.
• How to squash fake news without trampling free speech (Callum Borchers, WashPost, 10-12-17) About the PEN report and its findings and recommendations.
• Ten Questions for Fake News Detection (The News Literacy Project, or NLP)
• The Best Tools To Help Develop Global Media Literacy (Larry Ferlazzo, 3-12-09)
• Blue Feed, Red Feed Liberal and conservative views on the same topic, side by side. Try "Trump," for example.
• The Learning Network (New York Times web-based lessons in media literacy)
• 6 types of misinformation circulated this election season (Claire Wardle, Columbia Journalism Review, 11-18-16) She discusses and gives examples of
1. Authentic material used in the wrong context.
2. Imposter news sites designed to look like brands we already know.
3. Fake news sites.
4. Fake information.
5. Manipulated content.
6. Parody content.
• To Fix Fake News, Look To Yellow Journalism (Alexandra Samuel, JStor Daily, 11-29-16) How The Internet Ruined Everything (Or Did It? Social media critics have been quick to blame Facebook and the spread of "fake news" for the election upset. But poorly researched and downright dishonest reporting has been undermining the first amendment since the early days of journalism. Click journalism has plenty of precedents in the history of mass media, and particularly, in the history of American journalism. A good starting point on this topic.
• Skewed: A Critical Thinker's Guide to Media Bias by Larry Atkins, as reviewed on Philly.com: 'Skewed': How to be your own filter in the Web universe . "Atkins, a longtime adjunct professor of journalism at Temple University, Arcadia University, and Montgomery County Community College, lays out the difference between "clear and balanced" news and advocacy journalism. He highlights the urgency for media consumers to recognize this difference." Not that all advocacy journalism is bad -- it can also involve solid investigative journalism but then come down on a side: Silent Spring by Rachel Carson is an example of advocating to advocate to expose corruption or harm. Here are Atkins' main points (HuffPost, 12-6-16)
• After Comet Ping Pong and Pizzagate, teachers tackle fake news (Moriah Balingit, WaPo, 12-11-16) For conspiracy theorists, "pizzagate" didn't end when a man brought a gun to Comet Ping Pong in Washington in a misguided attempt to rescue child sex slaves. Instead, the shooting fired up further belief in the baseless claims.
• A century ago, progressives were the ones shouting ‘fake news’ (Matthew Jordan, The Conversation, 2-1-18) As a rhetorical strategy for eroding trust in the media, the term dates back to the end of the 19th century. Righteous "muckrakers were usually the ones deploying the term. They sought to challenge the growing numbers of powerful newspapers that were concocting fake stories to either sell papers or advance the interests of their corporate benefactors."
• House ethics committee warns lawmakers against posting deepfakes (Emily Birnbaum, The Hill, 1-29-20) The House Ethics Committee issued a memo warning lawmakers that they may violate Congress’s Code of Official Conduct if they post “deep fakes,” or distorted videos that operate as a technologically sophisticated form of disinformation. The warning comes soon after Rep. Paul Gosar, of Arizona, re-tweeted an edited photo falsely depicting President Obama meeting Iranian President Hassan Rouhani.
• Journalists can change the way they build stories to create organic news fluency (Tom Rosenstiel and Jane Elizabeth, White Paper for American Press Institute, 5-9-18) "We propose a new way of creating journalism that helps audiences become more fluent and more skilled consumers of news the more they consume it....imagine a format or presentation that, alongside the story, poses some key questions a discriminating or "fluent" news consumer might ask to decide what to make of the story." They might ask: What is new here? What evidence is there? What sources did you talk to and when? What facts don't we know yet? What, if anything, is still in dispute? ...Imagine if more journalists were to raise and answer these questions in an element placed at the top of the narrative."
Is teaching news literacy a journalist's job? Yes. Here's a way to build stories that can show people the difference between good and bad journalism and outright fakery. The first step is thinking about — and asking — what questions audiences may have about a story and then providing those answers explicitly. That step guides the journalist into a new and important mindset of putting themselves in the audience’s shoes.
The authors of this white paper present templates for building news fluency for nine news categories — standard news stories, non-investigative projects, investigations, fact-checks, explainers, breaking news (live/unplanned), live events (planned), features, opinion — guides for constructing stories that proactively resolve doubts and questions audiences may have.
Journalists should consider it their job to build stories in a way that shows people the difference between good reporting, bad reporting and outright fakery, thinking about — and asking — questions audiences may have about a story and then providing those answers explicitly (in the mindset of putting yourself in the audience's shoes).
• Get Smart About News (News Literacy Project) How news-literate are you? Test and sharpen your news literacy skills with short activities, engaging quizzes and shareable graphics for learners of all ages.See also News Lit Quiz: How news-literate are you? (12 questions to help you test your news literacy knowledge)
• Meet the KGB Spies Who Invented Fake News (Adam B. Ellick, Adam Westbrook and Jonah M. Kessel, video opinion, NY Times, 11-12-18) Episode 1 of Operation InfeKtion, revealing reveal how one of the biggest fake news stories ever concocted — the 1984 AIDS-is-a-biological-weapon hoax — went viral in the pre-Internet era. Meet the KGB cons who invented it, and the “truth squad” that quashed it. For a bit.. See also The Seven Commandments of Fake News (11-13-18) The Pizzagate playbook: Same tactics, new technologies. How the seven rules of Soviet disinformation are being used to create today’s fake news stories. Pizza anyone?
• Can News Literacy Be Taught? (John Dyer, Nieman Reports, 4-14-17) At a time when more critical media consumption is sorely needed, news literacy can be a difficult skill to impart. Whether educators can train audiences to unmask fake news, conspiracy theories and propaganda remains an open question.
• “When I was a reporter, we had a saying: ‘Facts are stubborn things.’ Now it seems opinions are more stubborn things than facts.” ~ Alan Miller, News Literacy Project.
• Flagging Fake News (Eryn Carlson, Nieman Report, 4-14-17) A look at some potential tools and strategies for identifying misinformation
• Rick Gates Sought Online Manipulation Plans From Israeli Intelligence Firm for Trump Campaign (Mark Mazzetti, Ronen Bergman, David D. Kirkpatrick and Maggie Haberman, NY Times, 10-8-18) Trump campaign aide Rick Gates expressed interest in an Israeli company’s proposal for a social media manipulation effort, requesting proposals from the company to create fake online identities, use social media manipulation, and gather intelligence to help defeat Republican primary race opponents and Hillary Clinton. The Trump campaign’s interest in the work began as Russians were escalating their effort to aid Donald J. Trump. Though the Israeli company’s pitches were narrower than Moscow’s interference campaign and appear unconnected, the documents show that a senior Trump aide saw the promise of a disruption effort to swing voters in Mr. Trump’s favor.
• Truth, Disrupted (Sinan Aral, Harvard Business Review, July 2018) False news spreads online faster, farther, and deeper than truth does — but it can be contained. Successfully fighting the spread of falsity will require four interrelated approaches — educating the players, changing their incentives, improving technological tools, and (the right amount of) governmental oversight — and the answers to five key questions."
• The enduring appeal of conspiracy theories (Melissa Hogenboom, BBC, 1-24-18) While some conspiracy theories are largely harmless, others have damaging ripple-effects. With new insights, researchers are getting closer to understanding why so many people believe things which are not true.
• Center for a New American Security. See, for example, Behind The Magical Thinking: Lessons from Policymaker Relationships with Drones (7-31-18)
• He Predicted the 2016 Fake News Crisis. Now He's Worried About an Information Apocalypse. (Charlie Warzel, BuzzFeed, 2-11-18) "In mid-2016, Aviv Ovadya realized there was something fundamentally wrong with the internet — so wrong that he abandoned his work and sounded an alarm. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled 'Infocalypse.' The web and the information ecosystem that had developed around it was wildly unhealthy, Ovadya argued. The incentives that governed its biggest platforms were calibrated to reward information that was often misleading and polarizing, or both. Platforms like Facebook, Twitter, and Google prioritized clicks, shares, ads, and money over quality of information, and Ovadya couldn’t shake the feeling that it was all building toward something bad — a kind of critical threshold of addictive and toxic misinformation."
• How the Left Lost Its Mind (Katie Martin, The Atlantic, 7-2-17)The Republicans do not have a monopoly on fake news."The Trump era has given rise to a vast alternative left-wing media infrastructure that operates largely out of the view of casual news consumers, but commands a massive audience and growing influence in liberal America. There are polemical podcasters and partisan click farms; wild-eyed conspiracists and cynical fabulists.Some traffic heavily in rumor and wage campaigns of misinformation; others are merely aggregators and commentators who have carved out a corner of the web for themselves. But taken together, they form a media universe where partisan hysteria is too easily stoked, and fake news can travel at the speed of light."
• We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned (Laura Sydell, All Tech Considered, NPR, 11-23-16) During the run-up to the presidential election, fake news really took off. "It was just anybody with a blog can get on there and find a big, huge Facebook group of kind of rabid Trump supporters just waiting to eat up this red meat that they're about to get served," Coler says. "It caused an explosion in the number of sites. I mean, my gosh, the number of just fake accounts on Facebook exploded during the Trump election." Coler says his writers have tried to write fake news for liberals — but they just never take the bait.
• What do ordinary people think fake news is? Poor journalism and political propaganda. (Rasmus Kleis Nielsen and Lucas Graves, Columbia Journalism Review, 10-24-17) "We spoke to ordinary people about fake news. What they qualify as a false story is notable. ...Journalists see news and advertising as completely separate kinds of information. But for users, they are encountered in the same context, and perceptions of one will color perceptions of the other...What kinds of journalism do audiences think qualify as fake news? People associate the term with superficial, inaccurate, and sensationalist reporting, especially in areas like celebrity, health, and sports coverage...people also see potentially misleading advertising as a kind of fake news. ...Like when you scroll down far enough and it is like “look at how these 12 child celebrities turned out,” and they are just ridiculous pictures....The term is used strategically by self-interested elites, amplified by news media, and has become part of the vernacular because it resonates with people’s lived experience of coming across all sorts of superficial, unreliable, and misleading information that they do not trust—much of it from journalists and politicians."
• How Fake News Turned a Small Town Upside Down (Caitlin Dickerson, NY Times Magazine, 9-26-17) At the height of the 2016 election, exaggerated reports of a juvenile sex crime brought a media maelstrom to Twin Falls — one the Idaho city still hasn’t recovered from.
• Media Manipulation and Disinformation Online (Downloadable report, Data&Society, 5-15-17) Samples: "By rebranding 'white nationalism' as the 'alt-right,' these groups played on the media's fascination with novelty to give their ideas mass exposure." and "Despite the existence of multiple men's rights groups, most of them have a reasonably consistent value set and ideological organization. Their central belief is that men and boys in the Western world are at risk or marginalized, and in need of defense." and "Online communities are increasingly turning to conspiracy-driven news sources, whose sensationalist claims are then covered by the mainstream media, which exposes more of the public to these ideas, and so on."
• Trump's War on "Fake News" Is Chillingly Real (Mark Follman, Mother Jones, 4-29-17) Dismissing his petty, hostile attacks on the media could be dangerous. Not too long ago, it was about as interesting and informative to tune into CNN as it was to watch a home shopping channel. But even though the cable network has shamelessly used duplicitous Trump surrogates to juice conflict and thereby ratings for its commentary shows, CNN reporters and anchors have come alive in the face of Trump...But journalists can also keep the faith given what history has shown about this battle. “Four hostile newspapers,” as Napoleon once said, “are more to be feared than a thousand bayonets.”
• Why a divided America has united against the media (Gillian Tett, Financial Times Magazine, 7-14-17) Fake news is seeping into the mainstream press on occasion, partly because it makes such compelling clickbait (to understand how this works, take a look at a brilliant study from New York’s Data & Society project on “alt-right” groups such as 8/chan (see previous link).
• Google and Facebook Can’t Just Make Fake News Disappear Danah Boyd, Wired, 3-27-17) "That’s the beauty of provocative speech: It makes people think not simply by shoving an idea down their throats, but inviting them to connect the dots."
• How do you deal with a problem like “fake news?” (Robyn Caplan, Points, Data&Society, 1-5-17) "[Fake news] consists of misleading headlines, deceptive edits, consensus-based truthmaking in communities like reddit or 8chan (i.e. pizzagate), or by the absorption of fake news by political figures, like Donald Trump, who have the power to make fake news newsworthy."
• What We Can Learn from Fake News (Angela Cochran, The Scholarly Kitchen, 11-15-16) 'Facebook’s fake news problems started before election day when it replaced human curation of their “trending” news list with an algorithm....Facebook is not creating content, Facebook is distributing content. No one knows (or even cares) who created the content anymore."
"Fake news became such an internet issue that Google added a fact-checker function in the Google News results. It doesn’t always work." "It seems pretty clear that anyone who thinks “fake” news is not harming “real” news is sadly misinformed. Part of the problem is that newspapers gave away content in the digital revolution. Gave it all away. Now, no one wants to pay for it and an entire generation of “digital borns” never had to. The result is the shutting of papers and dwindling news staff."
• Fake News (FactCheck.org) "FactCheck.org is one of several organizations working with Facebook to help identify and label viral fake news stories flagged by readers on the social media network. We provide several resources for readers: a guide on how to flag suspicious stories on Facebook and a list of websites that have carried fake or satirical articles, as well as a video and story on how to spot fake news."
• How and why to spot and identify fake news (Great search links, Writers and Editors)
• The President Versus ‘Fake News,’ Again (Bret Stephens, NY Times, 6-29-17) "Yet before dismissing Trump’s rants as evidence of his mental state, it’s worth taking them seriously as proof of political acumen." Trump's confidencel rating fell from 36% to 32% at a time when TV news had a 24% confidence level and newspapers 27%. "If nothing else, Trump has the bully’s cunning to pick on a target more unpopular than he is. And like a bully, he knows that his mark suffers the additional weakness of being susceptible to moral reproach."
• How Fake News Goes Viral: A Case Study (Sapna Maheshwari, NY Times, 11-20-16) "While some fake news is produced purposefully by teenagers in the Balkans or entrepreneurs in the United States seeking to make money from advertising, false information can also arise from misinformed social media posts by regular people that are seized on and spread through a hyperpartisan blogosphere. Here, The New York Times deconstructs how Mr. Tucker’s now-deleted declaration on Twitter the night after the election turned into a fake-news phenomenon."
• With 'Fake News,' Trump Moves From Alternative Facts To Alternative Language (Danielle Kurtz, NPR, 2-17-17) "Trump casts all unfavorable news coverage as fake news....Technology has been a big aid in Trump's quest to redefine fake news. With the help of Twitter and Facebook, language is, arguably, slipperier than ever.
• A 1964 Lesson in Fake News That Still Applies (Jim Dwyer, About New York, NY Times, 1-10-17) Released in 1964, before the presidential election, the privately (self) published book None Dare Call It Treason by John A. Stormer "claimed disloyalty to the United States was rampant among elites at the highest level of government....Mr. Stormer stuffed his tale of conspiracy with anecdotes and examples of what he said was communist infiltration of school boards, courthouses, the State Department and the White House. Franklin D. Roosevelt was quoted as saying that some of his best friends were communists, with the Congressional Record cited as the source." English teacher Lawrence Schaefer heard boys talking about the book (which still appears to have a big following--see Amazon site) and decided to use the book as a vehicle for teaching what we might now call "media literacy" (understanding how facts can be manipulated).
• Inside a Fake News Sausage Factory: ‘This Is All About Income’ (Andrew Higgins, Mike McIntire, and Gabriel J.S. Dance, NY Times, 11-25-16) A fake news article posted by Beqa Latsabidze, a Georgian. “My audience likes Trump,” he said. “I don’t want to write bad things about Trump. If I write fake stories about Trump, I lose my audience.” "Some analysts worry that foreign intelligence agencies are meddling in American politics and using fake news to influence elections. But one window into how the meat in fake sausages gets ground can be found in the buccaneering internet economy, where satire produced in Canada can be taken by a recent college graduate in the former Soviet republic of Georgia and presented as real news to attract clicks from credulous readers in the United States."
• A Call for Cooperation Against Fake News (Jeff Jarvis and John Bothrick, Medium.com, 11-18-16) Constructive suggestions for what the platforms—Facebook, Twitter, Google, Instagram, Snapchat, WeChat, Apple News, and others—as well as publishers and users can do now and in the future to grapple with fake news and build better experiences online and more civil and informed discussion in society. In brief (a sampling, and even these make no sense without reading the whole article, so please do!):
1. Make it easier for users to report fake news, hate speech, harassment, and bots.
2. Create a system for media to send metadata about their fact-checking, debunking, confirmation, and reporting on stories and memes to the platforms.
3. Expand systems of verified sources.
7. Recognize the role of autocomplete in search requests to spread impressions without substance.
9. Create reference sites to enable users to investigate memes and dog whistles.
10. Establish the means to subscribe to and distribute corrections and updates.
12. Stop funding fake news. "Google and Facebook have taken steps in the right direction to pull advertising and thus financial support (and motivation) for fake-news sites. Bing, Apple search and programmatic programmatic advertising platforms must follow suit."
13. Support white-hat media hacking.
14. Hire editors.
• Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds (Camila Domonoske, The Two-Way, NPR, 11-23-16) "More than 80 percent of middle schoolers believed that 'sponsored content' was a real news story." "More than 30 percent of students thought a fake Fox News account was more trustworthy than the real one."
• Fake News Is Not the Only Problem (Gilad Lotan, Points, Data and Society, 11-22-16) Bias, propaganda, and deliberately misleading information are much more prevalent and do more damage.
• Design Solutions for Fake News (Eli Pariser, co-founder of Upworthy, on a hivemind Google Doc). "We're in brainstorm mode."
• Online, Everything Is Alternative Media (John Herrman, NY Times, 11-10-16)
• How We Got to Post-Truth There's never been so much to read and so many readers—and that's part of a much larger problem for politics. "There are lots of folks who are not prepared for this huge onslaught of information and figuring out how to deal with that," says Craig Silverman, who reports on fake news for Buzzfeed and revealed teenagers in the Balkans were running fake news sites in support of Donald Trump to make money.
"The more obvious hoax news is fairly easy to identify and could be tagged as such on Facebook and Twitter. He also thinks Facebook’s algorithm could be tweaked to lessen the impact of fake news. For instance, if a story is being shared amongst a small pocket of ideologically similar users, perhaps the network could refrain from blasting it out to a wider audience."
• Inside Gab: The New Twitter Alternative Championed by the Alt-Right (Cale Guthrie Weissman, Fast Company, 11-18-16) A dispatch from the new censorship-defying social network in the wake of "the purge" of alt-right users on Facebook and Twitter this week.
• List of Fake News websites (Wikipedia)