This page will be this guide's "home base" for recent updates on the topics of misinformation/disinformation, media literacy, fact checking and conspiracy theories.
A February 2024 study published in The Harvard Kennedy School's Misinformation Review found that debunking misinformation among fringe groups susceptible to it was not effective. Instead, the study found that focusing upon limiting consumption of misinformation is far more effective. The new strategy works by exposing the unreliability of sources, which results in the audiences reducing their consumption of misinformation in order not to be misled. This groundbreaking research has broad implications for how societies combat misinformation.
And a June 2024 study published in Nature found that exposure to misinformation on social media is not nearly as widespread as has been reported, but tends to concentrate "among a narrow fringe with strong motivations to seek out such information." According to the study, algorithms that determine what content a person receives in their social media feed actually "tend to push users to more moderate content and to offer extreme content predominantly to those who have sought it out." An effective way of deterring misinformation distributed via websites is to compile and release lists of advertisers that purchase advertising on those sites, the study also found. As with the above study, the study asserts that the best way to counteract the effects of misinformation is to limit the consumption of such content in fringe groups.
To be continued...
A selection of recent scholarship on various aspects of misinformation/disinformation, propaganda & conspiracy theories..
Galarza-Molina, R. (2023). Youth in the face of disinformation: A qualitative exploration of Mexican college students’ attitudes, motivations, and abilities around false news. Communication & Society, 97–113.
Gjerazi, B., & Skana, P. (2023). Impact of politically motivated fake news on public opinion: A case study of deliberate dissemination of disinformation. Balkan Social Science Review, 22(22), 365–383.g
Petratos, P. N., & Faccia, A. (2023). Fake news, misinformation, disinformation and supply chain risks and disruptions: Risk management and resilience using blockchain. Annals of Operations Research, 327(2), 735–762.
Weaver, R. L. (2024). Remedies for “Disinformation.” University of the Pacific Law Review, 55(2), 185–208.
A selection of recent magazine, newspaper, web and trade journal articles about misinformation/disinformation, propaganda & conspiracy theories.
Arnott, C. (2024, June 30). In a year of global elections, how do we stop the spread of misinformation? 'Prebunking' is part of the solution. The Conversation.
Bucktin, C. (2023, Apr 13). £1.3bn fake news trial 'could ruin Murdoch': Tycoon admits Fox TV pushed US election conspiracy theories. The Daily Mirror.
DiMolfetta., D. (2024, Mar 20). US sanctions Kremlin-backed firms for operating network of fake news sites. Nextgov.Com (Online).
Morris, R. (2023). The shocking truth about fake news: Your social media feed is an easy place to get news-but you shouldn’t believe everything you see. Here are four tips to avoid falling into a misinformation trap. Scholastic Choices, 38(6), 6–11.
Researchers have found that trusted messengers and grassroots organizing are critical to halting the spread of misinformation and fostering community engagement, especially in communities that have limited access to internet, wi-fi and library resources. Marginalized communities in particular can benefit from investments in "human networks and local infrastructures that can offer reliable information and support democratic participation." In addition, new research related to COVID-19 misinformation suggests the power of this type of outreach to counter conspiracy theories.
Sources:
Conference Presentation/Discussion, INFORMED 2024/The Knight Foundation. View sessions from INFORMED 2024 here; previous conference sessions are also available.
Lalani, H. S., DiResta, R., Baron, R. J., & Scales, D. (2023). Addressing viral medical rumors and false or misleading information. Annals of Internal Medicine.
Brand new from ContentCredentials.org, a tool that lets you upload an image to inspect its content credentials in detail and see how it has changed over time. Not 100% foolproof, but good to use along with reverse image search to determine details about the history/usage of an image.
Research into how to curb the spread of misinformation, disinformation and conspiracy theories is ongoing and plentiful. Sander van der Linden of the University of Cambridge and Steven Lewandowsky of the University of Bristol are just two of the growing number of behavioral scientists researching this important topic. Below is some of their recent publishing, which includes important new discoveries about the techniques of prebunking and debunking misinformation. Per latest research, both methods have merit, but prebunking - essentially inoculating people against misinformation before they hear it - is gaining traction as an important tool. Read more on FirstDraft.
Van der Linden and colleagues have developed a new game called Bad News that highlights the tactic of prebunking. Visit the Tips and Tools page of this guide and check it out!
Karlsson, L. C., Mäki, K. O., Holford, D., Fasce, A., Schmid, P., Lewandowsky, S., & Soveri, A. (2024). Testing psychological inoculation to reduce reactance to vaccine-related communication. Health Communication, 1–9.
Lewandowsky, S., & van der Linden, S. (2021). Countering misinformation and fake news through inoculation and prebunking. European Review of Social Psychology, 32(2), 348–384.
Traberg, C. S., Roozenbeek, J., & van der Linden, S. (2022). Psychological inoculation against misinformation: current evidence and future directions. The ANNALS of the American Academy of Political and Social Science, 700(1), 136-151.
June 13, 2024 - Bad news out of Palo Alto, CA - the Stanford Internet Observatory (SIO) - an interdisciplinary program for the study of abuse in information technologies - is shutting down operations. The groundbreaking project, led by data science expert Renee DiResta and others, has been under a "sustained and increasingly successful campaign...to discredit research institutions and discourage academics from investigating political speech and influence campaigns," and has been sued three times by groups claiming the SIO "colluded" with the federal government to censor speech.
UPDATE: The University of Washington Center for an Informed Public (CIP), which has been partnering with the SIO, announced on June 14 that their work on election-related misinformation and disinformation will continue. “Our UW team has been doing research on online rumors and disinformation campaigns for over a decade, and that work will continue. In particular, we are currently conducting and plan to continue our ‘rapid’ research — working to identify and rapidly communicate about emergent rumors — during the 2024 election,” said Kate Starbird, co-founder and faculty director of the CIP.
photo: Stanford.edu
DiResta, who left Stanford in early June, is the author of the newly-released Invisible Rulers: The People Who Turn Lies into Reality. Please see the Social Media & Conspiracy Theories materials on this guide for more information on DiResta and her work, and stay tuned for updates on this topic.
Source: Casey Newton & Zoe Schiffer, The Platformer
A June 2024 study by NewsGuard finds that several major chatbot platforms including ChatGPT have been spreading Russian propaganda. NewsGuard co-CEO Steven Brill recommends that "for now, don't trust answers provided by most of these chatbots to issues related to news, especially controversial issues."
Photo: licensed under the Creative Commons Attribution-Share Alike 4.0 International license.
Recently, NewsGuard has come under fire from members of the U.S. House Oversight Committee (see Stanford Internet Observatory box in this guide for more details). This is a developing story - watch this space.
New from PBS.org - lesson plans for the classroom! These lesson plans are listed with an accompanying video that explains the issue and how students can take action. The program is geared toward teens and young adults, and it includes modules on how to fact check and analyze online information to determine its credibility, how to avoid internet scams, how to navigate the internet using media literacy skills and how to produce a fact-check video. Check it out!
In summer 2023, the University of Rhode Island Media Education Lab had a series of events called Courageous Conversations. The goal was to gather data and insight into why and how misinformation and conspiracies are believed and spread, and what we can do to effectively tackle this problem via interpersonal communication. In addition to its professional development opportunities, curriculum and recommended reading list, the Lab has continued offering these conversations, and representatives are always available for in person and virtual presentations.
The Courageous Conversations events resulted in a fascinating data portal, which hosts interactive tools suitable for researchers, students and concerned citizens. Give it a try!
In June 2024, misinformation watchdog NewsGuard identified more than 800 websites that use A.I. to produce "unreliable news content." The sites often have bland sounding names that are modeled after actual news outlets, and distribute articles in multiple languages that might be mistaken as being created by human writers.
In June 2024, the U.S. Surgeon General published an article in the New York Times that explained his recent conclusion that social media platforms should carry warning labels as tobacco/cigarettes and other products are mandated to carry.
"Why is it that we have failed to respond to the harms of social media when they are no less urgent or widespread than those posed by unsafe cars, planes or food? These harms are not a failure of willpower and parenting; they are the consequence of unleashing powerful technology without adequate safety measures, transparency or accountability." - U.S. Surgeon General Vivek Murthy
We are sure to hear more about this proposal and other possible ideas on how to reduce the harm that social media platforms may cause to children and young adults. Watch this space...
In July 2023, the Observatory on Social Media at Indiana University published a new tool called Top FIBers, a dashboard of disinformation "superspreaders" on Facebook and Twitter. Users can view data by month, platform, individual account or overall.
According to information scientist Mike Caulfield, "The latest AI language tools are powering a new generation of spammy, low-quality content that threatens to overwhelm the internet unless online platforms and regulators find ways to rein it in." Caulfield believes it's essential for tech platforms to mitigate AI spam before platforms become completely unusable. Stay tuned.
Hoffman, B. & J. Ware (2024). God, guns and sedition. Columbia University Press.
Newitz, A. (2024). Stories are weapons: Psychological warfare and the American mind. W.W. Norton & Co.
Thagard, P. (2024). Falsehoods fly: Why disinformation spreads and how to stop it. Columbia University Press.
Weiss, A. (2024). Counterfact: Fake news and misinformation in the digital information age. Rowman & Littlefield.
From our catalog: