In Pursuit of the Justice of Truth
Disinformation is spreading like a virus. The vectors include social-media platforms, broadcast and print outlets of far-right pundits and pseudo-journalists, speech and signage of Trump supporters, Trump campaign rallies covered by all news outlets, including those that are fact-based, progressive, and conservative. It has infected the American body politic to the point that we cannot have conversations with people in different political camps because Truth itself has been obscured. And with it, justice.
According to the Washington Post, at the end of October 2020, Donald Trump was averaging fifty (50) false or misleading claims a day; in July 2020, he had exceeded 20,000 false or misleading claims. “It took President Trump 827 days to top 10,000 false and misleading claims in The Fact Checker’s database, an average of 12 claims a day,” write the authors. There is now a Wikipedia page devoted to the “Veracity of statements by Donald Trump.”
What Are We Talking About When We Talk About Disinformation?
How do we begin to inoculate against disinformation? In a presentation on “Issues in Counter Deception” (January 2020), cybersecurity expert Sami Saydjari suggests we first consider some definitions:
Misinformation: unintentionally incorrect information
Disinformation: intentionally incorrect information as part of deception
Deception: actions (including spreading disinformation) taken to intentionally mislead
Information warfare: use of information and information technology to gain an advantage (includes deception)
Psychological Operations (PsyOps): action by psychological methods to cause a planned psychological reaction
Influence operations: coordinated effort to alter target’s attitudes, decisions and behaviors toward influencers interests
Saydjari notes that “Uncertainty and antipathy are two favored tools in the disruption warrior’s toolbox.”
TO DOWNLOAD A TWO-PAGE SUMMARY OF GUIDELINES FOR COUNTERING DISINFORMATION, CLICK HERE
How Can We Inoculate Against Disinformation?
Luckily, we can counter disinformation. “Inoculation may work, but it must come from a trusted source,” says Saydjari, who adds, “It may be better to target the credibility of the source of misinformation, plant skepticism of a source.” Additionally, he points out, easier fact-checking may help, but people have grown distrustful of the news media. Saydjari also suggests reading David Lazer, et al, who have published extensively on the science of fake news.
According to John Cook and Stephan Lewandowsky, co-authors of The Debunking Handbook, when attempting to debunk misinformation, “First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.”
Thus, say Cook and Lewandowsky, “Bringing all the different threads together, an effective debunking requires:
CORE FACTS: A refutation should emphasize the facts, not the myth. Present only key facts to avoid an Overkill Backfire Effect.
EXPLICIT WARNINGS: Before any mention of a myth, text or visual cues should warn that the upcoming information is false.
ALTERNATIVE EXPLANATION: Any gaps left by the debunking need to be filled. This may be achieved by providing an alternative causal explanation for why the myth is wrong and, optionally, why the misinformers promoted the myth in the first place.
GRAPHICS: Core facts should be displayed graphically if possible.” See the annotated graphic below, an example of a debunking graphic.
The bottom line: it matters how we word—and depict with images—the messaging used to counter disinformation. Stay tuned to the 3.5% Blog for more on this very important topic.
Some Additional Resources:
Washington Post article from 10-28-2020, “Harvard Teaches How to Detect Misinformation Campaigns”
Center for Human Technology’s Ledger of Harms
Warped Reality, a TED Radio Hour program on disinformation and technology