Modern media is eroding our ability to think straight.
Last Christmas my daughters gifted me with the Storyworth app, which sends me weekly prods to write about different aspects of my life. Recording stories of my life for my kids and grandkids is a great idea and I am happy to do it.
This week’s question is a difficult one and gave me pause: “How has the country changed during your lifetime?”
There are so many ways I could try to answer that question. But the topic that is foremost on my mind is the dramatic evolution of media platforms. During my lifetime, media dominance has shifted from radio, to TV, to computers, and on to the Internet, social media, smart phones, and artificial intelligence. I worry the country is worse off because of the way media has evolved and proliferated, because it makes us more vulnerable to propaganda and disinformation. I worry that modern media is eroding our ability to think straight.
In the 1960s, when television was king, philosopher Marshall McLuhan observed that “the medium is the message.” The idea is that the specific character of a media platform is more than a neutral conduit of information. The structure of the media platform itself, and the specific way it disseminates information, has just as profound an impact on people as does the content. In the last century, the proliferation of media platforms has given propagandists a broad range of tools with which to spread their disinformation.
Now, more than ever, propagandists use the nature of the media itself as mechanism of mind control. In 2018, when Steve Bannon was running former President Donald Trump’s campaign, he told a reporter that “The Democrats don’t matter. The real opposition is the media.” Bannon’s strategy for neutralizing legitimate news coverage was to “flood the zone with shit.”
Modern day propagandists use the fragmented media landscape to overwhelm the public with a firehose deluge of disinformation. Historian Heather Cox Richardson says that both foreign and domestic propagandists disguise their wrongdoing and malicious intent by overwhelming people with outrageous and often contradictory information. The goal is to cognitively exhaust their audiences and dissuade them from even trying to figure out what is real. People willingly expose themselves to nefarious content simply because it is embedded in an entertaining package and is delivered by stimulating media celebrities.
Cox makes an interesting distinction between misinformation and disinformation. Misinformation is simply information that is incorrect. We all make mistakes or come to conclusions that we later discover to be mistaken. But, when our error is realized, we strive to replace the erroneous information with facts and ideas that are more accurate and adaptive. The objective is to improve our understanding of the world.
Disinformation, on the other hand, is “a deliberate lie to convince people of things that are not true.” Authoritarian propagandists use disinformation, the “big lies,” to undermine the ability of individuals to think clearly or to trust the idea that truth even exists. They deploy disinformation to destabilize the social norms and undermine the social institutions that have been constructed to support rational discourse and to protect democratic principles.
This authoritarian playbook was articulated, and deployed demonically, by Adolph Hitler. Cox reports that the U.S. Office of Strategic Services (OSS), the precursor of the CIA, summarized Hitler’s techniques as follows: “His primary rules were never allow the public to cool off; never admit a fault or wrong; never concede that there may be some good in your enemy; never leave room for alternatives; never accept blame; concentrate on one enemy at a time and blame him for everything that goes wrong; people will believe a big lie sooner than a little one; and if you repeat it frequently enough people will sooner or later believe it.”
This sounds disturbingly familiar.
Modern day propagandists can choose from a much broader array of media platforms than could Hitler. And they are taking full advantage of the expanded opportunities to spread disinformation. Arizona Senator Mark Kelly (D), who serves on the Senate Intelligence Committee, estimates that Russia, Iran, and China are now generating between 20 percent to 30 percent of the political content and comments on social media. MAGA Republicans are making things worse by willingly echoing foreign propaganda to their constituents.
The largest purveyors of disinformation, however, are home grown. Elon Musk is a prime example. The billionaire purchased the social media platform X (formerly Twitter) to distribute his own self-serving brand of disinformation without restraint.
Our media landscape will continue its rapid evolution. It is likely to become increasingly fragmented, niche oriented and, as such, more susceptible to misuse by malevolent propagandists. So, how can we protect ourselves from pervasive and powerful disinformation?
We can become familiar with standard disinformation techniques, recognize when they are being deployed, and continue to express moral outrage at their duplicity and evil intent. Hitler’s rules for authoritarian disinformation are a good starting point. Examples abound in our current political climate and we should all be outraged and act accordingly.
Authoritarian propagandists use disinformation to mess with our minds. They want our minds to degenerate into credulity or cynicism. They use a flood of disinformation to turn our minds into mush, becoming so pliable that we believe whatever nonsense they feed us. If that doesn’t work, they want us to become so overwhelmed and disenchanted that we give up, lose faith in cooperative and collaborative action, and trust no one but the designated authoritarian father-figure, the cult-leader, the strongman.
We can guard against credulity and cynicism by cultivating compassionate, open-minded skepticism; a mindset that is comfortable with ambiguity and uncertainty, that encourages careful evaluation, intellectual flexibility, and curious inquiry in the service of the common good. We can model compassionate skepticism for our grandchildren, and prepare them, as best we can, to recognize and deflect the disinformation that is likely to fill the media landscape of their future.
DISINFORMATION TRICKS
Learn to identify common tricks used to spread disinformation:
AI (artificial intelligence) and Deepfakes: Highly believable AI-generated photos, videos, or audio clips show people saying and doing things they haven’t done. Check with Politifact or Snopes.com to verify.
Fake News Sites: These sites and reports look like news but are really propaganda. Don’t Google to verify weird stories—you will just find the same or similar propaganda articles. Check credible sources like Snopes.com, the BBC, The AP or PBS.
Astroturfing: Fake comments, blog posts, and news articles repeat lies and conspiracy theories to make them seem true and popular. Ignore and don’t respond.
False Equivalence: These are comparisons that sound plausible but don’t really make sense. [“Taxes are like armed robbery. They both take your money by force.”] They are designed to get us to agree without thinking it through. Take the time to think it through.
Attacking the Person, Not the Issue: This trick deflects attention away from the real issues and triggers emotion responses that can cloud our judgment. Bring your attention, and the conversation, back to the issue.
Rage Farming: Making outrageous and offensive statements are designed to infuriate us, capture our attention, and get us to respond and expand the reach of the offensive lies. IGNORE THEM.
Lying With Science: Science is hard to interpret. Liars use confusing science jargon and bad research to support their false claims. Check with multiple science experts.
The D.Y.O.R Trap: Liars will bolster their lies by challenging us to D.Y.O.R.—“Do Your Own Research”—knowing most of us won’t. If you do, don’t just Google outrageous statements. Google a conspiracy theory and you will find more conspiracy theories. Fact check with FactCheck.org or Snopes.com.
False Choice: These are misleading either/or constructions such as “Do you want to save the climate, or save the economy?” Reject either/or scenarios. We can have a strong economy and protect the environment.
Cherry Picking: This trick supports a lie by using a carefully chosen bit of data and ignoring the rest of the story. “It snowed in April. So much for climate warming!” Is it data, or just a story (anecdotal)? Put things into context and look at the complete picture.
Michael C. Patterson had an early career in the theater, then worked at PBS, developing programs and systems to support the educational mission of public television. Patterson ran the Staying Sharp brain health program for AARP, then founded MINDRAMP to continue to promote physical well-being and mental flourishing for older adults. He currently explores these topics on his MINDRAMP Podcast and his Synapse newsletter. His website is www.mindramp.org.