War in 2022 does not only involve combat boots on the ground. In the wake of Russia’s invasion of Ukraine, there have been countless cyberattacks all over the globe, even here at Olin. The United States and European Union have used sanctions, financial and travel limits, and other economic levers to put pressure on Vladimir Putin’s government. But one of the most deeply unsettling and dangerous fronts in this war is being fought in the form of (dis)information. In this article, I’ll explain some of the reasons why disinformation has become a virulent social problem in the United States and offer tips on how to be a more mindful consumer of what you read online. This is an incredibly complex issue, so for more reading on the topic, please see the resources linked throughout this piece.
The public spending decisions in the U.S. that have impoverished schools, libraries, institutes of higher education, and more – combined with declining trust in the government and the mainstream media – have created a fertile environment for disinformation to spread. With even trusted organizations like the CDC backtracking some of their findings during the COVID response, it’s legitimately difficult to know who or what to trust. Disinformation also fills a social gap. Former QAnon adherent Lenka Perron told the New York Times in 2021 about how, feeling abandoned by politicians, ignored by the media, and lonely in her life, she found emotional support among Q believers. Stories like Perron’s demonstrate that the response to disinformation can’t only be teaching people how to better evaluate the news. People are not seeking the truth so much as they are seeking validation of existing beliefs and community support.
Disinformation researchers and librarians also blame the rise of social media platforms using algorithms that promote the most incendiary and divisive voices. Big Tech dominates the information landscape with billions of users, creates uncontrolled vectors of “fake news,” and undermines everyone’s ability to thoughtfully consume information. Educators are simply not equipped to combat these issues when advertising and social media giants like Facebook and YouTube design their algorithms to encourage maximum engagement rather than accuracy or reliability. While some platforms are finally attempting to squelch disinformation, corporations should not be allowed to serve as the sole arbiters of speech in a democracy.
Worsening economic conditions, widespread fear and loneliness, the engagement-driven algorithms of Big Tech, and defunded educational institutions have created a serious problem that needs to be fought from multiple fronts. This is all in combination with a deluge of calculated disinformation tactics utilized by actors with nationalist interests and a desire for global destabilization. These tactics are the product of decades-old state-sponsored disinformation campaigns in Russia, described by one KGB defector as having a goal of changing “the perception of reality of every American to such an extent that, despite the abundance of information, no one is able to come to sensible conclusions in the interest of defending themselves, their families, their community and their country.”
Disinformation about Ukraine isn’t just coming from Russian intelligence agencies and rogue agents, though. On the one hand, you have a master propagandist in Putin, unabashedly playing the victim even as he orchestrates aggression, using his administration to spin tales of Ukraine’s President Volodymyr Zelenskyy and his “Nazi guys,” as a Russian politician repeatedly said in a recent interview with BBC Newshour. On the other, you have the French media outlet that published a moving video of a Ukrainian girl confronting a Russian soldier…that was actually a Palestinian girl confronting an Israeli soldier in 2012. This is not to establish a false equivalency between the significance of these stories, but to make the point that no matter who we support, our very human confirmation bias diminishes our ability to evaluate information.
Shifting gears to techniques we might use for assessing the information we see online, I want to start by invoking the SIFT method we teach in library instruction sessions at Olin. SIFT is a four-step process for learning to think like a fact checker. We usually teach it in the context of the much more slow-moving and deliberate process of research vs. assessing stories shared through chat or social media, but the core strategies still apply.
News is generated and spread in such an overwhelming and lightning-fast manner in 2022 that it is disorienting and tough to keep up with even in “slower” news cycles. The first step of SIFT is to stop and think – what are you even looking at? If you’re tracking stories from many miles away in areas you don’t have much familiarity with, you are going to be inherently limited in your ability to understand what’s going on. You may also be overly emotional while you’re reading or watching, and that can sway your interpretations. It’s okay to recognize you may not be able to follow certain kinds or sources of news. The next step in the process can help you find ways to stay up to date while acknowledging your limitations.
Investigate the Source
There are many kinds of sources visible on the web these days, not just encyclopedias, newspapers, and research articles. Independent writers and freelance journalists can be critical trustworthy eyewitnesses during events, sharing their firsthand experiences as they happen. Unfortunately, there are also fake accounts, bots, and spammers to watch out for. Mike Caulfield, misinformation researcher and one of the creators of SIFT, prompts us to ask if a source is “‘in the know’ — do they have *significantly* above average knowledge of a situation because of expertise, profession, life experience, or location?” He also asks us to consider a source’s personal and professional incentives, and to wait for better sources or more verification of developing news stories rather than rushing to share breaking stories the moment you find them.
Find Better Coverage
This step is a close partner with “investigate the source.” It’s critical to be extra careful about this when dealing with contexts that you may not be familiar with because of your geographic location, upbringing, or other limitations of perspectives. Ukraine is a country of over 40 million people; there are numerous mainstream media outlets, and most Americans need to do quite a bit of homework on learning which ones are reliable. It’s important to distinguish where finding better coverage is more important than investigating a source. “If you get an article that says koalas have just been declared extinct from the Save the Koalas Foundation, your best bet might not be to investigate the source, but to go out and find the best source you can on this topic,” Caulfield suggests, “or, just as importantly, to scan multiple sources and see what the expert consensus seems to be.”
Trace Back to the Original Context
In the final move of SIFT, we acknowledge that the internet strips images and words of their original context. You might see the middle minutes of a video, hear audio edited to change the speaker’s intended meaning, or see a reference to a medical study in an article that describes its conclusions inaccurately. In these cases, you should try to find the original, undoctored source or the cited article, but it may not be possible to do that. When it’s not, try to let it go. Is your best option to share something when you have 20% of the story, or an incorrect but interesting interpretation of it?
SIFT is not the solution to disinformation. Disinformation is a complex and entrenched problem in the U.S. exacerbated not only by slashed education budgets, crumbling public infrastructure, and social media giants with too much power, but also by state-sponsored or independent actors who are deliberately working to destabilize trust in democracy. It’s not something that any one individual can solve. That said, learning how to start thinking like a fact checker is one action we can individually take to help today. This article only begins to unpack small parts of the disinformation ecosystem, but a better understanding of how we got here can inspire us to work on rebuilding the support systems we have lost.