The Misinformation Epidemic

This is the second in a series of blog posts on Social Studies by Dr. Sabrina M. Weiss. You can find previous blog posts in this series linked here:
Part 1: Social Studies and Civics in Crisis

Misinformation has become a serious threat to our society, and experts are discovering that it is difficult to manage and avoid it.  Misinformation is defined as “false or inaccurate information.”  A similar concept is disinformation, defined as “false information deliberately intended to mislead.”  All of what is discussed here involves a mixture of the two, but because identifying intent behind misleading information can be difficult to determine, the term “misinformation” will be used here as an umbrella term.

Why is misinformation so dangerous?

Worryingly, the explosion of social media and misinformation threatens our democratic society and processes; something that people in my generation (and probably yours) didn’t have to navigate because we grew up before the time of widespread social networks that use algorithms to encourage this spread with “clickbait.” It’s become really hard to know what information is valid and which sources are trustworthy – a far cry from being able to trust the Nightly News on one of three television networks.

There are four aspects of misinformation that present a big challenge for us as educators and mentors to young people.

First, scientists have frequently not understood how misinformation is accepted by people and spread.  Previously, it was thought that misinformation takes hold because there was not enough accurate information, and that simply educating people with “good information” would block adoption of “bad information.”  But that is not actually the case: it is now recognized that a combination of cognitive factors (like relying on a “gut feeling”) and social factors (like wanting to belong to an identity group) play a larger role in the spread of misinformation. People often accept misinformation because it comes from someone they trust, or because it sounds attractive and appeals to an intuitive sense.  Challenging this misinformed belief then challenges one’s circle of trusted people, or one’s inner sense of what is right, making it threatening and hard to dispel. 

Second,  misinformation is not easily removed from our minds and memories.  In fact, engaging with misinformation and people who spread it can actually promote its acceptance because of how our brains tend to work in social settings. Through the “continued influence effect,” our brains may still believe the false information even after it was corrected with accurate information; the false belief is not simply replaced by the true one, but instead they coexist in our memory and compete to be recognized. Basically, misinformation is “sticky,” and hard to remove, even when we accept correct information. This means that tactics we’ve used in the past, like critical thinking or fact checking, often don’t work with today’s misinformation, and can often cause it to become more entrenched. 

Third, it’s very difficult to avoid exposure to misinformation because social media networks and platforms easily become “infected” with it, and it is spread both by bots and unwittingly carriers (sometimes through viral memes or sharing).  In fact, social media platforms (like Facebook and Twitter) has algorithms that encourage users to see controversial content to increase their engagement (even negative attention is desirable attention).  Even when individual users are conscientious about fact-checking before sharing misinformation, their positive influence is far outweighed by the negative influence of the platforms themselves
Fourth, misinformation and radicalization go hand in hand, leveraging the social identity aspect of misinformation adoption to spread extremist ideologies.  If someone feels like those around them are rejecting them because they have a concern about an issue (that they received misinformation on), they are more likely to seek out people with similar beliefs who are supportive, rather than challenging, to their ideas.  The same pathways for misinformation spread also spread bigoted ideas, sexist and misogynistic rhetoric (see: Red Pill), and nationalistic and xenophobic positions. Young people (especially boys and young men) are being specifically targeted through popular social media platforms and communities and drawn into increasingly dangerous social groups and belief systems involving misogyny, racism, and neo-Nazi beliefs.

What can we do about it?  

This is a question that experts, researchers, and policy makers are actively working on.  Just recently, a judge issued a ruling severely restricting the ability of federal agencies to contact and work with social media platforms to reduce misinformation. This is one important part of fighting misinformation because as we’ve discussed, individual consumers of social media have limited ability to stop misinformation because of platform-wide algorithms that intentionally spread it.  

As an educator, my focus is on finding and sharing tools that can help our learners more effectively engage with the world.  One of these is the S.I.F.T. technique, coined by Mike Caulfield, a researcher at the University of Washington’s Center for an Informed Public. Caulfield explains that traditional critical thinking skills can actually be counterproductive:

“The goal of disinformation is to capture attention, and critical thinking is deep attention,” he wrote in 2018. People learn to think critically by focusing on something and contemplating it deeply — to follow the information’s logic and the inconsistencies.That natural human mind-set is a liability in an attention economy.”

Instead, he recommends that we:

1. Stop.

2. Investigate the source.

3. Find better coverage.

4. Trace claims, quotes and media to the original context.

This approach helps us to disengage from the particular platform (which may be trying to trap us into digging deeper there) and to use another resource to corroborate or debunk the ideas.  Knowing about reliable and reasonable resources that can be used to evaluate claims or questionable sources is essential.  But even more important is being able to stop oneself from falling down the rabbit hole. 

Another important skill is building up one’s own bank of knowledge by reading and watching credible news sources.  Finding multiple reliable and trustworthy sources of information, along with transparent expert sources (such as think tanks that openly describe their mission and ideologies) helps to give us a good starting point for recognizing when we could be confronted with misinformation.  
What are some experiences you have had with misinformation, disinformation, or radicalization?  Have you discovered that a friend or family member has fallen into a trap of believing misinformation?  Were you able to help them get out of it? 

What are some things you would like to learn about fighting misinformation?  What kinds of topics would you like to see for your learners in this area?  Feel free to leave a comment or send one to us here.

Dr. Sabrina M. Weiss is an instructor with Dayla Learning who specializes in teaching ethics, social issues, and interdisciplinary science and technology.