Search
  • Stefan Rollnick

Disinfo Briefing: You are vulnerable to sharing fake news, here's how

This article is from my fortnightly Disinfo Briefing. If you'd like to subscribe, you can do so here.


New research: Why do people fall for fake news?

In brief: A new study from Nigeria looks into some of the cognitive processes behind fake news sharing (specifically Covid-19 fake news), allowing us to draw conclusions as to what interventions we can make to interrupt these processes.

Why it's important: Firstly, too much of our knowledge comes from studies in the Western world, and by adding another country-based case study we can start to determine what knowledge is common to all countries and what deviates across cultures.

Secondly, while social media companies need to take responsibility to stem the spread of content (something the study explicitly mentions), we simultaneously need to be protecting the public against bad information. Big Tech action is like quarantining and social distancing, educating the public is like the vaccination program.

How the study worked: The researchers used two lenses to approach this problem. First up was something called Affordance Theory, which centres around how social media's functionality (e.g. likes, comments, shares) impacts the spreadability of fake news. The second was Cognitive Load Theory, which looks at how the bombardment of information we get from social media is processed by our limited brain capacity.

What they found: They were able to rank the key factors behind what makes people vulnerable to fake news on social media, giving us some hints at what might be able to stem the spread. Here are some of their most interesting findings, in order of most to least significant:

1) "News-find-me" thinking: News-find-me thinking is our tendency to believe that we no longer have to seek out the news independently: that purely through passive exposure to social media we can inform ourselves about the world. This is how many of us get our news now, in what experts call the larger "social flow" of information (e.g. An article about the latest Covid numbers sandwiched on your newsfeed between an argument about a local housing development and a photo of your new baby nephew). This passive information absorption leaves us more vulnerable to internalising disinformation.

Potential fix: Help people to decouple their "news feeds" from "actual news".

2) Information overload: Social media is awash with ambiguous, imprecise and downright false information, and the sheer volume of it can overwhelm our internal processing. Interestingly, previous studies have also found that information overload can reduce social trust between individuals, pointing to this as a potential contributor to the decline in political discourse we've seen over the last ten years. These researchers found information overload to be the second strongest predictor in the sharing of fake news.

Potential fix: Encourage people to rely on a small number of mainstream sources.

3) Online information trust: "Don't believe everything you read on the internet". A common refrain that many of us claim to adhere to, but have you ever found yourself explaining an interesting fact to someone, only to fail to recall exactly where you saw it? This may be a result of passive absorption of information as you frantically navigate your way through social media – and it leaves you vulnerable to fake news.

Potential fix: Encourage people to be more generally sceptical of what they see on social media.

4) Status seeking: This is one of the really interesting punchlines of the research. The researchers predicted – based on previous studies – that social media users' desire to be seen as informed and reliable would reduce the chances of them sharing fake news as they take more care to curate their public image. But they were wrong: increased desire to status seek on social actually increased your likelihood of sharing fake news. The researchers put this down to individuals hastily sharing updates so they can be seen to be "in the know" without checking the details first.

Potential fix: Create a greater perceived social cost to sharing bad information.

Analysis: This study is largely built on the assumption that social media users live in a shared factual universe, but as we've seen across the world – that's not always true. Stoked by politicians, large groups of people can suspend belief in traditionally universal facts (e.g. the outcomes of elections) due to their identity-based commitment to politicians.

A committed anti-vaxxer is unlikely to stop spreading fake news if they believe there to be a "social cost" to sharing fake news, because they believe they are sharing important facts and so the social cost would be in not sharing this information.

This doesn't undermine the research, but it does limit its relevance to the well-meaning, often anxious, folks in the "middle" of public debate, rather than those on the extremes.

10 views0 comments