Psychologist logo
Fake and fact on cubes
Cognition and perception, Covid, Social and behavioural

‘If misinformation does behave like a virus, then we can also create a vaccine’

Sander van der Linden studies how and why people share misinformation – research that he outlines in detail in his new book, Foolproof (Fourth Estate). Interview by science writer David Robson.

15 May 2023

At the centre of your thesis is the idea that misinformation acts like a like a virus of the mind. In what ways is this an apt metaphor?

When they hear about my work, some people think this is a kind of a catchy metaphor that came out of the times that we're living in – but I wrote most of the book before the pandemic.

People have been studying how information behaves like a virus for a long time, and it’s interesting how literal that analogy is. We can use models from epidemiology without any or much adaptation and they work really well in explaining how misinformation spreads. And then, on the belief level, there are analogies to the ways viruses attack host cells; they take over some of the machinery with the goal of reproducing themselves. I think that happens to some people who basically get so consumed by conspiracy theories, that it takes over part of their cognition. Their memory and perception can be distorted and it alters the way they behave so that they reproduce the misinformation.

The more positive aspect is that if misinformation does behave like a virus, then we can also create a vaccine.

Why is inoculation necessary?

Debunking and fact checking are good, but they have limitations; when people have already been infected, it’s just very hard to sort of undo the damage. And so that's why we have focused on inoculation.

In a medical vaccine, you expose people to a weakened or inactivated strain of a virus to try to trigger the production of antibodies to help confer resistance against future infection.

The amazing thing, to me, is that you can do the same with information. You expose people to a weakened dose of the misinformation, and then – this is the crucial part – you help people’s psychological immune system to recognise and neutralise it.

Forewarned is forearmed, so you have to tell people that there are actors out there trying to manipulate them, and that they use certain strategies to achieve that goal. And once you have people motivated to pay attention, you then give people the tools to deal with the misinformation and recognise why it is wrong. For those who don’t like the medical metaphor, this process is sometimes known as ‘pre-bunking’.

There’s a lot of flexibility with this. You can do fact-based inoculations – where you tackle a specific claim – or technique based-inoculations that target the general strategies that people use to spread misinformation.

So how do you go about inoculating people?

It’s possible to pre-bunk claims with simple text. For example, you can tell people that they’ll hear that there’s a lot of disagreement among scientists about climate change. Then we can explain that this is called ‘casting doubt on the consensus’ when in fact, 97 per cent of scientists agree on the causes of climate change. Then when people come across this tactic of sowing doubt, they've been inoculated against it.

A more interesting way of combatting minformation is ‘active inoculation’. A lot of research shows that when people have agency and control over what they're doing, they're more likely to remember things and to use what they learn. And so we’ve created some games in which you play the role of a manipulator, and you find out how bad actors make use of six broad techniques that are that are used to spread misinformation, such as polarising people, using emotions to stoke fear and outrage, and floating conspiracy theories.

We know that conspiracy theories have certain ingredients, like there's always some evil actor working behind the scenes to dupe people. It's always casting doubt on the mainstream narrative. There's usually some persecuted victim in the story. And there’s usually a causal story created around random events. For example, they would often talk about the apparent link between 5G phone masts and Covid outbreaks, right? A third (non-conspiratorial) factor, population density, links the two – when there are more people there are more phone masts and more outbreaks. But conspiracy theorists can tie it into a nice causal story by falsely suggesting that 5G is causing Covid.

We found that you can take these ingredients and let people build their own conspiracy theory. And then, over time, it turns out that they can better recognise these building blocks in new variants of conspiracy theories that they haven't seen before. You can’t fact-check everything, but this technique-based inoculation gives a broader spectrum of immunity.

You’ve also worked with some big tech companies to roll out these misinformation inoculations. Tell me about your YouTube videos.

Google was concerned about extremists using misinformation techniques to lure people into YouTube videos. One common technique is the use of false dichotomies, which are often deployed to get people into more extremist modes of thinking. It’s like, either you join ISIS or you’re not a good Muslim, or we have to fix the homelessness problem before we can think about immigrants. To inoculate people, we get people to see a clip from Star Wars: Revenge of the Sith, in which Anakin Skywalker is talking to Obi Wan Kenobi and he says, ‘either you're with me or you're my enemy’. And then Obi Wan replies that ‘Only a Sith deals in absolutes’. And nobody wants to be a Sith, right? You don't want to deal in absolutes and be an evil person.

When we test people with real social media content, we find that this inoculation really helps people identify the use of false dichotomies.

[You can view the video here.]

Our examples are highly biased by our 80s and 90s upbringings, and so for Tik Tok, which is a different generation of users, we need something else. We're always thinking of new ways of delivering the vaccine.

Has your work considered anxiety as both a driver and consequence of misinformation?

In the book I talk about the ‘six degrees of manipulation’ that producers of misinformation often use. The use of negative emotions (such as appealing to fear and outrage) is a popular tactic. For example, analyses of anti-vaccination rhetoric show that most anti-vaxx websites use negative emotional appeals to scare people.

Studies show that even in the mainstream media the use of negative emotions has been increasing over the last decades. In our own research with millions of datapoints across both Twitter and Facebook we’ve also found that negative emotional words increase the viral spread of content on social media by a significant degree. Conspiracy theories in particular tend to prey on people’s anxieties about the world (such as a pandemic, mass shooting, or global warming) to help restore a false sense of agency and control over the world, which often results in maladaptive coping strategies such as denial (for example, the belief Covid-19 is a hoax or that mass shootings are a false flag operation), harassment (such as trolling) or sometimes even violence (riots, vandalism) or death (refusal of medical treatment).

Do you think that inoculation techniques should now be incorporated into education?

Yes, absolutely – I think it'd be important to build resilience at population level. We work with schools and universities, but we can only do so much, and I think it would be better to have a standardised national curriculum, where we prepare people for misinformation in advance.

The media often call you Cambridge’s ‘Defence Against the Dark Arts’ teacher. How do you feel about that title?

At the beginning, I wasn't sure about it, but over time, I've sort of come to sort of accept it, though my students have jokingly pointed out that in the Harry Potter books, most don’t last very long in the role. So I make that the end-point of my book: I'm still around for now, but just in case I’m not, here are some lessons that I want to pass on!

- David Robson also spoke with Sander Van der Linden about this work in his award-winning feature for us in May 2020; and you can find more from Sander in our archive.
Visit Sander's website for more on the book.