Skip to content
Menu
  • Blog
  • 24th Mar 2022

What works in disinformation wars?

As the war in Ukraine escalates, Russia is again using a capability it has been perfecting for many years: disinformation. The Russian government is taking full advantage of its state-owned media outlets to spread false information about what Russian officials refer to as a “special operation” in Ukraine. Amoung the false reports are claims from President Putin himself regarding the motivation for ‘the operation’ – namely to ‘protect’ the people of Ukraine from “drug-addicted, neo-Nazi” leaders.

The Russian disinformation machine has been dangerously effective in the past. It disrupted the US election in 2016 and helped sow concerns about COVID vaccines. But so far at least, Russian disinformation seems to have been much less effective in relation to the conflict in Ukraine this time around. Why?

One factor is that the US and NATO allies are prebunking Russian disinformation. The US government released intelligence findings that would typically be classified, revealing their understanding of Russia’s military plans and the disinformation campaign it would wage – for example announcing false flag attacks and the unfounded allegations of chemical weapons use by the Ukrainians before they’re disseminated by Russia..

This prebunking approach appears to stem from inoculation theory. This technique was developed by psychologist William McGuire in the 1950s in response to US prisoners of war in Korea being successfully “brainwashed” because they were unprepared to being exposed to critical views of America. Since then a growing body of evidence has shown that preempting false narratives by debunking lies before they are shared is more effective than trying to counter them after they’ve spread. The approach has been effective recently in improving resilience to misinformation related to COVID-19 and climate change

Prebunking is effective, but it is not always possible. What can we do about false information once false information has already been released in the world: 

Accuracy prompts: It is estimated that people spend around 2 seconds engaging with content on social media before sharing it, and as result people often share misinformation unintentionally. Recent studies have shown that an effective way to reduce susceptibility to misinformation is to ask people to slow down for a moment and engage with the content before sharing it. 

In a BIT study, conducted in Ukraine in 2019, we found that asking both Ukrainian and Russian speakers to stop for just seven seconds before deciding if a misleading news headline was truthful or not was effective in reducing beliefs in fake news. This study was conducted in an online simulated environment, but follow-up work has confirmed the efficacy of this type of attention-focusing intervention in reducing the spread of misinformation in the real world.

Implementing these ‘moments of consideration’ on social media platforms, could encourage people to process information in a more considerate way, without restricting their ability to share views freely. 

Rules of thumb: While it is tempting to assume that educational programmes aiming to improve media literacy could help people spot fake news, there is little evidence that this approach is effective. One promising avenue is to provide people with easy to remember rules of thumb that can help them to spot misinformation. The findings from BIT’s study in Ukraine suggested that providing evidence-based rules of thumb (e.g. be careful of news you would like to be true) before exposure to misleading content was effective in improving discernment.

Correct with kindness: When countering misinformation, it is important to always be respectful of and empathetic towards those who believe the misinformation – or want to believe it. Former US Governor and Hollywood actor Arnold Schwartzenegger recently posted a video to the Russian people on Twitter that encapsulates best practices in countering misinformation. He established his legitimacy as someone who is not anti-Russian and made the common ground between Ukrainians and himself and Russians explicit. Rather than attacking the listener, he engaged in a positive, empathetic, conversational manner using simple language. Schwartzneeger bolstered the strength of his argument by referring to agreement among experts and stating the truth and repeating it; familiarity makes us more likely to remember information and to believe it to be true

Counter cautiously: In addition to prebunking Russian misinformation, the West should also be thoughtful about how its own communications land with Russian audiences. Messages that we instinctively feel will have a particular impact may not be perceived the same way by the Russian population. For example, in our past research in the Ukranian context we found that disinformation portraying Russia as aggressively undermining European democracies was in fact perceived as positive news by Russian speakers. Instead, disinformation portraying Russia as weak, or Putin being ill, was what Russian speakers considered to be most concerning.  

All these techniques can be used to help prevent misinformation from spreading and from having a malicious effect on people who are exposed to it. It’s worth noting though, that with notable exceptions, most of the research on debunking disinformation is from simulated environments, and we don’t actually know what happens in the real world. These academic theories need to be tested in real world settings to give us greater confidence in the right way to tackle misinformation.

This is an important issue, not only in Ukraine. We spend a large proportion of our lives online, and we are living in a time when the information landscape is more crowded than ever. Misinformation has a meaningful impact on the way individuals react to a wide range of social challenges, from climate change to war.

Authors

Want to learn more?