Skip to content
Menu

Can we use behavioural science to reduce the risk of nuclear weapons?

  • Blog
  • 10th Sep 2024

If you were asked to imagine the process for launching a nuclear weapon, the images that come to mind probably involve something like the following: strict protocols, stern people in uniform and heavily guarded missiles in secret bunkers.

However, the reality is far more mundane – and prone to exactly the kind of human mishaps and errors that we’d expect in any organisation 

In 2013, a group of US Air Force officers stationed at a North Dakotan air base were fired following a string of almost comical blunders, including accidentally loading nuclear weapons onto aircraft before taking off, erroneously exporting nuclear weapons parts, and, most commonly, falling asleep while guarding nuclear weapons.

These are not isolated incidents. Contrary to popular expectation, errors are concerningly common, as demonstrated by the failed Trident test in the UK earlier this year. An error involving nuclear weapons can happen at any time because each country’s nuclear system is constantly active. We should therefore assume that the stories that come to light are just the tip of the iceberg. 

While politicians and policy-makers are often focused on the headline grabbing risks of nuclear weapons in the hands of rogue states and terrorists, the very human frailties that lie at the heart of our nuclear weapons systems are often overlooked. So, in a joint project with Chatham House, BIT sought to understand exactly what the ‘behavioural risks’ in our nuclear weapons programmes really are – and what consequences they might pose –  focussing on four areas in particular: errors, miscommunication, overconfidence, and public and political salience. 

Workplace errors

The examples cited above show that nuclear weapon environments are not always that different from any other workplace.

The reality is that the work of nuclear weapons operators is often unvaried, unstimulating and, for many, unrewarding. It is because of this that many errors occur, either through negligence and corner-cutting, or even alcohol or drug abuse.

And, while each of these individual errors might not lead to a crisis, bigger errors often result from several smaller errors which have been normalised over time. 

Although these are often complex behavioural issues, once they are recognised and acknowledged, sometimes straightforward behavioural interventions are possible:

  • Address ‘cognitive underload’. In the airport security industry, this has been attempted by using mock explosives to sensitise staff to the possibility of encountering an actual explosive.
  • Design out errors instead of trying to prevent them. For example, instead of training personnel to lock doors, installing doors which automatically lock.
  • Use errors as opportunities for ‘teachable moments’. This prevents errors being covered up, which can ultimately lead to more and more severe errors.

Miscommunication between powers

The closest thing nuclear states have to a ‘default’ position is their nuclear posture, which sets out the circumstances under which they would use nuclear weapons. The clarity of these posture declarations is therefore key. But how well do we know the mindset of our nuclear rivals? The invasion of Ukraine has shown us that our assumptions about what our opponents think is in their best interest may be far from accurate.

Red teams (groups of specialists tasked with role-playing the enemy) armed with sufficient linguistic and cultural knowledge of nuclear powers such as China, Russia and North Korea, could help diagnose areas of potential misunderstanding. While the Ministry of Defence has published guidance on operating red teams, it’s unclear if they are actually being used to test nuclear scenarios.

Correctly understanding your opponent is vital, but what if they don’t want to be understood? Troublingly, nuclear postures are often made deliberately ambiguous to increase their deterrence effect. So it would be just as beneficial, perhaps more so, to make plans based on the assumption that communication will fail.

Old war: deterrence and the ‘rational’ actor

Nuclear weapons were developed in a world where the theories of the rational economic actor remained unchallenged. Since then, our thinking on human behaviour has evolved. 

But the logic of deterrence theory, centred upon the ability of the US and the (then) USSR to annihilate the other, remains largely unopposed. Fatal miscommunication can occur when actors – particularly political leaders – believe that the conditions which would deter them will also deter their adversary. Perspective rotations can be a useful way of gaining new insights into the actual motives of an adversary, without acquiring new intelligence (see how to conduct a perspective rotation in our report). 

Overconfidence and fool-proof planning

Military planning is often prone to overconfidence. In particular, governments have continually failed at the planning stage to realistically foresee how their wars will pan out. If a nuclear-armed state overconfidently predicts its prospects of winning a conflict, it may be left with only the nuclear option when its plans begin to fail. 

But below the level of geopolitical calculation lie the normal processes that all human decision-makers go through – and which can be distorted in a nuclear conflict.

For example, during war the opportunities for feedback and learning are limited, making it difficult for military and political leaders to assess whether their confidence is misplaced until it’s too late.

And even when the point of no return is approaching, there’s no guarantee that the conflict won’t escalate, as the ‘sunk costs’ of the endeavour, as well as human emotions like pride, stop people from changing their mind and course of action.

Seeing that small risks can scale

Overconfidence can also increase the likelihood of errors and miscommunication. Humans are very bad at dealing with small probabilities, which can lead to optimism bias when planning for errors – that is, if the probability of an error occurring is very small, it is understood to be impossible. However, when a system is in constant operation (as nuclear weapons systems are), even very small probabilities of errors accumulate into non-trivial risks.

Correcting overconfidence: calibration and pre-diagnosis

Based on our previous work ‘Behavioural Government’, we think there are concrete ways that overconfidence can be challenged among decision-makers at all levels.

One way of doing this is through calibration exercises, where participants make a prediction and state their degree of confidence. Often, these exercises reveal a degree of overconfidence in our beliefs about the future or what we know about the present. BIT has built a calibration quiz specifically about nuclear weapons, and we’d love you to try it out by clicking here.

Performing ‘pre-mortems’, where groups imagine their plan has failed or an error has occurred and then work back to identify potential causes of the failure or error, can also help identify weak spots at the planning stage. This is because when thinking about a project, we (as humans) generally imagine it going well.

Public and political pressure

The level of awareness and concern among the general public regarding a particular issue can have a significant impact on policy and nuclear weapons are no different. 

Unfortunately, the end of the Cold War started a trend of disengagement by civil society on nuclear weapons safety. And this has reinforced the growing tendency of policy-makers to let nuclear weapons slip down their agendas.  

Without the issue being forced into the political spotlight there is little impetus driving forward measures to reduce nuclear risks, such as arms control agreements, despite the fact that public engagement on nuclear weapons policy has achieved real progress in the past. For example, one expert we interviewed told us that a US Senator was convinced to vote to ratify the Comprehensive Test Ban Treaty by his church minister.

However, the relationship between public and political opinion on nuclear weapons is complex: there is some evidence that Europeans have actually become more favourable towards nuclear weapons since the Russian invasion of Ukraine, and in Russia itself the Orthodox Church is used to legitimise Moscow’s nuclear posture.

Building nuclear anxiety

But what can be done to increase concern about the risks of nuclear weapons among the public and political elites?

Having watched the film The Day After, which depicts the effects of a nuclear strike on a small Kansas town, Ronald Reagan, then president, confided in his diary that:

“The film was very effective & left me greatly depressed … My own reaction was one of our having to do all we can to have a deterrent & to see there is never a nuclear war.”

Despite living in a digital, multimedia era, the defence policy world still relies on the written word to lobby and persuade politicians, when clearly visual media also has great power to influence even decision-makers at the highest level. Visceral experiences, such as nuclear simulations could be more impactful still, as could visiting areas affected by nuclear detonation – such as test sites. That is not to say that writing has no place: compelling scenarios of nuclear conflict can be built using facts, interviews, and data, such as this recent example

Conclusion

In summary, nuclear weapons policy is afflicted with many of the same behavioural issues as other kinds of policy-making. Human errors, overconfidence among leaders, and miscommunication between actors can all creep in, making nuclear weapons systems across the world less safe.

The difference with other policy areas are the stakes. When things go seriously wrong with nuclear weapons, the risk is truly existential. Although the world has so far successfully avoided nuclear catastrophe, nuclear weapons are not going anywhere. It therefore remains our duty to find whatever tools we can to reduce the risks they present.

Our exploratory report, co-written with Chatham House, has many ideas for reducing the risks of nuclear weapons. If you’re interested in discussing the intersection of behavioural science and nuclear weapons policy further, please do get in touch by emailing oliver.adcock@bi.team. We’re particularly excited to work on testing these interventions with policy-makers, and thinking about how to implement the most promising ideas.

Authors