Skip to content

Creating Oppositions in Academia and Policy

3rd Jun 2014

Every now and again an article appears in the UK press that we think needs a bit of clarification. One such article appeared in the Observer over the weekend: link.

Creating oppositions between academics and organisations is a nice journalistic device – and of course, a well-established pattern in academic journals and the promotion of books. But sometimes it looks a little strange. The supposed ‘debunking’ of Danny Kahneman by Gerd Gigerenzer, and the battle between them dramatized in the article, will leave many readers puzzled.

To be clear, the Behavioural Insights Team certainly have great regard for both men’s work. Danny Kahneman’s work is now well known, and world leading. Gerd’s is less well known, at least in the UK, but also very interesting. For those who don’t know Gerd’s work, he shows amongst other things that professionals – not just the public – can dramatically misunderstand statistics and risks. But he also shows that when the same information is presented in a different way, and especially with a little training in ‘natural frequencies’, these errors can be dramatically turned around. We ourselves have used Gigerenzer’s earlier research to inform our work on clinical judgement and decision making in children’s social work (1). We were certainly delighted to have him visit us, as with other visiting scholars.

What then is the battle – the grand ‘de-bunking’? The core findings of both Kahneman and Gigerenzer have much in common. But Gerd, at least in relation to the specific but important issue of judging risks and probabilities, has highlighted how training and reframing can eliminate the errors. In contrast, Kahneman’s work is sometimes interpreted – though not necessarily by Danny himself – as showing that such errors are fundamentally and irredeemably baked into us.

A linked charge is therefore sometimes made against behavioural economists and psychologists that by identifying human errors of judgement – ‘predictably irrational’ in Ariely’s phrase – the discipline somehow holds people as ‘dumb’. It’s certainly not a view we hold – though our mental shortcuts occasionally get us into difficulty, they also enable us to handle complexity of the world around us with astonishing speed and ability.

Rather, our mantra is to support people to make better choices. Far from assuming that people are ‘dumb’, this is absolutely about respecting people’s capacity and ability to decide. But we should also recognise that the world is highly complex, and that policymakers can get far by seeking to cut through this complexity to aid consumers and users of services. This is why we often talk (as do Gigerenzer and Richard Thaler) about simplification. It’s why we have started programmes like midata (with BIS) which are all about giving people new channels and rules of thumb to cut through complex markets (2).

Finally, above all else, we’re interested in ‘what works’. If a nudge is the most effective way of achieving a policy objective, great: it will often also be highly cost-effective. More usually, a nudge will help to move the dial in the right direction alongside other interventions. In order to help policymakers and commissioners of services understand what mix of interventions work best, we’ve been working with the Cabinet Office on the creation of new ‘What Works’ centres, whose remit is to give commissioners and citizens easy access to collations of evidence, including specific and detailed work about how to make sense of statistical data. (3)