Skip to content

The rise of evidence-based policymaking?

Blog 31st Jan 2022

Last week saw the publication of the Evidence Commission report from the Global Commission on Evidence.

Weighing in at an impressive 144 pages, the report was supported by Commissioners from research and policy communities across six continents (disclosure: including me). Particular credit goes to the McMaster-based (Canada) secretariat, who have pulled together an unusually comprehensive summary of the global evidence ‘system’ along with a long list of exhibits, materials and recommendations, available in multiple languages.

Beyond the headline call to action, the recommendation that is most likely to attract attention is the call for governments and Foundations to benchmark at least 1% of spend on R&D –  experimentation, evaluation, and the strengthening of the evidence-building and using system.

It may not sound like a very ambitious target, particularly when set against the OECD average spend of 2.4% on R&D. But few government or public service bodies get close to 1%, let alone 2.4%.

Summary of the Evidence Commission findings

  • Decision-makers should recognise the scale of the problem, that evidence is not being systematically and effectively used, leading to poor decisions.
  • Multilateral organizations should support evidence-related global public goods and equitably distributed capacities to produce, share and use evidence.
  • All national and sub-national governments should review their existing evidence-support and fill the gaps identified as well as report publicly on their progress.
  • Evidence can and should help citizens make effective decisions about their own and their families wellbeing & governments should ensure people have access to the best relevant evidence.
  • Evidence intermediaries should step forward to fill gaps left by government, provide continuity if staff turn-over in government is frequent, and leverage strong connections to global networks.
  • News and social media platforms should build relationships with evidence intermediaries to help leverage sources of best evidence.
  • Governments, foundations and other funders should spend 1% of funding on evidence infrastructure.

In the UK, the Commission report comes in the wake of a National Audit Office report highlighting the lack of evaluation or learning in UK government expenditure. The NAO noted that with respect to more than £400bn of new government program expenditure, only 8% was subject to a meaningful impact evaluation (though it would have been good to give credit to the joint Cabinet Office-Treasury-BIT team that did this analysis, and the Ministers who put the 8% number into the public domain…!). Why does it matter? Every year, governments spend $billions, with surprisingly little evidence on which of this expenditure does good, harm, or nothing at all. 

An organisation that doesn’t evaluate, is one that isn’t learning or getting better. Of course, R&D and the effective use of evidence goes beyond evaluation. It’s also about discovering new and better ways of delivering – and identifying ideas and interventions that don’t work before blindly turning them into national policy. 

But there is good news here too. Within the UK, Ministers and the Treasury have backed the creation of an Evaluation Task Force (ETF), and have written into the latest Spending Review settlements a requirement for robust evaluation of new and legacy programs. The ETF has also been given considerable resources, including leverage over the flagship £200m ‘shared outcomes’ innovation fund; a new £15m Evaluation Accelerator Fund; along with providing an institutional home for the 50-person strong Trial Advisory Panel and the What Works Council.

To make sure the message is getting through, the Cabinet Office (UK) is hosting an ‘Evaluation Week’ in late February at which the Chancellor, and a host of other Ministers and international experts in evaluation, are taking part. Civil servants, faced with the choice of whether to evaluate, are apt to say or think: ‘but does the Minister really want to know if the program works?’ Having high-profile Ministers saying ‘yes’ really does matter. Even better if it’s a message reinforced by Parliament, media and the public.

We still have so far to go. Data from PUBmed suggests that around 10,000 medical trials are published every year. How many are published covering education, crime and justice, economic growth, net zero, conflict prevention and every other public service area you can think of? Surely less than a tenth. 

Figuring out what does and doesn’t work, and getting that evidence into the hands of policymakers and practitioners is a public good. There are financial, and sometimes political, costs in creating such knowledge. But it is of enormous benefit to all.

The Global Evidence Commission drives home that it’s not just a national public good, but an international one. Whether you live in Birmingham, Boston or Beirut, we’re all asking what’s a better way of teaching our kids maths, reducing crime, or boosting sustainable growth. It’s mad not to learn from each other. 

It is an arc of an argument that leads to G20 Indonesia and beyond. Building and democratising evidence. Now that would be something to celebrate.

Authors

Want to learn more?

Sign up to our newsletter

Design and development by Soapbox.