Skip to content
Menu
  • Blog
  • 29th Aug 2019

Your training programme isn’t working – here’s how to make sure it does

More than £200bn is spent globally on learning and development every year. Entrepreneurs, small business owners and corporate workers sit through millions of hours of training on the latest management fad or tips on how to grow their business. 

Yet does any of it actually work? Do the behaviours of people who attend training, on average, change? And does behaviour change translate into better organisational outcomes?

We recently partnered with CDC Group, the UK’s development finance institution, to better understand the answer to these questions. We reviewed the existing evidence base for leadership and networking programmes and collaborated on a new evaluation framework for The Africa List (TAL), CDC Group’s leadership and networking programme for African business leaders. In a new report, published today, we share our findings and present a set of practical tools to evaluate leadership and networking programmes.

Do leadership development programmes work? 

Overall, the evidence suggests traditional short-term business training and entrepreneurship programmes fail to lead to improvement in business performance. Whilst participants often enjoy the experience and learn new business practices, this does not, on average, translate into higher profits or business growth.

A potential explanation is that programmes do not change the workplace behaviours that matter for performance. Another possibility is that even when changes in business practices occur they are too small to influence business performance. Improvements in business outcomes take time and may be fully realised only several years after the programme (which is beyond the scope of many evaluations). Methodological limitations such as small sample sizes and differences between businesses also make it hard to properly evaluate programmes.

Account for context, offer practical insights and focus on behaviour change (and, where possible, evaluate) 

Despite our overall findings we think there is still hope for carefully constructed business leadership and networking programmes. 

Understanding the local context, participants’ background and their practical and behavioural barriers to business start-up and growth are, we think, the ingredients necessary for any chance of success. Programmes that are practice-focused, support participants in applying lessons in their business and include regular feedback appear more effective. 

It is not only know-how. Attitudes matter too. Take the story of Akouélé Ekoué Hettah, a clothing boutique owner in Togo. She experienced significant success after attending a tailored training programme focused on developing a proactive, self-starting and future-oriented mindset. On average, participants in this specially designed programme run by the World Bank saw their profits increase by 30% two years after the programme ended compared to entrepreneurs who received no training.

Alongside tailored programmes, there is emerging evidence that structured and intensive networking for business owners can have a positive impact on business performance. Such a programme with business leaders from Chinese firms resulted in an 8.1% increase in firm revenues. Mentoring could also be a cost-effective way to help small businesses. However, as with business training, context, participants’ needs, programme design and implementation matter.

Whilst these findings are encouraging, there is a general lack of robust high-quality evidence on how to design and run business training for different groups. We believe, therefore, it is vital to evaluate programmes. 

A guide for evaluating leadership and networking programmes for businesses in sub-Saharan Africa

We are, of course, proponents of using randomised controlled trials (RCTs) or rigorous, quasi-experimental approaches to establish causal links between training programmes and business outcomes. This should sit alongside strategic use of qualitative research to understand why specific activities work. However, we recognise parts of this approach are not possible in all environments given the number of participants in leadership programmes and the resources of providers. 

Therefore, our report with CDC Group provides a set of tools with different levels of complexity: essential, extended, involved and future-orientated. Organisations should start with essential which provides a straight-forward approach to beginning the process of collecting data and understanding the links between activities and outcomes. The process comprises collecting survey data at the start, middle and end of a programme and process tracing (see our report for survey templates and analysis instructions). Given the lack of robust evidence in this space, we suggest it would be a great leap forward if more organisations used the essential tools. The other tiers in our framework suggest further practical ways – and tools – for how essential could be systematically expanded over time.

Ultimately, evaluating programme effectiveness requires time and resources. However, developing and running these programmes requires even more. We believe that programmes should take a serious approach to understanding what works both early on and throughout their programmes. No one wants to live through a learning and development version of Groundhog Day – repeated days of useless training (without the saving grace of Bill Murray and Andie MacDowell’s romantic tension!). Taking an experimental approach to evaluation will help leadership and networking programmes ensure that their investments are making a real impact. 

If you’re interested to know more or want to apply evidence-based approaches and behavioural insights to leadership and networking programmes – please get in touch (robbie.tilleard@bi.team)! 

Authors

Want to work with us on leadership & networking?