link to Maltway
training consultants
Tuesday 14th January 2025
training consultancy

evaluation of learning effectiveness

Does your organisation evaluate learning or does it simply assume that training is worth doing? Richard Walton discusses trends in the evaluation of learning.

We have observed a trend for organisations to systematise the collection of baseline data across their training to Level 2 and sample specific training programmes to Level 3.

Organisations routinely collect "happy sheets" to gauge reactions to training and help ensure standards of delivery are maintained from course to course. Most organisations take this further by requiring training programmes to specify business and learner objectives and demonstrate how these are met through specific activities and content linked to the objectives. However, routine collection of data to validate whether these objectives are actually met in the training and retained or used afterward is much less frequently done.

In this article we examine some of the approaches to evaluation of learning that our clients are using and discuss their pro's and con's.

How learning is evaluated

Evaluation of learning effectiveness can be conducted to varying degrees of depth, first categorised by Donald Kirkpatrick (Levels 1-4) and, later, augmented by Dr.Jack Phillips (Level 5). Theses levels are:

    1. Reaction & Planned Action: Measures participant satisfaction with the programme and captures participants' planned actions

    2. Learning: Tests and measures changes in knowledge, skills and behaviours

    3. Application: Measures changes in on-the-job skills and behaviours

    4. Business Impact: Measures business impact of the changes

    5. Return on Investment: Compares the financial benefits achieved to the cost of investing in the training

Most in-house training that we observe is evaluated to levels 1 and 2. The methods we see are usually a combination of the following:

Typical Level 1 activities

  • "Happy sheets" are completed during or after training by participants to record immediate reactions to the training.
  • Action plans are completed by participants during and after the training, often with participants sharing their actions or "takeaways" with rest of the class.
  • Trainer feedback sheet to capture trainers' views on learning outcomes and organisation of the event.

Information may be paper-based or captured on-line. Typically a 90%+ return rate will be obtained for paper based assessment (the 10% that got away probably had to leave early). This return rate is significantly higher that requesting participants to complete on-line evaluations back in the workplace, although this return rate can be driven up if incentives are provided.

Typical Level 2 activities

  • Exercises and role plays within events to embed and assess participants' ability to apply skills. To properly assess whether learning is taking place the instructors need to be able to distinguish between group and individual learning.
  • Individual exams during or at close of training. This can be unpopular with participants and the instructors' self interest may lead them to mark leniently, so independent marking should be used, or moderation conducted.
  • Opportunities to give and receive feedback at both a collective and individual level. Use of pre-course questionnaires asking participants to assess their base line skills and knowledge can be contrasted with similar post course questionnaires. Questions are based around the learning objectives set for the course. But participants often underestimate their skills and knowledge prior to attending training, so these surveys can lead to a biased view of the success of the training.

Level 1 and 2 activities are usually supported with reports summarising and analysing the collected results.

Taking evaluation to Level 3

In our experience assessment to Level 3 requires significantly more resource to conduct, can be difficult to do well and is much less frequently done. Where it is done it is usually targeted to a sub-set of training rather than systemic across the range of training provided. As this can be expensive, organisations should be clear on their motivation, which may not simply be a desire to conform to best practice. We have seen organisations evaluate for the following reasons:

  • To compare the respective value for money of training programmes competing for limited funding;
  • Help the learning and development department demonstrate its worth in the organisation;
  • Keep existing training providers on their toes;
  • Gather evidence to obtain or maintain an accreditation.

The main problem with Level 3 evaluation is that it requires observation of measurable changes in behaviour over a period of time and isolating these changes from other effects. This requires agreement on:

  • What behaviours should change;
  • How those changes will be measured;
  • Taking measurements before training;
  • Deciding how long after training to re-take the measures and then taking them;
  • How to isolate the effect of training on those measures from other factors;
  • Validating the change with participants to the training, their managers and their subordinates
  • Interpreting the results.

As a way around some of these issues we see organisations turning to on-line self assessment by participants. The questions focus on whether the training has been of benefit to the participant, whether the participant has had the opportunity to put new skills into practice and whether they feel the business has benefited. While subjective, this body of evidence can be supported with a smaller number of more objective 360 degree assessments, through interview or observation.

Evaluation beyond Level 3

Much of the literature promoting evaluation to Level 4 and above cites its use where very specific skills based training is introduced, an example frequently given is training sales staff in selling techniques. In practice training is often much less specific than that, or control groups cannot be observed or training is part of a wider change programme where the effects of training from the wider business benefits are difficult to isolate.

If you have read this far, hoping to find out how to evaluate beyond Level 3 you are going to be disappointed as, in our experience, it rarely happens in practice. Usually the organisation decides that the benefits of evaluating beyond Level 3 are not justified by the costs of doing so and the business benefits of training are taken on trust. However, training that has had good feedback at Levels 1 and 2 will frequently be attributed toward the achievement of business objectives. Examples we have seen include:

  • Measurable business benefits from the successful introduction of a new IT system. Where it was taken as granted that the benefits would not have been achieved without high take up of well received end-user training.
  • Improvement in the CPA Use of Resources rating of local authorities. Where training in core finance and business processes is viewed by the senior leadership team as having been a key factor in lifting its Use of Resources from 1 star to 3 star over a period of three years.
  • A reduction of 20 days in the time taken to pay suppliers as a consequence of new business processes that were rolled out with practical end user training.

If you are in an organisation that has conducted detailed evaluation to Level 4 or beyond and would be prepared to share your experiences with our readers we would be delighted to hear from you!

How we can help

As part of the development of training, we feel it is essential to develop from the outset a workable, pragmatic, fit-for-purpose evaluation strategy to assess the impact of any training. Our action-oriented, output-based approach to training design helps to ensure beneficial outputs for individuals and the organisation. This extends to our attitude to evaluation and measurement of benefits where we have extensive experience of working with clients to devise and implement evaluation strategies.

management of training projects
development of training strategy
conduct of training needs analysis
design, development, build and delivery of training solutions
evaluation of learning
© 2001, Maltway Limited Maltway Limited - Registered in England and Wales 4190888 - Registered office: 294 Lordship Lane London SE22 8LY - email: enquiries@maltway.com
training consultancy     finance training     management development     public sector     case studies     fees     resources
home     contact us     joining us     privacy policy     terms of use