Tips for Evaluating A Training Program

Begin With The End In Mind

Without evaluating the success of training ventures, little can be done to measure an initiative’s effectiveness or future areas of developmental focus.

Donald Kirkpatrick created the definitive training evaluation model that has been adopted and administered by most of the modern adult learning world.  The model consists of 4 levels:

  1. Reaction – what participants thought and felt about the training (satisfaction; “smile sheets”)
  2. Learning – the resulting increase in knowledge and/or skills, and change in attitudes. This evaluation occurs during the training in the form of either a knowledge demonstration or test.
  3. Behavior – transfer of knowledge, skills, and/or attitudes from classroom to the job (change in job behavior due to training program). This evaluation occurs 3-6 months post training while the trainee is performing the job. Evaluation usually occurs through observation.
  4. Results – the final results that occurred because of attendance and participation in a training program (can be monetary, performance-based, etc.)

These levels have been the cornerstone of contemporary evaluation theory since they debuted in the late 1950s, and later captured in Kirkpatrick’s 1975 seminal work, “Evaluating Training Programs”.

Kirkpatrick’s son, James, has continued with his father’s work and uses what he calls ROE (Return on Expectations).  The idea of ROE requires that clear success measures are identified with the executive sponsor to measure something tangible in terms of outcomes, to “begin with the end in mind” and identify what is expected early on so that trainers can ensure they deliver the goods.

Xponents relies heavily on the research and published work of Kirkpatrick, and has also discovered a few practical elements to assist in the process.  What we’ve found: when evaluating a training program, stick to the F.A.C.T.S.:

Focused:
Be very clear about what it is you plan to evaluate, and how you intend to gauge effectiveness.  Keep your questions focused on the skills or behaviors the training is meant to develop. For example, “Leadership” is not a skill. It is the result of multiple skills and beliefs.  When framing your questions, be sure not to drift into broad generalizations, or connect invisible dots for the user. Ask direct questions related to specific training focal points.

Applicable:
Can evaluation results be quickly applied to the system’s needs? Evaluation forms are great tools for proving ROI on a training initiative, but don’t let that be their only purpose.  If the form is properly focused, areas of opportunity should be immediately evident.  Have a plan to respond quickly and give participants the tools they need to apply their learning to the workplace.

Comprehensive:
If you only hand out an evaluation form at the end of a training session, you’re only getting half of the picture.  Collect participant data before and after the training to see the evolution of understanding, not just a final snapshot.  A documented improvement in numbers helps to prove program effectiveness, highlight areas for participant improvement, and pinpoint specific curriculum strengths and opportunities.  In addition to program pre and post-assessments, it is also a good idea to offer pre and post evaluations of the system’s overall mood, ability to work together, and general effectiveness.

Transparent:
This should sound obvious, but it must be said: Forms and analytics should be shared with all appropriate parties.  This doesn’t mean that trainees should be required to sign their name to each assessment; in fact, anonymity yields the most honest feedback.  But if aspects of an initiative fell short, that should be reflected in the numbers, and shared with appropriate parties.  If evaluations are completed by hand, all original hard copies should be saved and documented.

Systematic:
The evaluation process and the forms themselves should be simple and systematic.  The bulk of training efforts should go into program development, delivery, and determining action steps for the future.  The delivery and assessment of evaluation results should be quick and efficient.  Online forms yield the most immediate results, especially for online learning programs. But online questionnaires are not always practical in most live, in-person training situations. Questions should be straightforward; easy to answer and easy to tabulate.  Forms should be easily customizable to accommodate varying needs.

Learn How to Build Trust!

Xponents, Inc. is committed to protecting and respecting your privacy, and we’ll only use your personal information to administer your account and to provide the products and services you requested from us. From time to time, we would like to contact you about our products and services, as well as other content that may be of interest to you. If you consent to us contacting you for this purpose, please tick below to say how you would like us to contact you:
You can unsubscribe from these communications at any time. For more information on how to unsubscribe, our privacy policies, and how we are commited to processing and respecting your privacy.

please review our Privacy Policy

By clicking submit below, you consent to allow Xponents, Inc. to store and process the personal information submitted above to provide you the connect requested.