Pull Off that Band-Aid: Learn What Trainees Really Think of Your Training
How many times have you left a training course and asked yourself the question “What was the point of that?”, but you reported “satisfied” in the end of course critique. I have done it, we have all done it, but we are doing the instructor (and other trainees) a disservice.
Training can be unsuccessful for a variety of reasons – bad instructors, poorly planned curriculum, or an unexciting topic.
Any trainer, or organization that provides training, should want to know if the course they delivered is useful and imparted knowledge in a way that the trainee knew how to apply it. If you are getting training from someone else, you should be asking similar questions.
But how do we figure this out?
First - by asking. Asking trainees what they think about the training they receive is the first step in evaluating training, and likely the easiest.
Why evaluate? Because it tells you if the financial and human resources you are expending on training is worth it. Training costs money and even if training is provided “free” by a government entity, you still have to pay for your officer’s time to take the training. That gets expensive. There is also an opportunity cost; one training will take away the opportunity for another. Training evaluations will help you determine if you are getting value for your money. If you want to learn more about in-service training costs and solutions check out our blog posts on in-service training obstacles and our instructor-led versus online cost calculator.
About Evaluation and Reaction
According to Kirkpatrick (1998), there are four generally accepted levels of evaluation. Level 1 is “Reaction” or what trainees say about their experience. It is probably the easiest to collect. You have likely provided Level 1 data after a training. This is an important source of data. One that can be used to identify the value of and improve training.
A Level 1 evaluation involves assessing what trainees think about what they learned. These are self-reports. It is subjective; meaning that the material is likely to resonate differently with attendees. One may like a training, another may not.
While subjective, reaction data is an essential gauge for assessing training. If trainees did not find what they learned as helpful or relevant, they will be less likely to retain it, and even less likely to implement what they learned in the real-world.
Evaluating reaction involves asking trainees what they think and feel about their training, often at a time closely following the delivery. It can include questions like
How much do you think you got out of the program?
How much did you like the program?
Will what you learned help you?
While liking and enjoying a training may not be enough to put it into action, not-liking or finding it not useful could be enough to make sure that your officers don’t.
Reaction Data and Behavioral Threat Assessment Training
I am going to use our active behavioral threat assessment training as an example for explaining the benefit of collecting and learning from Level 1 data. This threat assessment approach involves the identification of immediate, imminent, or active threats by a law enforcement officer. These threats could be against the public or the first responder. You can read more about active threat assessment in our threat assessment blog.
We collect reaction data throughout all of our threat assessment training courses. In our longer instructor-led courses we have daily 10-question critiques. Check out an example of this daily critique in Figure 1. We ask trainees to report on perceived application, utility, and what they might add or take away. Our online training is broken into four short-courses, and we also have the trainees complete a critique after each.
These mid-course assessments are an awesome way to gauge how a class is going, make mid-course corrections, or respond to problems. This is especially important for a new class or when training an entirely new audience. Our evaluation questions in online and instructor-led courses are similar, allowing us to compare the results between and across training methods and deliveries.
Not to toot our own horn too loudly, but we are proud of how our trainees perceive what they learn in our courses – 97% of our law enforcement trainees agree that what they learned will help them do their job and that the course leaves them better prepared to identify a suspicious person.
It is important to stress, however, that reaction data is subjective and other efforts should be taken to measure what trainees learned or whether or not they can use these skills on the job. You can read about measuring learning in our blog post on evaluating threat assessment skills.
Learning from Trainee Reaction
Let me give you an example of the utility of Level 1 data. In our online Threat Awareness program’s final assessment, we ask trainees if they had any problems with our system. In a number of our online course modules, we have multiple demonstration and example videos. We learned from a 20+ year veteran police officer that he was having problems tracking which videos he had already watched. While a small problem, fixing it would be worthwhile and improve the training experience.
I did some digging, and it was a simple fix in our video player settings. Now all videos are automatically marked to help our trainees figure out what they already watched, so they know what to do next.
This marking may seem like a small thing, but now it is easier for every trainee to complete their training. This is great. We want our trainees to focus on learning active threat assessment skills NOT trying to figuring out how to use our online platform.
The Benefit of Evaluating Reaction
Before we focus on the benefits of evaluating reaction, it is critical to realize what Level 1 data is NOT. It is not objective. It does not measure skill retention, knowledge gain, or the ability to use the training in a real-world setting. These data are collected in other ways.
Level 1 evaluations are a vital part instructional design and should be incorporated into any training program. They are a perfect opportunity to improve training and figure out what does and does not resonate with trainees. The approach does not have to be complicated. It can be a short online assessment using a free tool such as Survey Monkey or Qualtrics or a form that you hand out at the end of the class.
I know everyone is busy, but you also need to provide your instructors the time and resources to collect the data, analyze, and make changes based on these results. Without time to review, reflect, and revise it will be harder to make needed changes. You spend time and money to get your officers training - spend a little more to make sure they are learning something useful.
Contact us if you would like to learn more about active threat assessment training or how to improve your collection and analysis of reaction data from your trainees.
Kirkpatrick, D. (1976/98), Evaluating Training Programs: The Four Levels (San Francisco: Brett-Koehler Publishers).