Why Knowledge Checks are Measuring the Wrong Thing

When I taught middle school math, tests were used to assess knowledge comprehension and some application with word problems and a few complex questions requiring logic proofs. Results were captured via a score; a metric if you will as to how well you answered the questions and very appropriate in academia.

In our quest for training evaluation metrics, we have borrowed the idea of testing someone’s knowledge as a measure of effectiveness. This implies that a corporate classroom mirrors an educational classroom and testing means the same thing – a measure of knowledge comprehension. However, professors, colleges, universities and academic institutions are not held to the same results oriented standard. In the business world, results need to be performance oriented, not knowledge gained.

So why are we still using tests?

Call it a quiz, a knowledge check or any other name it is still assessing some form of knowledge comprehension. In training effectiveness parlance, it is also known as a level 2 evaluation. Having the knowledge is no guarantee that it will be used correctly back on the job. Two very common situations occur in the life science arena where “the quiz” and knowledge checks are heavily used: Annual GMP Refresher and Read & Understand Approach for SOPs.

Life sciences companies are required by law to conduct annual regulations training (GMP Refreshers) so as to remain current. To address the training effectiveness challenge, a quiz / questionnaire / knowledge assessment (KA) is added to the event. But what is the KA measuring? Is it mapped to the course /session objectives or are the questions so general that they can be answered correctly without having to attend the sessions? Or worse yet, are the questions being recycled from year to year / event-to-event? What does it mean for the employee to pass the knowledge check or receive 80% or better? When does s/he learn of the results? In most sessions, there is no more time left to debrief the answers. This is a lost opportunity to leverage feedback into a learning activity. How do employees know if they are leaving the session with the “correct information”?

The other common practice is to include a 5 multiple choice as a knowledge check for Read & Understood (R & U) SOPs especially for revisions. What does it mean if employees get all 5 questions right? That they will not make a mistake? That the R & U method of SOP training is effective? The search function in most e-doc systems is really good at finding the answers. It doesn’t necessarily mean that they read the entire procedure and retained the information correctly. What does it mean for the organization if human errors and deviations from procedures are still occurring? Does it really mean the training is ineffective?

What should we be measuring?

The conditions under which employees are expected to perform need to be the same conditions under which we “test” them. So it makes sense to train ‘em under those same conditions as well. What do you want/need your employees (learners) to do after the instruction is finished? What do you want them to remember and use from the instruction in the heat of their work moments? Both the design and assessment need to mirror these expectations. And that means developing objectives that guide the instruction and form the basis of the assessment. (See Performance Objectives are not the same as Learning Objectives.)

So ask yourself, when in their day to day activities will employees need to use this GMP concept? Or, where in the employees’ workflow will this procedure change need to be applied? Isn’t this what we are training them for? Your knowledge checks need to ensure that employees have the knowledge, confidence and capability to perform as trained. It’s time to re-think what knowledge checks are supposed to do for you. – VB

Need to write better Knowledge Check questions?  Need to advise peers and colleagues on the Do’s and Don’ts for writing test questions?

Retraining and Refresher Training: Aren’t they one in the same?

I say no, not at all. Ask an Operations Manager and he’ll acknowledge that what it’s called is less important than getting the “assignment” done and entered into the LMS. He’s usually more concerned about the loss of productivity during the training than the effectiveness of the training at that time. It isn’t until later when the training may have to be delivered again (repeated), that the comment “training doesn’t really work” is heard.

Retraining is typically delivered as repeat training. Corrective Actions from *CAPAs usually trigger these types of required training events. In the context of the specific CAPA, we uncover the error, mistake, non-conformance or what I like to call performance discrepancy from expected outcome. It is believed that by delivering the training again, the cause of the discrepancy will be resolved. That is if the root cause was determined to be a lack of knowledge, skill or not enough practice.

Some folks believe that more is better and that with several repeated training sessions, employees will eventually get it right. It always amazes me that we find time to do repeat training over and over again but complain very loudly for refresher training, significant **SOP revision training or even new content training.   (*Corrective Actions Preventive Actions, **Standard Operating Procedures).Retraining Quote

Refresher Training implies that training was already provided at least once. The intention here is to review on that content.   A lot of regulatory training requirements are generated to satisfy this need. Common examples are Annual GMP Refreshers and several OSHA standards such as Blood Borne Pathogens training. While the aim is to refresh on the content, it is not necessarily meant to just repeat the training. Also included is the part – “so as to remain current” with current practice, trends and new updates. Hence, refresher training needs to include new material based on familiar content.

Upon Biennial SOP Review

There are some folks who would like to use this required SOP activity to coincide with the need to “refresh” on SOPs already read and/or trained. The rationale being that if the SOP hasn’t revved in 2 or 3 years time, more than likely the training hasn’t been repeated either. So, it sounds like a good idea to require that SOPs be “refreshed” upon using the same SOP cycle. One could argue for the prevention of errors; thus, in theory, this sounds very proactive.

But donning my Instructional Designer Hat, I ask you, what is the definition of training – to close a knowledge gap or skill gap. What value is there for forcing a mandatory “refresher reading” on SOPs just because the procedure is due for technical review? In practice, this becomes one huge check mark exercise leading to a paper work /LMS backlog and might actually increase errors due to “information overload”! Again, what gap are you trying to solve? In the above refresher scenario, we are avoiding a compliance gap by satisfying regulatory requirements.

Refresher Retraining

Defending Your Training Process

For those of you who have fielded questions from regulators, you can appreciate how the very training record produced generates follow up questions.   How you describe the conditions under which the training occurred or is “labeled” can impact the message you are sending as well. Calling it retraining instead of refresher training implies that training had to be repeated as a result of a performance problem not meeting expectations or standards. Whereas refresher training occurs at a defined cycle to ensure that the forgetting curve or lack of practice is not a factor of poor performance. It is a routine activity for satisfying regulatory expectations.

For end users, clarifying the difference between refresher training and “repeat” training in your Policy/SOP not only defines the purpose of the training session, it also provides the proper sequence of steps to follow to ensure maximum effectiveness of the training. There’s a difference between training content that is new /updated vs. delivered as a repeat of the same materials.   Yes, new and/or updated design takes resources and time.   How many times do you want to sit through the same old same old and get nothing new from it? Recall the definition of insanity – doing more of the same while hoping for change.   You just might want to review your Training SOP right about now. – VB

 

 

With a little help from my Validation Colleagues – The Training Protocol

In the 17 June 2014 issue: http://hpiscblog.hpisconsulting.com/2014/06/learning-on-the-fly/ I blogged about an urgent learning need requiring instructor led classroom delivery that needed to be facilitated among a group of talented SMEs.  During the needs assessment portion, I hit a huge barrier.

“I teach GMP Basics and conduct Annual GMP Refreshers several times a year and preach to audiences that you must follow the procedure otherwise it’s a deviation.  And in less than two weeks, I am expected to teach a process that is changing daily!   Yet on the other hand, how could I teach a work instruction that is known to be broken; is being re-designed and not yet finalized?”

My dilemma challenged the essence of my “learned” compliance belief system about following the 1st basic GMP principle – “thou shall follow written procedures”!  The instructional designer side of me screamed – how can you teach flawed content?  That’s wasted training that results in scrap learning. How is that training going to be effective beyond a check in the box?

And then it hit me – validation engineers use protocols to capture their “change in process” work.  Whether it’s experimental batches, 3 batches for process validation or *IQ-OQ-PQ protocols for equipment qualifications.  They are validating the procedure or the new process before it can become the standard operating procedure by developing the plan, developing acceptance criteria, managing the unexpected deviations and capturing the results.  So why couldn’t I borrow the concept and adapt it to my situation?

While it was the intention of the business unit leader to deviate from the approved set of work instructions, a planned deviation would not be appropriate in this case.  The purpose of the training sessions was to test the new sequence of steps and confirm the robustness of the criteria to make correct decisions where needed.  The learners would still be in compliance with the quality policy document and would still meet the intention of the quality system regulation.  They were essentially testing the future “how-to steps” for the proposed new work instructions.

Now before you fire off a rant of emails to me, I did not copy and paste the validation protocol template.  I did however, include a “please pardon our appearance while we are under construction” paragraph in the training plan to document the departure from the current set of work instructions.  This protocol like section also included our intentions for the outcomes of the sessions and stipulated required SOP training of all affected users once the finalized set of work instructions were approved and went into effect.

Sometimes the very solution can be found around the next cubicle.  –VB

*Installation Qualification, Operational Qualification, Performance Qualification

Analyses du jour: Isn’t it really all the same thing?

So there’s root cause analysis and gap analysis and now performance cause analysis?  Is there a difference? Do they use different tools?  It can be overwhelming to decipher through the jargon, no doubt!  I think it depends on which industry you come from and whether your focus is a regulatory / quality system point of view or performance consulting perspective.  To me, it doesn’t change the outcome.  I still want to know why the deviation occurred, how the mistake that was made and /or what allowed the discrepancy to happen.  Mix and matching the tools allows me to leverage the best techniques from all.

Why we love root cause analysis

For starters, it’s GMP and we get to document our compliance with CAPA requirements.  It allows us to use tools and feel confident that our “data doesn’t lie”.  This bodes well for our credibility with management.  And it provides the strategic connection between our training solution (as a corrective action) and site quality initiatives thus elevating the importance and quite possibly the priority for completing the corrective action on time.

Asking the right questions

Root cause analysis and problem solving steps dove tail nicely.  See sidebar below.  It requires us to slow down and ask questions methodically and sequentially.  More than one question is asked, for sure.  When you rush the process, it’s easy to grab what appears to be obvious.  And that’s one of the early mistakes that can be made with an over reliance on the tools.  The consequence?  Jumping to the wrong conclusion that automatic re-training or refresher training is the needed solution.  Done, checkmark.  On to the next problem that needs a root cause analysis. But when the problem repeats or returns with a more serious consequence, we question why the training did not transfer or we wonder what’s wrong with the employee – why is s/he not getting this yet?

Side Bar -Double Click to Enlarge.
Side Bar -Double Click to Enlarge.

No time to do it right, but time to do it twice!

Solving the problem quickly and rapidly closing the CAPA allows us to get back to our other pressing tasks.  Unfortunately, “band-aids” fall off.  The symptom was only covered up and temporarily put out of sight, but the original problem wasn’t solved.  So now, we must investigate again (spend more time) and dig a little deeper.  We have no time to do it right, but find the time to do it twice.  Madness!

Which tool to use?

My favorite human performance cause tool is the fish bone diagram, albeit the “ 5 Whys Technique” is a close second.  Both tools force you to dig a little deeper into the causes.  Yes, the end result often reveals something is amiss with “the training”, but is it man, machine, method or materials? Ah-hah, that is very different than repeat training on the procedure!  Alas, when we have asked enough right questions, we are led to the true cause(s).  That is the ultimate outcome I seek no matter what you call the process or which tool is used. -VB

HPIS C. has articles, impact stories and white papers.
Published article – Why the Band Aids Keep Falling Off

 

Request this Job Aid from HPIS C. Website.