Remind Me Again Why We Need Compliance Refreshers

For many organizations, the sole purpose of refresher training is to satisfy compliance requirements. Hence, the focus is on just delivering the content.  Ironically, the intent behind the 211.25 regulation is to ensure that employees receive training more than at orientation and frequently enough to remain current.  The goal is to ensure compliance with GMPs and SOPs and improve performance where there are gaps.  Improved business performance is the result and not just a checkmark for 100% attended. 

And the practice of repeating the same video year after year as the annual refresher? Efficient yes, effective, well just look at your deviations and CAPA data to answer that one.  When you shift your focus from delivering content only as the objective to a more learner-centered design, your sessions become more performance-oriented and your effectiveness reaches beyond just passing the GMP Quiz.

From passive lecture GxP refreshers to active learner centered sessions

Yet, senior leaders are not grasping that just “telling them the GMPs” is not an effective training technique, nor is it engaging.  Even if it’s backed up with a slide deck, it’s either “death by PowerPoint” or click to advance to the next slide for CBT refresher modules.   Koreen Pagano, in her June 2014 T&D article, “the missing piece”, describes it as “telling employees how to swim, then sending them out to sink, hoping they somehow can use the information we’ve provided to them to make it shore”, (p.42).  To make matters worse, employees can end up with disciplinary letters for deviations and CAPAs for failure to follow GMPs.

Look at the GXP Refresher course outline for the last 3 years at your company. What is the ratio of content to interactivity? When I dig a little deeper, I usually discover a lack of instructional design skills, and minimal creativity is a factor.  And then I hear, “Oh but we have so little time and all this content to cover, there’s no more room. If I had more time, you know, I’d add it in.”  Koreen informs us that “training is supposed to prepare employees to be better, and yet training professionals often stop after providing content” (p.43).

See What’s so special about SMEs as Course Designers?

What about using previously developed compliance materials?

I am not criticizing the use of previous course materials if they were effective.  But asking an SME to “deliver training” using a previously created PowerPoint presentation does not guarantee effective delivery. Neither does replacing clip art with new images or updating the slide deck to incorporate the new company template.   These visual “updates” are not going to change the effectiveness of the course unless the content was revised, and activities were improved. 

For many SMEs and Trainers, having a previous slide deck is both a gift and a curse.  While they are not starting with a blank storyboard, there is a tendency to use as-is and try to embellish it with speaker notes because the original producer of the slide was not in the habit of entering his/her speaking points for someone else to deliver.  Speaker notes embedded at the bottom of the notes pages within PowerPoint slides is not a leader’s guide.  While handy for scripting what to say for the above slide, it does not provide ample space for managing other aspects of the course such as visual cues, tips for “trainer only” and managing handouts, etc. 

The SME has the burden to make content decisions such as what content is critical; what content can be cut if time runs out.  Perhaps even more crucial is how to adapt content and activities to different learner groups or off-shift needs. Without a leader’s guide, the SME is unsupported and will fall back on the lecture to fill in the duration of the course. 

“SMEs put down those speaker’s notes and step away from the podium!” Vivian Bringslimark, HPIS Consulting, Inc.

Better Training Means an Investment in Instructional Design Skills

Interactive, immersive, engaging are great attributes that describe active training programs.  But it comes at a price: an investment in instructional design skills.  Trained course designers have spent time and budget to create an instructional design that aligns with business needs and has measurable performance outcomes. The course materials “package” is complete when a leader’s guide is also created that spells out the design rationale and vision for delivery, especially when someone else will be delivering the course such as SMEs in the classroom.

The Leaders Guide, invaluable for effective course delivery

A well-designed leader’s guide has the key objectives identified and the essential learning points to cover. These learning points are appropriately sequenced with developed discussion questions to be used with activities; thus, removing the need for the Trainer/SME to think on demand while facilitating the activity.  This also reduces the temptation to skip over the exercise/activity if s/he is nervous or not confident with interactive activities such as virtual break out groups, etc.

A really good guide will also include how to segue to the next slide and manage seamless transitions to next topic sections.  Most helpful, are additional notes about what content MUST be covered, tips about expected responses for activities and clock time duration comments for keeping to the classroom schedule.  SMEs as Facilitators (Instructor Led SMEs| ILT SMEs) need to study the leader’s guide and pay attention to the icons and notes provided there. These cues indicate the differentiation from lecture, to an activity whether that be self, small group, or large group.

Given all the time and effort to produce the leader’s guide, it is wasted if the course designer and SME as Facilitator do not have a knowledge transfer session.  Emailing the guide or downloading it from a share point site will not help the SME in following the guide during delivery unless an exchange occurs in which SMEs can begin to markup their copy.  

Resource from SME as Facilitators Workshop

During the knowledge transfer session/ discussion with the course designer, ILT SMEs make notes of how the instructor transitions from one slide to the next and how s/he provided instruction for the activity. This is a good time for ILT SMEs to ask how to modify content or an activity if certain conditions should occur. Especially important for SMEs to ask is what content is critical and what content can be skipped if time runs short. It is always a good idea for the ILT SME to mark-up his/her copy of the materials. And then again after the first delivery to really make it their own leader’s guide.  For example, SMEs may want to experiment with different ways to “open a session” to get experience with a variety of techniques and observe which ones yield better results.

Why do ILT SMEs need their own Qualified Trainers workshop?

  • When GMP courses are designed with the learner’s needs in mind, including adequate duration for exercises and activities, learners begin to engage with the content through the skill of a qualified facilitator who can guide the experiential activities.
  • When learner-generated responses are immediately incorporated into the session and leveraged to enhance the debriefings, the involvement and future application back on the job is even greater.

In order to pull this off, ILT SMEs need to learn how to facilitate learning experiences such as preparing to have a facilitated discussion.  One of the biggest fears ILT SMEs have when asked to facilitate an exercise or an interactive activity is the fear of it bombing such as discussions. 

Discussions can often bomb

While popular and commonly used, discussions can also fail miserably if not designed well. Relying on the SME to facilitate the discussion without carefully preparing the path to the targeted outcome is leaving it to chance that the SME knows how to execute the activity successfully. It includes the upfront questions to ask, pertinent examples as reference, and application type activities in which clarifying comments can be addressed.

 “It takes effort to get out of your head and connect with individuals.” Ludwig, D. Training Industry, Fall, 2015, p. 23.

“… So as to remain current in the practices they perform …”

Is once a year GXP refresher enough? Before you rush to answer this question, consider the following.  Do you have:

  • a lot of human or operator error related deviations?
  • or regulatory observations that include failure to thoroughly investigate …?
  • or a large percentage of repeat deviations?

Then you might be sending the mixed message that your employees are NOT trained well enough or sufficient in their knowledge and application of the GXPs.

Twice a year has fast become the new norm.

There’s a difference between GXP training content that is delivered as a repeat of the same materials vs. new and/or updated. Yes, new content takes resources and time. But, how many times do you want to sit through the same old slides and get nothing new from it? Recall the definition of insanity – doing more of the same while hoping for change. – VB

References:

  • Ludwig, D. “Lets Get Serious about Live Instructor-led Training”, training industry, Fall, 2015, p. 23.
  • Pagano, K. “The Missing Piece”, T & D, June 2014, pp. 41 – 45.
  • Rock, D. “Your Brain on Learning”, CLO, May 2015, pp. 30 – 33,48.
  • Silberman, M. (1990). Active Training: A Handbook of Techniques, Designs, Case Examples, and Tips.  Lexington Books, New York.

What’s so special about SMEs as Course Designers?

They have expertise and experience and are expected to share it via training their peers.  But now the venue is the classroom as well.  It’s training on course design methodology that is needed. SMEs and most trainers do not automatically have this knowledge.  Some develop it by reading A LOT, attending well-designed courses, and over time with trial and error and painful feedback.  The faster way is to provide funds to get SMEs as Course Designers at least exposed to how to effectively design for learning experiences so that they can influence the outcome of the objectives. 

This is management support for SMEs as Trainers.   SMEs who attend an ID basics course learn how to use design checklists for previously developed materials.   These checklists allow them to confidently assess the quality of the materials and justify what needs to be removed, revised or added; thus, truly upgrading previously developed materials.

HPISC Library has articles, impact stories and white papers.

Hacking GMPs: Deliberate Attacks or Accidental Workarounds?

Looking for more information on training SMEs?

Now available for blog readers! Email Vivian today.

Who is the Author, Vivian Bringslimark?

(c) HPIS Consulting, Inc.

Batteries Not Included: Not All Trainers are Instructional Designers or Classroom Facilitators

When I left the manufacturing shop floor and moved into training, full-time trainers presented in the classroom using a host of techniques, tools and relied on their platform skills to present content.  Subject matter experts (or the most senior person) conducted technical training on the shop floor in front of a piece of equipment, at a laboratory station, or a workbench. 

For years, this distinction was clearly practiced where I worked.  Trainers were in the classroom and SMEs delivered OJT.  Occasionally a “fulltime” trainer would consult with an SME on content or request his/her presence in the room during delivery as a back-up or for the Q & A portion of a “presentation”.  It seemed that the boundaries at the time, were so well understood, that one could determine the type of training simply by where it was delivered.

Training boundaries are limitless today

Today, that’s all changed.  No longer confined to location or delivery methods, fulltime trainers can be found on the shop floor fully gowned delivering GMP (Good Manufacturing Practices) content for example. And SMEs are now in the classroom more each day with some of the very tools used by fulltime trainers!   What defines a fulltime trainer from an SME is less important, what is necessary however is what defines effective instruction.

Your title might have the word trainer in it.  One of your responsibilities might be a qualified trainer. And you know how to use PowerPoint (PPT). Does this make you an Instructional Designer as well?  Some say yes and others cry foul as they cling to their certificates and advanced degrees. So, forgive me when I say, not every Trainer or Training Manager has the skill set or ID competency embedded in his/her toolbox.   It’s analogous to the toy box on the shelf at Toys R Us – “NOTE: Batteries Not Included”.  Except in our case, the note may be missing from the resume, but definitely embedded into the job description if you are QA L&D or HR Training and Development.

Instructional Design is a recognized profession 

Instructional Design (ID) as a field of study has been offered by many prominent universities for quite some time and is now more known as Instructional Technology.  Underlying the design of a course or a learning event, is a methodology for “good” instructional design and really good instructional designers will confess that there is a bit of an art form to it as well.  Unfortunately, with shrinking budgets and downsized L&D staff, there are less resources available to develop traditional course materials of the past.  Not to mention, shrinking timelines for the deliverables.  So, it makes sense to tap SMEs for more training opportunities since many are already involved in training at their site.  But, pasting their expert content into a PPT slide deck is not instructional design. 

What is effective design? 

Basic Elements of Course Design

To me, effective design is when learners not only meet the learning objectives during training but also transfer that learning experience back on the job and achieve performance objectives / outcomes.  That’s a tall order for an SME, even for fulltime trainers who have not had course design training. 

The methodology a course designer follows be that ADDIE, Agile, SAM (Successive Approximation Model), Gagne’s 9 Conditions of Learning, etc., provides a process with steps for the design rationale and then development of content including implementation and evaluation of effectiveness.  It ensures that key elements are not unintentionally left out or forgotten about until after the fact like evaluation/ effectiveness or needs assessment.  In an attempt to expedite training, these methodology driven elements are easily skipped without fully understanding the impact of leaving them out can have on the overall training effectiveness.  There is a science to instructional design. 

PowerPoint Slides are only a visual tool

Using PowerPoint slides by themselves does not make the training successful.  It’s one of the main tools a trainer uses to meet the objectives of the learning event, albeit the main one. The “art form” occurs when a designer creates visually appealing slides / eLearning scenes as well as aligned activities and engaging exercises designed to provide exploration, practice, and proficiency for the performance task back on the job.  But there is a difference between a course that is created to help the Trainer achieve his/her agenda and one that successfully engages learners to participate, learn and then transfer their insights back home to the job where changed behavior improves the department’s metrics.

The more trainer/instructor driven the course is, the less participation is required from the learner. For example, the instructor makes all the decisions about the course objectives and content, develops the course, delivers the course, and conducts the assessment.

From passive to active to immersive

As you move along the Learner Participation Continuum, the learner is required to participate more, and the trainer does less “talking”. The learner acquires knowledge and skills through activities that s/he experiences with the assistance of a “facilitator”.  The facilitator is focused on helping the learners meet their needs and interests. It is through these firsthand experiences and facilitated dialogue with other learners that thoughtful analysis and interpretation can become the focus of the instruction. The end result is that learners take full responsibility for decisions, actions and consequences.

Moving from Presenter Controlled Training to Learner Focused Facilitation

Moving to a more Learner Focused approach shifts the effort of the design from “deliver this content” to facilitate learning transfer for performance back on the job; which is after all the end goal for a training event. The new design includes opportunities for group participation, utilization of participants’ expertise, and real-life problem solving; key principles of adult learning.

On the one end of the continuum is the lecture which is one-way communication and requires very little participation.  At the other end, we have experiential learning and now immersive learning environments with the introduction of 3D graphics, virtual simulations, and augmented reality.

Most Trainers and SMEs tend to suffer from the “curse of too much knowledge” and find it difficult to separate the need-to-know from the nice-to-know content.  As a result, it shows up in the slide deck with overburdened slides filled with a lot of “stuff”.  Training for them takes on a lecture-style format. The thought of facilitating an activity gives most SME a case of jitters and anxiety.

So, in the “SME as Facilitator” workshop, nominated SMEs as Facilitators are encouraged to step away from the podium and use their eyes, hands, and voice to engage with their audience. Easier said than done, yes. That’s why the course is designed to allow them to take small steps within the safety of a workshop environment.

But rather than trying to pull off a fully immersive session, SMEs as Facilitators are introduced to techniques that “liven up” the lecture. They are shown how to move back and forth from passive listening (sit, hear, see) to active involvement (write, construct, discuss, move, speak). This requires the ability to:

  • follow a well-organized design plan
  • capture and hold the attention of learners
  • use relevant examples and deviations if possible
  • show authentic enthusiasm
  • involve audience both directly and indirectly
  • respond to questions with patience and respect.

While lecture has its merits, today’s learners want engaging content; that is timely, relevant and meaningful.  And while virtual reality and simulations are engaging and very immersive, courses and learning events using these techniques rely on well-funded budgets.  Most Training Departments are not that fortunate.   In the middle of the range are “lively lectures” and alternate methods such as:

  • Shift one step to right to begin the move to active learningDemonstrations
  • Case Study
  • Guided Teaching
  • Group Inquiry
  • Read and Discuss
  • Information Search.

Take the 1st shift right.

It’s really about starting with the learners’ expectations and the current organizational culture and then moving one step to the right. If they are used to lectures from SMEs, then work on delivering effective lectures before experimenting with alternate training methods. The overnight shift may be too big of a change for the attendees to adjust to despite their desire for no more boring lectures. Small incremental steps are the key.

Shift to the right when ready for the upgrade

Moving from Lecture to Delivering an EFFECTIVE Lecture

Thoroughness in the preparation reflects care and thoughtfulness. Learners appreciate the personal desire to deliver a livelier lecture. Stepping away from the podium forces the Trainer/SME to take action and allow the learners to “get up close” with the SME as Facilitator. This in turn is reflected in the learner’s desire to respond to questions and dialogue during a facilitated discussion. The rule of thumb for lecturing is approximately 8-10 minutes max. For virtual sessions, the rule of thumb is approximately 5 minutes. 

Take the 2nd Shift: Cut Content to Add Interactivity

How is this done? Upfront in the design of the course materials. The course designers have spent time and budget to prepare a leader’s guide that captures their vision for delivering the course.  SMEs as Facilitators (Classroom SMEs) need to study the leader’s guide and pay attention to the icons and notes provided there. These cues indicate the differentiation from lecture, to an activity whether that be self, small group, or large group. While it may be tempting to skip exercises to make up for lost time, it is better for learner participation to skip lecture and modify an activity if possible.

“STOP TALKING and get learners engaged in some form of activity, practice or reflection exercise”, Vivian Bringslimark, HPIS Consulting, Inc. 

One of the benefits of shifting to this learner focused design is the opportunity for learners to process the content, to make it meaningful for themselves and then associate memory links to it for later recall when the moment of need is upon them.  This can’t happen while the trainer is lecturing.  It happens during activities and reflection exercises designed to generate their own ideas during small group interactions and link it back to the course content/objectives.  Learners are prompted to openly discuss issues and problems within a “learning lab” style environment. Trainers become empathetic listeners as they create a climate of trust and safety. They become a Facilitator.

Of course, this shift also requires that site leadership and local management not only support the facilitated learning lab concept but follow through on issues and concerns that surface. Failure to do so undermines not only the facilitator’s credibility but the entire training program.

Wow, won’t this take longer to design, you ask?  Yes, in the sense that the design is now from the learner’s point of view. This means that the designer will need to research examples, collect data, and might have to develop a story from an incident, a deviation or significant CAPA, etc. The reward is that the Trainer/ Classroom SME stops talking and gives employees more engaging learning sessions. So learners become more accountable for participating and guess what – the SME’s session is no longer a boring podium speech

References:

Silberman, M. (1990). Active Training: A Handbook of Techniques, Designs, Case Examples, and Tips.  Lexington Books, New York.

Who is the Author, Vivian Bringslimark?

HPISC Library has articles, impact stories and white papers.

SME Impact Story: The Real Meaning of TTT

White Paper: Step Away From the Podium

(c) HPIS Consulting, Inc.

Why Knowledge Checks are Measuring the Wrong Thing

When I taught middle school math, tests were used to assess knowledge comprehension and some application with word problems and a few complex questions requiring logic proofs.  Results were captured via a score; an indication as to how well you answered the questions.  Whether you call it a quiz, a knowledge check or any other name it is still assessing some form of knowledge comprehension.

Life sciences companies are required by law to conduct annual regulations training (GXP Refreshers) so as to remain current in the operations being performed.  Given the design and delivery of most refresher training sessions, it is heavy with content that is delivered to a passive learner.  Listening to a speaker or reading slide after slide and clicking the next button is not active engagement.  It’s boring, mind-numbing and, in some cases, downright painful for learners.  As soon as the training is over, it’s very easy to move past the experience.  That also means most of the content is also soon forgotten.  Regulatory agencies are no stranger to this style of compliance training and delivery.  But when departures from SOPs and other investigations reveal a lack of GXP knowledge, they will question the effectiveness of the GXP training program. 

So why are we still using tests?

In our quest for effective compliance training, we have borrowed the idea of testing someone’s knowledge as a measure of effectiveness.  This implies that a corporate classroom mirrors an educational classroom and testing means the same thing – a measure of knowledge comprehension.  However, professors, colleges, universities and academic institutions are not held to the same results standard.  In the Life Sciences, two very common situations occur where knowledge checks and “the quiz” are heavily used: Annual GXP Refresher and the Read & Understand Approach for SOPs. 

Typical Knowledge Check

But what is the knowledge assessment measuring?  Is it mapped to the course objectives or are the questions so general that they can be answered correctly without having to attend the sessions?  Or worse yet, are the questions being recycled from year to year / event-to-event?  What does it mean for the employee to pass the knowledge check or receive 80% or better? When does s/he learn of the results? In most sessions, there is no more time left to debrief the answers. This is a lost opportunity to leverage feedback into a learning activity.  How do employees know if they are leaving the session with the “correct information”?

The other common practice is to include a multiple-choice quiz consisting of 5 questions as the knowledge check for SOPs that are “Read & Understood” especially for revisions.  What does it mean if employees get all 5 questions right?  That they will not make a mistake?  That the R & U method of SOP training is effective?  The search function in most e-doc systems is really good at finding the answers.  It doesn’t mean that they read the entire procedure and retained the information correctly.  What does it mean for the organization if human errors and deviations from procedures are still occurring?  Does it really mean “the training” is ineffective?

Conditions must be the same for evaluation as it for performance and training

What should we be measuring?

The conditions under which employees are expected to perform need to be the same conditions under which we “test” them.  So, it makes sense to train them under those same conditions as well.  What do you want/need your employees (learners) to do after the instruction is finished? What do you want them to remember and use from the instruction in the heat of their work moments?  Both the design and assessment need to mirror these expectations. 

Ask yourself, when in their day to day activities will employees need to use this GMP concept?  Or, where in the employees’ workflow will this procedure change need to be applied?  Isn’t this what we are training them for?  Your knowledge checks need to ensure that employees have the knowledge, confidence and capability to perform as trained.  It’s time to re-think what knowledge checks are supposed to do for you. And that means developing objectives that guide the instruction and form the basis of the assessment.

Objectives drive the design and the assessment

Instructional Design 101 begins with well-developed objective statements for the course, event, or program.  These statements aka objectives determine the content and they also drive the assessment.  For example, a quiz or a knowledge check is typically used for classroom sessions that ask questions about the content.  In order for learners to be successful, the course must include the content whether delivered in class or as pre-work. But what are the assessments really measuring?  How much of the content they remember and maybe how much of the content they anticipate applying when they return to work? 

When the desired outcomes are compliance back on the job, knowledge checks need to shift to become more application-oriented (performance-based) such as “what if situations” and real scenarios that will require an employee to analyze the situation, propose a workaround and then evaluate if the problem-solving idea meets the stated course objectives.

Performance objectives drive a higher level of course design

Training effectiveness is really an evaluation of whether we achieved the desired outcome.  So, I ask you, what is the desired outcome for your GXP training and SOP Training: to gain knowledge (new content) and/or to use the content correctly back in the workplace? The objectives need to reflect the desired outcomes in order to determine the effectiveness of training; not just knowledge retention.

What is the desired outcome?

When you begin with the end in mind namely, the desired performance outcomes, the objective statements truly describe what the learners are expected to accomplish.  While the content may be the same or very similar, how we determine whether employees are able to execute what they learned post training requires more thought about the accuracy of the assessment.  It must be developed from the performance objectives in order for it to be a valid “instrument”.  The learner must perform (do something observable) so that it is evident s/he can carry out the task according to the real workplace conditions.

Accuracy of the assessment tools

The tools associated with the 4 Levels of Evaluation can be effective when used for the right type of assessment.  For example, Level 1 (Reaction) surveys are very helpful for Formative Assessments.  See below – Formative vs. Summative Distinction.  Level 2 (Learning) knowledge assessments are effective in measuring retention and minimum comprehension and go hand in hand with learning-based objectives.  But when the desired outcomes are actually performance-based, Level 3 (Behavior) checklists can be developed for the performance of skills demonstrations and samples of finished work products. Note: Level 4 is Results and is business impact-focused.

Difference between formative and summative assessments

Trainers are left out of the loop

Today’s trainers don’t always have the instructional design skill set developed.  They do the best they can with the resources given including reading books and scouring the Internet.  For the most part, their training courses are decent and the assessments reflect passing scores.  But when it comes to Level 4 (Results) impact questions from leadership, it becomes evident that trainers are left out of the business analysis loop and therefore are missing the business performance expectations.  This is where the gap exists.  Trainers build courses based on knowledge/content instead and develop learning objectives that determine what learners should learn.  They create assessments to determine whether attendees have learned the content, but this does not automatically confirm learners can apply the content back on the job in various situations under authentic conditions.

Levels of objectives, who knew?

Many training mangers have become familiar with the 4 Levels of Evaluation over the course of their time in training, but less are acquainted with Bloom’s Taxonomy of Objectives.  Yes, objectives have levels of increasing complexity resulting in higher levels of performance.  Revised in 2001, the levels were renamed for better description of what’s required of the learner to be successful in meeting the objective. 

Bloom’s Revised Taxonomy of Objectives

Take note, remembering and understanding are the lowest levels of cognitive load while applying and analyzing are mid-range.  Evaluating and creating are at the highest levels.  If your end in mind is knowledge gained ONLY, continue to use the lower level objectives.  If; however, your desired outcome is to improve performance or apply a compliant workaround in the heat of a GMP moment, your objectives need to shift to a higher level of reasoning in order to be effective with the training design and meet the stated performance outcomes. Fortunately, much has been written about writing effective objective statements and resources are available to help today’s trainers.

Did we succeed as intended? Was the training effective in achieving the desired outcomes?

To ensure learner success via the chosen assessment, the training activities must also be aligned with the level of the objectives.  This requires the design of the training event to shift from passive lecture to active engagement intended to prepare learners to transfer back in their workspace what they experienced in the event.   The training design also needs to include practice sessions where making mistakes is an additional learning opportunity and teach learners how to recognize a deviation is occurring.  Michael Allen refers to this as “building an authentic performance environment”. 

Our training outcomes need to be both knowledge gained, and performance based.  Now, the agency is expecting us to document that our learners have the knowledge AND can apply it successfully in order to follow SOPs and comply with the regulations. Thus, trainers and subject matter experts will need to upgrade their instructional design skills if you really want to succeed with training as intended.  Are you willing to step up and do what it takes to ensure training is truly effective? – VB

Allen,M. Design Better Design Backward, Training Industry Quarterly, Content Development, Special Issue, 2017, p.17.

Who is Vivian Bringslimark, the Author?

Need to write better Knowledge Check questions?
Tips for Writing KCs
Find out the Do’s and Don’ts for Writing Assessment Questions

Want to see the Table of Contents before you request it? No problem!

(c) HPIS Consulting, Inc.

Retraining and Refresher Training: Aren’t they one in the same?

I say no, not at all. Ask an Operations Manager and he’ll acknowledge that what it’s called is less important than getting the “assignment” done and entered into the LMS. He’s usually more concerned about the loss of productivity during the training than the effectiveness of the training at that time. It isn’t until later when the training may have to be delivered again (repeated), that the comment “training doesn’t really work” is heard.

Retraining is typically delivered as repeat training. Corrective Actions from *CAPAs usually trigger these types of required training events. In the context of the specific CAPA, we uncover the error, mistake, non-conformance or what I like to call performance discrepancy from expected outcome. It is believed that by delivering the training again, the cause of the discrepancy will be resolved. That is if the root cause was determined to be a lack of knowledge, skill or not enough practice.

Some folks believe that more is better and that with several repeated training sessions, employees will eventually get it right. It always amazes me that we find time to do repeat training over and over again but complain very loudly for refresher training, significant **SOP revision training or even new content training.   (*Corrective Actions Preventive Actions, **Standard Operating Procedures).Retraining Quote

Refresher Training implies that training was already provided at least once. The intention here is to review on that content.   A lot of regulatory training requirements are generated to satisfy this need. Common examples are Annual GMP Refreshers and several OSHA standards such as Blood Borne Pathogens training. While the aim is to refresh on the content, it is not necessarily meant to just repeat the training. Also included is the part – “so as to remain current” with current practice, trends and new updates. Hence, refresher training needs to include new material based on familiar content.

Upon Biennial SOP Review

There are some folks who would like to use this required SOP activity to coincide with the need to “refresh” on SOPs already read and/or trained. The rationale being that if the SOP hasn’t revved in 2 or 3 years time, more than likely the training hasn’t been repeated either. So, it sounds like a good idea to require that SOPs be “refreshed” upon using the same SOP cycle. One could argue for the prevention of errors; thus, in theory, this sounds very proactive.

But donning my Instructional Designer Hat, I ask you, what is the definition of training – to close a knowledge gap or skill gap. What value is there for forcing a mandatory “refresher reading” on SOPs just because the procedure is due for technical review? In practice, this becomes one huge check mark exercise leading to a paper work /LMS backlog and might actually increase errors due to “information overload”! Again, what gap are you trying to solve? In the above refresher scenario, we are avoiding a compliance gap by satisfying regulatory requirements.

Refresher Retraining

Defending Your Training Process

For those of you who have fielded questions from regulators, you can appreciate how the very training record produced generates follow up questions.   How you describe the conditions under which the training occurred or is “labeled” can impact the message you are sending as well. Calling it retraining instead of refresher training implies that training had to be repeated as a result of a performance problem not meeting expectations or standards. Whereas refresher training occurs at a defined cycle to ensure that the forgetting curve or lack of practice is not a factor of poor performance. It is a routine activity for satisfying regulatory expectations.

For end users, clarifying the difference between refresher training and “repeat” training in your Policy/SOP not only defines the purpose of the training session, it also provides the proper sequence of steps to follow to ensure maximum effectiveness of the training. There’s a difference between training content that is new /updated vs. delivered as a repeat of the same materials.   Yes, new and/or updated design takes resources and time.   How many times do you want to sit through the same old same old and get nothing new from it? Recall the definition of insanity – doing more of the same while hoping for change.   You just might want to review your Training SOP right about now. – VB

 

 

With a little help from my Validation Colleagues – The Training Protocol

In the blog, “Learning on the Fly“, I blogged about an urgent learning need requiring instructor-led classroom delivery that needed to be facilitated among a group of talented SMEs.  During the needs assessment portion, I hit a huge barrier.

“I teach GMP Basics and conduct Annual GMP Refreshers several times a year and preach to audiences that you must follow the procedure otherwise it’s a deviation.  And in less than two weeks, I am expected to teach a process that is changing daily!   Yet on the other hand, how could I teach a work instruction that is known to be broken; is being re-designed and not yet finalized?”

My dilemma challenged the essence of my “learned” compliance belief system about following the 1st basic GMP principle – “thou shall follow written procedures”!  The instructional designer side of me screamed – how can you teach flawed content?  That’s wasted training that results in scrap learning. How is that training going to be effective beyond a check in the box?

And then it hit me – validation engineers use protocols to capture their “change in process” work.  Whether it’s experimental batches, 3 batches for process validation or *IQ-OQ-PQ protocols for equipment qualifications.  They are validating the procedure or the new process before it can become the standard operating procedure by developing the plan, developing acceptance criteria, managing the unexpected deviations and capturing the results.  So why couldn’t I borrow the concept and adapt it to my situation?

While it was the intention of the business unit leader to deviate from the approved set of work instructions, a planned deviation would not be appropriate in this case.  The purpose of the training sessions was to test the new sequence of steps and confirm the robustness of the criteria to make correct decisions where needed.  The learners would still be in compliance with the quality policy document and would still meet the intention of the quality system regulation.  They were essentially testing the future “how-to steps” for the proposed new work instructions.

Now before you fire off a rant of emails to me, I did not copy and paste the validation protocol template.  I did however, include a “please pardon our appearance while we are under construction” paragraph in the training plan to document the departure from the current set of work instructions.  This protocol like section also included our intentions for the outcomes of the sessions and stipulated required SOP training of all affected users once the finalized set of work instructions were approved and went into effect.

Sometimes the very solution can be found around the next cubicle.  –VB

*Installation Qualification, Operational Qualification, Performance Qualification

(c) HPIS Consulting, Inc.

Analyses du jour: Isn’t it really all the same thing?

So there’s root cause analysis and gap analysis and now performance cause analysis?  Is there a difference? Do they use different tools?  It can be overwhelming to decipher through the jargon, no doubt!  I think it depends on which industry you come from and whether your focus is a regulatory / quality system point of view or performance consulting perspective.  To me, it doesn’t change the outcome.  I still want to know why the deviation occurred, how the mistake that was made and /or what allowed the discrepancy to happen.  Mix and matching the tools allows me to leverage the best techniques from all.

Why we love root cause analysis

For starters, it’s GMP and we get to document our compliance with CAPA requirements.  It allows us to use tools and feel confident that our “data doesn’t lie”.  This bodes well for our credibility with management.  And it provides the strategic connection between our training solution (as a corrective action) and site quality initiatives thus elevating the importance and quite possibly the priority for completing the corrective action on time.

Asking the right questions

Root cause analysis and problem solving steps dove tail nicely.  See sidebar below.  It requires us to slow down and ask questions methodically and sequentially.  More than one question is asked, for sure.  When you rush the process, it’s easy to grab what appears to be obvious.  And that’s one of the early mistakes that can be made with an over reliance on the tools.  The consequence?  Jumping to the wrong conclusion that automatic re-training or refresher training is the needed solution.  Done, checkmark.  On to the next problem that needs a root cause analysis. But when the problem repeats or returns with a more serious consequence, we question why the training did not transfer or we wonder what’s wrong with the employee – why is s/he not getting this yet?

Side Bar -Double Click to Enlarge.
Side Bar -Double Click to Enlarge.

No time to do it right, but time to do it twice!

Solving the problem quickly and rapidly closing the CAPA allows us to get back to our other pressing tasks.  Unfortunately, “band-aids” fall off.  The symptom was only covered up and temporarily put out of sight, but the original problem wasn’t solved.  So now, we must investigate again (spend more time) and dig a little deeper.  We have no time to do it right, but find the time to do it twice.  Madness!

Which tool to use?

My favorite human performance cause tool is the fish bone diagram, albeit the “ 5 Whys Technique” is a close second.  Both tools force you to dig a little deeper into the causes.  Yes, the end result often reveals something is amiss with “the training”, but is it man, machine, method or materials? Ah-hah, that is very different than repeat training on the procedure!  Alas, when we have asked enough right questions, we are led to the true cause(s).  That is the ultimate outcome I seek no matter what you call the process or which tool is used. -VB

HPIS C. has articles, impact stories and white papers.
Published article – Why the Band Aids Keep Falling Off

 

Request this Job Aid from HPIS C. Website.

(c) HPIS Consulting, Inc.