Did we succeed as intended? Was the training effective?

When you think about evaluating training, what comes to mind? It’s usually a “smile sheet”/ feedback survey about the course, the instructor and what you found useful. As a presenter/instructor, I find the results from these surveys very helpful, so thank you for completing them. I can make changes to the course objectives, modify content or tweak activities based on the comments. I can even pay attention to my platform skills where noted. But does this information help us evaluate if the course was successful?

Formative vs. Summative Distinction

Formative assessments provide data about the course design. Think form-ative; form-at of the course. The big question to address is whether the course as designed met the objectives. For example, the type of feedback I receive from surveys gives me comments and suggestions about the course.

Summative assessments are less about the course design and more about the results and impact. Think summative; think summary. It’s more focused on the learner; not the instructional design. But when the performance expectations are not met or the “test” scores are marginal, then the focus shifts back to the course, instructor/trainer and instructional designer with the intent to find out what happened? What went wrong? When root cause analysis fails to find the cause, it’s time to look a little deeper at the objectives.

Objectives drive the design and the assessment

Instructional Design 101 begins with well-developed objective statements for the course, event, or program. These statements aka objectives determine the content and they also drive the assessment. For example, a written test or knowledge check is typically used for classroom sessions that ask questions about the content. In order for learners to be successful, the course must include the content whether delivered in class or as pre-work. But what are the assessments really measuring? How much of the content they remember and maybe how much of the content they can apply when they return to work?

Training effectiveness on the other hand is really an evaluation of whether we achieved the desired outcome. So I ask you, what is the desired outcome for your training: to gain knowledge (new content) or to use the content correctly back in the workplace? The objectives need to reflect the desired outcome in order to determine the effectiveness of training.

What is your desired outcome from training?

Levels of objectives, who knew?

Many training professionals have become familiar with Kirkpatrick’s 4 Levels of Evaluation over the course of their careers, but less are acquainted with Bloom’s Taxonomy of Objectives. Yes, objectives have levels of increasing complexity resulting in higher levels of performance. Revised in 2001, the levels were renamed for better description of what’s required of the learner to be successful in meeting the objective. Take note, remembering and understanding are the lowest levels of cognitive load while applying and analyzing are mid range. Evaluating and creating are at the highest levels.

If your end in mind is knowledge gained ONLY, continue to use the lower level objectives. If however, your desired outcome is to improve performance or apply a compliant workaround in the heat of a GMP moment, your objectives need to shift to a higher level of reasoning in order to be effective with the training design and meet performance expectations. They need to become more performance based. Fortunately, much has been written about writing effective objective statements and resources are available to help today’s trainers.

Accuracy of the assessment tools

The tools associated with the 4 levels of evaluation can be effective when used for the right type of assessment. For example, Level 1 (Reaction) surveys are very helpful for Formative Assessments. Level 2 (Learning) are effective in measuring retention and minimum comprehension and go hand in hand with learning based objectives. But when the desired outcomes are actually performance based, Level 2 knowledge checks need to shift up to become more application oriented such as “what if situations” and scenarios requiring analysis, evaluating, and even problem solving. Or shift altogether to Level 3 (Behavior) and develop a new level of assessments such as demonstrations and samples of finished work products.

Trainers are left out of the loop

But, today’s trainers don’t always have the instructional design skill set developed. They do the best they can with the resources given including reading books and scouring the Internet. For the most part, their training courses are decent and the assessments reflect passing scores. But when it comes to Level 4 (Results) impact questions from leadership, it becomes evident that trainers are left out of the business analysis loop and therefore are missing the performance expectations. This is where the gap exists. Trainers build courses based on knowledge / content instead and develop learning objectives that determine what learners should learn. They create assessments to determine whether attendees have learned the content; but this does not automatically confirm learners can apply the content back on the job in various situations under authentic conditions.

Performance objectives drive a higher level of course design

When you begin with the end in mind namely, the desired performance outcomes, the objective statements truly describe what the learners are expected to accomplish. While the content may be the same or very similar, how we determine whether employees are able to execute post training requires more thought about the accuracy of the assessment. It must be developed from the performance objectives in order for it to be a valid “instrument”. The learner must perform (do something observable) so that it is evident s/he can carry out the task according to the real work place conditions.

To ensure learner success with the assessment, the training activities must also be aligned with the level of the objectives. This requires the design of the training event to shift from passive lecture to active engagement intended to prepare learners to transfer back in their workspace what they experienced in the event.   This includes making mistakes and how to recognize a deviation is occurring. Michael Allen refers to this as “building an authentic performance environment”. Thus, trainers and subject matter experts will need to upgrade their instructional design skills if you really want to succeed with training as intended. Are you willing to step up and do what it takes to ensure training is truly effective? – VB

 

Allen,M. Design Better Design Backward, Training Industry Quarterly, Content Development, Special Issue, 2017, p.17.

Why Knowledge Checks are Measuring the Wrong Thing

When I taught middle school math, tests were used to assess knowledge comprehension and some application with word problems and a few complex questions requiring logic proofs. Results were captured via a score; a metric if you will as to how well you answered the questions and very appropriate in academia.

In our quest for training evaluation metrics, we have borrowed the idea of testing someone’s knowledge as a measure of effectiveness. This implies that a corporate classroom mirrors an educational classroom and testing means the same thing – a measure of knowledge comprehension. However, professors, colleges, universities and academic institutions are not held to the same results oriented standard. In the business world, results need to be performance oriented, not knowledge gained.

So why are we still using tests?

Call it a quiz, a knowledge check or any other name it is still assessing some form of knowledge comprehension. In training effectiveness parlance, it is also known as a level 2 evaluation. Having the knowledge is no guarantee that it will be used correctly back on the job. Two very common situations occur in the life science arena where “the quiz” and knowledge checks are heavily used: Annual GMP Refresher and Read & Understand Approach for SOPs.

Life sciences companies are required by law to conduct annual regulations training (GMP Refreshers) so as to remain current. To address the training effectiveness challenge, a quiz / questionnaire / knowledge assessment (KA) is added to the event. But what is the KA measuring? Is it mapped to the course /session objectives or are the questions so general that they can be answered correctly without having to attend the sessions? Or worse yet, are the questions being recycled from year to year / event-to-event? What does it mean for the employee to pass the knowledge check or receive 80% or better? When does s/he learn of the results? In most sessions, there is no more time left to debrief the answers. This is a lost opportunity to leverage feedback into a learning activity. How do employees know if they are leaving the session with the “correct information”?

The other common practice is to include a 5 multiple choice as a knowledge check for Read & Understood (R & U) SOPs especially for revisions. What does it mean if employees get all 5 questions right? That they will not make a mistake? That the R & U method of SOP training is effective? The search function in most e-doc systems is really good at finding the answers. It doesn’t necessarily mean that they read the entire procedure and retained the information correctly. What does it mean for the organization if human errors and deviations from procedures are still occurring? Does it really mean the training is ineffective?

What should we be measuring?

The conditions under which employees are expected to perform need to be the same conditions under which we “test” them. So it makes sense to train ‘em under those same conditions as well. What do you want/need your employees (learners) to do after the instruction is finished? What do you want them to remember and use from the instruction in the heat of their work moments? Both the design and assessment need to mirror these expectations. And that means developing objectives that guide the instruction and form the basis of the assessment. (See Performance Objectives are not the same as Learning Objectives.)

So ask yourself, when in their day to day activities will employees need to use this GMP concept? Or, where in the employees’ workflow will this procedure change need to be applied? Isn’t this what we are training them for? Your knowledge checks need to ensure that employees have the knowledge, confidence and capability to perform as trained. It’s time to re-think what knowledge checks are supposed to do for you. – VB

Need to write better Knowledge Check questions?  Need to advise peers and colleagues on the Do’s and Don’ts for writing test questions?

Instructional Design: Not Just for Full Time Trainers Anymore

When I left the manufacturing shop floor and moved into training, full time trainers presented in the classroom using a host of techniques, tools, and relied on their platform skills to present content. Subject matter experts (or the most senior person) conducted technical training on the shop floor in front of a piece of equipment, at a laboratory station or a work bench.

For years, this distinction was clearly practiced where I worked. Trainers were in the classroom and SMEs delivered OJT. Occasionally a “full time” trainer would consult with a SME on content or request his/her presence in the room during delivery as a back-up or for the Q & A portion of a “presentation”. It seemed that the boundaries at the time, were so well understood, that one could determine the type of training simply by where it was delivered.

Training boundaries are limitless today

Today, that’s all changed. No longer confined to location or delivery methods, full time trainers can be found on the shop floor fully gowned delivering GMP (Good Manufacturing Practices) content for example. And SMEs are now in the classroom more each day with some of the very tools used by full time trainers! What defines a full time trainer from a SME is less important, what is imperative however is what defines effective instruction.

Instructional Design is a recognized profession

What goes into good instructional design?

Believe it or not, instruction design (ID) / instructional technology is a degreed program offered at numerous colleges and universities. Underlying the design, is a methodology for “good” course design and really good instructional designers will confess that there is a bit of an art form to it as well. Unfortunately, with shrinking budgets and downsized L&D staffs, there are less resources available to develop training materials. Not to mention, shrinking time lines for the deliverables. So it makes sense to tap SMEs for more training opportunities since many are already involved in training at their site. But, pasting their expert content into a power point slide deck is not instructional design. Nor is asking a SME to “deliver training” using a previously created power point presentation effective delivery.

What is effective design?

To me, effective design is when learners not only meet the learning objectives during training but also transfer that learning experience back on the job and achieve performance objectives / outcomes. That’s a tall order for a SME, even for full time trainers who have not had course design training. The methodology a course designer follows be that ADDIE, Agile, SAM (Successive Approximation Model), Gagne’s 9 Principles, etc., provides a process with steps to facilitate the design rationale and then development of content including implementation and evaluation of effectiveness. It ensures that key elements are not unintentionally left out or forgotten about until after the fact like evaluation/ effectiveness or needs assessment. In an attempt to expedite training, these methodology driven elements are easily skipped without fully understanding the impact the consequences can have on overall training effectiveness. There is a science to instructional design.

The “art form” occurs when a designer creates visually appealing slides and eLearning scenes as well as aligned activities and engaging exercises designed to provide exploration, practice and proficiency for the performance task back on the job. The course materials “package” is complete when a leader’s guide is also created that spells out the design rationale and vision for delivery, especially when someone else will be delivering the course such as SMEs as Classroom Facilitators.

The Leaders Guide

Speaker notes embedded at the bottom of the notes pages within power point slides is not a leader’s guide. While handy for scripting what to say for the above slide, it does not provide ample space for facilitating other aspects of the course such as visual cues, tips for “trainer only” and managing handouts, etc. A well-designed leader’s guide has the key objectives identified and the essential learning points to cover. These learning points are appropriately sequenced with developed discussion questions to be used with activities; thus removing the need for the facilitator to think on demand while facilitating the activity. This also reduces the temptation to skip over the exercise/activity if s/he is nervous or not confident with interactive activities.

A really good guide will also include how to segue to the next slide and manage seamless transitions to next topic sections. Most helpful, are additional notes about what content MUST be covered, tips about expected responses for activities and clock time duration comments for keeping to the classroom schedule. Given all the time and effort to produce the leaders guide, it is wasted if the course designer and SME as Facilitator do not have a knowledge transfer session. Emailing the guide or downloading it from a share point site will not help the SME in following the guide during delivery unless an exchange occurs in which SMEs can begin to mark up his/her copy.

Using previously developed materials

I am not criticizing previous course materials if they were effective. But replacing clip art with new images and updating the slide deck to incorporate the new company background is not going to change the effectiveness of the course unless content was revised and activities were improved. For many SMEs, having a previous slide deck is both a gift and a curse.

While they are not starting with a blank storyboard, there is a tendency to use as-is and try to embellish it with speaker notes because the original producer of the power point slide did not include them or worse, provided no leader’s guide. The SME has the burden to make content decisions such as what content is critical; what content can be cut if no time. Perhaps even more crucial is how to adapt content and activities to different learner groups or off-shift needs. SMEs who attend a HPISC. ID basics course learn how to use design checklists for previously developed materials.   These checklists allow them to confidently assess the quality of the materials and justify what needs to be removed, revised or added; thus truly upgrading previously developed materials.

What’s so special about SMEs as Course Designers?

They have expertise and experience and are expected to share it via training their peers. But now the venue is the classroom as well. It’s training on course design methodology that is needed. SMEs and most trainers do not automatically have this knowledge. Some develop it by reading A LOT, attending well-designed courses, and over time with trial and error and painful feedback. The faster way is to provide funds to get SMEs as Course Designers at least exposed to how to effectively design for learning experiences so that they can influence the outcome of the objectives. This is management support for SMEs as Trainers. -VB

Facilitating the Shift from Passive Listening to Active Learning

On the one end of “The Learner Participation Continuum” is lecture which is a one way communication and requires very little participation.  At the other end, we have experiential learning and now immersive learning environments with the introduction of 3D graphics, virtual simulations and augmented reality.

In the middle of the range are effective “lectures” and alternate methods such as:

  • Demonstrations
  • Case Study
  • Guided Teaching
  • Group Inquiry
  • Read and Discuss
  • Information Search.

Shift one step to right to begin the move to active learningNow before you insist that the SME as Facilitator move to the far right and conduct only immersive sessions, a word of caution is in order. It’s really about starting with the learners’ expectations and the current organizational culture and then moving one step to the right. If they are used to lectures from SMEs, then work on delivering effective lectures before experimenting with alternate training methods. The overnight shift may be too big of a change for the attendees to adjust to despite their desire for no more boring lectures. Small incremental steps is the key.

How is this done? Upfront in the design of the course materials. The course designers have spent time and budget to prepare a leaders guide that captures their vision for delivering the course.  SMEs as Facilitators (Classroom SMEs) need to study the leader’s guide and pay attention to the icons and notes provided there. These cues indicate the differentiation from lecture, to an activity whether that be self, small group, or large group. While it may be tempting to skip exercises to make up for lost time, it is better for learner participation to skip lecture and modify an activity if possible.

During the knowledge transfer session/ discussion with the course designer and/or instructor, Classroom SMEs make notes of how the instructor transitions from one slide to the next and how s/he provided instruction for the activity. This is a good time for Classroom SMEs to ask how to modify content or an activity if certain conditions should occur. Especially important for SMEs to ask is what content is critical and what content can be skipped if time runs short. It is always a good idea for the Classroom SME to mark-up his/her copy of the materials. And then again after the first delivery to really make it their own leader’s guide. -VB

Speaking of personalizing their leaders’ guide, SMEs may want to experiment with different ways to “open a session” to get experience with a variety of techniques and observe which ones yield better results.

I’m in love with my own content!

Many QA /HR Training Managers have the responsibility for providing a train-the-trainer course for their designated trainers.  While some companies send their folks to public workshop offerings, many chose to keep the program in-house.   And then an interesting phenomenon occurs.  The course content grows with an exciting and overwhelming list of learning objectives.

The supervisors of the SMEs struggle with the loss of productivity for the 2 – 3 day duration and quickly develop a “one and done” mindset.   Given the opening to “train” newly identified SMEs as Trainers, the instructional designer gets one opportunity to teach them how to be trainers.  So s/he tends to add “a lot of really cool stuff” to the course in the genuine spirit of sharing, all justifiable in the eyes of the designer.  However, there is no hope in breaking this adversarial cycle if the Training Manager doesn’t know how to cut content.

I used to deliver a two-day (16 hour) workshop for OJT Trainers. I included all my favorite topics.  Yes, the workshop was long.  Yes, I loved teaching these concepts.  I honestly believed that knowing these “extra” learning theory concepts would make my OJT Trainers better trainers.  Yes, I was in love with own my content.  And then one day, that all changed.

 

Do they really need to know Maslow’s Hierarchy of Needs?

During a rapid design session I was leading, I got questioned on the need to know Maslow’s Hierarchy of Needs.  As I began to deliver my auto-explanation, I stopped mid-sentence.  I had an epiphany.  My challenger was right.  Before I continued with my response, I feverishly racked my brain thinking about the training Standard Operating Procedures (SOPs) we revised, the forms we created, and reminded myself of the overall goal of the OJT Program.  I was searching for that one moment during an OJT session when Maslow was really needed.  When would an OJT Qualified Trainer use this information back on the job, if ever I asked myself?

It belongs in the Intermediate Qualified Trainers Workshop, I said out loud.  In that moment, that one question exercise was like a laser beam cutting out all nice-to-know content.  I eventually removed up to 50% of the content from the workshop.

 

Oh, but what content do we keep?

Begin with the overall goal of the training program: a defendable and reproducible methodology for OJT.  The process is captured in the redesigned SOPs and does not need to be repeated in the workshop.  See Have you flipped your OJT TTT Classroom yet?

Seek agreement with key stakeholders on what the OJT QTs are expected to do after the workshop is completed.  If these responsibilities are not strategic or high priority, then the course will not add any business value.  Participation remains simply a means to check the compliance box.  Capture these expectations as performance objectives.

How to align purpose of a course to business goals

Once there is agreement with the stated performance objectives, align the content to match these. Yes, there is still ample room in the course for learning theory, but it is tailored for the need to know only topics.

In essence, the learning objectives become evident.  When challenged to add certain topics, the instructional designer now refers to the performance objectives and ranks the consequences of not including the content in the workshop against the objectives and business goal for the overall program.

 

What is the value of the written assessment?

With the growing demand for training effectiveness, the addition of a written test was supposed to illustrate the commitment for compliance expectations around effectiveness and evaluation.  To meet this client need, I put on my former teacher hat and created a 10 question open book written assessment.  This proved to need additional time to execute and hence, more content was cut to accommodate the classroom duration.

My second epiphany occurred during the same rapid design project, albeit a few weeks later.   What is the purpose of the classroom written assessment when back on the job the OJT QTs are expected to deliver (perform) OJT; not just know it from memory? The true measure of effectiveness for the workshop is whether they can deliver OJT according to the methodology, not whether they retained 100% of the course content!   So I removed the knowledge test and created a qualification activity for the OJT QTs to demonstrate their retained knowledge in a simulated demonstration using their newly redesigned OJT checklist.  Now the OJT QT Workshop is value added and management keeps asking for another round of the workshop to be scheduled.  -VB

Are you ready to update your OJT TTT Course?

 

 

 

I’ve fired my [TTT] Vendor!  

Sustaining Qualified Trainer’s Momentum Post Launch

With a little help from my Validation Colleagues – The Training Protocol

In the blog, “Learning on the Fly“, I blogged about an urgent learning need requiring instructor-led classroom delivery that needed to be facilitated among a group of talented SMEs.  During the needs assessment portion, I hit a huge barrier.

“I teach GMP Basics and conduct Annual GMP Refreshers several times a year and preach to audiences that you must follow the procedure otherwise it’s a deviation.  And in less than two weeks, I am expected to teach a process that is changing daily!   Yet on the other hand, how could I teach a work instruction that is known to be broken; is being re-designed and not yet finalized?”

My dilemma challenged the essence of my “learned” compliance belief system about following the 1st basic GMP principle – “thou shall follow written procedures”!  The instructional designer side of me screamed – how can you teach flawed content?  That’s wasted training that results in scrap learning. How is that training going to be effective beyond a check in the box?

And then it hit me – validation engineers use protocols to capture their “change in process” work.  Whether it’s experimental batches, 3 batches for process validation or *IQ-OQ-PQ protocols for equipment qualifications.  They are validating the procedure or the new process before it can become the standard operating procedure by developing the plan, developing acceptance criteria, managing the unexpected deviations and capturing the results.  So why couldn’t I borrow the concept and adapt it to my situation?

While it was the intention of the business unit leader to deviate from the approved set of work instructions, a planned deviation would not be appropriate in this case.  The purpose of the training sessions was to test the new sequence of steps and confirm the robustness of the criteria to make correct decisions where needed.  The learners would still be in compliance with the quality policy document and would still meet the intention of the quality system regulation.  They were essentially testing the future “how-to steps” for the proposed new work instructions.

Now before you fire off a rant of emails to me, I did not copy and paste the validation protocol template.  I did however, include a “please pardon our appearance while we are under construction” paragraph in the training plan to document the departure from the current set of work instructions.  This protocol like section also included our intentions for the outcomes of the sessions and stipulated required SOP training of all affected users once the finalized set of work instructions were approved and went into effect.

Sometimes the very solution can be found around the next cubicle.  –VB

*Installation Qualification, Operational Qualification, Performance Qualification

(c) HPIS Consulting, Inc.

“Learning on the fly” or is this what they meant by the Agile Learning Model?

When Rapid Design for E Learning found its way into my vocabulary, I loved it and all the derivatives like rapid prototyping.  And soon, I starting seeing Agile this and Agile that.  It seemed that Agile was everywhere I looked.  When Michael Allen published his book, LEAVING ADDIE for SAM, I was intrigued and participated in an ATD (formerly known as ASTD) sponsored webinar.  It made a lot of sense to me and “I bought into the concept”.  Or so I thought …

 

A few weeks back, I joined a project that was already in-progress and had to “hit the ground running to get caught up to speed”.  The element of urgency was the anticipation of a post FDA visit following a consent decree.   If you’ve experienced this “scene” before, you can relate to the notion of expedited time.   As part of remediation efforts, training events needed to be conducted.  I learned during a meeting sometime my first week, I was to be the trainer.  Okay, given my background and experience, that made sense.  Sure, in a few weeks when we have the new procedure in place, I’d be happy to put the training materials together, is what I was thinking.  Wait – in two weeks?  Are you kidding me?  I’m not the SME and I don’t even have the software loaded on my laptop yet.  Well, some cleaned up version of those words was my response.

 

But what about all that buzz for rapid design and prototyping I’ve been reading about?  In theory, I totally bought it.  But, this is different I argued with myself.  This is compliance with a quality system for a company who is undergoing transformative change as a result of a consent decree!  I teach GMP Basics and conduct Annual GMP Refreshers several times a year and preach to audiences that you must follow the procedure otherwise it’s a deviation.  And in less than two weeks, I am expected to teach a process that is changing daily!   Yet on the other hand, how could I teach a work instruction that is known to be broken; is being re-designed and not yet finalized?  Stay tuned for a future blog about how I overcame this dilemma.

 

My bigger issue was to get out of my own design way.  I’m classically schooled in *ADDIE and with 25+ years as an instructional designer, very comfortable with how to design, develop and deliver training.  All I needed was more time and it hit me!  I was so focused on what I needed, that I was missing the urgency of the learners’ needs.  It was time to put theory into practice and take the agile plunge into the domain of the unknown.

 

By shifting the prioritization away from perfectly designed classes with pristine training materials, I was able to diagnose that the need was to get the learners into a live classroom.   They needed to see the database software in action and “play in the sandbox”; the training materials could follow afterwards.  I shifted my role to facilitator and found the true SMEs to navigate the software screens and explain how to complete field transactions.  To my surprise and delight, trainer-wannabes volunteered to paste screen shots into participant worksheets so they could take notes.  I became a scribe and worked on sequencing these pages for the next round of attendees.  Together, we all collaborated to meet the urgent need of the learners.   And we documented it!  Once they had the tour and sand-box time, the learners were paired up with a buddy for guided instruction of real entry into the live system.  The following week, the department was able to go live with a project plan that focused on a series of interim roles, changed roles and transitioning responsibilities within established roles.  The project launched on time to meet commitments promised to the agency.

 

It was energizing and empowering for the learners. A truly collaborative experience for the SMEs and the biggest surprise of all was that they thanked me.  Me?  I did not deliver the training; I was not the SME nor did I provide perfect training materials.   If I had pursued my classically trained ADDIE technique, we’d still be waiting to deliver those sessions.  However, I’m not ready to throw ADDIE over board yet.  She has served me well and continues to be an appropriate technique in most of my training needed situations.

 

My lesson learned was this: when the need is for speed and the design is not the key focus, I need to give up control to the SMEs and Learners and focus on facilitating the best learning experience given the daily change challenges and system constraints.   Is this “learning on the fly” or agile learning in practice?  You decide.

 

*NOTE: ADDIE = Analyze, Design, Develop, Implement, Deliver – classic phases of Instructional Systems Design (ISD) Technique.