Taking the Agile Learning Plunge

When Rapid Design for eLearning found its way into my vocabulary, I loved it and all the derivatives like rapid prototyping, etc.  And soon, I started seeing agile this and agile that.  It seemed that agile was everywhere I looked.  When Michael Allen published his book, LEAVING ADDIE for SAM, I was intrigued and participated in an ATD sponsored webinar.  It made a lot of sense to me and “I bought into the concept”.  Or so I thought …

I joined a project that was already in-progress and had to “hit the ground running to get caught up to speed”.  The element of urgency was the anticipation of a post FDA visit following a consent decree.   If you’ve experienced this “scene” before, you can relate to the notion of expedited time.   As part of remediation efforts, training events needed to be conducted.  I learned during a meeting sometime my first week, I was to be the trainer.  Okay, given my instructional design background and classroom facilitation experience, that made sense.  Sure, in a few weeks when we have the new procedure in place, I’d be happy to put the training materials together, is what I was thinking.  Wait, what, in two weeks?  Are you kidding me?  I’m not the SME and I don’t even have the software loaded on my laptop yet.  Well, some cleaned up version of those words was my response.

My biggest challenge was to get out of my own design way

I’m classically schooled in *ADDIE with 30+ years as an instructional designer and very comfortable with how to design, develop and deliver training.  All I needed was more time; more than two weeks, for a process that was changing daily!   And then I found myself thinking about all the buzz for rapid design and prototyping I had been reading about.  

*ADDIE = Analysis, Design, Develop, Implement, Evaluate: a project management approach to training projects.

In theory, I totally bought into it. But this is different I argued with myself.  This is compliance with a quality system for a company that is undergoing transformative change as a result of a consent decree!  Furthermore, I teach GMP Basics and conduct Annual GMP Refreshers several times a year. My GMP dilemma challenged the very essence of my “learned” compliance beliefs about following the 1st basic GMP Work Habit – “thou shall follow written procedures” otherwise, it’s a deviation. 

Are we really planning to deviate from the SOP while under a consent decree?

While it was the intention of the business unit leader to deviate from the approved set of work instructions, a planned deviation would not be appropriate in this case.  I mean we were talking about a corrective action for a consent decree item.  Were we really considering a PLANNED DEVIATION to intentionally teach unapproved procedures and then submit the documentation as a completed corrective action for the CAPA to the agency?  I was truly baffled by how I was going to pull this off in two weeks.  I’m not a magician, I can’t pull this rabbit out of my laptop is what I was thinking when I left the VP’s office.

Yet on the other hand, how could I teach a work instruction that was known to be broken; was being re-designed and not yet finalized?  The instructional designer side of me screamed – how can you teach flawed content?  That’s wasted training that results in scrap learning. How is that training going to be effective not to mention having to explain a failed effectiveness check during the post inspection?

And then, it hit me!  I was so focused on WHAT I NEEDED, that I was missing the urgency of the learners’ needs. Julia Lewis Satov refers to this situation as ‘agility by fire’ – “the ability to move quickly but not easily, and still excel”, (p. 50, 2020). It was time to put theory into practice and take the agile learning plunge into the realm of the unknown.  If I could come up with a way to document what we were doing and get it approved, then I could reconcile my GMP dilemma and satisfy my instructional designer. 

 With a little help from my validation colleagues – the training implementation plan

Validation engineers use protocols to capture their “change in process” work.  Whether it’s experimental batches, 3 batches for process validation or **IQ-OQ-PQ protocols for equipment qualifications.  They are validating the procedure or the new process before it can become the standard operating procedure by developing the plan, developing acceptance criteria, managing deviations and capturing the results.  So why couldn’t I borrow the concept and adapt it to my situation?

**Installation Qualification, Operational Qualification, Performance Qualification

The purpose of the initial training session was to test the new sequence of steps and confirm the robustness of the software responses for each field entry and then make correct decisions where needed.  The learners were still in compliance with the quality policy for complaint handling and were still meeting the intention for Medical Device Reporting requirements.  They were essentially testing the future “how-to steps” for the proposed new work instructions.

Agile QT’s processing their learning experience

I did not copy and paste the validation protocol template. I did, however, include a please pardon our appearance while we are under construction” paragraph in the training plan to document the departure from the current set of work instructions.  This protocol-like section also included our intentions for the outcomes of the sessions. We also stipulated that required SOP training of all affected users including the Qualified Trainers, would be mandatory once the finalized set of work instructions were approved.

Anybody want to play in the sand-box?

By shifting the prioritization away from perfectly designed classes with pristine training materials, I was able to diagnose that the need was to get the learners into a live classroom. But first I needed a small group of super users who wanted to see the database software in action and “play in the sandbox”; the training materials could follow afterwards. 

It didn’t take long for them to find me.  These “learning-agile individuals” wanted the challenge of not only learning something new but seemed to thrive on the idea that they would be managing their part of the training implementation plan.  They were not at all worried about the lack of available training materials for themselves.  They allowed the learning experience to occur spontaneously.  Their ability to learn new knowledge and skills did not get in the way of previously learned skills. They embraced the changes rather than resist them.

A new breed of SMEs as Agile Qualified Trainers?

I shifted my role to facilitator and allowed these learning agile SMEs to navigate the software screens and then work out the explanation of how to complete field transactions.  In the Center for Creative Leadership “Learning Agility” white paper, authors Adam Mitchinson and Robert Morris explain that learning-agile individuals understand that experience alone does not guarantee learning; they take time to reflect, seeking to understand why things happen, in addition to what happened”, p. 2.

“SMEs are true front-line and onsite educators” says Satov.  Every organization has employees who are brimming with intelligent and diverse ideas and are eager to share their talent producing work deliverables. “[…] Our focus must shift to finding and developing individuals who are continually able to give up skills, perspectives, and ideas that are no longer relevant, and learn new ones that are”, (Mitchinson and Morris, 2014, p.1).

We documented these sessions as training because we all learned how to navigate the screens; albeit it was learning on the fly.  We recognized that learning the software was the goal.  Developing the process steps and eventually the work instructions was the secondary goal.  This training documentation became the qualifying evidence for their train-the-trainer knowledge transfer.  And collectively they decided what choices end users were to pick from the drop down tables.  

Is this “learning on the fly” or agile learning in practice? You decide.

1 + 1+ 1 is more than 3

I shifted my role again to become a scribe and worked on sequencing these pages for the next round of end-users. To my surprise and delight, my new breed of Agile QTs volunteered to paste screen shots into participant worksheets so their “students” could take additional notes.  Together, we all collaborated to meet the urgent need of the end-users. Each of us in our niche roles experienced first-hand the value the others brought with them to that room.  And in that time away from our regular job tasks, we became more valuable to the organization.

The learners were paired up with their Agile QT for guided instruction of real entry into the live system.  The following week, the department was able to go live with a project plan that focused on a series of interim roles, changed roles and transitioning responsibilities within established roles.  The project launched on time to meet commitments promised to the agency.

Why are they thanking me?

It was an energizing and empowering learning experience for the super-users. A truly collaborative experience for the SMEs and the biggest surprise of all was that they thanked me.  Me?  I did not deliver the training; I was not the SME, nor did I provide perfect training materials.   If I had pursued my classically trained ADDIE approach, we would have waited for the perfect SOP to deliver those sessions and woefully miss FDA committed timelines. While I’m not ready to throw ADDIE overboard yet, Satov makes a compelling plea, “move aside elite and long-standing establishments of formal education”. 

My lesson learned was this: when the demand is for speed and the content design is not the key focus, I need to give up control to the true onsite educators and focus on facilitating the best learning experience given the daily change challenges and system constraints. Satov would agree, “the role of learning is to capitalize and create the architecture of the hybrid-mind”.  Is this “learning on the fly” or agile learning in practice?  You decide. But agile instructional design is here to stay if QA L&D is going to keep up with the fast-paced, often reactive, and regulated world of the Life Sciences Industries. – VB

  • Allen, M. Leaving ADDIE for SAM: An Agile Model for Developing the Best Learning Experiences. ASTD, 2012.
  • Mitchinson, A & Morris, R. Learning Agility. Center for Creative Leadership white paper, 2014.
  • Satov, JML. “Agile by Fire”, Chief Learning Office, July/ August, 2020, p. 50.
Need to expedite a CAPA remediation project? |Looking for a facilitator/ quality systems project manager to align your SMEs for collaborative deliverables?

Who is the Author, Vivian Bringslimark?

(c) HPIS Consulting, Inc.

Why Knowledge Checks are Measuring the Wrong Thing

When I taught middle school math, tests were used to assess knowledge comprehension and some application with word problems and a few complex questions requiring logic proofs.  Results were captured via a score; an indication as to how well you answered the questions.  Whether you call it a quiz, a knowledge check or any other name it is still assessing some form of knowledge comprehension.

Life sciences companies are required by law to conduct annual regulations training (GXP Refreshers) so as to remain current in the operations being performed.  Given the design and delivery of most refresher training sessions, it is heavy with content that is delivered to a passive learner.  Listening to a speaker or reading slide after slide and clicking the next button is not active engagement.  It’s boring, mind-numbing and, in some cases, downright painful for learners.  As soon as the training is over, it’s very easy to move past the experience.  That also means most of the content is also soon forgotten.  Regulatory agencies are no stranger to this style of compliance training and delivery.  But when departures from SOPs and other investigations reveal a lack of GXP knowledge, they will question the effectiveness of the GXP training program. 

So why are we still using tests?

In our quest for effective compliance training, we have borrowed the idea of testing someone’s knowledge as a measure of effectiveness.  This implies that a corporate classroom mirrors an educational classroom and testing means the same thing – a measure of knowledge comprehension.  However, professors, colleges, universities and academic institutions are not held to the same results standard.  In the Life Sciences, two very common situations occur where knowledge checks and “the quiz” are heavily used: Annual GXP Refresher and the Read & Understand Approach for SOPs. 

Typical Knowledge Check

But what is the knowledge assessment measuring?  Is it mapped to the course objectives or are the questions so general that they can be answered correctly without having to attend the sessions?  Or worse yet, are the questions being recycled from year to year / event-to-event?  What does it mean for the employee to pass the knowledge check or receive 80% or better? When does s/he learn of the results? In most sessions, there is no more time left to debrief the answers. This is a lost opportunity to leverage feedback into a learning activity.  How do employees know if they are leaving the session with the “correct information”?

The other common practice is to include a multiple-choice quiz consisting of 5 questions as the knowledge check for SOPs that are “Read & Understood” especially for revisions.  What does it mean if employees get all 5 questions right?  That they will not make a mistake?  That the R & U method of SOP training is effective?  The search function in most e-doc systems is really good at finding the answers.  It doesn’t mean that they read the entire procedure and retained the information correctly.  What does it mean for the organization if human errors and deviations from procedures are still occurring?  Does it really mean “the training” is ineffective?

Conditions must be the same for evaluation as it for performance and training

What should we be measuring?

The conditions under which employees are expected to perform need to be the same conditions under which we “test” them.  So, it makes sense to train them under those same conditions as well.  What do you want/need your employees (learners) to do after the instruction is finished? What do you want them to remember and use from the instruction in the heat of their work moments?  Both the design and assessment need to mirror these expectations. 

Ask yourself, when in their day to day activities will employees need to use this GMP concept?  Or, where in the employees’ workflow will this procedure change need to be applied?  Isn’t this what we are training them for?  Your knowledge checks need to ensure that employees have the knowledge, confidence and capability to perform as trained.  It’s time to re-think what knowledge checks are supposed to do for you. And that means developing objectives that guide the instruction and form the basis of the assessment.

Objectives drive the design and the assessment

Instructional Design 101 begins with well-developed objective statements for the course, event, or program.  These statements aka objectives determine the content and they also drive the assessment.  For example, a quiz or a knowledge check is typically used for classroom sessions that ask questions about the content.  In order for learners to be successful, the course must include the content whether delivered in class or as pre-work. But what are the assessments really measuring?  How much of the content they remember and maybe how much of the content they anticipate applying when they return to work? 

When the desired outcomes are compliance back on the job, knowledge checks need to shift to become more application-oriented (performance-based) such as “what if situations” and real scenarios that will require an employee to analyze the situation, propose a workaround and then evaluate if the problem-solving idea meets the stated course objectives.

Performance objectives drive a higher level of course design

Training effectiveness is really an evaluation of whether we achieved the desired outcome.  So, I ask you, what is the desired outcome for your GXP training and SOP Training: to gain knowledge (new content) and/or to use the content correctly back in the workplace? The objectives need to reflect the desired outcomes in order to determine the effectiveness of training; not just knowledge retention.

What is the desired outcome?

When you begin with the end in mind namely, the desired performance outcomes, the objective statements truly describe what the learners are expected to accomplish.  While the content may be the same or very similar, how we determine whether employees are able to execute what they learned post training requires more thought about the accuracy of the assessment.  It must be developed from the performance objectives in order for it to be a valid “instrument”.  The learner must perform (do something observable) so that it is evident s/he can carry out the task according to the real workplace conditions.

Accuracy of the assessment tools

The tools associated with the 4 Levels of Evaluation can be effective when used for the right type of assessment.  For example, Level 1 (Reaction) surveys are very helpful for Formative Assessments.  See below – Formative vs. Summative Distinction.  Level 2 (Learning) knowledge assessments are effective in measuring retention and minimum comprehension and go hand in hand with learning-based objectives.  But when the desired outcomes are actually performance-based, Level 3 (Behavior) checklists can be developed for the performance of skills demonstrations and samples of finished work products. Note: Level 4 is Results and is business impact-focused.

Difference between formative and summative assessments

Trainers are left out of the loop

Today’s trainers don’t always have the instructional design skill set developed.  They do the best they can with the resources given including reading books and scouring the Internet.  For the most part, their training courses are decent and the assessments reflect passing scores.  But when it comes to Level 4 (Results) impact questions from leadership, it becomes evident that trainers are left out of the business analysis loop and therefore are missing the business performance expectations.  This is where the gap exists.  Trainers build courses based on knowledge/content instead and develop learning objectives that determine what learners should learn.  They create assessments to determine whether attendees have learned the content, but this does not automatically confirm learners can apply the content back on the job in various situations under authentic conditions.

Levels of objectives, who knew?

Many training mangers have become familiar with the 4 Levels of Evaluation over the course of their time in training, but less are acquainted with Bloom’s Taxonomy of Objectives.  Yes, objectives have levels of increasing complexity resulting in higher levels of performance.  Revised in 2001, the levels were renamed for better description of what’s required of the learner to be successful in meeting the objective. 

Bloom’s Revised Taxonomy of Objectives

Take note, remembering and understanding are the lowest levels of cognitive load while applying and analyzing are mid-range.  Evaluating and creating are at the highest levels.  If your end in mind is knowledge gained ONLY, continue to use the lower level objectives.  If; however, your desired outcome is to improve performance or apply a compliant workaround in the heat of a GMP moment, your objectives need to shift to a higher level of reasoning in order to be effective with the training design and meet the stated performance outcomes. Fortunately, much has been written about writing effective objective statements and resources are available to help today’s trainers.

Did we succeed as intended? Was the training effective in achieving the desired outcomes?

To ensure learner success via the chosen assessment, the training activities must also be aligned with the level of the objectives.  This requires the design of the training event to shift from passive lecture to active engagement intended to prepare learners to transfer back in their workspace what they experienced in the event.   The training design also needs to include practice sessions where making mistakes is an additional learning opportunity and teach learners how to recognize a deviation is occurring.  Michael Allen refers to this as “building an authentic performance environment”. 

Our training outcomes need to be both knowledge gained, and performance based.  Now, the agency is expecting us to document that our learners have the knowledge AND can apply it successfully in order to follow SOPs and comply with the regulations. Thus, trainers and subject matter experts will need to upgrade their instructional design skills if you really want to succeed with training as intended.  Are you willing to step up and do what it takes to ensure training is truly effective? – VB

Allen,M. Design Better Design Backward, Training Industry Quarterly, Content Development, Special Issue, 2017, p.17.

Who is Vivian Bringslimark, the Author?

Need to write better Knowledge Check questions?
Tips for Writing KCs
Find out the Do’s and Don’ts for Writing Assessment Questions

Want to see the Table of Contents before you request it? No problem!

(c) HPIS Consulting, Inc.