Taking the Agile Learning Plunge

When Rapid Design for eLearning found its way into my vocabulary, I loved it and all the derivatives like rapid prototyping, etc.  And soon, I started seeing agile this and agile that.  It seemed that agile was everywhere I looked.  When Michael Allen published his book, LEAVING ADDIE for SAM, I was intrigued and participated in an ATD sponsored webinar.  It made a lot of sense to me and “I bought into the concept”.  Or so I thought …

I joined a project that was already in-progress and had to “hit the ground running to get caught up to speed”.  The element of urgency was the anticipation of a post FDA visit following a consent decree.   If you’ve experienced this “scene” before, you can relate to the notion of expedited time.   As part of remediation efforts, training events needed to be conducted.  I learned during a meeting sometime my first week, I was to be the trainer.  Okay, given my instructional design background and classroom facilitation experience, that made sense.  Sure, in a few weeks when we have the new procedure in place, I’d be happy to put the training materials together, is what I was thinking.  Wait, what, in two weeks?  Are you kidding me?  I’m not the SME and I don’t even have the software loaded on my laptop yet.  Well, some cleaned up version of those words was my response.

My biggest challenge was to get out of my own design way

I’m classically schooled in *ADDIE with 30+ years as an instructional designer and very comfortable with how to design, develop and deliver training.  All I needed was more time; more than two weeks, for a process that was changing daily!   And then I found myself thinking about all the buzz for rapid design and prototyping I had been reading about.  

*ADDIE = Analysis, Design, Develop, Implement, Evaluate: a project management approach to training projects.

In theory, I totally bought into it. But this is different I argued with myself.  This is compliance with a quality system for a company that is undergoing transformative change as a result of a consent decree!  Furthermore, I teach GMP Basics and conduct Annual GMP Refreshers several times a year. My GMP dilemma challenged the very essence of my “learned” compliance beliefs about following the 1st basic GMP Work Habit – “thou shall follow written procedures” otherwise, it’s a deviation. 

Are we really planning to deviate from the SOP while under a consent decree?

While it was the intention of the business unit leader to deviate from the approved set of work instructions, a planned deviation would not be appropriate in this case.  I mean we were talking about a corrective action for a consent decree item.  Were we really considering a PLANNED DEVIATION to intentionally teach unapproved procedures and then submit the documentation as a completed corrective action for the CAPA to the agency?  I was truly baffled by how I was going to pull this off in two weeks.  I’m not a magician, I can’t pull this rabbit out of my laptop is what I was thinking when I left the VP’s office.

Yet on the other hand, how could I teach a work instruction that was known to be broken; was being re-designed and not yet finalized?  The instructional designer side of me screamed – how can you teach flawed content?  That’s wasted training that results in scrap learning. How is that training going to be effective not to mention having to explain a failed effectiveness check during the post inspection?

And then, it hit me!  I was so focused on WHAT I NEEDED, that I was missing the urgency of the learners’ needs. Julia Lewis Satov refers to this situation as ‘agility by fire’ – “the ability to move quickly but not easily, and still excel”, (p. 50, 2020). It was time to put theory into practice and take the agile learning plunge into the realm of the unknown.  If I could come up with a way to document what we were doing and get it approved, then I could reconcile my GMP dilemma and satisfy my instructional designer. 

 With a little help from my validation colleagues – the training implementation plan

Validation engineers use protocols to capture their “change in process” work.  Whether it’s experimental batches, 3 batches for process validation or **IQ-OQ-PQ protocols for equipment qualifications.  They are validating the procedure or the new process before it can become the standard operating procedure by developing the plan, developing acceptance criteria, managing deviations and capturing the results.  So why couldn’t I borrow the concept and adapt it to my situation?

**Installation Qualification, Operational Qualification, Performance Qualification

The purpose of the initial training session was to test the new sequence of steps and confirm the robustness of the software responses for each field entry and then make correct decisions where needed.  The learners were still in compliance with the quality policy for complaint handling and were still meeting the intention for Medical Device Reporting requirements.  They were essentially testing the future “how-to steps” for the proposed new work instructions.

Agile QT’s processing their learning experience

I did not copy and paste the validation protocol template. I did, however, include a please pardon our appearance while we are under construction” paragraph in the training plan to document the departure from the current set of work instructions.  This protocol-like section also included our intentions for the outcomes of the sessions. We also stipulated that required SOP training of all affected users including the Qualified Trainers, would be mandatory once the finalized set of work instructions were approved.

Anybody want to play in the sand-box?

By shifting the prioritization away from perfectly designed classes with pristine training materials, I was able to diagnose that the need was to get the learners into a live classroom. But first I needed a small group of super users who wanted to see the database software in action and “play in the sandbox”; the training materials could follow afterwards. 

It didn’t take long for them to find me.  These “learning-agile individuals” wanted the challenge of not only learning something new but seemed to thrive on the idea that they would be managing their part of the training implementation plan.  They were not at all worried about the lack of available training materials for themselves.  They allowed the learning experience to occur spontaneously.  Their ability to learn new knowledge and skills did not get in the way of previously learned skills. They embraced the changes rather than resist them.

A new breed of SMEs as Agile Qualified Trainers?

I shifted my role to facilitator and allowed these learning agile SMEs to navigate the software screens and then work out the explanation of how to complete field transactions.  In the Center for Creative Leadership “Learning Agility” white paper, authors Adam Mitchinson and Robert Morris explain that learning-agile individuals understand that experience alone does not guarantee learning; they take time to reflect, seeking to understand why things happen, in addition to what happened”, p. 2.

“SMEs are true front-line and onsite educators” says Satov.  Every organization has employees who are brimming with intelligent and diverse ideas and are eager to share their talent producing work deliverables. “[…] Our focus must shift to finding and developing individuals who are continually able to give up skills, perspectives, and ideas that are no longer relevant, and learn new ones that are”, (Mitchinson and Morris, 2014, p.1).

We documented these sessions as training because we all learned how to navigate the screens; albeit it was learning on the fly.  We recognized that learning the software was the goal.  Developing the process steps and eventually the work instructions was the secondary goal.  This training documentation became the qualifying evidence for their train-the-trainer knowledge transfer.  And collectively they decided what choices end users were to pick from the drop down tables.  

Is this “learning on the fly” or agile learning in practice? You decide.

1 + 1+ 1 is more than 3

I shifted my role again to become a scribe and worked on sequencing these pages for the next round of end-users. To my surprise and delight, my new breed of Agile QTs volunteered to paste screen shots into participant worksheets so their “students” could take additional notes.  Together, we all collaborated to meet the urgent need of the end-users. Each of us in our niche roles experienced first-hand the value the others brought with them to that room.  And in that time away from our regular job tasks, we became more valuable to the organization.

The learners were paired up with their Agile QT for guided instruction of real entry into the live system.  The following week, the department was able to go live with a project plan that focused on a series of interim roles, changed roles and transitioning responsibilities within established roles.  The project launched on time to meet commitments promised to the agency.

Why are they thanking me?

It was an energizing and empowering learning experience for the super-users. A truly collaborative experience for the SMEs and the biggest surprise of all was that they thanked me.  Me?  I did not deliver the training; I was not the SME, nor did I provide perfect training materials.   If I had pursued my classically trained ADDIE approach, we would have waited for the perfect SOP to deliver those sessions and woefully miss FDA committed timelines. While I’m not ready to throw ADDIE overboard yet, Satov makes a compelling plea, “move aside elite and long-standing establishments of formal education”. 

My lesson learned was this: when the demand is for speed and the content design is not the key focus, I need to give up control to the true onsite educators and focus on facilitating the best learning experience given the daily change challenges and system constraints. Satov would agree, “the role of learning is to capitalize and create the architecture of the hybrid-mind”.  Is this “learning on the fly” or agile learning in practice?  You decide. But agile instructional design is here to stay if QA L&D is going to keep up with the fast-paced, often reactive, and regulated world of the Life Sciences Industries. – VB

  • Allen, M. Leaving ADDIE for SAM: An Agile Model for Developing the Best Learning Experiences. ASTD, 2012.
  • Mitchinson, A & Morris, R. Learning Agility. Center for Creative Leadership white paper, 2014.
  • Satov, JML. “Agile by Fire”, Chief Learning Office, July/ August, 2020, p. 50.
Need to expedite a CAPA remediation project? |Looking for a facilitator/ quality systems project manager to align your SMEs for collaborative deliverables?

Who is the Author, Vivian Bringslimark?

(c) HPIS Consulting, Inc.

The Silver Bullet for Performance Problems Doesn’t Exist

Oh but if it did, life for a supervisor would be easier, right? Let’s face it, “people” problems are a big deal for management. Working with humans does present its challenges, such as miscommunications between staff, data entry errors, or rushing verification checks. Sometimes, the task at hand is so repetitive that the result is assumed to be okay and gets “a pass”.  Add constant interruptions to the list and it becomes even harder not to get distracted and lose focus or attention to the detail.

Actual behavior vs. performing as expected

In their book, Performance Consulting: Moving Beyond Training, Dana Gaines Robinson and James C. Robinson describe performance as what the performer should be able to do. A performance problem occurs when the actual behavior does not meet expectation (as in should have been able to do).   Why don’t employees perform as expected? Root cause analysis helps problem solvers and investigators uncover a myriad of possible reasons.   For Life Sciences companies, correcting mistakes and preventing them from occurring again is at the heart of CAPA systems (Corrective Actions Preventive Actions).

A closer look at performance gaps

Dana and James Robinson conducted research regarding performer actions and sorted their results into three categories of obstacles:

  • Conditions of performers
  • Conditions of the immediate managers
  • Conditions of the organization

A checklist for common Performance Causes  – scroll down for the Tool.

But, weren’t they trained and qualified?

Hopefully, employees are trained using an approved OJT (On the Job Training) Methodology in which they are shown how to execute the task and then given opportunities to practice multiple times to become proficient. During these sessions, they are coached by Qualified Trainers and given feedback on what’s right (as expected) and given specific instructions to correct what’s not right with suggestions for tweaking their performance so that their final performance demonstration is on par with their peer group. At the conclusion of the qualification event, employees must accept that they now own their deviations (mistakes) from this point forward. So what gets in the way of performing “as they should” or in compliance speak – according to the procedure?

Is it a lack of knowledge, skill or is it something else?

The Robinson’s explain that performance is more than the training event. It’s combination of the overall learning experience and the workplace environment that yields performance results. Breaking that down into a formula per se, they suggest the following: learning experience x workplace environment = performance results.

The root cause investigation will include a review of training and the qualification event as well as a discussion with the performer.

  • Is it a lack of frequency; not a task often performed?
  • Is it a lack of feedback or delayed feedback in which the deviation occurred without their awareness?
  • Is it task interference?

The work environment includes organizational systems and business unit processes that together enable the performer to produce the outcomes as “expected”.   These workplace factors don’t always work in perfect harmony resulting in obstacles that get in the way of “expected” performance:

  • Lack of authority – unclear roles, confusing responsibilities?
  • Lack of time – schedule conflicts; multi-tasking faux pas?
  • Lack of tools – reduced budgets?
  • Lack of poorly stored equipment/tools – lost time searching?

Isn’t it just human nature?

Once the root cause investigation takes on a human element attention, it’s easy to focus on the performer and stop there.   If it’s the first time for the performer or first instance related to the task, it’s tempting to label the event as an isolated incident. But when it comes back around again, it becomes apparent there was a “failure to conduct an in-depth investigation” to correct and prevent. Not surprisingly, a push back of “Operator Error as Root Cause” has forced organizations to look deeper into the root causes involving Humans.

Who’s human nature?

Recall that one of the categories of the researched obstacles was “conditions of the immediate managers”. This makes managers uncomfortable. With so much on their plates, managing a people performance problem is not what they want to see. A silver bullet like a re-training event is a nice activity that gets a big red check mark on their to-do list. However, Robert Mager and Peter Pipe, in their book, Analyzing Performance Problems, provide insights to managing direct reports that may lead to unintended consequences. A brief list can be found here – scroll to Tool: Performance Causes.  (It’s not always the performer’s fault.)

It takes all three to correct a performance problem

soln-people-performance-problemThe third category of researched obstacles clustered around “conditions of the organization”.  I’ve already discussed task interference above. To suggest that organizations are setting up their employees to fail is pushing it just a bit too far.   So I won’t go there, but it is painful for some leaders to come to terms with the implication. In order to prevent issues from reoccurring, an examination of the incidents and quite possibly a restructuring of systems have to occur, because automatic re-training is not the only solution to a “people performance problem”. –VB

Robinson DG, Robinson JC. Performance Consulting: Moving beyond training. San Francisco: Berrett-Koehler; 1995.

Mager R, Pipe P. Analyzing performance problems. Belmont: Lake Publishing; 1984.

Retraining and Refresher Training: Aren’t they one in the same?

I say no, not at all. Ask an Operations Manager and he’ll acknowledge that what it’s called is less important than getting the “assignment” done and entered into the LMS. He’s usually more concerned about the loss of productivity during the training than the effectiveness of the training at that time. It isn’t until later when the training may have to be delivered again (repeated), that the comment “training doesn’t really work” is heard.

Retraining is typically delivered as repeat training. Corrective Actions from *CAPAs usually trigger these types of required training events. In the context of the specific CAPA, we uncover the error, mistake, non-conformance or what I like to call performance discrepancy from expected outcome. It is believed that by delivering the training again, the cause of the discrepancy will be resolved. That is if the root cause was determined to be a lack of knowledge, skill or not enough practice.

Some folks believe that more is better and that with several repeated training sessions, employees will eventually get it right. It always amazes me that we find time to do repeat training over and over again but complain very loudly for refresher training, significant **SOP revision training or even new content training.   (*Corrective Actions Preventive Actions, **Standard Operating Procedures).Retraining Quote

Refresher Training implies that training was already provided at least once. The intention here is to review on that content.   A lot of regulatory training requirements are generated to satisfy this need. Common examples are Annual GMP Refreshers and several OSHA standards such as Blood Borne Pathogens training. While the aim is to refresh on the content, it is not necessarily meant to just repeat the training. Also included is the part – “so as to remain current” with current practice, trends and new updates. Hence, refresher training needs to include new material based on familiar content.

Upon Biennial SOP Review

There are some folks who would like to use this required SOP activity to coincide with the need to “refresh” on SOPs already read and/or trained. The rationale being that if the SOP hasn’t revved in 2 or 3 years time, more than likely the training hasn’t been repeated either. So, it sounds like a good idea to require that SOPs be “refreshed” upon using the same SOP cycle. One could argue for the prevention of errors; thus, in theory, this sounds very proactive.

But donning my Instructional Designer Hat, I ask you, what is the definition of training – to close a knowledge gap or skill gap. What value is there for forcing a mandatory “refresher reading” on SOPs just because the procedure is due for technical review? In practice, this becomes one huge check mark exercise leading to a paper work /LMS backlog and might actually increase errors due to “information overload”! Again, what gap are you trying to solve? In the above refresher scenario, we are avoiding a compliance gap by satisfying regulatory requirements.

Refresher Retraining

Defending Your Training Process

For those of you who have fielded questions from regulators, you can appreciate how the very training record produced generates follow up questions.   How you describe the conditions under which the training occurred or is “labeled” can impact the message you are sending as well. Calling it retraining instead of refresher training implies that training had to be repeated as a result of a performance problem not meeting expectations or standards. Whereas refresher training occurs at a defined cycle to ensure that the forgetting curve or lack of practice is not a factor of poor performance. It is a routine activity for satisfying regulatory expectations.

For end users, clarifying the difference between refresher training and “repeat” training in your Policy/SOP not only defines the purpose of the training session, it also provides the proper sequence of steps to follow to ensure maximum effectiveness of the training. There’s a difference between training content that is new /updated vs. delivered as a repeat of the same materials.   Yes, new and/or updated design takes resources and time.   How many times do you want to sit through the same old same old and get nothing new from it? Recall the definition of insanity – doing more of the same while hoping for change.   You just might want to review your Training SOP right about now. – VB

 

 

Tired of repeat errors – ask a Performance Consultant to help you design a better corrective action

In this last “Making HPI Work for Compliance Trainers” series, I blog about one of the biggest complaints I hear over and over again from Compliance Trainers – management doesn’t really support training.  It’s hard to ask for “more of the same” even though you know your programs are now different.  In previous blogs, I shared why management hasn’t totally bought into the HPI methodology yet. See the blog Isn’t this still training?

 

Given the constant pressure to shrink budgets and improve the bottom line, managers don’t usually allow themselves the luxury of being proactive especially when it comes to training.  So they tend to fall back on quick fix solutions that give them a check mark and “clears their desk” momentarily.  For the few times this strategy works, there are twice as many times when those fixes back fire and the unintended consequences are worse.

 

In the article, “Why the Band Aids Keep Falling Off”, I provide an alternate strategy that emphasizes moving away from events-only focus to exploring the three levels of interaction that influence performance: individual performer, task/process, organizational quality systems.  These same three levels are where performance consultants carry out their best work when supported by their internal customers.  The good news is that the first step is the same; it begins with a cause analysis.  See the blog Analysis du jour  for more thoughts on why these are essentially the same approach.

 

The difference is that the corrective action is not a reactive quick fix but a systems approach to correcting the issue and preventing it from showing up again.  System based solutions are the foundation of many HPI/HPT projects that require cross functional support and collaborative participation across the site / organization.  And this is where a PC needs support from senior leaders.

 

We wrap up this series here and introduce the next series – Gaining Management Support – where I blog about credibility, trust, and access and how these 3 concepts impact relationship management.

Analyses du jour: Isn’t it really all the same thing?

So there’s root cause analysis and gap analysis and now performance cause analysis?  Is there a difference? Do they use different tools?  It can be overwhelming to decipher through the jargon, no doubt!  I think it depends on which industry you come from and whether your focus is a regulatory / quality system point of view or performance consulting perspective.  To me, it doesn’t change the outcome.  I still want to know why the deviation occurred, how the mistake that was made and /or what allowed the discrepancy to happen.  Mix and matching the tools allows me to leverage the best techniques from all.

Why we love root cause analysis

For starters, it’s GMP and we get to document our compliance with CAPA requirements.  It allows us to use tools and feel confident that our “data doesn’t lie”.  This bodes well for our credibility with management.  And it provides the strategic connection between our training solution (as a corrective action) and site quality initiatives thus elevating the importance and quite possibly the priority for completing the corrective action on time.

Asking the right questions

Root cause analysis and problem solving steps dove tail nicely.  See sidebar below.  It requires us to slow down and ask questions methodically and sequentially.  More than one question is asked, for sure.  When you rush the process, it’s easy to grab what appears to be obvious.  And that’s one of the early mistakes that can be made with an over reliance on the tools.  The consequence?  Jumping to the wrong conclusion that automatic re-training or refresher training is the needed solution.  Done, checkmark.  On to the next problem that needs a root cause analysis. But when the problem repeats or returns with a more serious consequence, we question why the training did not transfer or we wonder what’s wrong with the employee – why is s/he not getting this yet?

Side Bar -Double Click to Enlarge.
Side Bar -Double Click to Enlarge.

No time to do it right, but time to do it twice!

Solving the problem quickly and rapidly closing the CAPA allows us to get back to our other pressing tasks.  Unfortunately, “band-aids” fall off.  The symptom was only covered up and temporarily put out of sight, but the original problem wasn’t solved.  So now, we must investigate again (spend more time) and dig a little deeper.  We have no time to do it right, but find the time to do it twice.  Madness!

Which tool to use?

My favorite human performance cause tool is the fish bone diagram, albeit the “ 5 Whys Technique” is a close second.  Both tools force you to dig a little deeper into the causes.  Yes, the end result often reveals something is amiss with “the training”, but is it man, machine, method or materials? Ah-hah, that is very different than repeat training on the procedure!  Alas, when we have asked enough right questions, we are led to the true cause(s).  That is the ultimate outcome I seek no matter what you call the process or which tool is used. -VB

HPIS C. has articles, impact stories and white papers.
Published article – Why the Band Aids Keep Falling Off

 

Request this Job Aid from HPIS C. Website.

(c) HPIS Consulting, Inc.