Thank you LTEN for publishing the article!
Looking for the Effective Openings Job Aid?
Who is the Author, Vivian Bringslimark?
(c) HPIS Consulting, Inc.
For many organizations, the sole purpose of refresher training is to satisfy compliance requirements. Hence, the focus is on just delivering the content. Ironically, the intent behind the 211.25 regulation is to ensure that employees receive training more than at orientation and frequently enough to remain current. The goal is to ensure compliance with GMPs and SOPs and improve performance where there are gaps. Improved business performance is the result and not just a checkmark for 100% attended.
And the practice of repeating the same video year after year as the annual refresher? Efficient yes, effective, well just look at your deviations and CAPA data to answer that one. When you shift your focus from delivering content only as the objective to a more learner-centered design, your sessions become more performance-oriented and your effectiveness reaches beyond just passing the GMP Quiz.
From passive lecture GxP refreshers to active learner centered sessions
Yet, senior leaders are not grasping that just “telling them the GMPs” is not an effective training technique, nor is it engaging. Even if it’s backed up with a slide deck, it’s either “death by PowerPoint” or click to advance to the next slide for CBT refresher modules. Koreen Pagano, in her June 2014 T&D article, “the missing piece”, describes it as “telling employees how to swim, then sending them out to sink, hoping they somehow can use the information we’ve provided to them to make it shore”, (p.42). To make matters worse, employees can end up with disciplinary letters for deviations and CAPAs for failure to follow GMPs.
Look at the GXP Refresher course outline for the last 3 years at your company. What is the ratio of content to interactivity? When I dig a little deeper, I usually discover a lack of instructional design skills, and minimal creativity is a factor. And then I hear, “Oh but we have so little time and all this content to cover, there’s no more room. If I had more time, you know, I’d add it in.” Koreen informs us that “training is supposed to prepare employees to be better, and yet training professionals often stop after providing content” (p.43).
See What’s so special about SMEs as Course Designers?
What about using previously developed compliance materials?
I am not criticizing the use of previous course materials if they were effective. But asking an SME to “deliver training” using a previously created PowerPoint presentation does not guarantee effective delivery. Neither does replacing clip art with new images or updating the slide deck to incorporate the new company template. These visual “updates” are not going to change the effectiveness of the course unless the content was revised, and activities were improved.
For many SMEs and Trainers, having a previous slide deck is both a gift and a curse. While they are not starting with a blank storyboard, there is a tendency to use as-is and try to embellish it with speaker notes because the original producer of the slide was not in the habit of entering his/her speaking points for someone else to deliver. Speaker notes embedded at the bottom of the notes pages within PowerPoint slides is not a leader’s guide. While handy for scripting what to say for the above slide, it does not provide ample space for managing other aspects of the course such as visual cues, tips for “trainer only” and managing handouts, etc.
The SME has the burden to make content decisions such as what content is critical; what content can be cut if time runs out. Perhaps even more crucial is how to adapt content and activities to different learner groups or off-shift needs. Without a leader’s guide, the SME is unsupported and will fall back on the lecture to fill in the duration of the course.
“SMEs put down those speaker’s notes and step away from the podium!” Vivian Bringslimark, HPIS Consulting, Inc.
Better Training Means an Investment in Instructional Design Skills
Interactive, immersive, engaging are great attributes that describe active training programs. But it comes at a price: an investment in instructional design skills. Trained course designers have spent time and budget to create an instructional design that aligns with business needs and has measurable performance outcomes. The course materials “package” is complete when a leader’s guide is also created that spells out the design rationale and vision for delivery, especially when someone else will be delivering the course such as SMEs in the classroom.
The Leaders Guide, invaluable for effective course delivery
A well-designed leader’s guide has the key objectives identified and the essential learning points to cover. These learning points are appropriately sequenced with developed discussion questions to be used with activities; thus, removing the need for the Trainer/SME to think on demand while facilitating the activity. This also reduces the temptation to skip over the exercise/activity if s/he is nervous or not confident with interactive activities such as virtual break out groups, etc.
A really good guide will also include how to segue to the next slide and manage seamless transitions to next topic sections. Most helpful, are additional notes about what content MUST be covered, tips about expected responses for activities and clock time duration comments for keeping to the classroom schedule. SMEs as Facilitators (Instructor Led SMEs| ILT SMEs) need to study the leader’s guide and pay attention to the icons and notes provided there. These cues indicate the differentiation from lecture, to an activity whether that be self, small group, or large group.
Given all the time and effort to produce the leader’s guide, it is wasted if the course designer and SME as Facilitator do not have a knowledge transfer session. Emailing the guide or downloading it from a share point site will not help the SME in following the guide during delivery unless an exchange occurs in which SMEs can begin to markup their copy.
During the knowledge transfer session/ discussion with the course designer, ILT SMEs make notes of how the instructor transitions from one slide to the next and how s/he provided instruction for the activity. This is a good time for ILT SMEs to ask how to modify content or an activity if certain conditions should occur. Especially important for SMEs to ask is what content is critical and what content can be skipped if time runs short. It is always a good idea for the ILT SME to mark-up his/her copy of the materials. And then again after the first delivery to really make it their own leader’s guide. For example, SMEs may want to experiment with different ways to “open a session” to get experience with a variety of techniques and observe which ones yield better results.
Why do ILT SMEs need their own Qualified Trainers workshop?
In order to pull this off, ILT SMEs need to learn how to facilitate learning experiences such as preparing to have a facilitated discussion. One of the biggest fears ILT SMEs have when asked to facilitate an exercise or an interactive activity is the fear of it bombing such as discussions.
Discussions can often bomb
While popular and commonly used, discussions can also fail miserably if not designed well. Relying on the SME to facilitate the discussion without carefully preparing the path to the targeted outcome is leaving it to chance that the SME knows how to execute the activity successfully. It includes the upfront questions to ask, pertinent examples as reference, and application type activities in which clarifying comments can be addressed.
“It takes effort to get out of your head and connect with individuals.” Ludwig, D. Training Industry, Fall, 2015, p. 23.
“… So as to remain current in the practices they perform …”
Is once a year GXP refresher enough? Before you rush to answer this question, consider the following. Do you have:
Then you might be sending the mixed message that your employees are NOT trained well enough or sufficient in their knowledge and application of the GXPs.
There’s a difference between GXP training content that is delivered as a repeat of the same materials vs. new and/or updated. Yes, new content takes resources and time. But, how many times do you want to sit through the same old slides and get nothing new from it? Recall the definition of insanity – doing more of the same while hoping for change. – VB
What’s so special about SMEs as Course Designers?
They have expertise and experience and are expected to share it via training their peers. But now the venue is the classroom as well. It’s training on course design methodology that is needed. SMEs and most trainers do not automatically have this knowledge. Some develop it by reading A LOT, attending well-designed courses, and over time with trial and error and painful feedback. The faster way is to provide funds to get SMEs as Course Designers at least exposed to how to effectively design for learning experiences so that they can influence the outcome of the objectives.
This is management support for SMEs as Trainers. SMEs who attend an ID basics course learn how to use design checklists for previously developed materials. These checklists allow them to confidently assess the quality of the materials and justify what needs to be removed, revised or added; thus, truly upgrading previously developed materials.
Hacking GMPs: Deliberate Attacks or Accidental Workarounds?
Now available for blog readers! Email Vivian today.
Who is the Author, Vivian Bringslimark?
(c) HPIS Consulting, Inc.
When I left the manufacturing shop floor and moved into training, full-time trainers presented in the classroom using a host of techniques, tools and relied on their platform skills to present content. Subject matter experts (or the most senior person) conducted technical training on the shop floor in front of a piece of equipment, at a laboratory station, or a workbench.
For years, this distinction was clearly practiced where I worked. Trainers were in the classroom and SMEs delivered OJT. Occasionally a “fulltime” trainer would consult with an SME on content or request his/her presence in the room during delivery as a back-up or for the Q & A portion of a “presentation”. It seemed that the boundaries at the time, were so well understood, that one could determine the type of training simply by where it was delivered.
Training boundaries are limitless today
Today, that’s all changed. No longer confined to location or delivery methods, fulltime trainers can be found on the shop floor fully gowned delivering GMP (Good Manufacturing Practices) content for example. And SMEs are now in the classroom more each day with some of the very tools used by fulltime trainers! What defines a fulltime trainer from an SME is less important, what is necessary however is what defines effective instruction.
Your title might have the word trainer in it. One of your responsibilities might be a qualified trainer. And you know how to use PowerPoint (PPT). Does this make you an Instructional Designer as well? Some say yes and others cry foul as they cling to their certificates and advanced degrees. So, forgive me when I say, not every Trainer or Training Manager has the skill set or ID competency embedded in his/her toolbox. It’s analogous to the toy box on the shelf at Toys R Us – “NOTE: Batteries Not Included”. Except in our case, the note may be missing from the resume, but definitely embedded into the job description if you are QA L&D or HR Training and Development.
Instructional Design is a recognized profession
Instructional Design (ID) as a field of study has been offered by many prominent universities for quite some time and is now more known as Instructional Technology. Underlying the design of a course or a learning event, is a methodology for “good” instructional design and really good instructional designers will confess that there is a bit of an art form to it as well. Unfortunately, with shrinking budgets and downsized L&D staff, there are less resources available to develop traditional course materials of the past. Not to mention, shrinking timelines for the deliverables. So, it makes sense to tap SMEs for more training opportunities since many are already involved in training at their site. But, pasting their expert content into a PPT slide deck is not instructional design.
What is effective design?
To me, effective design is when learners not only meet the learning objectives during training but also transfer that learning experience back on the job and achieve performance objectives / outcomes. That’s a tall order for an SME, even for fulltime trainers who have not had course design training.
The methodology a course designer follows be that ADDIE, Agile, SAM (Successive Approximation Model), Gagne’s 9 Conditions of Learning, etc., provides a process with steps for the design rationale and then development of content including implementation and evaluation of effectiveness. It ensures that key elements are not unintentionally left out or forgotten about until after the fact like evaluation/ effectiveness or needs assessment. In an attempt to expedite training, these methodology driven elements are easily skipped without fully understanding the impact of leaving them out can have on the overall training effectiveness. There is a science to instructional design.
PowerPoint Slides are only a visual tool
Using PowerPoint slides by themselves does not make the training successful. It’s one of the main tools a trainer uses to meet the objectives of the learning event, albeit the main one. The “art form” occurs when a designer creates visually appealing slides / eLearning scenes as well as aligned activities and engaging exercises designed to provide exploration, practice, and proficiency for the performance task back on the job. But there is a difference between a course that is created to help the Trainer achieve his/her agenda and one that successfully engages learners to participate, learn and then transfer their insights back home to the job where changed behavior improves the department’s metrics.
The more trainer/instructor driven the course is, the less participation is required from the learner. For example, the instructor makes all the decisions about the course objectives and content, develops the course, delivers the course, and conducts the assessment.
As you move along the Learner Participation Continuum, the learner is required to participate more, and the trainer does less “talking”. The learner acquires knowledge and skills through activities that s/he experiences with the assistance of a “facilitator”. The facilitator is focused on helping the learners meet their needs and interests. It is through these firsthand experiences and facilitated dialogue with other learners that thoughtful analysis and interpretation can become the focus of the instruction. The end result is that learners take full responsibility for decisions, actions and consequences.
Moving from Presenter Controlled Training to Learner Focused Facilitation
Moving to a more Learner Focused approach shifts the effort of the design from “deliver this content” to facilitate learning transfer for performance back on the job; which is after all the end goal for a training event. The new design includes opportunities for group participation, utilization of participants’ expertise, and real-life problem solving; key principles of adult learning.
On the one end of the continuum is the lecture which is one-way communication and requires very little participation. At the other end, we have experiential learning and now immersive learning environments with the introduction of 3D graphics, virtual simulations, and augmented reality.
Most Trainers and SMEs tend to suffer from the “curse of too much knowledge” and find it difficult to separate the need-to-know from the nice-to-know content. As a result, it shows up in the slide deck with overburdened slides filled with a lot of “stuff”. Training for them takes on a lecture-style format. The thought of facilitating an activity gives most SME a case of jitters and anxiety.
So, in the “SME as Facilitator” workshop, nominated SMEs as Facilitators are encouraged to step away from the podium and use their eyes, hands, and voice to engage with their audience. Easier said than done, yes. That’s why the course is designed to allow them to take small steps within the safety of a workshop environment.
But rather than trying to pull off a fully immersive session, SMEs as Facilitators are introduced to techniques that “liven up” the lecture. They are shown how to move back and forth from passive listening (sit, hear, see) to active involvement (write, construct, discuss, move, speak). This requires the ability to:
While lecture has its merits, today’s learners want engaging content; that is timely, relevant and meaningful. And while virtual reality and simulations are engaging and very immersive, courses and learning events using these techniques rely on well-funded budgets. Most Training Departments are not that fortunate. In the middle of the range are “lively lectures” and alternate methods such as:
Take the 1st shift right.
It’s really about starting with the learners’ expectations and the current organizational culture and then moving one step to the right. If they are used to lectures from SMEs, then work on delivering effective lectures before experimenting with alternate training methods. The overnight shift may be too big of a change for the attendees to adjust to despite their desire for no more boring lectures. Small incremental steps are the key.
Moving from Lecture to Delivering an EFFECTIVE Lecture
Thoroughness in the preparation reflects care and thoughtfulness. Learners appreciate the personal desire to deliver a livelier lecture. Stepping away from the podium forces the Trainer/SME to take action and allow the learners to “get up close” with the SME as Facilitator. This in turn is reflected in the learner’s desire to respond to questions and dialogue during a facilitated discussion. The rule of thumb for lecturing is approximately 8-10 minutes max. For virtual sessions, the rule of thumb is approximately 5 minutes.
Take the 2nd Shift: Cut Content to Add Interactivity
How is this done? Upfront in the design of the course materials. The course designers have spent time and budget to prepare a leader’s guide that captures their vision for delivering the course. SMEs as Facilitators (Classroom SMEs) need to study the leader’s guide and pay attention to the icons and notes provided there. These cues indicate the differentiation from lecture, to an activity whether that be self, small group, or large group. While it may be tempting to skip exercises to make up for lost time, it is better for learner participation to skip lecture and modify an activity if possible.
“STOP TALKING and get learners engaged in some form of activity, practice or reflection exercise”, Vivian Bringslimark, HPIS Consulting, Inc.
One of the benefits of shifting to this learner focused design is the opportunity for learners to process the content, to make it meaningful for themselves and then associate memory links to it for later recall when the moment of need is upon them. This can’t happen while the trainer is lecturing. It happens during activities and reflection exercises designed to generate their own ideas during small group interactions and link it back to the course content/objectives. Learners are prompted to openly discuss issues and problems within a “learning lab” style environment. Trainers become empathetic listeners as they create a climate of trust and safety. They become a Facilitator.
Of course, this shift also requires that site leadership and local management not only support the facilitated learning lab concept but follow through on issues and concerns that surface. Failure to do so undermines not only the facilitator’s credibility but the entire training program.
Wow, won’t this take longer to design, you ask? Yes, in the sense that the design is now from the learner’s point of view. This means that the designer will need to research examples, collect data, and might have to develop a story from an incident, a deviation or significant CAPA, etc. The reward is that the Trainer/ Classroom SME stops talking and gives employees more engaging learning sessions. So learners become more accountable for participating and guess what – the SME’s session is no longer a boring podium speech.
Silberman, M. (1990). Active Training: A Handbook of Techniques, Designs, Case Examples, and Tips. Lexington Books, New York.
Who is the Author, Vivian Bringslimark?
SME Impact Story: The Real Meaning of TTT
White Paper: Step Away From the Podium
(c) HPIS Consulting, Inc.
When I taught middle school math, tests were used to assess knowledge comprehension and some application with word problems and a few complex questions requiring logic proofs. Results were captured via a score; an indication as to how well you answered the questions. Whether you call it a quiz, a knowledge check or any other name it is still assessing some form of knowledge comprehension.
Life sciences companies are required by law to conduct annual regulations training (GXP Refreshers) so as to remain current in the operations being performed. Given the design and delivery of most refresher training sessions, it is heavy with content that is delivered to a passive learner. Listening to a speaker or reading slide after slide and clicking the next button is not active engagement. It’s boring, mind-numbing and, in some cases, downright painful for learners. As soon as the training is over, it’s very easy to move past the experience. That also means most of the content is also soon forgotten. Regulatory agencies are no stranger to this style of compliance training and delivery. But when departures from SOPs and other investigations reveal a lack of GXP knowledge, they will question the effectiveness of the GXP training program.
So why are we still using tests?
In our quest for effective compliance training, we have borrowed the idea of testing someone’s knowledge as a measure of effectiveness. This implies that a corporate classroom mirrors an educational classroom and testing means the same thing – a measure of knowledge comprehension. However, professors, colleges, universities and academic institutions are not held to the same results standard. In the Life Sciences, two very common situations occur where knowledge checks and “the quiz” are heavily used: Annual GXP Refresher and the Read & Understand Approach for SOPs.
But what is the knowledge assessment measuring? Is it mapped to the course objectives or are the questions so general that they can be answered correctly without having to attend the sessions? Or worse yet, are the questions being recycled from year to year / event-to-event? What does it mean for the employee to pass the knowledge check or receive 80% or better? When does s/he learn of the results? In most sessions, there is no more time left to debrief the answers. This is a lost opportunity to leverage feedback into a learning activity. How do employees know if they are leaving the session with the “correct information”?
The other common practice is to include a multiple-choice quiz consisting of 5 questions as the knowledge check for SOPs that are “Read & Understood” especially for revisions. What does it mean if employees get all 5 questions right? That they will not make a mistake? That the R & U method of SOP training is effective? The search function in most e-doc systems is really good at finding the answers. It doesn’t mean that they read the entire procedure and retained the information correctly. What does it mean for the organization if human errors and deviations from procedures are still occurring? Does it really mean “the training” is ineffective?
What should we be measuring?
The conditions under which employees are expected to perform need to be the same conditions under which we “test” them. So, it makes sense to train them under those same conditions as well. What do you want/need your employees (learners) to do after the instruction is finished? What do you want them to remember and use from the instruction in the heat of their work moments? Both the design and assessment need to mirror these expectations.
Ask yourself, when in their day to day activities will employees need to use this GMP concept? Or, where in the employees’ workflow will this procedure change need to be applied? Isn’t this what we are training them for? Your knowledge checks need to ensure that employees have the knowledge, confidence and capability to perform as trained. It’s time to re-think what knowledge checks are supposed to do for you. And that means developing objectives that guide the instruction and form the basis of the assessment.
Objectives drive the design and the assessment
Instructional Design 101 begins with well-developed objective statements for the course, event, or program. These statements aka objectives determine the content and they also drive the assessment. For example, a quiz or a knowledge check is typically used for classroom sessions that ask questions about the content. In order for learners to be successful, the course must include the content whether delivered in class or as pre-work. But what are the assessments really measuring? How much of the content they remember and maybe how much of the content they anticipate applying when they return to work?
When the desired outcomes are compliance back on the job, knowledge checks need to shift to become more application-oriented (performance-based) such as “what if situations” and real scenarios that will require an employee to analyze the situation, propose a workaround and then evaluate if the problem-solving idea meets the stated course objectives.
Performance objectives drive a higher level of course design
Training effectiveness is really an evaluation of whether we achieved the desired outcome. So, I ask you, what is the desired outcome for your GXP training and SOP Training: to gain knowledge (new content) and/or to use the content correctly back in the workplace? The objectives need to reflect the desired outcomes in order to determine the effectiveness of training; not just knowledge retention.
When you begin with the end in mind namely, the desired performance outcomes, the objective statements truly describe what the learners are expected to accomplish. While the content may be the same or very similar, how we determine whether employees are able to execute what they learned post training requires more thought about the accuracy of the assessment. It must be developed from the performance objectives in order for it to be a valid “instrument”. The learner must perform (do something observable) so that it is evident s/he can carry out the task according to the real workplace conditions.
Accuracy of the assessment tools
The tools associated with the 4 Levels of Evaluation can be effective when used for the right type of assessment. For example, Level 1 (Reaction) surveys are very helpful for Formative Assessments. See below – Formative vs. Summative Distinction. Level 2 (Learning) knowledge assessments are effective in measuring retention and minimum comprehension and go hand in hand with learning-based objectives. But when the desired outcomes are actually performance-based, Level 3 (Behavior) checklists can be developed for the performance of skills demonstrations and samples of finished work products. Note: Level 4 is Results and is business impact-focused.
Trainers are left out of the loop
Today’s trainers don’t always have the instructional design skill set developed. They do the best they can with the resources given including reading books and scouring the Internet. For the most part, their training courses are decent and the assessments reflect passing scores. But when it comes to Level 4 (Results) impact questions from leadership, it becomes evident that trainers are left out of the business analysis loop and therefore are missing the business performance expectations. This is where the gap exists. Trainers build courses based on knowledge/content instead and develop learning objectives that determine what learners should learn. They create assessments to determine whether attendees have learned the content, but this does not automatically confirm learners can apply the content back on the job in various situations under authentic conditions.
Levels of objectives, who knew?
Many training mangers have become familiar with the 4 Levels of Evaluation over the course of their time in training, but less are acquainted with Bloom’s Taxonomy of Objectives. Yes, objectives have levels of increasing complexity resulting in higher levels of performance. Revised in 2001, the levels were renamed for better description of what’s required of the learner to be successful in meeting the objective.
Take note, remembering and understanding are the lowest levels of cognitive load while applying and analyzing are mid-range. Evaluating and creating are at the highest levels. If your end in mind is knowledge gained ONLY, continue to use the lower level objectives. If; however, your desired outcome is to improve performance or apply a compliant workaround in the heat of a GMP moment, your objectives need to shift to a higher level of reasoning in order to be effective with the training design and meet the stated performance outcomes. Fortunately, much has been written about writing effective objective statements and resources are available to help today’s trainers.
Did we succeed as intended? Was the training effective in achieving the desired outcomes?
To ensure learner success via the chosen assessment, the training activities must also be aligned with the level of the objectives. This requires the design of the training event to shift from passive lecture to active engagement intended to prepare learners to transfer back in their workspace what they experienced in the event. The training design also needs to include practice sessions where making mistakes is an additional learning opportunity and teach learners how to recognize a deviation is occurring. Michael Allen refers to this as “building an authentic performance environment”.
Our training outcomes need to be both knowledge gained, and performance based. Now, the agency is expecting us to document that our learners have the knowledge AND can apply it successfully in order to follow SOPs and comply with the regulations. Thus, trainers and subject matter experts will need to upgrade their instructional design skills if you really want to succeed with training as intended. Are you willing to step up and do what it takes to ensure training is truly effective? – VB
Allen,M. Design Better Design Backward, Training Industry Quarterly, Content Development, Special Issue, 2017, p.17.
Who is Vivian Bringslimark, the Author?
(c) HPIS Consulting, Inc.
What do I mean? You know, job aids, tools users have created, and SME cheat sheets. I’ve even seen task instruction sheets, quick reference guides for completing forms, and process flow diagrams. But I’m not talking about posters on the wall describing how to turn on the projector in the conference room. In this 3rd issue of Making It Work for Compliance Trainers series, I blog about why creating and openly sharing user-generated tools may not be a good thing in a regulated environment.
As a Performance Consultant (PC) or HPT specialist, works with SMEs, Key Performers, or STAR employees, s/he invariably uncovers or discovers that their SMEs have “other” tools they’ve developed that help them be so good at what they do. While these are helpful to the key performers, it presents a dilemma for the PC who is also a Compliance Trainer or a QA Manager. “If I expose the source of their secret sauce, will I break trust and create a barrier to the relationship? On the other hand, if I don’t speak up about this tool, what assurance do I have that the content is approved by the Quality Control Unit (per GMP) and is version controlled?
Why create them in the first place?
To get grounded, the PC/Compliance Trainer needs to perform a quick cause analysis upon the discovery of the tool. Why was it created in the first place? Is there information or steps missing from the standard operating procedure (SOP)? Was this tool created to “chunk up” the steps or create bite-sized training materials that evolved into a job aid? Or is it a maneuver to bypass the change control system? The answers to the questions could provide the basis for a more user-friendly revision or at least be officially approved as a supporting tool to the SOP upon the next version release.
What’s the big deal?
Rejection of product, deviation from approved written instruction that could result in adulterated product, additional follow up testing, and rework are all forms of waste to the organization. Not to mention that consistency is the key to compliance and assuring public confidence in approved marketed products. If folks are not using the approved procedure, then there’s an issue somewhere.
To what level of control is needed?
That is the most sought after question regarding job aids and user tools. The answer lies in each company’s level of risk and its document hierarchy. I’ve seen extreme cases where “NO Paper” on the floor means not even an SOP is allowed to be in hand. I do believe that some level of control is needed to ensure that the content is valid, is in sync with the current procedure and users have the most current version of the tool. Can your organization defend the level of control? Are you sure about that? Or do you use a “don’t tell and we won’t ask policy”? Are folks making errors because they followed an uncontrolled worksheet vs. the approved procedure?
Tips for Establishing Level of Control
o For example, some companies have a separate numbering system for these exhibits and the storage location may not be in the same folder directory as the parent SOP.
Calling all User Generated Tools Home
The purpose of the initiative is to allow users to admit that they have these tools and that no performance consequences will follow when they surrender them. The second focus of the program is to find a proper home for these tools once they are deemed valuable. They need proper care and nourishment. In other words, content is valid, accurate, up to date and approved for use. The PC/Compliance Trainer is the ideal conduit to make this happen.
One company that I visited did just that and more. Once it was discovered that a series of mistakes were coming from an old tool that had been downloaded and copied to their desktop, a team of auditors was dispatched to observe the removal of all tools from employee’s desktops. The 2nd phase of their program was the identification of an owner for the share-point site who now manages access and content revisions. The 3rd phase includes a content/tool submission process that is vetted by a designated users group of SMEs.
Is it time for a Job Aid/Users Tool Amnesty Project where you work? – VB