Embedding Forcing Functions into Scenario Design to Enhance Assessment Capabilities

shutterstock_316201547Many people design scoring instruments for simulation encounters as part of an assessment plan. They are used for various reason ranging from a tool to help provide feedback, research purposes, to high stakes pass/fail criteria. Enhancing the ability for assessment tools to function as intended may often be closely linked to scenario design.

Often times checklists are employed. When designing checklists is critical that you are asking the question “Can I accurately measure this?”. It is easy to design checklists that seem intuitively simple and filled with common sense (from a clinical perspective) but are not actually able accurately measure what you think you are evaluating.

It is quite common to see checklists that have items such as “Observes Chest Rise”; “Identified Wheezing”; “Observed Heart Rate”. During faculty training sessions focusing on assessment tool development we routinely run scenarios that contain deliberate errors of omission. These items, some are routinely scored, or “checked” as completed. Why is this? Part of the answer is we are interjecting our own clinical bias into what we think the simulation participant is doing or thinking. This raises the possibility that we are not measuring what we are intending to measure, or assess.

Consider two checklist items for an asthma scenario, one is “Auscultates Lung Sounds”; another item is “Correctly Interprets Wheezing”. The former we can reasonably infer from watching the scenario and see the participant listen to lung fields on the simulator. The latter however is more complicated. We don’t know if the participant recognized wheezing or not by watching them listen to the lungs. Many people would check yes for “Correctly Interpreted Wheezing” if the next thing the participant did was order a bronchodilator. This would be an incorrect assumption, but could be rationalized in the mind of the evaluator because of a normal clinical sequence and context.

However, it may be completely wrong and the participant never interpreted the sounds as wheezing, but ordered a treatment because of a history of asthma. Or what would happen if the bronchodilator was ordered before auscultation of the lungs? What you have by itself, is an item on your checklist that seems simple enough, but is practically unmeasurable through simple observation.

This is where linking scenario design and assessment tools can come in handy. If the item you are trying to assess is a critical element of the learning and assessment plan perhaps something in the simulation, transition to, or during the debriefing can cause the information to be made available to more correctly or accurately, assess the item.

A solution to a real-time assessment during the flow of the scenario is possible within the design of the scenario. Perhaps inserting a confederate as a nurse caring for the patient that is scripted to ask “What did you hear?” after the participant auscultates the lungs fields. This will force the data to become available during the scenario for the assessor to act upon. Hence the term, forcing function.

Another possibility would be to have the participant complete a patient note on the encounter and evaluate their recording of the lung sounds. Another possibility would just be to have the participant write down what their interpretation of the lung sounds were. Or perhaps embed the question into the context of the debriefing. Any of these methods would provide a more accurate evaluation of the assessment item ““Correctly Interpreted Wheezing”.

While not trying to create a list of exhaustive methods I am trying to provide two things in this post. One is to have you critically evaluate your ability to accurately assess something that occurs within a scenario with higher validity. Secondly, recognize that creation of successful, reliable and valid assessment instruments are linked directly to scenario design. This can occur during the creation of the scenario, or can be as a modification to an existing scenario to enhance assessment capabilities.

This auscultation item just serves as a simple example. The realization of the challenges of accurate assessment of a participants performance is important to recognize to allow for the development of robust, valid and reliable tools.  The next time you see or design a checklist or scoring tool think in your own mind….. Can I really, truly evaluate that item accurately? If not, can I modify the scenario or debriefing to force the information to be made available?

 

Leave a comment

Filed under scenario design

Beware of the Educational Evangelist!

beware educational evangelistThey are everywhere now days like characters in pokemon go. They seem to hang out in high concentration around new simulation centers.

You know the type. Usually they start off by saying how terrible it is for someone to give a lecture. Then they go on to espouse the virtues and values of student – centered education claiming active participation and small group learning is the pathway to the glory land. They often toss in terms like “flipped classroom”. And just to ensure you don’t question their educational expertise they use a word ending with “-gogy” in the same paragraph as the phrase “evidence-based”.

If you ask them where they have been in the last six months you find out that they probably went to a weekend healthcare education reform retreat or something equivalent…….

My principal concern with the today’s educational evangelist is that they are in search of a new way of doing everything. Often times they recommend complete and total overhauls to existing curriculum without regard to a true understanding of how to efficiently and effectively improve, and/or analyze the existing resources required to carry out such changes.

Further, the evangelist usually has a favorite methodology such as “small group learning”, “problem-based learning” or “simulation-based learning” that they are trying to convert everyone to through prophecy.

An easy target of all educational evangelist is the lecture, and often that is where the prophecy begins. They usually want to indicate that if lecture is happening, learning is not. As I discussed in a previous blog article lecture is not dead, and when done well, can be quite engaging and create significant opportunities for learning and is maximally efficient in terms of resources.

If you think about a critically it is just as easy to do lousy small group facilitation as it is to do a lousy lecture. Thus, the potential gains in learning will not achieve maximal potential. The difference is small group facilitation like simulation, generally take significantly more faculty resources.

The truth is the educational evangelist is a great person to have in and amongst the team. Their desire for change, generally instilled with significant passion are often a source of great energy. When harnessed they can help advance and revise curricula to maximize, and modernize various educational programs.

However, to be maximally efficient all significant changes should undergo pre-analysis, hopefully derived from a needs assessment, whether it is formal or informal. Secondly, it is worth having more than one opinion to decide the prioritization of what needs to be changed in a given curriculum. While the evangelist will be suggestive that the entire curriculum is broken, often times with a more balanced review you find out that there are areas of the curriculum that would benefit from such overhaul, and some aspects that are performing just fine.

When you begin to change aspects of the curriculum, start small and measure the change if possible. Moving forward on a step-by-step basis will usually provide a far better revised curriculum then an approach that “Throws out the baby and the bathwater”. Mix the opinions of the stalwarts of the existing curriculum methods with the evangelists. Challenge existing axioms, myths and entrenched beliefs like “Nothing can replace the real patient for learning….” If this process is led well, it will allow the decision making group to reach a considerably more informed position that will lead to sound decisions, change strategies, and guide investments appropriately.

So if you’re the leader or a member of a team responsible for a given curriculum of healthcare instruction and confronted with the educational evangelist, welcome their participation. Include them in the discussions moving forward with a balanced team of people have them strive to create an objective prioritization of the needs for change. This will allow you to make excellent decisions with regard to new technologies and/or methods that you should likely embrace for your program. More importantly you will avoid tossing out the things that are working and are cost efficient.

Leave a comment

Filed under Curriculum, Uncategorized

Learning from Simulation – Far more than the Debriefing

Most people have heard someone say “In Simulation, debriefing is where all of the learning occurs.” I frequently hear this when running faculty development workshops and programs, which isn’t as shocking as hearing this espoused at national and international meetings in front of large audiences! What a ridiculous statement without a shred of evidence or a realistic common sense approach to think it would be so. Sadly, I fear it represents an unfortunate instructor-centered perspective and/or a serious lack of appreciation for potential learning opportunities provided by simulation based education.LearningDuringSimulation2

Many people academically toil over the technical definitions of the word feedback and try to contrast in from a description of debriefing as if they are juxtaposed. They often present it in a way as if one is good and the other is bad. There is a misguided notion that feedback is telling someone, or lecturing to someone to get a point across. I believe that is a narrow interpretation of the word. I think that there are tremendous opportunities for learning from many facets of simulation that may be considered feedback.

Well-designed simulation activities hopefully provide targeted learning opportunities of which part of it is experiential, sometimes immersive, in some way. I like to think of debriefing as one form of feedback that a learner may encounter during simulation based learning, commonly occurring after engaging in some sort of immersive learning activity or scenario. Debriefing can be special if done properly and will actually allow the learner to “discover” new knowledge, perhaps reinforce existing knowledge, or maybe even have corrections made to inaccurate knowledge. No matter how you look at it at the end of the day it is a form of feedback, that can likely lead, or contribute to learning. But to think that during the debriefing is the only opportunity for learning is incredibly short-sighted.

There are many other forms of feedback and learning opportunities that learners may experience in the course of well-designed simulation based learning. The experience of the simulation itself is ripe with opportunities for feedback. If a learner puts supplemental oxygen on a simulated patient that is demonstrating hypoxia on the monitor via the pulse oximetry measurements and the saturations improve, that is a form of feedback. Conversely, if the learner(s) forgets to provide the supplemental oxygen and the saturations or other signs of respiratory distress continue to worsen then that can be considered feedback as well. The latter two example examples are what I refer to as intrinsic feedback as they are embedded in the scenario design to provide clues to the learners, as well as to approximate what may happen to a real patient in a similar circumstance.

With regard to intrinsic feedback, it is only beneficial if it is recognized and properly interpreted by the learner(s) either while actively involved in the simulated clinical encounter, and if not, perhaps in the debriefing. The latter should be employed if the intrinsically designed feedback is important to accomplishing the learning objectives germane to the simulation.

There are still other forms of feedback that likely contribute to the learning that are not part of the debriefing. In the setting of a simulated learning encounter involving several learners, the delineation of duties, the acceptance or rejection of treatment suggestions are all potentially ripe for learning. If a learner suggests a therapy that is embraced by the team, or perhaps stimulates a group discussion during the course of the scenario the resultant conversation and ultimate decision can significantly add to the learning of the involved participants.

Continuing that same idea, perhaps the decision to provide, withhold, or check the dosage of a particularly therapy invokes a learner to check a reference, or otherwise look up a reference that provides valuable information that solidifies a piece of information in the mind of the leaner. The learner may announce such findings to the team while the scenario is still underway thereby sharing the knowledge with the rest of the treatment team. Waaah Laaaah…… more learning that may occur outside of the debriefing!

Finally, I believe there is an additional source of learning that occurs outside of the debriefing. Imagine when a learner experiences something or becomes aware of something during a scenario which causes them to realize they have a knowledge gap in that particular area. Maybe they forgot a critical drug indication, dosage or adverse interaction. Perhaps there was something that just stimulated their natural curiosity. It is possible that those potential learning items are not covered in the debriefing as they may not be core to the learning objectives. This may indeed stimulate the learner to engage in self-study to enhance their learning further to close that perceived area of a knowledge gap. What???? Why yes, more learning outside of the debriefing!

In fact, we hope that this type of stimulation occurs on the regular basis as a part of active learning that may have been prompted by the experiential aspects provided by simulation. Such individual stimulation of learning is identified in the sentinel publication of Dr. Barry Issenberg et al in Vol 27 of Medical Teacher in 2005 describing key features of effective simulation.

So hopefully I have convinced you, or reinforced your belief that the potential for learning from simulation based education spans far beyond the debriefing. Please recognize that this statement made by others likely reflects a serious misunderstanding and underappreciation for learning that can and should be considered with the use of simulation. The implication of such short-sightedness can have huge impacts on the efficiency and effectiveness of simulation that begin with curriculum and design.

So the next time you are incorporating simulation into your education endeavor, sit back and think of all of the potential during which learning may occur. Of course the debriefing in one such activity during which we hope learning to occur. Thinking beyond the debriefing and designing for the bigger picture of potential learning that can be experienced by the participants is likely going to help you achieve positive outcomes from your overall efforts.

5 Comments

Filed under Uncategorized

Simulation Curriculum Integration via a Competency Based Model

Process_Integration.shutterstock_304375844One of the things that is a challenge for healthcare education is the reliance on random opportunity for clinical events to present themselves for a given group of learners to encounter as part of a pathway of a structured learning curriculum. This uncertainty of exposure and eventual development of competency is part of what keep our educational systems time-based which is fraught with inefficiencies by its very nature.

Simulation curriculum design at present often embeds simulation in a rather immature development model in which there is an “everybody does all of the simulations” approach. If there is a collection of some core topics that are part and parcel to a given program, combined with a belief, or perhaps proof, that simulation is a preferred modality for the topic, then it makes sense for those exposures. Let’s move beyond the topics or situations that are best experienced by everyone.

If you use a model of physician residency training for example, curriculum planners “hope” that over the course of a year a given first year resident will adequately manage an appropriate variety of cases. The types of cases, often categorized by primary diagnosis, is embedded in some curriculum accreditation document under the label “Year 1.” For the purposes of this discussion lets change the terminology from Year 1 to Level 1 as we look toward the future.

What if we had a way to know that a resident managed the cases, and managed them well for level one? Perhaps one resident could accomplish the level one goals in six months, and do it well. Let’s call that resident, Dr. Fast. This could then lead to a more appropriate advancement of the resident though the training program as opposed to them advancing by the date on the calendar.

Now let’s think about it from another angle. Another resident who didn’t quite see all of the cases, or the variety of cases needed, but they are managing things well when they do it. Let’s call them Dr. Slow. A third resident of the program is managing an adequate number and variety, but is having quality issues. Let’s refer to them as Dr. Mess. An honest assessment of the current system is that all three residents will likely be advanced to hire levels of responsibilities based on the calendar without substantial attempt at remediation of understanding of the underlying deficiencies.

What are the program or educational goals for Drs. Fast, Slow and Mess? What are the differences? What are the similarities? What information does the program need to begin thinking in this competency based model? Is that information available now? Will it likely be in the future? Does it make sense that we will spend time and resources to put all three residents through the same simulation curriculum?

While there may be many operational, culture, historical models and work conditions that provide barriers to such a model, thinking about a switch to a competency based model forces one to think deeper about the details of the overall mission. The true forms of educational methods, assessment tools, exposure to cases and environments, should be explored for both efficiency and effective effectiveness. Ultimately the outcomes we are trying to achieve for a given learner progressing through a program would be unveiled. Confidence in the underlying data will be a fundamental necessary component of a competency based system. In this simple model, the two functional data points are quantity and quality of given opportunities to learn and demonstrate competence.

This sets up intriguing possibilities for the embedding of simulation into the core curriculum to function in a more dynamic way and contribute mightily to the program outcomes.

Now think of the needs of Dr. Slow and Dr. Mess. If we had insight combined with reliable data, we could customize the simulation pathway for the learner to maximally benefit their progression through the program. We may need to provide supplement simulations to Dr. Slow to allow practice with a wider spectrum of cases, or a specific diagnosis, category of patient, or situation for them to obtain exposure. Ideally this additional exposure that is providing deliberate practice opportunities could also include learning objectives to help them increase their efficiencies.

In the case of Dr. Mess, the customization of the simulation portion of the curriculum provide deliberate practice opportunities with targeted feedback directly relevant to their area(s) of deficiency, ie a remediation model. This exposure for Dr. Mess could be constructed to provide a certain category of patient, or perhaps situation, that they are reported to handle poorly. The benefit in the case of Dr. Mess is the simulated environment can often be used to tease out the details of the underlying deficiency in a way that learning in the actual patient care environment is unable to expose.

Lastly, in our model recall that Dr. Fast may not require any “supplemental” simulation thus freeing up sparse simulation and human resources necessary to conduct it. This is part of the gains in efficiencies that can be realized through a competency -based approach to incorporating simulation into a given curriculum.

Considering a switch to a competency based curriculum in healthcare education can be overwhelming simply based on the number of operational and administrative challenges. However, using a concept of a competency based implementation as a theoretical model can help envision a more thoughtful approach to curricular integration of simulation. If we move forward in a deliberate attempt to utilize simulation in a more dynamic way, it will lead to increases in efficiencies and effectiveness along with providing better stewardship of scarce resources.

 

1 Comment

Filed under Uncategorized

Evaluating Inpatient Crisis Response

shutterstock_168180668_a

As the Medical Director of patient safety for a large healthcare system I can say that conducting unannounced “mock codes” (Inpatient Crisis Response Evaluation System is the title of our program) is a critical pillar of safety quality improvement efforts. WISER oversees our program and provides the evaluation and consultation service to many of our 20 hospitals in conjunction with and close collaboration with the local hospital physician and nursing leadership.

The unannounced part allows true system evaluation of such a response. The events are closely choreographed with our simulation team (led by a physician medical director), as well as the local hospital leadership. Our evaluation system has afforded us as a system, the opportunity to unveil many latent system threats as well as identify opportunities for targeted training efforts. With regard to simulation and training it is a TRUE needs analysis in this way.

With regard to acceptance, I believe that it is related to the maturity of the overall organization and the simulation personnel conducting the events. In the words of James Reason on high reliability organizations “They anticipate the worst and equip themselves to deal with it at all levels of the organization. It is hard, even unnatural, for individuals to remain chronically uneasy, so their organizational culture takes on a profound significance. Individuals may forget to be afraid, but the culture of a high reliability organization provides them with both the reminders and the tools to help them remember.” Thus I believe in highly mature safety culture organizations it is incumbent upon both the leadership and the healthcare clinicians to be accepting of “external” evaluations for such critical moments as inpatient crisis events.

I also believe that the naming of the program has significant implications. The title “Mock Code” in my opinion sounds somewhat trivial, extra, perhaps of marginal utility, or at the very least “fake.” If that is the intent, then I believe that is easier to argue that the events should be pre-planned and/or avoid being completely “unexpected”. However if the intent is to seriously evaluate a high reliability organization’s response to an unexpected patient situation, and identify needs, process improvement opportunities and uncover latent threats, I would argue for the unannounced methodology.

Our health system shares a deep commitment to continue on the journey to high reliability and believe our Inpatient Crisis Response Evaluation System is an important component of our success. As WISER is accredited by the SSH in Systems Integration (among other categories) we believe a fully integrated approach is necessary, very safe, feasible and our responsibility to execute and provide feedback to our health system.shutterstock_78054850_a

As anyone who provides actual care for patients there are risks and benefits to ALL decision that are made from therapeutics, to staffing, to salting the parking lot. There are certainly safety items that must be attended to in any of our simulation efforts, particularly those which occur in proximity to actual care. However carefully crafted programs, process and execution will ultimately ensure the benefits outweigh the risks.

I truly believe the undiscovered system latent threats to inpatients are a greater risk than the conducting of the mock code itself.

1 Comment

Filed under Uncategorized

Lecture: It’s not Dead Yet

LectureNotDeadFellow simulationists, let’s get real. We should not be the enemy of lecture. Lecture is a very valuable form of education. What we should be campaigning against are bad lectures, and the use of lecture when it isn’t the best tool for the associated attempt at education.

We have all listened to lectures that were horrific and/or lectures presented by speakers who have/had horrific public speaking or presenting skills. But in essence a good lecture can be an incredibly efficient transfer of information. The one to many configuration that is in inherent in the format of lecture can lead to an amazing amount of materials covered, interpreted and/or organized by the presenter to raise the level of knowledge or understanding of the people in attendance.

Like anything else in education we need to stratify the needs of what we are trying to teach and create solutions by which to teach them. With regard to lecture as a tool, we need to find ways to engage the audience into active participation to enhance the comprehension, learning and attention of the participants. There are many tools available for this, some involving technology, some not. The onus is on the presenter to seek out techniques as well as technologies or creative ways to engage people in the audience into an active learning process.

I don’t think of simulation as an alternative, or better way to teach, then lecture. I view lecture and simulation as two different tools available to the educational design process to affect good learning. Much the same way that I would not say a screwdriver is a better tool than a pair of pliers.

Too many times at simulation meetings and in discussions with simulation enthusiasts I hear empirical lecture-bashings if it is old school, out-moded or something lacking value. During these conversations it becomes readily apparent that the person speaking doesn’t have full command of the fact that the main goal is education, not simulation, and that there are many ways to create effective learning environments.

Now lecture can get a bad rap deservedly. Go to a meeting and listen to a boring monotonous speaker drone on and read from their powerpoint slides while not even recognizing that there is an audience in front of them. Unfortunately that is still more common than not at many physician and nursing meetings. Or worse yet, in the new age of converting to flipped classrooms and on-line learning, people are taking the easy way out and moving videos of lectures and plopping them on-line and calling it on-line learning. How pitiful. How painful. The only thing I can imagine worse than a bad lecture in person, is a bad lecture on web based learning that I would have to suffer through.

So I still teach and lead workshops on helping people enhance their lecturing and presentation skills. In part because I continue to recognize that not only will lecture be around for a long time, it should be around for a long time because it CAN be incredibly powerful with the right preparation and in the right hands. Also I continue to recognize the value of seeing modern healthcare education efforts being carefully thought out to understand which tool is best for which phase of learning after careful evaluation of the intended learner group and the topic at hand.

We need to end the silo-like thinking of simulation is better than lecture and convert to a more outcomes oriented thought process that evaluates and implements the appropriate educational tool for the intended educational accomplishments.

So let’s commit to each other to never do a simulation that could be just effective as an engaging lecture, AND lets all agree to never do a lecture that sucks.

2 Comments

Filed under Uncategorized

Simulation can be Fun. And Serious.

shutterstock_286597808aI was recently energized by sitting in the back of one of our simulation rooms where two of my faculty colleagues were running simulations for some of Emergency Medicine Residents. They had prepared the session well and had clearly established a previously great and trusting relationship with the residents in a safe learning kind of way.

The residents seemed relaxed, smiling, and many were attending the session dressed in the likes of Khacki shorts, Teva’s and a Hawaiian shirt or two. During one of the scenarios the faculty member operating the simulator made a mistake and the “patient” took a turn for the worse when the correct treatment was ordered. He was on the other side of the glass and immediately said something funny about his mistake over the room speakers in a self-deprecating way. Everyone in the room was cracking up including the other faculty members, me, all of the team members and the resident observers. The simulation came to an end a few minutes later as the rest of the learning objectives were met

During the debriefing the faculty member called out his mistake once again to another round of snickers. Superficially it seemed that he was trying to be funny. Deeper I think he was level setting to ensure there wasn’t confusion of the change in status over the patient. Additionally he was ensuring to demonstrate the safe learning environment in so far as declaring that he was capable of making mistakes as well.

A few moments later the residents were engaged in a debriefing using the Structured and Supportive Debriefing Model and the GAS tool. During the debriefing many topics were covered ranging from teamwork, the initial care and stabilization of the patient, to aberrancies in the electrical system of the heart leading to wide complex tachycardia that can mimic ventricular tachycardia.

A few minutes later the debriefing was wrapped up expertly by the faculty member. Another scenario ensued with a new group of residents and again, unplanned, something funny happened. Again laughter, then back to work, then the end. Debriefing commenced. During the second debriefing led to a discussion of how cyanide poisoning interacts with cellular metabolic pathways of the P450 cytochrome system and the therapeutics that should be considered to save the patient’s life. During the conversation a few light hearted comments by residents created more laughing.shutterstock_261594212a

I sat back thinking….. this is really fun…….There they are dressed in their tevas and shorts…..Learning of all things…… imagine that. This is truly patient-centric simulation. Innovative education occurring in a comfortable atmosphere helping these future emergency physicians perfect their diagnostic, therapeutic and leadership skills. They don’t need to be in scrubs, shirts and ties or wearing hospital badges to optimize this learning opportunity. They are not going to show up to work in the hospital wearing shorts and tevas. They are professionals. You know what? They are in fact adult learners being treated as adults.

I was a bite envious of my faculty colleagues having creating this amazingly relaxed environment where the residents felt comfortable to speak up, right or wrong in front of each other and faculty members alike.  In fact they were encouraged to explore during the cases. And they were learning. Learning new concepts or at least reviewing topics and learning objectives that were appropriate for their training program.

Guys and gals dressed as if they were going to a picnic, learning from each other, laughing and feeling free to explore and demonstrate their knowledge, skills and attitudes for the purpose of improving. Were they not taking it seriously? Cytochrome P450 and conduction aberrancies sure sounded serious to me, as did the discussion of teamwork and leadership.

Sometimes I think we can easily take ourselves too seriously in the simulation world. While I would be the first to argue there are times to do just that, I am reminded that there are times when it is not the case. People seem to be so caught up in defining rules of how things should and shouldn’t be done in simulation encounters that sometimes I observe huge opportunities to find new and interesting ways in which we can engage learners in their prime. I think that these faculty members new their participants well and designed amazing learning opportunities for them that included some of the power of simulation.

After all, we are not trying to simulate reality, we are trying to use simulation to create a milieu that will enhance our ability to carry out learning and assessment objectives that will eventually influence the care that is delivered by the healthcare system.

It was a great day for me, simulation and especially for future patients!

1 Comment

Filed under Uncategorized