Category Archives: Curriculum

Sherlock Holmes and the Students of Simulation

I want to make a comparison between Sherlock Holmes and the students of our simulations! It has important implications for our scenario design process. When you think about it, there’s hypervigilance amongst our students, looking for clues during the simulation. They are doing so to figure out what we want them to do. Analyzing such clues is like the venerable detective Sherlock Holmes’s processes when investigating a crime.

Video version of this post

This has important implications for our scenario design work because many times, we get confused with the idea that our job is to create reality when in fact, it is not that at all our job. As simulation experts, our jobs are to create an environment with the reality that is sufficient to allow a student to progress through various aspects of the provision of health care. We need to be able to make a judgment and say, “hey, they need some work in this area,” and “hey, they’re doing good in this area.”

To accomplish this, we create facsimiles of what they will experience in the actual clinical environment transported into the simulated environment to help them adjust their mindset so they can progress down the pathway of taking care of those (simulated) patient encounters.

We must be mindful that during the simulated environment, people engage their best Sherlock Holmes, and as the famous song goes, [they are] “looking for clues at the scene of the crime.”
Let’s explore this more practically.

Suppose I am working in the emergency department, and I walk into the room and see a knife sitting on the tray table next to a patient. In that case, I immediately think, “wow, somebody didn’t clean this room up after the last patient, and there’s a knife on the tray. I would probably apologize about it to the patient and their family.”

Fast forward…..

Put me into a simulation as a participant, and I walk into the room. I see the knife on the tray next to the patient’s bed, and I immediately think, “Ah, I’m probably going to do a crich or some invasive procedure on this patient.”

How does that translate to our scenario design work? We must be mindful that the students of our simulations are always hypervigilant and always looking for these clues. Sometimes when we have things included in the simulation, we might just have there as window dressing or to try to (re)create some reality. However, stop to think they can be misinterpreted as necessary to be incorporated into the simulation by the student for success in their analysis.

Suddenly, the student sees this thing sitting on the table, so they think it is essential for them to use it in the simulation, and now they are using it, and the simulation is going off the tracks! As the instructor, you’re saying that what happened is not what was supposed to happen!

At times we must be able to objectively go back and look at the scenario design process and recognize maybe just maybe something we did in the design of the scenario, which includes the setup of the environment, that misled the participant(s). If we see multiple students making the same mistakes, we must go back and analyze our scenario design. I like to call it noise when we put extra things into the simulation scenario design. It’s noise, and the potential for that noise to blow up and drive the simulation off the tracks goes up exponentially with every component we include in the space. Be mindful of this and be aware of the hypervigilance associated with students undergoing simulation.

We can negate some of these things by a good orientation, by incorporating the good practice into our simulation scenario design so that we’re only including items in the room that are germane to accomplishing the learning objectives.

Tip: If you see the same mistakes happening again and again, please introspect, go back, look at the design of your simulation scenario, and recognize there could be a flaw! Who finds such flaws in the story?  Sherlock Holmes, that’s who!

1 Comment

Filed under Curriculum, design, scenario design, simulation

5 Tips to Improve Interrater Reliability During Healthcare Simulation Assessments

One of the most important concepts in simulation-based assessment is achieving reliability, and specifically interrater reliability. While I have discussed previously in this blog every simulation is assessment, in this article I am speaking of the type of simulation assessment that requires one or more raters to record data associated with the performance or more specifically an assessment tool.

Interpreter reliability simply put is that if we have multiple raters watching a simulation and using a scoring rubric or tool, that they will produce similar scores. Achieving intermittent reliability is important for several reasons including that we are usually using more than one rater to evaluate simulations over time. Other times we are engaged in research and other high stakes reasons to complete assessment tools and want to be certain that we are reaching correct conclusions.

Improving assessment capabilities for stimulation requires a significant amount of effort. The amount of time and effort that can go into the assessment process should be directly proportional to the stakes of the assessment.

In this article I offer five tips to consider for improving into rate of reliability when conducting simulation-based assessment

1 – Train Your Raters

The most basic and overlooked aspect of achieving into rate and reliability comes from training of the raters. The raters need to be trained to the process, the assessment tools, and each item of the assessment that they are rendering an opinion on. It is tempting to think of subject matter experts as knowledgeable enough to fill out simple assessments however you will find out with detailed testing that often the scoring of the item is truly in the eye of the beholder. Simple items like “asked medical history” may be difficult to achieve reliability if not defined prior to the assessment activity. Other things may affect the assessment that require rater calibration/training such as limitations of the simulation, and how something is being simulated and/or overall familiarity with the technology that may be used to collect the data.

2 – Modify Your Assessment Tool

Modifications to the assessment tool can enhance interrelated reliability. Sometimes it can be extreme as having to remove an assessment item because you figure out that you are unable to achieve reliability despite iterative attempts at improvement. Other less drastic changes can come in the form of clarifying the text directives that are associated with the item. Sometimes removing qualitative wording such as “appropriately” or “correctly” can help to improve reliability. Adding descriptors of expected behavior or behaviorally anchored statements to items can help to improve reliability. However, these modifications and qualifying statements should also be addressed in the training of the raters as described above.

3 – Make Things Assessable (Scenario Design)

An often-overlooked factor that can help to improve indurated reliability is make modifications to the simulation scenario to allow things to be more “assessable”. We make a sizable number of decisions when creating simulation-based scenarios for education purposes. There are other decisions and functions that can be designed into the scenario to allow assessments to be more accurate and reliable. For example, if we want to know if someone correctly interpreted wheezing in the lung sounds of the simulator, we introduced design elements in the scenario that could help us to gather this information accurately and thus increase into rater reliability. For example, we could embed a person in the scenario to play the role of another healthcare provider that simply asks the participant what they heard. Alternatively, we could have the participant fill out a questionnaire at the end of the scenario, or even complete an assessment form regarding the simulation encounter. Lastly, we could embed the assessment tool into the debriefing process and simply ask the participant during the debriefing what they heard when I auscultated the lungs. There is no correct way to do this, I am trying to articulate different solutions to the same problem that could represent solutions based on the context of your scenario design.

4 – Assessment Tool Technology

Gathering assessment data electronically can help significantly. When compared to a paper and pencil collection scheme technology enhanced or “smart” scoring systems can assist. For example, if there are many items on a paper scoring tool the page can sometimes become unwieldy to monitor. Electronic systems can continuously update and filter out data that does not need to be displayed at a given point in time during the unfolding of the simulation assessment. Simply having previously evaluated items disappear off the screen can reduce the clutter associated with scoring tools.

5 – Consider Video Scoring

For high stakes assessment and research purposes it is often wise to consider video scoring. High stakes meaning pass/fail criteria associated with advancement in a program, heavy weighting of a grade, licensure, or practice decisions. The ability to add multiple camera angles as well as the functionality to rewind and play back things that occurred during the simulation are valuable in improving the scoring accuracy of the collected data which will subsequently improve the interrater reliability. Video scoring associated with assessments requires considerable time and effort and thus reserved for the times when it is necessary.

I hope that you found these tips useful. Assessment during simulations can be an important part of improving the quality and safety of patient care!

If you found this post useful please consider subscribing to this blog!

Thanks and until next time! Happy Simulating.

Leave a comment

Filed under assessment, Curriculum, design, scenario design

Adjuncts to Enhance Debriefing

I wanted to discuss some ideas of using adjuncts as part of your debriefing.

When we think about debriefing, we often think about a conversation between faculty member or members and participants of simulation with a focus on everyone developing an understanding of what they did right as well as what they need to improve upon.  We rarely think about the possibility of including other “things” to enhance the learning that comes from the debriefing.

I tend to incorporate adjuncts into a many of the debriefings associated with courses that I design.  What I mean is things that added into the debriefing process/environment that can enhance the discussion.  Sometimes with deliberate purpose, and other times just to mix it up a little bit so that it is not just a dialogue between the participants and the faculty.  It may be something technical, it may be something as simple as a paper handout.

Simple Task Trainer as an Adjunct

Some ideas of adjuncts include PowerPoint slide deck or a few targeted slides that help to review a complex topic, one that requires a deeper understanding, or a subject that benefits from repetition of exposure.  Another type of adjunct is the simulator log file which can help set the stage for the debriefing and create a pathway of discussion that chronologically follows what happened during the simulation.  Another adjunct could be a partial task trainer or a model that helps to describe or demonstrate something.  For example, the students forgot to do a jaw-thrust or open the airway.  We can use a task trainer, or a teaching aide incorporated into the discussion during the debriefing.  

Example of an Algorithm Poster on the Wall

Other things that I use are charts, graphs, and algorithms that may represent best practices.  When I debrief during my difficult airway management course for physicians, I have the algorithm up on the wall hanging as a poster.  We use the algorithm posters as a pathway to compare the performance of the participants of the simulation with what the ideal case would be.  You can use the adjunct learning aid as a reference to standards.  This can help you to take yourself out of the direct argument of right vs. wrong.  This allows use of the adjunct as a third-party messenger of a reference to best practices excellence when I have the participants compare their performance against what appears on the algorithm.  This allows them to discover their own variations from the expected standard.  It tends to create powerful learning moments without the faculty having to be “the bearer of bad news!”

I think that if you start to strategically think about how to incorporate adjuncts into your debriefing you will find the students are more satisfied with the debriefing.  It also increases the stickiness of the learning and creates a more enjoyable experience for the faculty member as well as the participants.  Try it!  It does not have to be fancy!

Thanks, and as always,

Happy Simulating!

Leave a comment

Filed under Curriculum, debriefing

Cognitive Load Control and Scenario Design in Healthcare Simulation

As the design architects of simulation scenarios, we must remain cognizant of our ability to have influence over the cognitive load of those experiencing our simulations in the role of learners.

When caring for patients in real life, we expend cognitive energy in doing so to ensure we make the right decisions to provide the absolute best care for every patient. We engage in critical thought processes, that guide our interpretation of the enormous number of facts surrounding each patient so we can make further decisions to provide various therapies, or advice to the patient.

Headache brain in a clamp isolated grey background

When we design simulations for our learners, we are creating similar environments noted above that demand a significant amount of cognitive workload to be endured for the participant to successfully navigate the case and care the [simulated] patient. In addition, I argue that we are adding additional cognitive workload by subjecting someone to the simulated environment insofar as they are engaged in a conscious or perhaps subconscious pursuit of deciding what is simulated and what is not. I have previously written about this and dubbed it the cognitive third space of simulation.

Nonetheless, there is mental energy spent in the care of the patient as well as the interpretation of the simulation. We also must realize that our design choices inside of the scenario contribute to the adjustment of the cognitive load endured by the learner(s) associated with our simulations. It is important that we be deliberate in our design to ensure that we are allowing all involved to achieve the desired learning outcomes.

Some specific examples of this cognitive load influence may help to bring forth an understanding. Take a test result for example. If one looks in the electronic health record and sees the values reported for a simple test, like a basic metabolic profile (which consists of a sodium, chloride, potassium, CO2, BUN, creatinine and glucose) there is a certain amount of mental energy goes into the interpretation of the numeric data presented for each of the seven items of the basic metabolic profile. Some electronic health records may color-code the results to assist in the processing of normal versus normal, and some may not.

Such a decision involved in the human factors design of electronic health record actually influences the amount of cognitive spend on the interpretation of the given value. Further, as experienced clinicians are keenly aware, we must interpret the lab value in the context of the patient for whom the test has been ordered. What is normal for one patient, may not be normal for another. Thus, even in the interpretation of a simple test, there is a significant amount of cognitive process (critical thought) that should be applied.

How does this relate to simulation scenario design? We have the ability to engineer the scenario design to help the participants channel cognitive energy into those things that are important and away from those those things that are not. If we continue to run with the example of the basic metabolic profile as an example, we have choices on how said values are reported to the participants of our simulation.

We could have the participants look it up in the simulated electronic health record which takes time and cognitive processing as described above. We could give them a piece of paper or display the results on a screen demonstrating the seven values. This still takes significant cognitive processing to interpret the data. We could simply indicate that the basic metabolic profile result was “normal”.  This method significantly decreases the cognitive processing associated with the seven values of the basic metabolic profile and how it is to be interpreted into the context of the scenario. Also, one could make the argument that we are offering subtle, or perhaps not-so-subtle clues to the case that the basic metabolic profile is not a major part of what needs to be processed in the care of this particular patient.  

It is important to realize that all the examples above are viable options and there is not one that is superior to another. It is important that the decision is made during the design of the case that allows the participant(s) of the scenario to focus the appropriate cognitive spend on that which the designers of the scenario feel are most important. In other words, if it is part of the learning objectives that the participant should evaluate the actual values of the basic metabolic profile, then of course it would be appropriate to provide the requisite information at that level of detail. If, however, the results of the same test are perfunctory to the bigger picture of the case then one should consider a different mechanism of resulting values to the simulation participant.

A common misperception in the design of healthcare simulation scenarios is to try to re-create the realistic environment of the clinical atmosphere. While this is always a tempting choice, it is not without consequences. It comes from the mistaken belief that the goal of simulation scenarios is to re-create reality. Modern, successful simulationists need to recognize this outmoded, immature thought process.

In the context of a case where the basic metabolic profile is not significantly important that we should not design the “dance” (scenario) to include the steps of looking in the electronic health record and making determinations of the values associated with the test. It is a waste of time, and more importantly a waste of cognitive processing which is already artificially increased by the participant being involved in the simulation in the first place. It is in my opinion a violation of the learner contract between faculty and students.

While I am focusing on a simple example of a single test, I hope that you can imagine that this concept extrapolates to many, many decisions that are made in the scenario design process. For example, think about a chest x-ray. Do you result a chest x-ray as “normal”, “abnormal” or otherwise during the run time of the scenario? Or do you show an image of a chest x-ray and have your participants interpret the image? One answer is not superior to the other. It is just critically important that you evaluate what is best for the cognitive load of the learners involved in your scenario and how the decision relates to the details of the learning objectives you wish to achieve during the course of the simulation activity.

In moderate to complex cases associated with healthcare simulation the designer of the simulation, or architect, has a responsibility to craft the scenario to accomplish the learning objectives that are intended. In many scenarios, hundreds of decisions are made in terms of how participants extract data from the experience to incorporate into their performance of the simulation. It is critically important that as the designers of such learning events that we remain cognizant of the cognitive load placed upon our learner(s) that is associated with the normal care of patients, as well as the extra that is imposed upon them from participating in a simulation-based case.

Many of the decisions that we incorporate into the design of our scenarios have significant influence over this cognitive load, and the mental energy participants will spend to engage in the participation. We need to understand the impact of our choices and be deliberate with our design decisions to enhance the overall simulation-based learning process efficiency and effectiveness.

Leave a comment

Filed under Curriculum, design, scenario design, simulation, Uncategorized

Where do we Debrief?

Selecting the location to conduct the debriefing after a simulation is a decision that often has many variable. Sometimes there are limited choices and the choice is dictated by what is available, or what space holds the technology that is deemed essential to the debriefing. Other times there is deliberate planning and selection.

This short video explores some of the basics of how such decisions are made and some of the pros and cons associated with the final choices.

Leave a comment

Filed under Curriculum, debriefing, simulation, Uncategorized

Exploring the Elements of Orientation and (Pre)Briefing in Simulation Based Learning Design

AdobeStock_119412077

I want to explore a little bit about orientation and (pre)briefing(s) associated with simulation based education design concepts. The words are often tossed about somewhat indiscriminately. However it is important to realize they are both important elements of successful healthcare simulation and serve distinct purposes.

When we look in the Healthcare Simulation Dictionary, we find that the definition of Orientation is aligned with an overview preparation process including “… intent of preparing the participants.” Examples include center rules, timing and the simulation modalities.

On the other hand, according to the same dictionary the definition of the word Briefing includes “An activity immediately preceding the start of a simulation activity where participants receive essential information about the simulation scenario….”

I look at orientation as the rules of engagement. I like to think of orientation linked to the overall educational activity in total. Some essential components include orientation to the simulation center, the equipment, the rules, and the overall schedule for the learning activity.

At a somewhat deeper level of thought I think the orientation is linked to the learning contract. What do I mean by that?

I think it is essential that we as the faculty are establishing a relationship with our learners and begin to establish trust and mutual respect. To that end, we can use orientation to minimize surprises. Adult learners do not like surprises!

We need to have the adult learner understand what they can expect. I always orient the learners as to what will feel real, and I am similarly honest with them about what will not feel real. If they will be interacting with a computerized simulator for example, I orient them to the simulator before the start of the program.

In the simulation world we throw around words like debriefing, scenario and task training. To clinical learners these terms may be unfamiliar, or have different contexts associated with them. This for example, can cause anxiety and during the orientation we need to walk them through the experience they are about to embark upon.

Some factors can influence the amount and depth of the orientation. Variables such as the familiarity your participants have with simulation, your simulation center, and your simulation-based encounters. For example, learners who come to your center on the monthly basis probably need less total orientation than those who are reporting for the first time. Learners familiar with the fact that debriefings occur after every simulation may already be acclimated to that concept, but people coming to the sim center for the first time may not be aware of that at all.

Participants just meeting you for the first time they might need a little bit more warming up and that an come in the form of orientation. Overall though it is not just about telling them what’s going on, as it is using the opportunity toward earning their trust and confidence in the simulated learning encounter(s) and the value associated to them as a professional.

BriefingGraphic3Switching the focus to the brief, briefing or (pre)briefing. The briefing is more linked to the scenario as compared to the orientation. The briefing should focus on the details of the case at hand introducing components of information that allow one to acclimate to what they going to need to accomplish during the simulation. What is their role and goals in this scenario they are about to embark upon? If you are going to ask people to play different roles then they are in real life, it is very important that this fact is crystal clear in the briefing.

I think that the briefing should also bring the context to the healthcare experience. It is important to orient the learner for the impending encounter what they are to perceive and think of as real as they are experiencing what is in the simulation. You as a simulation faculty may think that it is obvious that a room in your simulation center is an ICU bed. The participant may not and deserves clarity prior to the start of the simulation so they do not feel like they are being tricked or duped. During the briefing the statement “You are about to see a patient in the ICU…..” can remove such ambiguity.

Another critical briefing point is to clarify the faculty-student engagement rules that should be expected during the scenario runtime if it was not covered in the orientation. There are many correct ways to conduct simulation scenarios. There are varying levels of interaction between faculty members running the simulation and the learners that are participating. This should be clarified before the scenario starts.

For example, are you going to let the learners ask questions of the of the faculty member during the simulation? Or not? This should be upfront and covered in the briefing, and perhaps even aspects of that in the orientation.

While not a requirement I think that parameters associated with time expectations are always good to give in a briefing. For example stating “You are going to have 10 minutes in the scenario to accomplish X,Y and Z, and then we will have a ten minute debriefing before the next scenario.”

Remember our adult learners don’t like surprises! I always use the briefing before a scenario to remind the participant(s) that afterward we are going to have a debriefing. I remind them of that so that they know that they should collect her thoughts and ideas and be ready to have this discussion. Secondly, I am saying in any unspoken way, that if they are uncomfortable about something, or have questions, that there will be an opportunity for discussion during the debriefing. (In other words, your sort of giving some control back to the learner…. Helping to build the trusting relationship.)

Some of the variations of the briefing are similar to that of the orientation mentioned above. People who are more familiar to simulation, your particular programs, your style, may require slightly less of a briefing than others. Additionally, if you are running multiple scenarios as part of a simulation-based course, after the first couple of scenarios you will find that the briefing can be shortened as compared to the beginning of the day.

So, in summary, orientation and briefings are different elements of simulation-based learning that are useful for different things that will contribute to the success of your simulations.

Think of orientation linked to the bigger picture and the learner contract that contributes to making the relationship comfortable between the participants and the faculty. The orientation is the rules of engagement and orientation to the technology and being explicit as to what is to be expected of the participant. Think of the briefing as linked more to the scenario roles, goals, and introduction to patient and environment information to help the participant mentally acclimate to what they are about to dive into.

Leave a comment

Filed under Curriculum, scenario design, simulation, Uncategorized

Simulation, Music, and Dancing

Many of you know of my crazy thoughts and ideas to try to connect things together with contrasts and comparisons to help people understand concepts and ideas. Well…. Here goes another one of them!Dance

I find that people continuously struggle with understanding the true relationship of the scenario (defined as the collective information, tools, and techniques that are presented to participants of simulations) to the outcomes of the simulation. The confusion arises from the fact that people get inappropriate messaging during the formative times of their simulation careers.

People gain the idea that the scenario must be as real as possible, or perfect mimic some aspect of real life in healthcare in order to be effective instead of recognition that the sole purpose is to create a script and stage that allows participants to perform. Some people believe that the overall goal of simulation is to recreate reality. The sad part is, those misguided thoughts often lead to over-production of the scenario and that the scenario is the primary focus of the activity. This can lead to the unintended consequences of increasing the workload of the simulation relative to the value of performance improvement and/or introduce confusion to the participants of the scenario.  Neither of which are desirable.

It occurred to me recently that a terrific analogy can be made by evaluating the relationship of music, to competitive dance. As it turns out the scenario is simply the music.

Thinks about it. When a dancer or group of dancers are going to compete, a number of things must be in place. First, there is an understanding that the dance will be carried out with the playing of music. The activity will last a certain length of time, involve one or more people who are supposed to do certain things at certain times and that various details will be assessed or evaluated along the way. At times the evaluation maybe be structured to focus on improvement (formative) and perhaps feedback is shared along the way (deliberate practice preparing for a competition), while other times may it may be a high-stakes evaluation (summative) resulting in only a score (the actual competition).

Now let’s focus on the music. What is its purpose in a dance competition? If you think about it, the music providers the framework or backdrop against which the dancing activity occurs. It helps to coordinate the tone, the tempo, and the activities associated with the dance. If the objective is to assess a pair of dancers doing a waltz, then a waltz is played. So the learning objective would read, at the conclusion of this five minute activity, the participants will demonstrate the ability to perform a waltz. If we wanted to evaluate a Latin dance, we would play Latin music and have an appropriate assessment criterion by which to guide the improvement of the activity.

While it is technically possible for the assessment to occur in the absence of the music, it would be awkward for the participants and the evaluators as well. Further, a piece of music may be specifically chosen to encourage a certain dance move that would facilitate the evaluation of the activity, let’s say a twirl or a flip. If we needed to evaluate or score how well one performed a flip, a flip would need to occur during the dance.

When using the methods of simulation in the healthcare world, we need to see people dance. The dance we need to see is often a complex one involving the delivery of healthcare, but it is a dance none the less involving specific movements, communications, and other activities toward a specific goal There are times that we need to see individuals dance, other times teams.

If we are to evaluate a certain element of healthcare, then we must have carefully composed the music that propagated the desired activity to have occurred during the dance. As they dance, we perform an assessment with a goal of helping them improve through various feedback mechanisms. Such feedback may occur through active reflection and facilitated discussion (debriefing), self-reflection, peer to peer engagement, or perhaps in the delivery of a more formal score in the case of summative feedback.

The bigger point is, the scenario is constructed and executed (composed) to provide the background milieu to form the basis of the dance, i.e. have participants perform the activity that we wish to assess. We choose different types [of music] to play that is concordant with the activity we wish to evaluate. At times we play a tune that accentuates the evaluation of critical thinking skills, perhaps the performance of a complex skill, or maybe one that allows a whole team to dance together requiring teamwork that will benefit from feedback.

So, the next time you are composing your scenario, give careful consideration to the moves that you desire to evaluate. The music that plays should allow/encourage your dancers to perform the steps and activities that will be evaluated and turned into useful information to facilitate improvement.

Compose, have people dance and help them get better!

If you enjoy or find my musings helpful, please sign up for my blog!

Leave a comment

Filed under Curriculum, scenario design, Uncategorized

Beware of Simulation Posers!

You may be a simulation poser if you say or do three or more of the following things…..

1. You say something like “In simulation all of the learning occurs during the debriefing.”
Appraisal: Not true. You are lying, uniformed, or not creative.
Not even close. If you believe this you are not paying attention to other learning opportunities that participants of simulation can avail themselves to. Think about the status changes of a simulator in response to proper or improper treatment. Think about participant to participant potential interactions. Think about the potential for instructor participant interactions that may contribute to learning. The potentials are practically limitless! For more see this blog post.

2. You claim there is a magic ratio of simulation time to debriefing time. “for every 15 minutes of simulation you must debrief for 45 minutes…. Etc.”
Appraisal: Rubbish.
No such thing exists. In fact if you think about this it is utterly ridiculous given the number of variables that exist that may potentially influence the debriefing time. Things like the topic, number of learners, experience level of the learners, number of faculty, experience of the faculty and on and on. Just stop saying it and the perception of your (simulation) IQ will raise by 10.

3. You espouse that during simulation encounters the students and faculty must be separated by something like a glass wall.
Appraisal: Lack of creative thinking.
While there are a lot of god reasons to design simulations that physically isolate the faculty from the participants, there are as many compelling reasons to have faculty in the same room at even at times interact ……. (agghhhast) with the participants. Think about the possibilities. Faculty side by side with students could engage in coaching and formative assessment or more easily conduct pause and discuss or pause and reflect type of learning encounters that can be more awkward when on the other side of the wall!

4. You say the simulator should never die during a simulation.
Appraisal: Wrong
‘Nuff Said on this one.

5. Simulations must have every aspect designed to be as real as possible.
Appraisal: Simply Crap.
Trying to create the ultra real environment can lead to increased time to set up, clean up and otherwise make the simulation less efficient. Worse yet creating a lot of simulated artifact can actually lead to increased confusion. How? Read this blog post on the cognitive third space of simulation. Simulations should be designed and outfitted to provide enough realism that enables the accomplishing of learning objectives. Everything else is a waste of time, money and/or people resources (ironically the same things you probably say that you don’t have enough of).

6. You say during simulations participants must/will suspend disbelief.
Appraisal: Ridiculous.
Out of the other side of your mouth you probably babble about adult learning theory……
If we are educating seriously smart adults, we don’t want them to think the plastic simulator is real. Seriously. I like to think of a more mature understanding of the situation that gives the participates a bit more credit for their lifetime of cerebral accomplishments. How about a message like…. “We have created this learning encounter using simulation for you so we can work together to help you become a better healthcare provider. Some of what you are going to experience will seem realistic and some will not. But we promise to make the best use of your time, treat you with dignity and respect, as we help you learn and practice.” Now that’s how adults talk. (Mic drop)

7. You claim one debriefing model is far superior to another. Or one has been validated.
Appraisal: Crap that gets sold at debriefing training programs.
If you are saying this, you probably don’t use a structure to your debriefing, don’t believe in learning objectives, or you only know one model of debriefing.
Truth is there are a bunch of good debriefing models in existence. You would do well to learn a few. Different models of debriefing are like tools in the toolbox. Some are good for certain topics, learners and situations and some for others.

8. You state that you should always use video while debriefing.
Appraisal: Industry sponsored rubbish.
You have drank some serious kool-aid, have had the wrong mentor, or an improper upbringing if you believe this. Further, if your make your participants watch the entire simulation on video, you should receive a manicure with a belt sander. Lastly if you say you use the video to solve disputes about what a student did or didn’t do, you may be hopeless.
Video can be a tool that can be strategically used to enhance debriefings at times. But more often video playback gets used as a crutch to make up for a lack of quality debriefing skills and to fill time.
There is also a misguided belief that students want to watch their videos. They don’t. They hate it. They think they look fat and their hair doesn’t look good.
Harnessing the power of a good debriefing is hard work and requires skill. But active reflection and guiding students towards a self-discovery of what they did well and what they need to change for the future is serious active learning. The more you can do that, the more the learning will occur. Watching a video of a simulation is like watching a bad movie. I always find it fascinating that simulation programs will spend a fortune putting in a video system that could film a Hollywood movie, but wont invest even a fraction of that cost into development of the faculty.

9. You use the terms “High and Low Fidelity Simulations” when you are referring to the use of a high technology simulator in your simulations.
Appraisal: You are feeding into the biggest industry sponsored word there is. In fact, the word fidelity is so perverse it should be banned. See additional blog post here on banning the “F” word.
The highest fidelity human simulator I know is a real person playing the role of a standardized or simulated patient. Everything else is overall, lower fidelity.
Seriously folks….. Somewhere along the way industry labeled a couple of simulators high fidelity because they had a feature or two that approximated that of a human. The label stuck and continues to perpetuate great confusion throughout the community of simulation, in practice and in the literature as well. Some centers even name their room like this!!!

Sadly, this crazy definition even made its way into the simulation dictionary of the Society for Simulation in Healthcare (which is otherwise excellent I might add). Do high technology simulators have some very cool and very useful high-technology features? Absolutely! But real like a person, ie high fidelity? Not so much.

The next time you think your SimMan or HPS is a high-fidelity simulator try doing a knee exam and compare it to a real person. Better yet, lock yourself in a room with either or both of them, and hold a 30-minute conversation. Then send me a note to the how the fidelity strikes you.

10. You tell your institution you will make a profit with your new simulation center.
Appraisal: Your setting yourself up for trouble
It just doesn’t happen very much. Everyone has a “business plan” and tries to justify the costs and appeasing finance people with rows and rows of imagined potential revenue sources that often include internal and external components. Somehow, some way, they just never seem to all pan out. Most simulation programs are a cost center to the institution to whom they are sponsored by. They are an important investment, but not a profit motivated investment for the institution. It is far better to focus on the value statement that you are brining to your institution(s) then to trying to convince your boss’s boss that the institution will get rich off of your program. Focusing on the value you produce that is aligned with your institutions mission may help you grow support for your program and as well as help you keep your job a little bit longer.

Leave a comment

Filed under Curriculum, debriefing, design, simulation

The First Four Steps of Healthcare Simulation Scenario Design

How can you make your scenario design process more consistent and efficient? One way is by following a step-by-step method to create your masterpieces!

In this post I cover the first four steps of a proven scenario design process.
There are four core steps that must be done in order. After the first four are accomplished you can branch out and be a little bit more variable in your approach to scenario design.

4 Success Steps, business concept

Step One: Pick A Topic

Picking a topic may seem like common sense but there is a lot to think about.

In healthcare simulation we have many topics to choose from. But in step one we want to a little bit specific and figure out that the major topic is that will be covered. We may be cover the teaching of physiologic, diagnostic or treatment where people are going to be making critical decisions, ordering medications, and other therapy, or perhaps our primary focus going to be on team training, teamwork, communications, team leadership. You get to pick!

Step Two: Define the Learner(s)

This is really important because in order to go to the next step which is designing the learning objectives we have to understand our learner population. For example, what do you expect of a fourth-year medical student what you expected them in terms of being able to evaluate and treat a simulated patient that is complaining of chest pain? Now contrast that to if your learners are medical students that are in the second year of medical school and haven’t had any clinical experience. In other words, we can take the same topic but as applied to two different populations, our expectations and what we are going to be evaluating from them is very different.

Step Three: Designing  the Learning Objectives

This is where you want to go into detail, great painstaking detail, about what you’re trying to accomplish with the simulation scenario. It is very important to take time on this step. Many people tend to gloss over this step which can create confusion later.

Let’s take a topic example. Let’s say asthma in the emergency department. When you think about asthma in the emergency department there could be many sub topics or areas from which to choose. It could be focused on competence of managing a minor asthma attack, or it could be a first-ever asthma attack, or it could be management of chronic asthma, or it could be major could be a life-threatening situation.

Carefully consider what do we want this learner group that we have defined in step two. Do you want them to diagnose? To treat? To critical compare and contrast it to other cases of shortness of breath in an acute patient? You get to choose!

Perhaps we want to focus on the step-by-step history presentation or the physical exam or maybe we want to see the learners perform treatment. Or maybe we want to see the overall management or the critical thinking that goes on for managing asthma in the emergency department. There are many possibilities, largely driven by your intended learner group demographics.

So, in other words were taking the big topic of asthma and we are going to cone it down to answer the question of what exactly we want our learners to accomplish by the end of the scenario. We can’t just assume that what is supposed to happen in the real clinical environment will or should happen in the simulation environment. That rarely works. We actually want to later engineer the story and situation to allow us to be able to focus on the learning objectives.

Step Four: Define the Assessment Plan

How are you going to assess that each objective defined in step three was accomplished? That is the fundamental thought process for step four.

What are you going to be watching for when you the creator of this simulation scenario are watching the participants do their thing? What are you going to be focusing your attention on that you’re going to bring into the debriefing? What are you picking up on that you might be filling out assessment tools?

Define your assessment plan with specificity of what you’re looking for. This is different than designing the assessment tools that could come later. Or perhaps not at all. It is important that you remember every simulation is an assessment of sorts. See Previous Blog Post on this!

This doesn’t mean that every simulation needs assessment tool like a checklist, rating scale or formal grading scheme. It simply is referring to consideration of how to focus the facilitating faculty member, or teacher, or whatever you call them, who are observing the simulation. Remember, that to help the learner(s) of the simulation get better, the faculty need to be focused on certain things to ensure that the goals of the scenario are accomplished for our selected learner group, associated with the topic we chose in step one.

Lastly, what I want to point out to you is that you should notice something missing. The story!

The story comes later. Everybody wants to focus on the story because the story is fun. It’s often related to what we do clinically. It’s replicating things that are fun that brings in the theatrics of simulation! But what we really want to do is bring the theatrics of simulation to cause the actors on the stage (the participants) to so some activity. This activity gives us the situation to focus our observations on the assessment of the performance. This in turn allows us to accomplish the learning objectives of the scenario and help the participants improve for the future!

Until next time, Happy Simulating!

Leave a comment

Filed under Curriculum, design, scenario design, simulation, Uncategorized

Five Tips for Effective Debriefing

There is no doubt that debriefing is an important part of simulation-based education efforts. Further, to do a good debriefing is not necessarily easy. Practice, self-reflection and getting training can help dramatically.  Seeking out help form experts and experienced people can be invaluable. Also, there are many resources in which to learn more about debriefing. I encourage you to take advantage of them!

Here are just five random tips in no particular order to help you increase the effectiveness of your debriefing!

5tIPS

  1. Know what the goal(s) are. Be specific.

Too many times simulation scenarios are executed and the faculty member just kind of winging it during the debriefing. It is far more effective a strategy to be keenly aware of what the learning outcomes and goals are prior to the simulation. This will allow you to focus your thoughts and ideas on helping the participants get better during the simulation which can be carried forward to your debriefing efforts. If you are attempting to have the debriefing constrained to the learning objectives for the simulation it is often easier to organize the information and get across the salient points that are needed to achieve the learning outcomes. It is particularly important to remember that you can’t teach everything with every scenario. The participant brain can only take in or process so much information in any one setting. In this case think of a sponge completely saturated with water, that can’t take any more!

  1. Have a framework or structure in mind

Having a structure to your debriefing ahead of time, or perhaps adopting a model of debriefing can help you significantly overcoming the challenging parts of debriefing. Some of the challenges occur in organizing the information. There are a number of debriefing models out there for consideration of adoption. There is no reason to believe that one is better than the other. I highly recommend that you learn several models and become comfortable with them. What you’ll find is some models work better than others in varying situations based on s number of factors such as the experience and expertise of the debriefer, the subject matter that is the focus of the simulation, as well as the level of the learners.

  1. Involve all the learners

If you are debriefing a group of students a challenging task can be involving all the learners. Often times there will be one or two learners who engage in a dialogue with the debriefer and without conscious effort and skill it is easy to continue the dialogue and allow the other members of the participating team to feel potentially marginalized. Often times this dialogue occurs with the person that was in the “hotseat”. Making a conscious effort during the debriefing to include all of the students in a meaningful way can significantly create more learner engagement. Further, if you are running multiple scenarios I believe that engaging all the learners encourages them to pay closer attention if they are in an observation role for subsequent scenarios.

  1. Pull the ideas, don’t push the facts

I like to think of the debriefing as the time when we explore the learners thought processes. If we are transmitting information or pushing facts to them the situation can become more of a lecture. In fact I see many novice debriefers break into song and start delivering a mini lectures during attempts at debriefing. It is important to remember that when you are pushing the facts to the participants it limits the amount of assessment that you can do in terms of their understanding of the material and what you need to do to create deeper learning. So, if you find yourself making many declarative statements, pullback, and start to ask some questions. Encourage critical thinking, self reflection and ensure you are helping to create linkages of what went well during the scenario and why it was good, along with allowing the participants to discover and identify what they should do differently if they were to face a similar situation in real life or another simulation to improve.

  1. Create a summary of the take home points

Novice debriefers tend to struggle with creating an adequate summary. Also, Beware. This is another time that is at risk for the debriefing turning into a mini lecture. It is helpful to have a list of the major take-home points associated with the scenario. You can contextually adapt the summary to the performance that occurred during the simulation scenario even if you have the summary points written out prior to the simulation occurring. It is important to remember that during a debriefing many areas can be covered and touched upon. Learner should be engaged to identify the major learning points that they experience in the simulation, as well as understanding how the simulation was relevant to helping them become better healthcare providers.

So, this was intended to be five random tips on how to improve the effectiveness of your debriefing strategy. I hope that you found them useful!

Now, go forth and do great debriefings extra mission point

 

Until next time,

Happy Simulating!

Leave a comment

Filed under Curriculum, debriefing, simulation