Category Archives: simulation

Simulation Professionals: Don’t let the Vocal Minority Get You Down!

The social psychologist Barbara Fredrickson coined the phrase, “The negative screams while the positive only whispers.” I don’t know about you, but this is extraordinarily true when reviewing course evaluations after simulation-based education programs!

Post-course evaluations are essential in measuring the program’s effectiveness and participant perceptions and are a tool to help with quality improvement initiatives. However, the feedback from vocal minorities can sometimes overshadow the opinions of the silent majority. After pouring blood, sweat, and tears into creating what you believe to be a successful simulation-based program, it can sometimes be a blow to your motivation when you receive negative evaluations.  At times the feedback can be pithy and personal and can sting.

Receiving negative feedback can be challenging for many reasons. First and foremost, it can feel like a personal attack on the hard work and effort you’ve put into a project or program. It’s natural to feel defensive or upset when someone criticizes something you’ve put so much time and energy into creating. Additionally, negative feedback can be difficult to process and use constructively. It’s easy to get caught up in the moment’s emotions and feel overwhelmed by the criticism. This can make it difficult to see the feedback as an opportunity for growth and improvement rather than a setback or failure.

This can be problematic as the feedback may not accurately represent the actual experiences of most participants, but it can certainly feel that way. It is also important to recognize the opportunities that come with critical feedback that could help you improve your program. It can help educators and course designers to identify areas for improvement and develop strategies for addressing these areas. Particularly when it is delivered constructively, and with a focus on improvement, negative feedback can be a powerful tool for enhancing the quality of simulation-based education programs and developing resilience in educators and learners alike. Critical feedback can help to identify areas for improvement, develop new strategies, and implement changes that can benefit future participants.

It is also important to remember that most participants with positive experiences may not feel the need to provide feedback. In contrast, those who have negative experiences may be more inclined to do so. So, I challenge you to go back and look at the designs of your course evaluation tools. It’s important to remember that the silent majority can be an important ally in the success of your program. By actively seeking out their feedback and insights, you can ensure that your program is meeting the needs of all participants, not just the most vocal. I’m not suggesting that we ignore the critical feedback; we just must find a way to balance it into a healthy model that contributes to resilience.

Developing a growth mindset is essential for developing resilience for those running simulation programs. It involves embracing challenges and staying motivated even when things get tough. Instead of seeing failures and setbacks as signs of inadequacy, individuals with a healthy mindset view them as opportunities for growth and learning. One powerful tool I use is remaining patient-centric in the decisions made regarding our simulations. Thinking about the downstream benefits that help raise the quality-of-care patients receive because of our efforts helps to keep my eye on the ball.

Lastly, remember that we can’t be all things to all people. While we remain excited and recognize the power of simulation-based education, not everyone will share our enthusiasm. As we move forward, remember that we can learn from the naysayers and the people unhappy that they are required to participate in some of our programs. Try to avoid the negative screaming in your ear, and you mistakenly believe that it represents the majority opinion. Stay focused on the idea that patients will benefit from our efforts, and many participants likely perceive value from our efforts.

Leave a comment

Filed under Curriculum, simulation

When Simulation Is NOT the Answer: Own It!

Obviously, we are happy that simulation has become a popular method of education in healthcare. Simulation can provide a hands-on approach to learning that allows participants to experience real-life situations in a safe and controlled environment.

However, while simulation has many benefits, it’s not necessarily the best option for every type of education.  When we engage simulation as a modality, it is relatively complex, expensive and resource intensive compared to other educational methodologies. That all being said we all know at times it is an irreplaceable methodology that allows education, competency assessment, as well as system assessment information to be utilized in the improvement of healthcare.  The key is to have a stratification process/policy in place to evaluate opportunities to decide when simulation is the optimal deployment tool.

As leaders and managers of simulation programs we are charged with creating the return on investment for our programs. We are entrusted by the people who provide our funding to be good stewards of the investment and ongoing operational support of the simulation efforts.  It is up to us to hold the keys to the vault that we call simulation so that it gets engaged, deployed and/or utilized in the fashion that generates the expected outcomes with the highest amount of efficiency and effectiveness.

In short, don’t simulate because you can, simulate because you need to!

As your simulation center becomes a more recognized resource within your institution, there will often be an increase in request for services.  As this occurs it is critically important that leaders of programs are ensuring that the simulations are bringing value. 

For example, if someone wants you to do simulation training for an entire unit to rule out a new simple policy or procedure change, do not just say yes.  Instead, create a framework that advises the requester if simulation is the best modality.

When contemplating the value of simulation as a modality, I think it is best to go back to the creation of learning objectives for anticipated scenarios.  I always like to say that if you do knowledge, skills, and attitudes (KSA) analysis of your learning objectives and they all come up with K’s, you should reevaluate whether simulation is the best method.

Web-based education including courses, videos, lectures, or assigned reading may accomplish the same objectives as your planned simulation.  If this is the case, as a leader in simulation it is important that you recognize this and recommend modalities other than simulation.  It will likely save your organization time and money.  More importantly, it may increase the credibility of your advice and reputation moving forward as a problem solver for the institution as well as someone who is fiscally responsible.  Over time it can be valuable for a simulation program to enjoy a reputation of “the solution deployment” expert, not simply the “simulation” expert.

It is important to remember that the true value we provide is in the end-result of creating higher quality healthcare along with a safer environment for patients.  In this day and age, it has become increasingly important that our engagement is thoughtful, prudent with cost considerations in mind.  While we are all passionate about simulation, leaders of the future will garner success through a lens of efficiency and effectiveness in the programs that we deploy.

In conclusion, healthcare simulation is an important tool for education and patient safety, but it is not always the best tool. Simulation program managers and leaders should consider the specific learning outcomes they hope to achieve and carefully consider which educational modality is most appropriate for their learners. By doing so, they can ensure that they are providing the best possible, most cost-efficient training for their staff and ultimately improving patient outcomes.

Remember: Don’t simulate because you can, simulate because you need to!

Let me know what you think in the comments! If you enjoyed this post, please let me know by liking it, or subscribing to my Blog!

Until next time,

Happy Simulating!

Leave a comment

Filed under Curriculum, return on investment, simulation

Not Every Simulation Scenario Needs to Have a Diagnostic Mystery!

It is quite common to mistakenly believe that there needs to be a diagnostic mystery associated with a simulation scenario. This could not be further from the truth.

Sometimes it arises from our clinical hat being confused with our educator hat (meaning we let our view of the actual clinical environment become the driving factor in the design of the scenario.) We must carefully consider the learning objectives and what we want to accomplish. One of the powerful things about simulation is that we get to pick where we start and where we stop, as well as the information given or withheld during the scenario.

Let us take an example of an Inferior Wall Myocardial Infarction (IWMI). Let us imagine that we desire to assess a resident physician’s ability to manage the case. Notice I said to manage the case, not diagnose, then manage the case. This has important distinctions on how we would choose to begin the scenario. If the objectives were to diagnose and manage, we might start the case with a person complaining of undifferentiated chest pain and have the participant work towards the diagnosis and then demonstrate the treatment. Elsewise, if we were looking to have them only demonstrate proficiency in the management of the case, we may hand them an EKG showing an IMWI (or maybe not even hand them the EKG) and start the case by saying, “your patient is having an IWMI” and direct them to start the care.  

What is the difference? Does it matter?

In the former example of starting the case, the participant has to work through the diagnostic conundrum of undifferentiated chest pain to come up with the diagnosis of IWMI. Further, it is possible that the participant does not arrive at the proper diagnosis, in which case you would not be able to observe and assess them in the management of the case. Thus, your learning objectives have become dependent on one another. By the way, there’s nothing wrong with this as long as it is intended. We tend to set up cases like this because that is the way that the sequencing would happen in the actual clinical environment (our clinical hat interfering). However, this takes up valuable minutes of simulation, which are expensive and should be planned judiciously. So, my underlying point is if you deliberately are creating the scenario to see the diagnostic reasoning and treatment, then the former approach would be appropriate.

The latter approach, however, should be able to accomplish the learning objective associated with demonstrating the management of the patient. Thus, if that is truly the intended learning objective, the case should be fast-forwarded to eliminate the diagnostic reasoning portion of the scenario. Not only will this save valuable simulation time it will also conceivably lead to more time to carefully evaluate the treatment steps associated with managing the patient. Additionally, it will eliminate the potential of prolonged simulation periods that do not contribute to accomplishing the learning objectives and/or get stuck because of a failure to achieve the initial objective (in this case, for example, the diagnosis.)

So, the next time you make decisions in the scenario’s design, take a breath and ask yourself, “Am I designing it this way because this is the way we always do it? Am I designing it this way because this is the way it appears in the real clinical environment?”

The important point is that one is asking themselves, “How can I stratify my design decisions so that the scenario is best crafted to accomplish the intended learning objectives?” If you do, you will be on the road to designing scenarios that are efficient and effective!

Leave a comment

Filed under scenario design, simulation

Sherlock Holmes and the Students of Simulation

I want to make a comparison between Sherlock Holmes and the students of our simulations! It has important implications for our scenario design process. When you think about it, there’s hypervigilance amongst our students, looking for clues during the simulation. They are doing so to figure out what we want them to do. Analyzing such clues is like the venerable detective Sherlock Holmes’s processes when investigating a crime.

Video version of this post

This has important implications for our scenario design work because many times, we get confused with the idea that our job is to create reality when in fact, it is not that at all our job. As simulation experts, our jobs are to create an environment with the reality that is sufficient to allow a student to progress through various aspects of the provision of health care. We need to be able to make a judgment and say, “hey, they need some work in this area,” and “hey, they’re doing good in this area.”

To accomplish this, we create facsimiles of what they will experience in the actual clinical environment transported into the simulated environment to help them adjust their mindset so they can progress down the pathway of taking care of those (simulated) patient encounters.

We must be mindful that during the simulated environment, people engage their best Sherlock Holmes, and as the famous song goes, [they are] “looking for clues at the scene of the crime.”
Let’s explore this more practically.

Suppose I am working in the emergency department, and I walk into the room and see a knife sitting on the tray table next to a patient. In that case, I immediately think, “wow, somebody didn’t clean this room up after the last patient, and there’s a knife on the tray. I would probably apologize about it to the patient and their family.”

Fast forward…..

Put me into a simulation as a participant, and I walk into the room. I see the knife on the tray next to the patient’s bed, and I immediately think, “Ah, I’m probably going to do a crich or some invasive procedure on this patient.”

How does that translate to our scenario design work? We must be mindful that the students of our simulations are always hypervigilant and always looking for these clues. Sometimes when we have things included in the simulation, we might just have there as window dressing or to try to (re)create some reality. However, stop to think they can be misinterpreted as necessary to be incorporated into the simulation by the student for success in their analysis.

Suddenly, the student sees this thing sitting on the table, so they think it is essential for them to use it in the simulation, and now they are using it, and the simulation is going off the tracks! As the instructor, you’re saying that what happened is not what was supposed to happen!

At times we must be able to objectively go back and look at the scenario design process and recognize maybe just maybe something we did in the design of the scenario, which includes the setup of the environment, that misled the participant(s). If we see multiple students making the same mistakes, we must go back and analyze our scenario design. I like to call it noise when we put extra things into the simulation scenario design. It’s noise, and the potential for that noise to blow up and drive the simulation off the tracks goes up exponentially with every component we include in the space. Be mindful of this and be aware of the hypervigilance associated with students undergoing simulation.

We can negate some of these things by a good orientation, by incorporating the good practice into our simulation scenario design so that we’re only including items in the room that are germane to accomplishing the learning objectives.

Tip: If you see the same mistakes happening again and again, please introspect, go back, look at the design of your simulation scenario, and recognize there could be a flaw! Who finds such flaws in the story?  Sherlock Holmes, that’s who!

1 Comment

Filed under Curriculum, design, scenario design, simulation

Cognitive Load Control and Scenario Design in Healthcare Simulation

As the design architects of simulation scenarios, we must remain cognizant of our ability to have influence over the cognitive load of those experiencing our simulations in the role of learners.

When caring for patients in real life, we expend cognitive energy in doing so to ensure we make the right decisions to provide the absolute best care for every patient. We engage in critical thought processes, that guide our interpretation of the enormous number of facts surrounding each patient so we can make further decisions to provide various therapies, or advice to the patient.

Headache brain in a clamp isolated grey background

When we design simulations for our learners, we are creating similar environments noted above that demand a significant amount of cognitive workload to be endured for the participant to successfully navigate the case and care the [simulated] patient. In addition, I argue that we are adding additional cognitive workload by subjecting someone to the simulated environment insofar as they are engaged in a conscious or perhaps subconscious pursuit of deciding what is simulated and what is not. I have previously written about this and dubbed it the cognitive third space of simulation.

Nonetheless, there is mental energy spent in the care of the patient as well as the interpretation of the simulation. We also must realize that our design choices inside of the scenario contribute to the adjustment of the cognitive load endured by the learner(s) associated with our simulations. It is important that we be deliberate in our design to ensure that we are allowing all involved to achieve the desired learning outcomes.

Some specific examples of this cognitive load influence may help to bring forth an understanding. Take a test result for example. If one looks in the electronic health record and sees the values reported for a simple test, like a basic metabolic profile (which consists of a sodium, chloride, potassium, CO2, BUN, creatinine and glucose) there is a certain amount of mental energy goes into the interpretation of the numeric data presented for each of the seven items of the basic metabolic profile. Some electronic health records may color-code the results to assist in the processing of normal versus normal, and some may not.

Such a decision involved in the human factors design of electronic health record actually influences the amount of cognitive spend on the interpretation of the given value. Further, as experienced clinicians are keenly aware, we must interpret the lab value in the context of the patient for whom the test has been ordered. What is normal for one patient, may not be normal for another. Thus, even in the interpretation of a simple test, there is a significant amount of cognitive process (critical thought) that should be applied.

How does this relate to simulation scenario design? We have the ability to engineer the scenario design to help the participants channel cognitive energy into those things that are important and away from those those things that are not. If we continue to run with the example of the basic metabolic profile as an example, we have choices on how said values are reported to the participants of our simulation.

We could have the participants look it up in the simulated electronic health record which takes time and cognitive processing as described above. We could give them a piece of paper or display the results on a screen demonstrating the seven values. This still takes significant cognitive processing to interpret the data. We could simply indicate that the basic metabolic profile result was “normal”.  This method significantly decreases the cognitive processing associated with the seven values of the basic metabolic profile and how it is to be interpreted into the context of the scenario. Also, one could make the argument that we are offering subtle, or perhaps not-so-subtle clues to the case that the basic metabolic profile is not a major part of what needs to be processed in the care of this particular patient.  

It is important to realize that all the examples above are viable options and there is not one that is superior to another. It is important that the decision is made during the design of the case that allows the participant(s) of the scenario to focus the appropriate cognitive spend on that which the designers of the scenario feel are most important. In other words, if it is part of the learning objectives that the participant should evaluate the actual values of the basic metabolic profile, then of course it would be appropriate to provide the requisite information at that level of detail. If, however, the results of the same test are perfunctory to the bigger picture of the case then one should consider a different mechanism of resulting values to the simulation participant.

A common misperception in the design of healthcare simulation scenarios is to try to re-create the realistic environment of the clinical atmosphere. While this is always a tempting choice, it is not without consequences. It comes from the mistaken belief that the goal of simulation scenarios is to re-create reality. Modern, successful simulationists need to recognize this outmoded, immature thought process.

In the context of a case where the basic metabolic profile is not significantly important that we should not design the “dance” (scenario) to include the steps of looking in the electronic health record and making determinations of the values associated with the test. It is a waste of time, and more importantly a waste of cognitive processing which is already artificially increased by the participant being involved in the simulation in the first place. It is in my opinion a violation of the learner contract between faculty and students.

While I am focusing on a simple example of a single test, I hope that you can imagine that this concept extrapolates to many, many decisions that are made in the scenario design process. For example, think about a chest x-ray. Do you result a chest x-ray as “normal”, “abnormal” or otherwise during the run time of the scenario? Or do you show an image of a chest x-ray and have your participants interpret the image? One answer is not superior to the other. It is just critically important that you evaluate what is best for the cognitive load of the learners involved in your scenario and how the decision relates to the details of the learning objectives you wish to achieve during the course of the simulation activity.

In moderate to complex cases associated with healthcare simulation the designer of the simulation, or architect, has a responsibility to craft the scenario to accomplish the learning objectives that are intended. In many scenarios, hundreds of decisions are made in terms of how participants extract data from the experience to incorporate into their performance of the simulation. It is critically important that as the designers of such learning events that we remain cognizant of the cognitive load placed upon our learner(s) that is associated with the normal care of patients, as well as the extra that is imposed upon them from participating in a simulation-based case.

Many of the decisions that we incorporate into the design of our scenarios have significant influence over this cognitive load, and the mental energy participants will spend to engage in the participation. We need to understand the impact of our choices and be deliberate with our design decisions to enhance the overall simulation-based learning process efficiency and effectiveness.

Leave a comment

Filed under Curriculum, design, scenario design, simulation, Uncategorized

Where do we Debrief?

Selecting the location to conduct the debriefing after a simulation is a decision that often has many variable. Sometimes there are limited choices and the choice is dictated by what is available, or what space holds the technology that is deemed essential to the debriefing. Other times there is deliberate planning and selection.

This short video explores some of the basics of how such decisions are made and some of the pros and cons associated with the final choices.

Leave a comment

Filed under Curriculum, debriefing, simulation, Uncategorized

Exploring the Elements of Orientation and (Pre)Briefing in Simulation Based Learning Design

AdobeStock_119412077

I want to explore a little bit about orientation and (pre)briefing(s) associated with simulation based education design concepts. The words are often tossed about somewhat indiscriminately. However it is important to realize they are both important elements of successful healthcare simulation and serve distinct purposes.

When we look in the Healthcare Simulation Dictionary, we find that the definition of Orientation is aligned with an overview preparation process including “… intent of preparing the participants.” Examples include center rules, timing and the simulation modalities.

On the other hand, according to the same dictionary the definition of the word Briefing includes “An activity immediately preceding the start of a simulation activity where participants receive essential information about the simulation scenario….”

I look at orientation as the rules of engagement. I like to think of orientation linked to the overall educational activity in total. Some essential components include orientation to the simulation center, the equipment, the rules, and the overall schedule for the learning activity.

At a somewhat deeper level of thought I think the orientation is linked to the learning contract. What do I mean by that?

I think it is essential that we as the faculty are establishing a relationship with our learners and begin to establish trust and mutual respect. To that end, we can use orientation to minimize surprises. Adult learners do not like surprises!

We need to have the adult learner understand what they can expect. I always orient the learners as to what will feel real, and I am similarly honest with them about what will not feel real. If they will be interacting with a computerized simulator for example, I orient them to the simulator before the start of the program.

In the simulation world we throw around words like debriefing, scenario and task training. To clinical learners these terms may be unfamiliar, or have different contexts associated with them. This for example, can cause anxiety and during the orientation we need to walk them through the experience they are about to embark upon.

Some factors can influence the amount and depth of the orientation. Variables such as the familiarity your participants have with simulation, your simulation center, and your simulation-based encounters. For example, learners who come to your center on the monthly basis probably need less total orientation than those who are reporting for the first time. Learners familiar with the fact that debriefings occur after every simulation may already be acclimated to that concept, but people coming to the sim center for the first time may not be aware of that at all.

Participants just meeting you for the first time they might need a little bit more warming up and that an come in the form of orientation. Overall though it is not just about telling them what’s going on, as it is using the opportunity toward earning their trust and confidence in the simulated learning encounter(s) and the value associated to them as a professional.

BriefingGraphic3Switching the focus to the brief, briefing or (pre)briefing. The briefing is more linked to the scenario as compared to the orientation. The briefing should focus on the details of the case at hand introducing components of information that allow one to acclimate to what they going to need to accomplish during the simulation. What is their role and goals in this scenario they are about to embark upon? If you are going to ask people to play different roles then they are in real life, it is very important that this fact is crystal clear in the briefing.

I think that the briefing should also bring the context to the healthcare experience. It is important to orient the learner for the impending encounter what they are to perceive and think of as real as they are experiencing what is in the simulation. You as a simulation faculty may think that it is obvious that a room in your simulation center is an ICU bed. The participant may not and deserves clarity prior to the start of the simulation so they do not feel like they are being tricked or duped. During the briefing the statement “You are about to see a patient in the ICU…..” can remove such ambiguity.

Another critical briefing point is to clarify the faculty-student engagement rules that should be expected during the scenario runtime if it was not covered in the orientation. There are many correct ways to conduct simulation scenarios. There are varying levels of interaction between faculty members running the simulation and the learners that are participating. This should be clarified before the scenario starts.

For example, are you going to let the learners ask questions of the of the faculty member during the simulation? Or not? This should be upfront and covered in the briefing, and perhaps even aspects of that in the orientation.

While not a requirement I think that parameters associated with time expectations are always good to give in a briefing. For example stating “You are going to have 10 minutes in the scenario to accomplish X,Y and Z, and then we will have a ten minute debriefing before the next scenario.”

Remember our adult learners don’t like surprises! I always use the briefing before a scenario to remind the participant(s) that afterward we are going to have a debriefing. I remind them of that so that they know that they should collect her thoughts and ideas and be ready to have this discussion. Secondly, I am saying in any unspoken way, that if they are uncomfortable about something, or have questions, that there will be an opportunity for discussion during the debriefing. (In other words, your sort of giving some control back to the learner…. Helping to build the trusting relationship.)

Some of the variations of the briefing are similar to that of the orientation mentioned above. People who are more familiar to simulation, your particular programs, your style, may require slightly less of a briefing than others. Additionally, if you are running multiple scenarios as part of a simulation-based course, after the first couple of scenarios you will find that the briefing can be shortened as compared to the beginning of the day.

So, in summary, orientation and briefings are different elements of simulation-based learning that are useful for different things that will contribute to the success of your simulations.

Think of orientation linked to the bigger picture and the learner contract that contributes to making the relationship comfortable between the participants and the faculty. The orientation is the rules of engagement and orientation to the technology and being explicit as to what is to be expected of the participant. Think of the briefing as linked more to the scenario roles, goals, and introduction to patient and environment information to help the participant mentally acclimate to what they are about to dive into.

Leave a comment

Filed under Curriculum, scenario design, simulation, Uncategorized

5 Elements in My Approach to the Learning Contract in Simulation

In simulation-based education there is a relationship between the faculty of the program and the participants that is important during all aspects of simulation. The relationship has tenets of trust and respect that must be considered when designing as well as conducting simulations. I have heard this relationship referred to by a few titles such as psychological contract, fiction contract, learning contract, all of which are generally referring to the same thing.Smiling asian female vacancy candidate shaking hand with hr manager

Probably more important than the title, is what such a relationship embodies or focuses on. I view it as an agreement between two or more parties that acknowledges several aspects of simulation based programs and works to establish rules of engagement and principles of interactions between those involved.

In my practice of using simulation for clinical education I work a great deal with practicing professionals, who by in large are physicians. I generally adhere to five elements or premises over the course of interactions that I design as well as provide for the participants of my programs.

  1. Meaningful use of Your Time.

Acknowledging up front that participating in learning activities takes time away from their busy schedule. I assure them that the content of the program is carefully crafted to fill the needs of their learning cohort in the mostly timely way possible. I refer to refinements of the course that have occurred in response to feedback from prior participants to help increase the efficiency and effectiveness of the program.

  1. This is NOT real and that’s really ok!

During the orientation I am always careful to point out that not everything they are going to experience will look or feel real. I include the idea that things are “real-enough” to help us create a successful learning environment. I also let them know the things that may feel somewhat real during the simulation. Additionally, I emphasize that the “realness” is not the primary focus and point out that the learning and reinforcement of high-quality clinical practice is the ultimate outcome.

  1. We are not here to trick you.

I find that practicing professionals often come to simulation training endeavors with an idea that we design programs to exploit their mistakes. I assure them this is not the case. I am careful to include an overview of what they can expect during all phases of the learning. For example, when I am conducting difficult airway programs, I carefully orient them to every feature of the simulators airway mechanics before starting any scenarios. I also let them know that the cases associated with our scenarios are modeled after actual cases of clinical care. I explain that while we don’t model every detail of the case, that we work hard to design situations that provide opportunity to promote discussion and learning that would have or should have resulted from the actual case.

  1. Everyone makes mistakes. We are here to learn from each other.

At the most basic part of this element, I point out that WE all make mistakes and that is part of being human. I let them know that everyone is likely to make a mistake throughout the learning program. I carefully weave in the idea that it is far better to make mistakes in the simulated environment as opposed to when providing actual clinical care.

Further, I advance the idea that we can learn from each other. As everyone in clinical practice knows, there are many ways to do most things correctly. While this idea can be challenging because often people feel that “their way” is the correct way, I point out that with an open mind and professional, collaborative discussion we can share learnings with each other.

Contract Signing Concept

  1. We are here to help you be the best you can be.

Leveraging the idea that almost all practicing professional hold themselves to high levels of performance standards as well as the desire to improve can provide a powerful connection between the faculty and participants of a healthcare simulation program. I put forth this idea along with carefully tying in a review of the prior four elements. Further, I point out to them the opportunity to perfect the routine exists in our learning programs. I then pivot to highlight that some aspects of the program exist to practice and learn from situations that they may encounter infrequently that may have high stakes for the patient.

So, in summary, I believe the relationship between faculty members and participants of simulation-based education programs is multi-factorial and demands attention. Depending on the learners and the topics of the program, the elements that serve as the underpinning of the relationship may range from few to many, and moderate to significant in complexity.

In my simulation work providing clinical education that involves practicing physicians as participants, I pay close attention to the five elements described above throughout the design as well as the conducting of the learning encounters.

I invite you to reflect upon your approach to the development and maintenance of the relationship between your faculty and participants of your simulation efforts.

 

 

2 Comments

Filed under debriefing, design, simulation

Education may NOT be the Return on Investment Value of Healthcare Simulation

Its January 2019 and I am flying to San Antonio, Tx to attend the International Meeting for Simulation in Healthcare. While traveling (in coach) I cannot help but to ponder where we are in simulation and where we are going. While I feel that simulation has a bright future and will earn a deservedly important role in healthcare it feels as if it is taking longer then it should.

In my overly simplistic view of simulation I envision two primary user groups. Those who utilize simulation to teach students of various healthcare professions (schools) and those who use simulation to somehow improve the quality of the delivery of healthcare. The latter of which likely includes education of individuals as well, but more of the ilk of practicing healthcare professionals and those in the apprentice phases of training such as resident physicians.

For the purpose of this post, I will be focusing on simulation efforts associated with healthcare delivery. Toward the end, I will circle back to the “school” environment again.

As healthcare dollars for the delivery of healthcare continue to be under more pressure and harder to come by there is great interest in controlling spending and increasing vigilance by corporate overlords on money being spent on investments. Investments or capital purchases are under higher levels of scrutiny than ever before. Simulationists must bear in mind that simulation is an investment, or at the least a capital expense for healthcare systems. This realization is accompanied by the stark reality that whatever you want to purchase for your simulation efforts whether it be a single simulator, or a suite of training equipment is competing against other “things” also associated with the delivery of care. Pesky things such as CT Scanners, ultrasound machines, laproscopic surgical equipment for the operating room or dialysis machines.

Why pesky? From my view as a simulation and safety leader I am envious. I am flat our jealous that it is so easy for the purchasers of the above listed examples, it is so easy for them to justify their return on investment (RPOI). Huh? What’s that? In simple terms the ROI is the business term and calculations that allow spreadsheet drivers to determine how much profit an investment of dollars in a “thing” will bring back to the

Perhaps looking at an overly simplistic explanation will help. Let’s say somFemale patient undergoing MRI - Magnetic resonance imaging scaneone wants to put in a new CT Scanner. The costs of the scanner and installation, maintenance, staffing, and operational expenses are calculated. Then how much can be charged for each scan, how many scans can be done by the hour, and how many hours per day the scanner will be running calculates the revenue that the new CT scan will bring in. After the install is paid for, all of the rest of the revenue coming in once the expenses are deducted is profit. Thus at least when justifying the new CT scanner a requester of funds will create a fancy business proposal with colors and graphs that show money flowing in as a result of the purchase after a given period time. Purchase approved!

Now let’s take a typical cost justification scenario discussion between a simulationist (sim) and a Chief Finance Officer CFO of a healthcare system:Corporate Bean Counting

Simulationist (Sim): I’d like $250,000 to buy a simulator.

CFO: How is that going to make us more money?

SIM: To educate people and make them smarter and reduce mistakes?

CFO: We have lots of smart doctors and nurses working here. You should be reducing mistakes anyway.

SIM: There is a study showing a reduction of central line infections saves money.

CFO: Save who money? We still make money when the patient is in the hospital. And besides, your not asking for central line simulator.

SIM: But insurers are not going to pay for errors and hospital acquired infections anymore

CFO: Maybe not. We still make money when the patient is in the hospital. What’s your return on investment for this doll?

SIM: We are buying the simulator to train people to work together better. To work as highly functional interdisciplinary teams.

CFO: Right. We have lots of smart doctors and nurses working here. You should be reducing mistakes anyway. They are smart enough to work as teams. They do it every day.

SIM: But we can make the teams work better and make people enjoy working together more and improve patient care.

CFO: People like working here. You should be improving patient care. Where is the proof that simulation is needed to train teams AND that team training improves patient care?

SIM: The airlines have been doing it for years.

CFO: Where is the proof that airline simulation improves the airlines?

SIM: everybody just knows. It makes sense. And planes don’t crash as much as they used to.

CFO: Hospitals don’t burn down either. You know, we bought the new CT Scanner last year, and we have been able to make money on it. Its just like radiology predicted in their purchase proposal. Let me think about your request and I’ll get to you.

While the above scenario is somewhat tongue in cheek, sadly, I think it is closer to real life then many simulations we conduct. The fact of the matter is the true ROI of simulation is buried in nuances, potential opportunities, mired by anecdotal enthusiasm with a scant amount of hard-core evidence that provides the black and white spreadsheet numbers that make the bean counters excited.

It is upon us to figure out ways to describe the ROI of simulation more coherently, accompanied by facts and figures that make a difference to the leaders of healthcare systems. Let me give you a hint……. It aint about education.

We must transcend long hold belief and common assumption that the value of simulation is the education. I think the realization and yet unlocked true potential of simulation remains ties up in the ability to assess. It is tough to pivot from thinking that simulation is primarily an education methodology. But I encourage you to do so. Now before you get your hair on fire and leave me nasty comments, I’m not suggesting that we abandon simulation which we know to be an incredibly powerful education platform/modality. I just believe it you think it the main power is education first and foremost its becomes difficult to strategically plan, document, and provide leadership in other directions.

I think in the healthcare delivery space a more powerful argument that can contribute to the ROI of simulation is to harness the ability of simulation to identify the best deployment of judicious resources. So, what does this mean? Stop teaching with simulation? No, of course not.

Focusing more on the use of simulation as an assessment and surveillance tool can help to create bigger value. When teaching with simulation, conducting assessments of what people or perhaps units are doing well, what they are struggling with in a more quantitative way can help to identify the true needs of the organization. Understanding the local struggle and perhaps what the local community is not struggling with allows for a smarter utilization strategy for simulation.

Now before the heads pop off of the safe learning environment people, I’m not Stressful girl with exploded headsuggesting we need to turn every simulation into a summative performance assessment and give passing and failing grades that will ruin peoples lives. However consideration should be given to the gathering of data to show improvement is critically important as you do all of your great education work. After you collect the data is a systematic way have the courage to abandon what participants always do well on, focus or increase in the areas of greatest improvements.

Carefully collect the data if you use your simulation activities for on-boarding. Don’t ask if they liked the simulation. That’s not the data you need for your ROI justifications. Can you shorten aspects of on-boarding through the use of simulation? Showing credible evidence that nursing on-boarding can be shortened by x number of days or weeks through the strategic and judicious use of simulation will bring music to the ears of the bean counting crew who don’t fancy paying for the training of people when they could be working.

Other thoughts…. Using simulation as an evaluation tool in a human factors applications can assist other departments in increasing efficiency, and improving throughput. Think about the importance of that. What????? Not your cup of tea? Think back a few paragraphs on calculations leading to justifying the need for the new CT Scanner.

Carefully documenting that simulation trained anesthesiologists, CRNA’s , endoscopists and surgeons for example may shorten OR time which means more surgeries can occur, which generate lots of revenue is part of the ROI that should be in capital letters. This is the data that matters for the ROI justifications.

In-situ programs can give valuable feedback to hospital safety and quality leaders to demonstrate volatilities in the system with regard to both process, staffing, human performance etc. It can also demonstrate where the strengths lie. If there is unnecessary training going on where the strengths lie, then redeploy or readjust to the actual needs of your system. Additionally, formulating such relationships with the quality and safety leaders of your institution and letting them know of you true capabilities that are more then making people happy and smarter through education, can win you some powerful allies in the corporate leadership suites.

Lastly circling back to the schools……. Looking past the education benefits of simulation to use it as a tool to create data that can lead to information the underpins significant change, cost savings, and allocations of precious resources (people and money) will do you well. With the exception of more students,  I  don’t think it is likely that windfalls of money are coming your way either…….

So is you are carefully assessing you simulation efforts and activities in a thoughtful manner, you can help to reduce redundancy, unnecessary training intervals, or repetitions and on and on. Doing less of that which is ineffective save money. Saving money is a variable of the ROI that your CFO will pay attention to.

Leave a comment

Filed under assessment, hospital, return on investment, simulation

Beware of Simulation Posers!

You may be a simulation poser if you say or do three or more of the following things…..

1. You say something like “In simulation all of the learning occurs during the debriefing.”
Appraisal: Not true. You are lying, uniformed, or not creative.
Not even close. If you believe this you are not paying attention to other learning opportunities that participants of simulation can avail themselves to. Think about the status changes of a simulator in response to proper or improper treatment. Think about participant to participant potential interactions. Think about the potential for instructor participant interactions that may contribute to learning. The potentials are practically limitless! For more see this blog post.

2. You claim there is a magic ratio of simulation time to debriefing time. “for every 15 minutes of simulation you must debrief for 45 minutes…. Etc.”
Appraisal: Rubbish.
No such thing exists. In fact if you think about this it is utterly ridiculous given the number of variables that exist that may potentially influence the debriefing time. Things like the topic, number of learners, experience level of the learners, number of faculty, experience of the faculty and on and on. Just stop saying it and the perception of your (simulation) IQ will raise by 10.

3. You espouse that during simulation encounters the students and faculty must be separated by something like a glass wall.
Appraisal: Lack of creative thinking.
While there are a lot of god reasons to design simulations that physically isolate the faculty from the participants, there are as many compelling reasons to have faculty in the same room at even at times interact ……. (agghhhast) with the participants. Think about the possibilities. Faculty side by side with students could engage in coaching and formative assessment or more easily conduct pause and discuss or pause and reflect type of learning encounters that can be more awkward when on the other side of the wall!

4. You say the simulator should never die during a simulation.
Appraisal: Wrong
‘Nuff Said on this one.

5. Simulations must have every aspect designed to be as real as possible.
Appraisal: Simply Crap.
Trying to create the ultra real environment can lead to increased time to set up, clean up and otherwise make the simulation less efficient. Worse yet creating a lot of simulated artifact can actually lead to increased confusion. How? Read this blog post on the cognitive third space of simulation. Simulations should be designed and outfitted to provide enough realism that enables the accomplishing of learning objectives. Everything else is a waste of time, money and/or people resources (ironically the same things you probably say that you don’t have enough of).

6. You say during simulations participants must/will suspend disbelief.
Appraisal: Ridiculous.
Out of the other side of your mouth you probably babble about adult learning theory……
If we are educating seriously smart adults, we don’t want them to think the plastic simulator is real. Seriously. I like to think of a more mature understanding of the situation that gives the participates a bit more credit for their lifetime of cerebral accomplishments. How about a message like…. “We have created this learning encounter using simulation for you so we can work together to help you become a better healthcare provider. Some of what you are going to experience will seem realistic and some will not. But we promise to make the best use of your time, treat you with dignity and respect, as we help you learn and practice.” Now that’s how adults talk. (Mic drop)

7. You claim one debriefing model is far superior to another. Or one has been validated.
Appraisal: Crap that gets sold at debriefing training programs.
If you are saying this, you probably don’t use a structure to your debriefing, don’t believe in learning objectives, or you only know one model of debriefing.
Truth is there are a bunch of good debriefing models in existence. You would do well to learn a few. Different models of debriefing are like tools in the toolbox. Some are good for certain topics, learners and situations and some for others.

8. You state that you should always use video while debriefing.
Appraisal: Industry sponsored rubbish.
You have drank some serious kool-aid, have had the wrong mentor, or an improper upbringing if you believe this. Further, if your make your participants watch the entire simulation on video, you should receive a manicure with a belt sander. Lastly if you say you use the video to solve disputes about what a student did or didn’t do, you may be hopeless.
Video can be a tool that can be strategically used to enhance debriefings at times. But more often video playback gets used as a crutch to make up for a lack of quality debriefing skills and to fill time.
There is also a misguided belief that students want to watch their videos. They don’t. They hate it. They think they look fat and their hair doesn’t look good.
Harnessing the power of a good debriefing is hard work and requires skill. But active reflection and guiding students towards a self-discovery of what they did well and what they need to change for the future is serious active learning. The more you can do that, the more the learning will occur. Watching a video of a simulation is like watching a bad movie. I always find it fascinating that simulation programs will spend a fortune putting in a video system that could film a Hollywood movie, but wont invest even a fraction of that cost into development of the faculty.

9. You use the terms “High and Low Fidelity Simulations” when you are referring to the use of a high technology simulator in your simulations.
Appraisal: You are feeding into the biggest industry sponsored word there is. In fact, the word fidelity is so perverse it should be banned. See additional blog post here on banning the “F” word.
The highest fidelity human simulator I know is a real person playing the role of a standardized or simulated patient. Everything else is overall, lower fidelity.
Seriously folks….. Somewhere along the way industry labeled a couple of simulators high fidelity because they had a feature or two that approximated that of a human. The label stuck and continues to perpetuate great confusion throughout the community of simulation, in practice and in the literature as well. Some centers even name their room like this!!!

Sadly, this crazy definition even made its way into the simulation dictionary of the Society for Simulation in Healthcare (which is otherwise excellent I might add). Do high technology simulators have some very cool and very useful high-technology features? Absolutely! But real like a person, ie high fidelity? Not so much.

The next time you think your SimMan or HPS is a high-fidelity simulator try doing a knee exam and compare it to a real person. Better yet, lock yourself in a room with either or both of them, and hold a 30-minute conversation. Then send me a note to the how the fidelity strikes you.

10. You tell your institution you will make a profit with your new simulation center.
Appraisal: Your setting yourself up for trouble
It just doesn’t happen very much. Everyone has a “business plan” and tries to justify the costs and appeasing finance people with rows and rows of imagined potential revenue sources that often include internal and external components. Somehow, some way, they just never seem to all pan out. Most simulation programs are a cost center to the institution to whom they are sponsored by. They are an important investment, but not a profit motivated investment for the institution. It is far better to focus on the value statement that you are brining to your institution(s) then to trying to convince your boss’s boss that the institution will get rich off of your program. Focusing on the value you produce that is aligned with your institutions mission may help you grow support for your program and as well as help you keep your job a little bit longer.

Leave a comment

Filed under Curriculum, debriefing, design, simulation