Category Archives: design

Beware of Simulation Posers!

You may be a simulation poser if you say or do three or more of the following things…..

1. You say something like “In simulation all of the learning occurs during the debriefing.”
Appraisal: Not true. You are lying, uniformed, or not creative.
Not even close. If you believe this you are not paying attention to other learning opportunities that participants of simulation can avail themselves to. Think about the status changes of a simulator in response to proper or improper treatment. Think about participant to participant potential interactions. Think about the potential for instructor participant interactions that may contribute to learning. The potentials are practically limitless! For more see this blog post.

2. You claim there is a magic ratio of simulation time to debriefing time. “for every 15 minutes of simulation you must debrief for 45 minutes…. Etc.”
Appraisal: Rubbish.
No such thing exists. In fact if you think about this it is utterly ridiculous given the number of variables that exist that may potentially influence the debriefing time. Things like the topic, number of learners, experience level of the learners, number of faculty, experience of the faculty and on and on. Just stop saying it and the perception of your (simulation) IQ will raise by 10.

3. You espouse that during simulation encounters the students and faculty must be separated by something like a glass wall.
Appraisal: Lack of creative thinking.
While there are a lot of god reasons to design simulations that physically isolate the faculty from the participants, there are as many compelling reasons to have faculty in the same room at even at times interact ……. (agghhhast) with the participants. Think about the possibilities. Faculty side by side with students could engage in coaching and formative assessment or more easily conduct pause and discuss or pause and reflect type of learning encounters that can be more awkward when on the other side of the wall!

4. You say the simulator should never die during a simulation.
Appraisal: Wrong
‘Nuff Said on this one.

5. Simulations must have every aspect designed to be as real as possible.
Appraisal: Simply Crap.
Trying to create the ultra real environment can lead to increased time to set up, clean up and otherwise make the simulation less efficient. Worse yet creating a lot of simulated artifact can actually lead to increased confusion. How? Read this blog post on the cognitive third space of simulation. Simulations should be designed and outfitted to provide enough realism that enables the accomplishing of learning objectives. Everything else is a waste of time, money and/or people resources (ironically the same things you probably say that you don’t have enough of).

6. You say during simulations participants must/will suspend disbelief.
Appraisal: Ridiculous.
Out of the other side of your mouth you probably babble about adult learning theory……
If we are educating seriously smart adults, we don’t want them to think the plastic simulator is real. Seriously. I like to think of a more mature understanding of the situation that gives the participates a bit more credit for their lifetime of cerebral accomplishments. How about a message like…. “We have created this learning encounter using simulation for you so we can work together to help you become a better healthcare provider. Some of what you are going to experience will seem realistic and some will not. But we promise to make the best use of your time, treat you with dignity and respect, as we help you learn and practice.” Now that’s how adults talk. (Mic drop)

7. You claim one debriefing model is far superior to another. Or one has been validated.
Appraisal: Crap that gets sold at debriefing training programs.
If you are saying this, you probably don’t use a structure to your debriefing, don’t believe in learning objectives, or you only know one model of debriefing.
Truth is there are a bunch of good debriefing models in existence. You would do well to learn a few. Different models of debriefing are like tools in the toolbox. Some are good for certain topics, learners and situations and some for others.

8. You state that you should always use video while debriefing.
Appraisal: Industry sponsored rubbish.
You have drank some serious kool-aid, have had the wrong mentor, or an improper upbringing if you believe this. Further, if your make your participants watch the entire simulation on video, you should receive a manicure with a belt sander. Lastly if you say you use the video to solve disputes about what a student did or didn’t do, you may be hopeless.
Video can be a tool that can be strategically used to enhance debriefings at times. But more often video playback gets used as a crutch to make up for a lack of quality debriefing skills and to fill time.
There is also a misguided belief that students want to watch their videos. They don’t. They hate it. They think they look fat and their hair doesn’t look good.
Harnessing the power of a good debriefing is hard work and requires skill. But active reflection and guiding students towards a self-discovery of what they did well and what they need to change for the future is serious active learning. The more you can do that, the more the learning will occur. Watching a video of a simulation is like watching a bad movie. I always find it fascinating that simulation programs will spend a fortune putting in a video system that could film a Hollywood movie, but wont invest even a fraction of that cost into development of the faculty.

9. You use the terms “High and Low Fidelity Simulations” when you are referring to the use of a high technology simulator in your simulations.
Appraisal: You are feeding into the biggest industry sponsored word there is. In fact, the word fidelity is so perverse it should be banned. See additional blog post here on banning the “F” word.
The highest fidelity human simulator I know is a real person playing the role of a standardized or simulated patient. Everything else is overall, lower fidelity.
Seriously folks….. Somewhere along the way industry labeled a couple of simulators high fidelity because they had a feature or two that approximated that of a human. The label stuck and continues to perpetuate great confusion throughout the community of simulation, in practice and in the literature as well. Some centers even name their room like this!!!

Sadly, this crazy definition even made its way into the simulation dictionary of the Society for Simulation in Healthcare (which is otherwise excellent I might add). Do high technology simulators have some very cool and very useful high-technology features? Absolutely! But real like a person, ie high fidelity? Not so much.

The next time you think your SimMan or HPS is a high-fidelity simulator try doing a knee exam and compare it to a real person. Better yet, lock yourself in a room with either or both of them, and hold a 30-minute conversation. Then send me a note to the how the fidelity strikes you.

10. You tell your institution you will make a profit with your new simulation center.
Appraisal: Your setting yourself up for trouble
It just doesn’t happen very much. Everyone has a “business plan” and tries to justify the costs and appeasing finance people with rows and rows of imagined potential revenue sources that often include internal and external components. Somehow, some way, they just never seem to all pan out. Most simulation programs are a cost center to the institution to whom they are sponsored by. They are an important investment, but not a profit motivated investment for the institution. It is far better to focus on the value statement that you are brining to your institution(s) then to trying to convince your boss’s boss that the institution will get rich off of your program. Focusing on the value you produce that is aligned with your institutions mission may help you grow support for your program and as well as help you keep your job a little bit longer.

Leave a comment

Filed under Curriculum, debriefing, design, simulation

The First Four Steps of Healthcare Simulation Scenario Design

How can you make your scenario design process more consistent and efficient? One way is by following a step-by-step method to create your masterpieces!

In this post I cover the first four steps of a proven scenario design process.
There are four core steps that must be done in order. After the first four are accomplished you can branch out and be a little bit more variable in your approach to scenario design.

4 Success Steps, business concept

Step One: Pick A Topic

Picking a topic may seem like common sense but there is a lot to think about.

In healthcare simulation we have many topics to choose from. But in step one we want to a little bit specific and figure out that the major topic is that will be covered. We may be cover the teaching of physiologic, diagnostic or treatment where people are going to be making critical decisions, ordering medications, and other therapy, or perhaps our primary focus going to be on team training, teamwork, communications, team leadership. You get to pick!

Step Two: Define the Learner(s)

This is really important because in order to go to the next step which is designing the learning objectives we have to understand our learner population. For example, what do you expect of a fourth-year medical student what you expected them in terms of being able to evaluate and treat a simulated patient that is complaining of chest pain? Now contrast that to if your learners are medical students that are in the second year of medical school and haven’t had any clinical experience. In other words, we can take the same topic but as applied to two different populations, our expectations and what we are going to be evaluating from them is very different.

Step Three: Designing  the Learning Objectives

This is where you want to go into detail, great painstaking detail, about what you’re trying to accomplish with the simulation scenario. It is very important to take time on this step. Many people tend to gloss over this step which can create confusion later.

Let’s take a topic example. Let’s say asthma in the emergency department. When you think about asthma in the emergency department there could be many sub topics or areas from which to choose. It could be focused on competence of managing a minor asthma attack, or it could be a first-ever asthma attack, or it could be management of chronic asthma, or it could be major could be a life-threatening situation.

Carefully consider what do we want this learner group that we have defined in step two. Do you want them to diagnose? To treat? To critical compare and contrast it to other cases of shortness of breath in an acute patient? You get to choose!

Perhaps we want to focus on the step-by-step history presentation or the physical exam or maybe we want to see the learners perform treatment. Or maybe we want to see the overall management or the critical thinking that goes on for managing asthma in the emergency department. There are many possibilities, largely driven by your intended learner group demographics.

So, in other words were taking the big topic of asthma and we are going to cone it down to answer the question of what exactly we want our learners to accomplish by the end of the scenario. We can’t just assume that what is supposed to happen in the real clinical environment will or should happen in the simulation environment. That rarely works. We actually want to later engineer the story and situation to allow us to be able to focus on the learning objectives.

Step Four: Define the Assessment Plan

How are you going to assess that each objective defined in step three was accomplished? That is the fundamental thought process for step four.

What are you going to be watching for when you the creator of this simulation scenario are watching the participants do their thing? What are you going to be focusing your attention on that you’re going to bring into the debriefing? What are you picking up on that you might be filling out assessment tools?

Define your assessment plan with specificity of what you’re looking for. This is different than designing the assessment tools that could come later. Or perhaps not at all. It is important that you remember every simulation is an assessment of sorts. See Previous Blog Post on this!

This doesn’t mean that every simulation needs assessment tool like a checklist, rating scale or formal grading scheme. It simply is referring to consideration of how to focus the facilitating faculty member, or teacher, or whatever you call them, who are observing the simulation. Remember, that to help the learner(s) of the simulation get better, the faculty need to be focused on certain things to ensure that the goals of the scenario are accomplished for our selected learner group, associated with the topic we chose in step one.

Lastly, what I want to point out to you is that you should notice something missing. The story!

The story comes later. Everybody wants to focus on the story because the story is fun. It’s often related to what we do clinically. It’s replicating things that are fun that brings in the theatrics of simulation! But what we really want to do is bring the theatrics of simulation to cause the actors on the stage (the participants) to so some activity. This activity gives us the situation to focus our observations on the assessment of the performance. This in turn allows us to accomplish the learning objectives of the scenario and help the participants improve for the future!

Until next time, Happy Simulating!

Leave a comment

Filed under Curriculum, design, scenario design, simulation, Uncategorized

Don’t be Confused! Every Simulation is an Assessment

 

Recently as I lecture and conduct workshops I have been asking people who run simulations how often they do assessments with their simulations. The answers are astounding. Every time there are a few too many people reporting that they are performing assessments less than 100% of the time that they run their simulations. Then they are shocked when I tell them that they do assessments EVERY TIME they run their simulations.

While some of this may be a bit of a play on words there should be careful consideration given to the fact that each time we run a simulation scenario we must be assessing the student(s) that are the learners. If we are going to deliver feedback, whether intrinsic to the design of the simulation, or promote discovery during a debriefing process, somewhere at some point we had to decide what we thought they did well and identify areas for needed improvement. To be able to do this you had to perform an assessment.

Kundenbewertungen - Rezensionen

Now let’s dissect a bit. Many people tend to equate the word assessment with some sort of grade assignment. Classically we think of a test that may have some threshold of passing or failing or contribute in some way to figure out if someone has mastered certain learnings. Often this may be part of the steps one needs to move on, graduate, or perhaps obtain a license to practice. The technical term for this type of assessment is summative. People in healthcare are all too familiar with such types of assessment!

Other times however, assessments can be made periodically with a goal of NOT whether someone has mastered something, but with more of a focus of figuring out what one needs to do to get better at what they are trying to learn. The technical term for this is formative assessment. Stated another way, formative assessment is used to promote more learning while summative assesses whether something was learned.

When things can get even more confusing is when assessment activities can have components or traits of both types of assessment activities. None the less, what is less important then the technical details is the self-realization and acceptance of simulation faculty members that every time you observe a simulation and then lead a debriefing you are conducting an assessment.

Such realization should allow you to understand that there is really no such thing as non-judgmental debriefing or non-judgement observations of a simulation-based learning encounter. All goal directed debriefing MUST be predicated upon someone’s judgement of the performance of the participant(s) of the simulation. Elsewise you cannot provide and optimally promote discovery of the needed understanding of areas that require improvement, and/or understanding of the topic, skills, or decisions that were carried out correctly during the simulation.

So, if you are going to take the time and effort to conduct simulations, please be sure and understand that assessment, and rendering judgement of performance, is an integral part of the learning process. Once this concept is fully embraced by the simulation educator greater clarity can be gained in ways to optimize assessment vantage points in the design of simulations. Deciding the assessment goals with some specificity early in the process of simulation scenario design can lead to better decisions associated design elements of the scenario. The optimizing of scenario design to enhance “assess-ability” will help you whether you are applying your assessments in a formative or summative way!

So, go forth and create, facilitate and debrief simulation-based learning encounters with a keen fresh new understanding that every simulation is an assessment!

Until Next Time Happy Simulating!

Leave a comment

Filed under assessment, Curriculum, design, scenario design, simulation

Don’t Let the Theory Wonks Slow Down the Progress of Healthcare Simulation

AdobeStock_85761977_rasterized

Those of us in the simulation community know well that when used appropriately and effectively simulation allows for amazing learning and contributes to students and providers of healthcare improving the craft. We also know there is very little published literature that conclusively demonstrates the “right way to do it”.

Yet in the scholarly literature there is still a struggle to define best practices and ways to move forward. I believe it is becoming a rate limiting step in helping people get started, grow and flourish in the development of simulation efforts.

I believe that part of the struggle is a diversity of the mission of various simulation programs ranging from entry level students to practicing professionals, varying foci on individualized learning incompetence, versus and/or team working communications training etc. Part of the challenges in these types of scholarly endeavors people try to describe a “one-size-fits-all“ approach to the solution of best practices. To me, this seems ridiculous when you consider the depths and breadth of possibilities for simulation in healthcare.

I believe another barrier (and FINALLY, the real point of this blog post 🙂  is trying to overly theorize everything that goes on with simulation and shooting down scholarly efforts to publish and disseminate successes in simulation based on some missing link to some often-esoteric deep theory in learning. While I believe that attachments to learning theory are important, I think it is ridiculous to think that every decision, best practice and policy in simulation, or experimental design, needs to reach back and betide to some learning theory to be effective.

As I have the good fortune to review a significant number simulation papers it is concerning to me to see many of my fellow reviewers shredding people’s efforts based on ties to learning theories, as well as their own interpretations on how simulation should be conducted. They have decided by reading the literature that is out there (of which there is very little, if any, conclusive arguments on best practices) has become a standard.

My most recent example is that of a paper I reviewed of a manuscript describing an experimental design looking at conducting simulation one way with a certain technology and comparing it to conducting the simulation another way without the technology. The authors then went on to report the resulting differences. As long as the testing circumstances are clearly articulated, along with the intentions and limitations, this is the type of literature the needs to appear for the simulation community to evaluate and digest, and build upon.

Time after time after time more recently I am seeing arguments steeped in theory attachments that seem to indicate this type of experimental testing is irrelevant, or worse yet inappropriate. There is a time and place for theoretical underpinnings and separately there is a time and place for attempting to move things forward with good solid implementation studies.

The theory wonks are holding up the valuable dissemination of information that could assist simulation efforts moving forward. Such information is crucial to assist us collectively to advance the community of practice of healthcare simulation forward to help improve healthcare globally.  There is a time to theorize and a time to get work done.

While I invite the theorist to postulate new and better ways to do things based on their philosophies, let those in the operational world, tell their stories of successes and opportunities as they are discovered.

Or perhaps it is time that we develop a forum or publication of high quality, that provides a better vehicle for dissemination of such information.

So…… in the mean time….. beware of the theory wonks. Try not to let them deter from your efforts to not only move your own simulation investigations forward, but to be able to disseminate and share them with the rest of the world!

Leave a comment

Filed under Curriculum, design, patient safety, return on investment

FIVE TIPS on effectively engaging adult learners in healthcare simulation

Leave a comment

Filed under Curriculum, design