Tag Archives: education

Five Tips for Effective Debriefing

There is no doubt that debriefing is an important part of simulation-based education efforts. Further, to do a good debriefing is not necessarily easy. Practice, self-reflection and getting training can help dramatically.  Seeking out help form experts and experienced people can be invaluable. Also, there are many resources in which to learn more about debriefing. I encourage you to take advantage of them!

Here are just five random tips in no particular order to help you increase the effectiveness of your debriefing!

5tIPS

  1. Know what the goal(s) are. Be specific.

Too many times simulation scenarios are executed and the faculty member just kind of winging it during the debriefing. It is far more effective a strategy to be keenly aware of what the learning outcomes and goals are prior to the simulation. This will allow you to focus your thoughts and ideas on helping the participants get better during the simulation which can be carried forward to your debriefing efforts. If you are attempting to have the debriefing constrained to the learning objectives for the simulation it is often easier to organize the information and get across the salient points that are needed to achieve the learning outcomes. It is particularly important to remember that you can’t teach everything with every scenario. The participant brain can only take in or process so much information in any one setting. In this case think of a sponge completely saturated with water, that can’t take any more!

  1. Have a framework or structure in mind

Having a structure to your debriefing ahead of time, or perhaps adopting a model of debriefing can help you significantly overcoming the challenging parts of debriefing. Some of the challenges occur in organizing the information. There are a number of debriefing models out there for consideration of adoption. There is no reason to believe that one is better than the other. I highly recommend that you learn several models and become comfortable with them. What you’ll find is some models work better than others in varying situations based on s number of factors such as the experience and expertise of the debriefer, the subject matter that is the focus of the simulation, as well as the level of the learners.

  1. Involve all the learners

If you are debriefing a group of students a challenging task can be involving all the learners. Often times there will be one or two learners who engage in a dialogue with the debriefer and without conscious effort and skill it is easy to continue the dialogue and allow the other members of the participating team to feel potentially marginalized. Often times this dialogue occurs with the person that was in the “hotseat”. Making a conscious effort during the debriefing to include all of the students in a meaningful way can significantly create more learner engagement. Further, if you are running multiple scenarios I believe that engaging all the learners encourages them to pay closer attention if they are in an observation role for subsequent scenarios.

  1. Pull the ideas, don’t push the facts

I like to think of the debriefing as the time when we explore the learners thought processes. If we are transmitting information or pushing facts to them the situation can become more of a lecture. In fact I see many novice debriefers break into song and start delivering a mini lectures during attempts at debriefing. It is important to remember that when you are pushing the facts to the participants it limits the amount of assessment that you can do in terms of their understanding of the material and what you need to do to create deeper learning. So, if you find yourself making many declarative statements, pullback, and start to ask some questions. Encourage critical thinking, self reflection and ensure you are helping to create linkages of what went well during the scenario and why it was good, along with allowing the participants to discover and identify what they should do differently if they were to face a similar situation in real life or another simulation to improve.

  1. Create a summary of the take home points

Novice debriefers tend to struggle with creating an adequate summary. Also, Beware. This is another time that is at risk for the debriefing turning into a mini lecture. It is helpful to have a list of the major take-home points associated with the scenario. You can contextually adapt the summary to the performance that occurred during the simulation scenario even if you have the summary points written out prior to the simulation occurring. It is important to remember that during a debriefing many areas can be covered and touched upon. Learner should be engaged to identify the major learning points that they experience in the simulation, as well as understanding how the simulation was relevant to helping them become better healthcare providers.

So, this was intended to be five random tips on how to improve the effectiveness of your debriefing strategy. I hope that you found them useful!

Now, go forth and do great debriefings extra mission point

 

Until next time,

Happy Simulating!

Leave a comment

Filed under Curriculum, debriefing, simulation

Don’t be Confused! Every Simulation is an Assessment

 

Recently as I lecture and conduct workshops I have been asking people who run simulations how often they do assessments with their simulations. The answers are astounding. Every time there are a few too many people reporting that they are performing assessments less than 100% of the time that they run their simulations. Then they are shocked when I tell them that they do assessments EVERY TIME they run their simulations.

While some of this may be a bit of a play on words there should be careful consideration given to the fact that each time we run a simulation scenario we must be assessing the student(s) that are the learners. If we are going to deliver feedback, whether intrinsic to the design of the simulation, or promote discovery during a debriefing process, somewhere at some point we had to decide what we thought they did well and identify areas for needed improvement. To be able to do this you had to perform an assessment.

Kundenbewertungen - Rezensionen

Now let’s dissect a bit. Many people tend to equate the word assessment with some sort of grade assignment. Classically we think of a test that may have some threshold of passing or failing or contribute in some way to figure out if someone has mastered certain learnings. Often this may be part of the steps one needs to move on, graduate, or perhaps obtain a license to practice. The technical term for this type of assessment is summative. People in healthcare are all too familiar with such types of assessment!

Other times however, assessments can be made periodically with a goal of NOT whether someone has mastered something, but with more of a focus of figuring out what one needs to do to get better at what they are trying to learn. The technical term for this is formative assessment. Stated another way, formative assessment is used to promote more learning while summative assesses whether something was learned.

When things can get even more confusing is when assessment activities can have components or traits of both types of assessment activities. None the less, what is less important then the technical details is the self-realization and acceptance of simulation faculty members that every time you observe a simulation and then lead a debriefing you are conducting an assessment.

Such realization should allow you to understand that there is really no such thing as non-judgmental debriefing or non-judgement observations of a simulation-based learning encounter. All goal directed debriefing MUST be predicated upon someone’s judgement of the performance of the participant(s) of the simulation. Elsewise you cannot provide and optimally promote discovery of the needed understanding of areas that require improvement, and/or understanding of the topic, skills, or decisions that were carried out correctly during the simulation.

So, if you are going to take the time and effort to conduct simulations, please be sure and understand that assessment, and rendering judgement of performance, is an integral part of the learning process. Once this concept is fully embraced by the simulation educator greater clarity can be gained in ways to optimize assessment vantage points in the design of simulations. Deciding the assessment goals with some specificity early in the process of simulation scenario design can lead to better decisions associated design elements of the scenario. The optimizing of scenario design to enhance “assess-ability” will help you whether you are applying your assessments in a formative or summative way!

So, go forth and create, facilitate and debrief simulation-based learning encounters with a keen fresh new understanding that every simulation is an assessment!

Until Next Time Happy Simulating!

Leave a comment

Filed under assessment, Curriculum, design, scenario design, simulation

Three Things True Simulationists Should NEVER Say Again

From Wiktionary: Noun. simulationist (plural simulationists) An artist involved in the simulationism art movement. One who designs or uses a simulation. One who believes in the simulation hypothesis.

Woman taping-up mans mouth

 

After attending, viewing or being involved in hundreds if not thousands of simulation lectures, webinars, workshops, briefings and conversations there are a few things that I hear that make me cringe more than others. In this post I am trying to simmer it down to the top three things that I think we should ban from the conversations and vocabularies of simulationists around the globe!

1. Simulation will never replace learning from real patients!: Of course it wont! That’s not the goal. In fact, in some aspects simulation offers some advantages over learning on real patients. And doubly in fact, real patients have some advantages too! STOP being apologetic for simulation as a methodology. When this is said it is essentially deferring to real patients as some sort of holy grail or gold standard against which to measure. CRAAAAAAAZY……   Learning on real patients is but one methodology by which to attack the complex journey of teaching, learning and assessing the competence of a person or a team of people who are engaged in healthcare.  All the methodologies associated with this goal of education have their own advantages, disadvantages, capabilities and limitations. When we agree with people and say simulation will never replace learning from real patients, or allow that notion to go unchallenged, we are doing a short service to the big picture of creating a holistic education program for learners. See previous blog post on learning on real patients. 

2. In simulation, debriefing is where all of the learning occurs!: You know you have heard this baloney before. Ahhhhhhhhhhhhh such statements are purely misinformed, not backed up by a shred of evidence, kind of contrary to COMMON SENSE, as well as demeaning to the participants as well as the staff and faculty that construct such simulations. The people who still make this statement are still stuck in a world of instructor centricity. In other words, “They are saying go experience all of that…… and then when I run the debriefing the learning will commence.” The other group of people are trying to hard sell you some training on debriefing and then make you think it is some mystical power held by only a certain few people on the planet. Kinda cra’ cra’ (slang for crazy) if you think about it.

When one says something to articulate learning cannot occur during the simulation is confirming that they are quite unthoughtful about how they construct the entire learning encounter. It also hints at the fact that they don’t take the construct of the simulation itself very seriously. The immersive experience that people are exposed to during the simulation and before the debriefing can be and should be constructed in a way that provides built in feedback, observations, as well as experiences that contribute to a feeling of success and/or recognition of the need for improvement. See previous blog post  on learning beyond debriefing

3. Recreation of reality provides the best simulation! [or some variant of this statement]: When I hear this concept even eluded to, I get tachycardic, diaphoretic, and dilated pupils. My fight or flight nervous system gets fully engaged and trust me, I don’t have any planning on running. 😊

[disclaimer on this one: I’m not talking about the type of simulation that is designed for human factors, and/or critical environmental design decisions or packaging/marketing etc. which depend upon a close replication to reality.]

This is one of the signs of a complete novice and/or misinformed person or sometimes groups of people! If you think it through it is a rather ludicrous position. Further, I believe trying to conform to this principle is one of the biggest barriers to success of many simulation endeavors. People spent inordinate amounts of time trying to put their best theatrical foot forward to try to re-create reality. Often what is actually occurring is expanding the time to set up the simulation, expanding the time to reset the simulation and dramatically increasing the time to clean up from the simulation. (All of the after mentioned time intervals increase the overall cost of the individual simulation, thereby reducing the efficiency.) While I am a huge fan of loosely modeling scenarios off of real cases in an attempt to create an environment with some sense of familiarity to the clinical analog, I frequently see people going to extremes trying to re-create details of reality.

We have hundreds and thousands of design decisions to make for even moderately complex scenarios. Every decision we make to include something to try to imitate reality has the potential to potentially cause confusion if not carefully thought out. It is easy to introduce confusion in the attempts to re-create reality since learners engage in simulation with a sense of hyper-vigilance that likely does not occur in the same fashion when they are in the real clinical learning environment. See previous blog post on cognitive third space.

If you really think about it the simulation is designed to have people perform something to allow them to learn, as well as to allow observers to form opinions about the things that the learner(s) did well, and those areas that can be improved upon. Carefully selecting how a scenario unfolds, and/or the equipment that is used to allow this performance to occur is part of the complex decision-making associated with creating simulations. The scenario should be engineered to exploit the areas, actions, situations or time frames that are desired focal points of the learning and assessment objectives.  Attention should be paid to the specifics of the learning and assessment objectives to ensure that the included cache of equipment and/or environmental accoutrements are selected to minimize the potential of confusion, create the most efficient pathway that allows the occurrence of the assessment that contributes improving the learning.

Lastly, lets put stock into the learning contract we are engaging in with our learners. We need to treat them like adult learners. (After all everybody wants to throw in the phrase adult learning principles…. Haha).

Let’s face it: A half amputated leg of a trauma patient with other signs and symptoms of hemorrhagic shock that has a blood-soaked towel under it is probably good enough for our adult learners to get the picture and we don’t actually need blood shooting out of the wound and all over the room. While the former might not be as theatrically sexy, the latter certainly contributes to the overall cost (time and resource) of the simulation. We all need to realistically ask, “what’s the value?”

While my time is up for this post, and I promised to limit my comments to only three, I cannot resist to share with you two other statements or concepts that were in the running for the top three. The first is “If you are not video recording your scenarios you cannot do adequate debriefing”, and the second one is “The simulator should never die.” (Maybe I’ll expand the rant about these and others in the future 😉).

Well… That’s a wrap. I’m off to a week of skiing with family and friends in Colorado!

Until next time,

Happy Simulating!

6 Comments

Filed under Curriculum, debriefing, scenario design, simulation

Don’t Let the Theory Wonks Slow Down the Progress of Healthcare Simulation

AdobeStock_85761977_rasterized

Those of us in the simulation community know well that when used appropriately and effectively simulation allows for amazing learning and contributes to students and providers of healthcare improving the craft. We also know there is very little published literature that conclusively demonstrates the “right way to do it”.

Yet in the scholarly literature there is still a struggle to define best practices and ways to move forward. I believe it is becoming a rate limiting step in helping people get started, grow and flourish in the development of simulation efforts.

I believe that part of the struggle is a diversity of the mission of various simulation programs ranging from entry level students to practicing professionals, varying foci on individualized learning incompetence, versus and/or team working communications training etc. Part of the challenges in these types of scholarly endeavors people try to describe a “one-size-fits-all“ approach to the solution of best practices. To me, this seems ridiculous when you consider the depths and breadth of possibilities for simulation in healthcare.

I believe another barrier (and FINALLY, the real point of this blog post 🙂  is trying to overly theorize everything that goes on with simulation and shooting down scholarly efforts to publish and disseminate successes in simulation based on some missing link to some often-esoteric deep theory in learning. While I believe that attachments to learning theory are important, I think it is ridiculous to think that every decision, best practice and policy in simulation, or experimental design, needs to reach back and betide to some learning theory to be effective.

As I have the good fortune to review a significant number simulation papers it is concerning to me to see many of my fellow reviewers shredding people’s efforts based on ties to learning theories, as well as their own interpretations on how simulation should be conducted. They have decided by reading the literature that is out there (of which there is very little, if any, conclusive arguments on best practices) has become a standard.

My most recent example is that of a paper I reviewed of a manuscript describing an experimental design looking at conducting simulation one way with a certain technology and comparing it to conducting the simulation another way without the technology. The authors then went on to report the resulting differences. As long as the testing circumstances are clearly articulated, along with the intentions and limitations, this is the type of literature the needs to appear for the simulation community to evaluate and digest, and build upon.

Time after time after time more recently I am seeing arguments steeped in theory attachments that seem to indicate this type of experimental testing is irrelevant, or worse yet inappropriate. There is a time and place for theoretical underpinnings and separately there is a time and place for attempting to move things forward with good solid implementation studies.

The theory wonks are holding up the valuable dissemination of information that could assist simulation efforts moving forward. Such information is crucial to assist us collectively to advance the community of practice of healthcare simulation forward to help improve healthcare globally.  There is a time to theorize and a time to get work done.

While I invite the theorist to postulate new and better ways to do things based on their philosophies, let those in the operational world, tell their stories of successes and opportunities as they are discovered.

Or perhaps it is time that we develop a forum or publication of high quality, that provides a better vehicle for dissemination of such information.

So…… in the mean time….. beware of the theory wonks. Try not to let them deter from your efforts to not only move your own simulation investigations forward, but to be able to disseminate and share them with the rest of the world!

Leave a comment

Filed under Curriculum, design, patient safety, return on investment

FIVE TIPS on effectively engaging adult learners in healthcare simulation

Leave a comment

Filed under Curriculum, design

Recreating Reality is NOT the goal of Healthcare Simulation

Discussing the real goals of Healthcare Simulation as it relates to the education of individuals and teams. Avoiding the tendency to put the primary focus into recreating reality, and instead providing the adequate experience that allows deep reflection and learning should be the primary focus. This will help you achieve more from your simulation efforts!

 

Leave a comment

Filed under scenario design

Fire Alarm Systems and Simulation Programs in Hospitals – What is the ROI?

shutterstock_278643779How do you respond to your financial administrator or controller of the purse strings when they ask you what the return on investment is for your hospital-based simulation program? It’s quite complicated.

Return on investment in today’s vernacular implies that there is a financial spreadsheet that can show a positive bottom line after revenue (or direct cost savings) and expenses are accounted for. This is really difficult to do with simulation.

I have seen business plan after business plan of simulation centers that have promised their administration that they will become financially positive and start bringing in big bucks for the institution in some given period of time. Usually it’s part of the business plan that justifies the standing up of the simulation center. I think I can count on one hand the simulation programs that have actually achieved this status. Why is this?

The answer is because calculating discrete return on investment from the simulation alone is extraordinarily difficult to do. While there are some examples in the literature that attempt to quantify in dollar terms a return on investment, they are however few and far between. It is largely confined to some low hanging fruit with the most common example and published in the literature focusing on central line training.

Successfully integrated hospital focused simulation programs likely have found a way to quantify part of their offerings in a dollars and cents accounting scheme, but likely are providing tremendous value to their organizations that are extraordinarily difficult, if not impossible to demonstrate on spreadsheet.

What is the value the simulation center may bring to the ability of a hospital to recruit more patients because the community is aware of patient safety efforts and advanced training to improve care? What is the value of a simulation center in its ability to create exciting training opportunities that allow the staff to feel like the system is investing in them and ultimately helping with recruiting of new staff, along with retention of existing staff members?

What is the value or potential in the ability to avoid causing harm to patients such as mismanaged difficult airway because of simulation training of physicians and other providers who provide such care? What is the value of litigation avoidance for the same topic?

Also, the value proposition of the successfully implemented simulation program for patient safety extinguishes itself over time if it significantly reduces or eliminates the underlying problem. This is the so-called phenomenon of safety being a dynamic, nonevent. Going back to the more concrete example of airway if your airway management mishap rates have been essentially zero over several years, the institutional memory may become fuzzy on why you invest so much money and difficult airway training….. A conundrum to be sure.

I think of fire alarm systems in the hospital as similar situation Let’s compare the two. Fire alarm systems detect or “discover” fires, began to put the fire out, and disseminate the news. Simulation programs have the ability to “detect” or discover potential patient safety problems for the identification of latent threats, poor systems design or staffing for example. Once identified, the simulation program develops training that helps “put out” the patient safety threat. One could argue that the training itself is the dissemination of information that a patient safety “fire” exists.

Fire alarm systems and hospitals cost hundreds of thousands, possibly millions of dollars to install and run on the annual basis. But the chief financial officer never asks what’s the return on investment? Why is that?

Well, perhaps it is a non-issue because fire alarm systems have successfully been written into law, regulations of building codes and so on. Regulation is an interesting idea for simulation to be sure but probably not for a long time.

However, if you think about it beyond a regulatory requirement, the likelihood of a given fire alarm system actually saving a life is probably significantly less probable then a well-integrated simulation program that is providing patient safety programs designed around the needs of the institution it serves. Admittedly the image of hundreds of people being trapped in a burning building is probably more compelling to the finance guy then one patient at a time dying from hypoxia from a mismanaged difficult airway.

Do you really know what to do when the fire alarm system goes off in your hospital? I mean we have little rituals like close the doors etc. But what next? Do we run? If we run, do we run toward the fire? Or away from the fire?  Do we evacuate all the patients? Do we individually call the fire department? Do we find hoses and start squirting out the fire?

When we conduct simulation-based training in hospitals that are aligned with the patient safety needs of the given institution we are extinguishing or minimizing the situation that patients will undergo or suffer from unintended patient harm. The existence of simulation programs and attention to patient safety education are a critical need for the infrastructure of any hospital caring for patients.

The more we can expand upon this concept and allow our expertise in simulation to contribute to the overall mission of the institution in reducing potential harm to patients and hospital staff, the more likely we will receive continuing support and be recognized as important infrastructure to providing the highest quality and safety to our patients.

Just like the fire alarm systems.

 

 

Leave a comment

Filed under return on investment, Uncategorized

Patient Centered Debriefing – Putting the Patient First – A MUST for Healthcare Simulation

patientcentereddebiriefingDebriefing in healthcare education is a specific type of communication designed to allow enhanced learning through a post hoc analysis and (ideally) structured conversation of an event. While there are many different styles and methods commonly described for use in healthcare simulations there are generally some consistent principles. Common features of the goals of just about every debriefing method includes attempting to ensure that the participants involved in the event leave with an understanding of areas in which they performed well and areas that they could improve upon should the face a similar situation in the future.

Debriefing is not easy to do well for a variety of reasons, and suffice it to say generally improves with practice and a focus on improvement. Depending on the facilitator and/or the learner(s) many people struggle with ensuring learners depart the debriefing with a clear understanding of areas needed for improvement. Other times debriefers can make the mistake of focusing only on the negative, forgetting to elucidate the things that may have been done well.

I believe we need to always incorporate the needs of the patient into the debriefing. The thought that the simulation benefits the patient should permeate throughout the planning of all events in healthcare simulation including the debriefing.

With the proliferation of simulation based learning over the last two decades there has been an increased interest in faculty development and training of people to develop debriefing skills. Nearly every discussion of faculty training in the simulation healthcare simulation space includes some discussion of the safe learning environment and student-centered learning. These concepts are embedded in nearly every discussion and every publication on debriefing and feedback.

Ostensibly the safe learning environment is referring to a facilitator controlling the environment of simulations and debriefings to provide an environment of comfort that encourages participants to be able to share freely what is on their mind during the simulation and the debriefing without fear of repercussion, ridicule or reprisal. I also believe that it should encourage simulation faculty to remain vigilant for opportunities that need some sort active facilitation to assist a participant thought to be struggling with the situation from either an emotional or perhaps stressful stimulus.

Having been involved in the teaching of healthcare providers for almost thirty years and when thinking backing to the late eighties, I personally participated in early “simulations” designed to “knock students off of their game”. Thus, I can certainly relate to, and applaud the emergence of the concept of a safe-environment.

However, I now believe that the concept of a student-centered approach to healthcare education contributes to the illusion that the student is the ultimate benefactor of healthcare education programs. The concept has evolved because of a natural parenteral feeling of protection for students, along with the fact that experiential learning can be stressful. Balancing these factors can likely contribute to highly effective learning as well as a positive learning experience for the participant.

When applied to healthcare education student-centered learning can be a bit misleading, perhaps a bit irresponsible, in so far that it completely ignores the fact that the patient is the ultimate recipient of the educational efforts. It may be more comfortable for the faculty in the immediate because the student is present and the patient is not. However, if you think about it, down-stream it is likely incomplete and ultimately may do a disservice to both the learners and their patients.

The challenge is that when the pervasive thought process is student-centered, the culture, requisite curriculum and learning opportunity design will favor such a position. This can subtly influence the debriefing and interactions with participants in a way that fails to correct inaccurate or poor performance and/or reinforce decisions or actions that should be carried forward to actual care.

My colleagues and I have coined the term Patient-Centered Debriefing. I originally talked about it on my simulation blog in 2013. In the training of debriefers and the modeling of debriefing, we encourage the consideration of the needs of the patient and these seems to pull to a more appropriate anchor point. This slight shift in focus can also help to humanize the situation beyond the needs of the learner. Taking on the responsibility of eventual care of an actual patient can shift the mindset of the instructor to ensure the real goals of the simulations are met.

What does patient-centered debriefing look like? At casual observation it would appear the same as any other debriefing that is conducted with acceptable methods in 2017 under a premise of student centered debriefing. The difference is the facilitator(s), as well as perhaps the students, would be considering the ultimate patient outcomes associated with the learning objectives of the given scenario. Thus, if properly conducted, facilitator(s) would be less likely to gloss over or omit reconciliation of mistakes and/or errors of commission or omission that occurred during a simulation that would likely contribute to adverse sequela for the patient in a comparable actual healthcare setting. Simultaneously, however the facilitator will be maintaining the enshrined traditional “safe learning environment”.

In considering the needs of the patient there is a subtle reminder that it is our job as healthcare educators to best prepare learners for this reality and the time that we have to do it in is precious.  Further, particularly in simulation based learning it should be an ever present reminder that this is our ultimate purpose. I think it is particularly important for simulation facilitators who are not actively involved in the care of patients to consider this position. This is not to suggest that they are not doing a great job, but it seems like a reasonable active reminder to consider the needs of the patients who will be cared for by the learners involved in the simulation.

I am not suggesting that we abandon the attention to providing a safe learning environment for simulations as well as clinical learning environments. I do believe that this contributes to effective learning particularly in the simulated setting. I do believe that we need to reconsider the concept of student-centered learning insofar as the student being thought of as the epicenter of the overall education process and outcomes.

Reserving the definition and concepts of student centricity for considering the scholarly needs, learning styles, designs and appeals to the intrinsic motivating factors seem more appropriate. Any learning program in healthcare is far better to have a patient-centered axis from which all other actions and designs emerge.

I invite you to consider adopting a patient-centered debriefing into your work!

Leave a comment

Filed under debriefing, Uncategorized

Learning from Simulation – Far more than the Debriefing

Most people have heard someone say “In Simulation, debriefing is where all of the learning occurs.” I frequently hear this when running faculty development workshops and programs, which isn’t as shocking as hearing this espoused at national and international meetings in front of large audiences! What a ridiculous statement without a shred of evidence or a realistic common sense approach to think it would be so. Sadly, I fear it represents an unfortunate instructor-centered perspective and/or a serious lack of appreciation for potential learning opportunities provided by simulation based education.LearningDuringSimulation2

Many people academically toil over the technical definitions of the word feedback and try to contrast in from a description of debriefing as if they are juxtaposed. They often present it in a way as if one is good and the other is bad. There is a misguided notion that feedback is telling someone, or lecturing to someone to get a point across. I believe that is a narrow interpretation of the word. I think that there are tremendous opportunities for learning from many facets of simulation that may be considered feedback.

Well-designed simulation activities hopefully provide targeted learning opportunities of which part of it is experiential, sometimes immersive, in some way. I like to think of debriefing as one form of feedback that a learner may encounter during simulation based learning, commonly occurring after engaging in some sort of immersive learning activity or scenario. Debriefing can be special if done properly and will actually allow the learner to “discover” new knowledge, perhaps reinforce existing knowledge, or maybe even have corrections made to inaccurate knowledge. No matter how you look at it at the end of the day it is a form of feedback, that can likely lead, or contribute to learning. But to think that during the debriefing is the only opportunity for learning is incredibly short-sighted.

There are many other forms of feedback and learning opportunities that learners may experience in the course of well-designed simulation based learning. The experience of the simulation itself is ripe with opportunities for feedback. If a learner puts supplemental oxygen on a simulated patient that is demonstrating hypoxia on the monitor via the pulse oximetry measurements and the saturations improve, that is a form of feedback. Conversely, if the learner(s) forgets to provide the supplemental oxygen and the saturations or other signs of respiratory distress continue to worsen then that can be considered feedback as well. The latter two example examples are what I refer to as intrinsic feedback as they are embedded in the scenario design to provide clues to the learners, as well as to approximate what may happen to a real patient in a similar circumstance.

With regard to intrinsic feedback, it is only beneficial if it is recognized and properly interpreted by the learner(s) either while actively involved in the simulated clinical encounter, and if not, perhaps in the debriefing. The latter should be employed if the intrinsically designed feedback is important to accomplishing the learning objectives germane to the simulation.

There are still other forms of feedback that likely contribute to the learning that are not part of the debriefing. In the setting of a simulated learning encounter involving several learners, the delineation of duties, the acceptance or rejection of treatment suggestions are all potentially ripe for learning. If a learner suggests a therapy that is embraced by the team, or perhaps stimulates a group discussion during the course of the scenario the resultant conversation and ultimate decision can significantly add to the learning of the involved participants.

Continuing that same idea, perhaps the decision to provide, withhold, or check the dosage of a particularly therapy invokes a learner to check a reference, or otherwise look up a reference that provides valuable information that solidifies a piece of information in the mind of the leaner. The learner may announce such findings to the team while the scenario is still underway thereby sharing the knowledge with the rest of the treatment team. Waaah Laaaah…… more learning that may occur outside of the debriefing!

Finally, I believe there is an additional source of learning that occurs outside of the debriefing. Imagine when a learner experiences something or becomes aware of something during a scenario which causes them to realize they have a knowledge gap in that particular area. Maybe they forgot a critical drug indication, dosage or adverse interaction. Perhaps there was something that just stimulated their natural curiosity. It is possible that those potential learning items are not covered in the debriefing as they may not be core to the learning objectives. This may indeed stimulate the learner to engage in self-study to enhance their learning further to close that perceived area of a knowledge gap. What???? Why yes, more learning outside of the debriefing!

In fact, we hope that this type of stimulation occurs on the regular basis as a part of active learning that may have been prompted by the experiential aspects provided by simulation. Such individual stimulation of learning is identified in the sentinel publication of Dr. Barry Issenberg et al in Vol 27 of Medical Teacher in 2005 describing key features of effective simulation.

So hopefully I have convinced you, or reinforced your belief that the potential for learning from simulation based education spans far beyond the debriefing. Please recognize that this statement made by others likely reflects a serious misunderstanding and underappreciation for learning that can and should be considered with the use of simulation. The implication of such short-sightedness can have huge impacts on the efficiency and effectiveness of simulation that begin with curriculum and design.

So the next time you are incorporating simulation into your education endeavor, sit back and think of all of the potential during which learning may occur. Of course the debriefing in one such activity during which we hope learning to occur. Thinking beyond the debriefing and designing for the bigger picture of potential learning that can be experienced by the participants is likely going to help you achieve positive outcomes from your overall efforts.

5 Comments

Filed under Uncategorized

Simulation Curriculum Integration via a Competency Based Model

Process_Integration.shutterstock_304375844One of the things that is a challenge for healthcare education is the reliance on random opportunity for clinical events to present themselves for a given group of learners to encounter as part of a pathway of a structured learning curriculum. This uncertainty of exposure and eventual development of competency is part of what keep our educational systems time-based which is fraught with inefficiencies by its very nature.

Simulation curriculum design at present often embeds simulation in a rather immature development model in which there is an “everybody does all of the simulations” approach. If there is a collection of some core topics that are part and parcel to a given program, combined with a belief, or perhaps proof, that simulation is a preferred modality for the topic, then it makes sense for those exposures. Let’s move beyond the topics or situations that are best experienced by everyone.

If you use a model of physician residency training for example, curriculum planners “hope” that over the course of a year a given first year resident will adequately manage an appropriate variety of cases. The types of cases, often categorized by primary diagnosis, is embedded in some curriculum accreditation document under the label “Year 1.” For the purposes of this discussion lets change the terminology from Year 1 to Level 1 as we look toward the future.

What if we had a way to know that a resident managed the cases, and managed them well for level one? Perhaps one resident could accomplish the level one goals in six months, and do it well. Let’s call that resident, Dr. Fast. This could then lead to a more appropriate advancement of the resident though the training program as opposed to them advancing by the date on the calendar.

Now let’s think about it from another angle. Another resident who didn’t quite see all of the cases, or the variety of cases needed, but they are managing things well when they do it. Let’s call them Dr. Slow. A third resident of the program is managing an adequate number and variety, but is having quality issues. Let’s refer to them as Dr. Mess. An honest assessment of the current system is that all three residents will likely be advanced to hire levels of responsibilities based on the calendar without substantial attempt at remediation of understanding of the underlying deficiencies.

What are the program or educational goals for Drs. Fast, Slow and Mess? What are the differences? What are the similarities? What information does the program need to begin thinking in this competency based model? Is that information available now? Will it likely be in the future? Does it make sense that we will spend time and resources to put all three residents through the same simulation curriculum?

While there may be many operational, culture, historical models and work conditions that provide barriers to such a model, thinking about a switch to a competency based model forces one to think deeper about the details of the overall mission. The true forms of educational methods, assessment tools, exposure to cases and environments, should be explored for both efficiency and effective effectiveness. Ultimately the outcomes we are trying to achieve for a given learner progressing through a program would be unveiled. Confidence in the underlying data will be a fundamental necessary component of a competency based system. In this simple model, the two functional data points are quantity and quality of given opportunities to learn and demonstrate competence.

This sets up intriguing possibilities for the embedding of simulation into the core curriculum to function in a more dynamic way and contribute mightily to the program outcomes.

Now think of the needs of Dr. Slow and Dr. Mess. If we had insight combined with reliable data, we could customize the simulation pathway for the learner to maximally benefit their progression through the program. We may need to provide supplement simulations to Dr. Slow to allow practice with a wider spectrum of cases, or a specific diagnosis, category of patient, or situation for them to obtain exposure. Ideally this additional exposure that is providing deliberate practice opportunities could also include learning objectives to help them increase their efficiencies.

In the case of Dr. Mess, the customization of the simulation portion of the curriculum provide deliberate practice opportunities with targeted feedback directly relevant to their area(s) of deficiency, ie a remediation model. This exposure for Dr. Mess could be constructed to provide a certain category of patient, or perhaps situation, that they are reported to handle poorly. The benefit in the case of Dr. Mess is the simulated environment can often be used to tease out the details of the underlying deficiency in a way that learning in the actual patient care environment is unable to expose.

Lastly, in our model recall that Dr. Fast may not require any “supplemental” simulation thus freeing up sparse simulation and human resources necessary to conduct it. This is part of the gains in efficiencies that can be realized through a competency -based approach to incorporating simulation into a given curriculum.

Considering a switch to a competency based curriculum in healthcare education can be overwhelming simply based on the number of operational and administrative challenges. However, using a concept of a competency based implementation as a theoretical model can help envision a more thoughtful approach to curricular integration of simulation. If we move forward in a deliberate attempt to utilize simulation in a more dynamic way, it will lead to increases in efficiencies and effectiveness along with providing better stewardship of scarce resources.

 

1 Comment

Filed under Uncategorized