Excerpts from a recent plenary presentation regarding embedding simulation into patient safety as related to the Swiss Cheese model from Dr James Reason.
As educators involved in the design of simulation based activities we like to have a clear understanding of what we are trying to accomplish in a given educational encounter. We often talk about beginning with a needs analysis to determine what will bring most impact. We ideally design learning objectives that are well matched to our intended participants. We execute scenarios and debrief them expertly covering all the relevant topics. While this is an idealized workflow for the development of a simulation encounter, it doesn’t always translate to reality when we attempt to execute a program. Our success may be in the ability to creatively adapt our educational program to the realities of the environment and situations that it will be conducted.
One reason that such a discordance can occur is that we fail to consider operational realities into the design of our educational gems. The adage of trying to stuff 8-pounds of learning into a 5-pound bag is well known. It also seems to be a constant struggle in the design of simulation programs where there is often an urgency to “teach them everything they need to know.” However, such attempts can put students, faculty members, and programs at disadvantages. This can come from many different perspectives ranging from failing to accomplish the intended learning goals, erosion of confidence in the program, the faculty, as well as hampering the ability of the program to make change as designed and/or desired.
Considering operational realities of simulation program design are critically important when creating programs that are aimed at the training of practicing professionals. In particular, when designing those programs that may interrupt or be embedded into the normal workflow of the caring of actual patients.
Let’s consider the design of a program that provides a mock-resuscitation scenario conducted impromptu in an ICU that the unit nurses, physicians and respiratory therapists (RT) will participate.
During our needs assessment and our expert opinions of the design of an educational encounter we may imagine many things that need to be covered during the debriefing for a specific topic. Such topics may include the demonstration of knowledge of the therapeutics that the patient needs from the nurse, the physician, the RT, the assessment of the patient, further testing needed, communication, teamwork and so on.
First off, is the obvious. Operational parameters should be built-in to provide criteria for a go/no-go decision for the scenario. While in this day and age all healthcare units are busy, it is not advisable ethically, or operationally from a patient safety perspective, to divert healthcare resources to a training activity if the target unit is already overwhelmed. Such decisions are ideally achieved in advance during the program design phase through a collaborative discussion involving the simulation and the clinical unit manager. They are best defined in advance depending on the overall educational and improvement goals.
The obvious next consideration is the time that is available. This includes time for the scenario as well as the debriefing. Carefully considering the needs of the learners as well as the dynamics of the operational realities is particularly important. Additionally, factoring in how the scenario is embedded into the overall curriculum is critical.
For example, is this a scenario that is once and done? Meaning that the participants will not likely encounter any further simulations until next year. Or is this a recurring educational effort in this ICU where we will have repeated engagement with the care providers over time. This could have significant bearing on the length of time you spend in various topics during the debriefing.
Continuing with our above example of the ICU resuscitation scenario it is common to have much less time than anticipated to conduct the debriefing. Design considerations should include a prioritization of learning topics that adapt to the operational reality. For example, imagine there is a fifteen-minute period of time that occurs after the scenario for debriefing and then everyone scampers off back to work. Let’s pretend there are learning objectives SPECIFIC to nursing care, physician care and care provided by respiratory therapy. Then there are learning objectives around communication and team work that cut across the disciplines. We would want to design the debriefing discussion to focus on those topics that are multidisciplinary and would maximally benefit from a group discussion.
Thus, in this case, it would likely do a disservice to the domain specific objectives or learning points by trying to artificially shorten each one AND then also try to cover communications and teamwork. (In other words, none of the objectives get covered very well.) We may be better off focusing on the communication and teamwork while the nurses, physicians and therapists are all still in attendance as that would likely give the biggest bang for the buck for that operational circumstance.
Having achieved successful accomplishment of the communications and teamwork objectives we now need to turn our creative curricular designing efforts to the domain specific learning objective. This is why it is critical for simulation educators to think more-broadly and realize that they are healthcare educators that use simulation as part of a learning method.
As part of our overall design of the goals for the entire learning activity, perhaps we could email the nursing specific protocols or highlights of the intended learning to the nurses after the event. Or perhaps direct them toward a brief on-line learning encounter specific to the goals of the scenario. We could do the same for the physicians and the RT’s. The content in this case would be tailored specifically toward the care providers and will likely seem more relevant to the recipient.
I am of the opinion that once the care team experiences the simulation they will be more receptive to and engaged in the downstream feedback that they may receive after the encounter (meaning simulation and debriefing) is completed.
I believe this is true particularly if the information is specific to the practice domain, succinct, relevant, and tied directly to the simulation activity. Their participation in the simulation likely helps them to realize areas needed for self-improvement as well as an overall heightened engagement in the learning process. Contrast this thought to the education motivation, or lack thereof, that is realized through a list of mandatory on-line training programs that one gets assigned annual as part of a regulatory requirement. (Can you say annual torture?)
So, as you move forward be sure to consider operational realities and try to remain nimble on creative ways to accomplish the learning. It may be different then your initial vision of the activity. Simulation education creators should engage collaboratively with operational leaders particularly if the encounters are embedded into the mix of healthcare operations.
Just about every successful simulation in the world has one thing in common. No its not a high fidelity design, great curricular integration or a fabulous debriefing. What is it then????? Answer: There was a Simulation Technician involved. Sim Techs are crucial to the success of programs and are integral to any team using significant simulation. Sim Techs come in many varieties in terms of backgrounds, titles, and in some smaller programs, many share one of many responsibilities.
I do have a bias that I will disclose. I started my career as an Electronics Technician in the US NAVY. After nearly two years of Navy training I cut my teeth aboard an Aircraft Carrier, the USS John F Kennedy (CV-67). Now that was truly an immersive learning experience! After I screwed something up one day when we were off the coast of Libya in 1986, the Electronics Materials Officer called me in the office and said “Son, do I have to remind you where this boat is pointed?” me: “No Sir!” him: “Now stop being a technician and join this team as a thinker.”
It is important to engage the Sim Tech in every aspect of the simulation. Too often they are thought of as “just a tech”, but this is a HUGE mistake. Engaged professional Sim Techs are capable of many things that can add value to your program beyond setting up, driving mannequins and cleaning up.
Sim Techs are capable of learning how to evaluate and sort high quality simulation from that needing improvement, or good debriefing to less than good. Dare I say…. They can also be trained to conduct or participate in debriefings in very creative ways. Sim Techs interacting with your participants can help to alleviate anxiety and get ahead of problems before they occur. They can play a significant role in your quality improvement programs. After all, you have to imagine. They see a lot of simulation!
Sim Techs are highly capable at helping to orient faculty and help to get faculty functioning at a higher level. This may include how to operate A/V equipment, drive simulator, or reset a simulation room to be ready for the next group.
Engaged Sim Techs take pride in their work, become embedded into the effort and share in the pain when something doesn’t go as planned. This level of ownership will often help to transition a program into high reliability operation that has everyone beaming with pride.
The Sim Tech community as a whole harbors a huge supply of energy and creativity and love to participate in being a part of solutions. Whether its moulage, creating a special environment, app, smell, video, visual cue, you can call upon the technical community to solve it.
It is encouraging to see a more professional approach to the workforce development of the simulation community. Achieving certification as a Certified Healthcare Simulation Operations Specialist (CHSOS) through the Society for Simulation in Healthcare (SSH), can be a great source of professional pride for the technician in addition to ensuring competence in several important areas that well trained technicians should have as a minimum.
I would highly recommend encouraging your technicians to take part in training, attending meetings with other techs and engaging in the available networking can pay off in great dividends. Being current with their knowledge and being able to interact closely with vendors to not only know what is coming out down the pike, but sometimes being able to influence future products are other reasons to make a place in your budget for technician training. In addition to meetings such as the International Meeting for Simulation in Healthcare (IMSH), several high-quality specialty meetings have been created that dedicate a sole focus on technical training from the SSH as well as specialty organizations such as SimGHOSTS. High quality training programs are being offered by simulation centers both in person and on-line such as ours at WISER.
No Simulation Technician is “just a tech” unless the program leadership makes them that. Embrace your technicians. Nurture their professional development and status in and among the team. Push the envelope of their capabilities and creativity to expand into new roles and ownership of your simulation efforts. You will not only be thankful, but wonder why it took so long to realize this is a vital ingredient to the sauce of success of highly capable simulation programs!
How do you respond to your financial administrator or controller of the purse strings when they ask you what the return on investment is for your hospital-based simulation program? It’s quite complicated.
Return on investment in today’s vernacular implies that there is a financial spreadsheet that can show a positive bottom line after revenue (or direct cost savings) and expenses are accounted for. This is really difficult to do with simulation.
I have seen business plan after business plan of simulation centers that have promised their administration that they will become financially positive and start bringing in big bucks for the institution in some given period of time. Usually it’s part of the business plan that justifies the standing up of the simulation center. I think I can count on one hand the simulation programs that have actually achieved this status. Why is this?
The answer is because calculating discrete return on investment from the simulation alone is extraordinarily difficult to do. While there are some examples in the literature that attempt to quantify in dollar terms a return on investment, they are however few and far between. It is largely confined to some low hanging fruit with the most common example and published in the literature focusing on central line training.
Successfully integrated hospital focused simulation programs likely have found a way to quantify part of their offerings in a dollars and cents accounting scheme, but likely are providing tremendous value to their organizations that are extraordinarily difficult, if not impossible to demonstrate on spreadsheet.
What is the value the simulation center may bring to the ability of a hospital to recruit more patients because the community is aware of patient safety efforts and advanced training to improve care? What is the value of a simulation center in its ability to create exciting training opportunities that allow the staff to feel like the system is investing in them and ultimately helping with recruiting of new staff, along with retention of existing staff members?
What is the value or potential in the ability to avoid causing harm to patients such as mismanaged difficult airway because of simulation training of physicians and other providers who provide such care? What is the value of litigation avoidance for the same topic?
Also, the value proposition of the successfully implemented simulation program for patient safety extinguishes itself over time if it significantly reduces or eliminates the underlying problem. This is the so-called phenomenon of safety being a dynamic, nonevent. Going back to the more concrete example of airway if your airway management mishap rates have been essentially zero over several years, the institutional memory may become fuzzy on why you invest so much money and difficult airway training….. A conundrum to be sure.
I think of fire alarm systems in the hospital as similar situation Let’s compare the two. Fire alarm systems detect or “discover” fires, began to put the fire out, and disseminate the news. Simulation programs have the ability to “detect” or discover potential patient safety problems for the identification of latent threats, poor systems design or staffing for example. Once identified, the simulation program develops training that helps “put out” the patient safety threat. One could argue that the training itself is the dissemination of information that a patient safety “fire” exists.
Fire alarm systems and hospitals cost hundreds of thousands, possibly millions of dollars to install and run on the annual basis. But the chief financial officer never asks what’s the return on investment? Why is that?
Well, perhaps it is a non-issue because fire alarm systems have successfully been written into law, regulations of building codes and so on. Regulation is an interesting idea for simulation to be sure but probably not for a long time.
However, if you think about it beyond a regulatory requirement, the likelihood of a given fire alarm system actually saving a life is probably significantly less probable then a well-integrated simulation program that is providing patient safety programs designed around the needs of the institution it serves. Admittedly the image of hundreds of people being trapped in a burning building is probably more compelling to the finance guy then one patient at a time dying from hypoxia from a mismanaged difficult airway.
Do you really know what to do when the fire alarm system goes off in your hospital? I mean we have little rituals like close the doors etc. But what next? Do we run? If we run, do we run toward the fire? Or away from the fire? Do we evacuate all the patients? Do we individually call the fire department? Do we find hoses and start squirting out the fire?
When we conduct simulation-based training in hospitals that are aligned with the patient safety needs of the given institution we are extinguishing or minimizing the situation that patients will undergo or suffer from unintended patient harm. The existence of simulation programs and attention to patient safety education are a critical need for the infrastructure of any hospital caring for patients.
The more we can expand upon this concept and allow our expertise in simulation to contribute to the overall mission of the institution in reducing potential harm to patients and hospital staff, the more likely we will receive continuing support and be recognized as important infrastructure to providing the highest quality and safety to our patients.
Just like the fire alarm systems.
Debriefing in healthcare education is a specific type of communication designed to allow enhanced learning through a post hoc analysis and (ideally) structured conversation of an event. While there are many different styles and methods commonly described for use in healthcare simulations there are generally some consistent principles. Common features of the goals of just about every debriefing method includes attempting to ensure that the participants involved in the event leave with an understanding of areas in which they performed well and areas that they could improve upon should the face a similar situation in the future.
Debriefing is not easy to do well for a variety of reasons, and suffice it to say generally improves with practice and a focus on improvement. Depending on the facilitator and/or the learner(s) many people struggle with ensuring learners depart the debriefing with a clear understanding of areas needed for improvement. Other times debriefers can make the mistake of focusing only on the negative, forgetting to elucidate the things that may have been done well.
I believe we need to always incorporate the needs of the patient into the debriefing. The thought that the simulation benefits the patient should permeate throughout the planning of all events in healthcare simulation including the debriefing.
With the proliferation of simulation based learning over the last two decades there has been an increased interest in faculty development and training of people to develop debriefing skills. Nearly every discussion of faculty training in the simulation healthcare simulation space includes some discussion of the safe learning environment and student-centered learning. These concepts are embedded in nearly every discussion and every publication on debriefing and feedback.
Ostensibly the safe learning environment is referring to a facilitator controlling the environment of simulations and debriefings to provide an environment of comfort that encourages participants to be able to share freely what is on their mind during the simulation and the debriefing without fear of repercussion, ridicule or reprisal. I also believe that it should encourage simulation faculty to remain vigilant for opportunities that need some sort active facilitation to assist a participant thought to be struggling with the situation from either an emotional or perhaps stressful stimulus.
Having been involved in the teaching of healthcare providers for almost thirty years and when thinking backing to the late eighties, I personally participated in early “simulations” designed to “knock students off of their game”. Thus, I can certainly relate to, and applaud the emergence of the concept of a safe-environment.
However, I now believe that the concept of a student-centered approach to healthcare education contributes to the illusion that the student is the ultimate benefactor of healthcare education programs. The concept has evolved because of a natural parenteral feeling of protection for students, along with the fact that experiential learning can be stressful. Balancing these factors can likely contribute to highly effective learning as well as a positive learning experience for the participant.
When applied to healthcare education student-centered learning can be a bit misleading, perhaps a bit irresponsible, in so far that it completely ignores the fact that the patient is the ultimate recipient of the educational efforts. It may be more comfortable for the faculty in the immediate because the student is present and the patient is not. However, if you think about it, down-stream it is likely incomplete and ultimately may do a disservice to both the learners and their patients.
The challenge is that when the pervasive thought process is student-centered, the culture, requisite curriculum and learning opportunity design will favor such a position. This can subtly influence the debriefing and interactions with participants in a way that fails to correct inaccurate or poor performance and/or reinforce decisions or actions that should be carried forward to actual care.
My colleagues and I have coined the term Patient-Centered Debriefing. I originally talked about it on my simulation blog in 2013. In the training of debriefers and the modeling of debriefing, we encourage the consideration of the needs of the patient and these seems to pull to a more appropriate anchor point. This slight shift in focus can also help to humanize the situation beyond the needs of the learner. Taking on the responsibility of eventual care of an actual patient can shift the mindset of the instructor to ensure the real goals of the simulations are met.
What does patient-centered debriefing look like? At casual observation it would appear the same as any other debriefing that is conducted with acceptable methods in 2017 under a premise of student centered debriefing. The difference is the facilitator(s), as well as perhaps the students, would be considering the ultimate patient outcomes associated with the learning objectives of the given scenario. Thus, if properly conducted, facilitator(s) would be less likely to gloss over or omit reconciliation of mistakes and/or errors of commission or omission that occurred during a simulation that would likely contribute to adverse sequela for the patient in a comparable actual healthcare setting. Simultaneously, however the facilitator will be maintaining the enshrined traditional “safe learning environment”.
In considering the needs of the patient there is a subtle reminder that it is our job as healthcare educators to best prepare learners for this reality and the time that we have to do it in is precious. Further, particularly in simulation based learning it should be an ever present reminder that this is our ultimate purpose. I think it is particularly important for simulation facilitators who are not actively involved in the care of patients to consider this position. This is not to suggest that they are not doing a great job, but it seems like a reasonable active reminder to consider the needs of the patients who will be cared for by the learners involved in the simulation.
I am not suggesting that we abandon the attention to providing a safe learning environment for simulations as well as clinical learning environments. I do believe that this contributes to effective learning particularly in the simulated setting. I do believe that we need to reconsider the concept of student-centered learning insofar as the student being thought of as the epicenter of the overall education process and outcomes.
Reserving the definition and concepts of student centricity for considering the scholarly needs, learning styles, designs and appeals to the intrinsic motivating factors seem more appropriate. Any learning program in healthcare is far better to have a patient-centered axis from which all other actions and designs emerge.
I invite you to consider adopting a patient-centered debriefing into your work!
All too often it is easy to be stuck in a mindset which can create tunnel vision. One of those time frames in the simulation world can come from an overall short-sightedness, into the usefulness, power, wisdom and change that can result from well-run simulation efforts. Many people have heard the adage “with simulation is within the debriefing that all the learning occurs.” While phrases like this are meant to underscore the importance of the debriefing following a simulation if they are taken too literally they can result in a lack of recognition of total value of the simulation program investments and contributions.
This phenomenon is prevalent when evaluating the impact of simulation programs as part of patient safety efforts in healthcare systems in hospitals. In-situ simulation programs, or mock code evaluation programs are of unquestionable value to those of us who are in leadership in patient safety roles. Undoubtedly learning can occur during the simulation itself as I discussed in a previous blog post. Further, we all recognize the value of learning that can occur during well-run debriefing sessions. Lastly and perhaps most importantly great value can come from the information obtained during the simulation.
Scenario and debriefing sessions involved in in situ and other simulation programs that occur with practicing professional’s as participants have their limitations. First, and most practically is the operational recognition that healthcare professionals can only be kept “off-line” for a certain period of time to accomplish the simulation and debriefing. Secondly, some topics may be more sensitive than others and are not appropriate to be addressed directly with individuals during a debriefing that involves peers, as well as other healthcare colleagues. This point may be considered when evaluating the political and perceptions of your in-situ programs as received by the staff. Lastly, when you execute such a simulation there is only so much that can be absorbed at one point in time before cognitive overload becomes a significantly limiting factor.
Thinking traditionally from a “simulationist” point of view, is easy to think that all of the learning that will be recognized comes from the performance of the simulation combined with debriefing. With structure, planning and a systems-based approach to the simulation efforts, data can be gathered and analyzed to help a given hospital, or health system, understand the capabilities and limitation of their various clinical delivery systems. This can be invaluable learning for the system itself, which can then be incorporated into a plan of change to improve safety or in other cases efficiency in the delivery of care.
The given plan of change may incorporate additional educational efforts, policy, procedure or process changes that will be made in a more informed way than if the data from the simulation was not available. To garner such useful information at a systems-based level it is important that the curriculum integration be developed with consistent measurement strategies, objectives and tools that will allow meaning information to accrue.
A well planned, needs based targeted implementation strategy will create larger value than the simulation efforts occurring in a silo not connected to a larger strategic plan of improvement. If you think about a simulation event it is easy to picture small groups of people learning a great deal from the participation in the scenario or program. Simulation has the unique capability to abstract information to help provide insight into aspects of the patient care that both go smoothly as well as identify opportunities for improvement simultaneous with deployment of useful learning.
Once these opportunities are catalogued and recognized, a transformation of greater scale can take place through careful planning and implementation of further patient safety efforts with defined targets. Partnering with your risk management or patient safety colleagues to work on the integration plan can be valuable for increasing leadership buy-in for supporting your simulation efforts.
So I challenge you! If you are running relations in situ make sure that you keep in mind that your educational efforts during the simulation scenario are part of a bigger picture of increasing the safety and/or efficiency for providing care to patients, thus bringing a higher return on investment for the simulation efforts that you are conducting.
Until next time…… Happy Simulating!
Many people design scoring instruments for simulation encounters as part of an assessment plan. They are used for various reason ranging from a tool to help provide feedback, research purposes, to high stakes pass/fail criteria. Enhancing the ability for assessment tools to function as intended may often be closely linked to scenario design.
Often times checklists are employed. When designing checklists is critical that you are asking the question “Can I accurately measure this?”. It is easy to design checklists that seem intuitively simple and filled with common sense (from a clinical perspective) but are not actually able accurately measure what you think you are evaluating.
It is quite common to see checklists that have items such as “Observes Chest Rise”; “Identified Wheezing”; “Observed Heart Rate”. During faculty training sessions focusing on assessment tool development we routinely run scenarios that contain deliberate errors of omission. These items, some are routinely scored, or “checked” as completed. Why is this? Part of the answer is we are interjecting our own clinical bias into what we think the simulation participant is doing or thinking. This raises the possibility that we are not measuring what we are intending to measure, or assess.
Consider two checklist items for an asthma scenario, one is “Auscultates Lung Sounds”; another item is “Correctly Interprets Wheezing”. The former we can reasonably infer from watching the scenario and see the participant listen to lung fields on the simulator. The latter however is more complicated. We don’t know if the participant recognized wheezing or not by watching them listen to the lungs. Many people would check yes for “Correctly Interpreted Wheezing” if the next thing the participant did was order a bronchodilator. This would be an incorrect assumption, but could be rationalized in the mind of the evaluator because of a normal clinical sequence and context.
However, it may be completely wrong and the participant never interpreted the sounds as wheezing, but ordered a treatment because of a history of asthma. Or what would happen if the bronchodilator was ordered before auscultation of the lungs? What you have by itself, is an item on your checklist that seems simple enough, but is practically unmeasurable through simple observation.
This is where linking scenario design and assessment tools can come in handy. If the item you are trying to assess is a critical element of the learning and assessment plan perhaps something in the simulation, transition to, or during the debriefing can cause the information to be made available to more correctly or accurately, assess the item.
A solution to a real-time assessment during the flow of the scenario is possible within the design of the scenario. Perhaps inserting a confederate as a nurse caring for the patient that is scripted to ask “What did you hear?” after the participant auscultates the lungs fields. This will force the data to become available during the scenario for the assessor to act upon. Hence the term, forcing function.
Another possibility would be to have the participant complete a patient note on the encounter and evaluate their recording of the lung sounds. Another possibility would just be to have the participant write down what their interpretation of the lung sounds were. Or perhaps embed the question into the context of the debriefing. Any of these methods would provide a more accurate evaluation of the assessment item ““Correctly Interpreted Wheezing”.
While not trying to create a list of exhaustive methods I am trying to provide two things in this post. One is to have you critically evaluate your ability to accurately assess something that occurs within a scenario with higher validity. Secondly, recognize that creation of successful, reliable and valid assessment instruments are linked directly to scenario design. This can occur during the creation of the scenario, or can be as a modification to an existing scenario to enhance assessment capabilities.
This auscultation item just serves as a simple example. The realization of the challenges of accurate assessment of a participants performance is important to recognize to allow for the development of robust, valid and reliable tools. The next time you see or design a checklist or scoring tool think in your own mind….. Can I really, truly evaluate that item accurately? If not, can I modify the scenario or debriefing to force the information to be made available?
They are everywhere now days like characters in pokemon go. They seem to hang out in high concentration around new simulation centers.
You know the type. Usually they start off by saying how terrible it is for someone to give a lecture. Then they go on to espouse the virtues and values of student – centered education claiming active participation and small group learning is the pathway to the glory land. They often toss in terms like “flipped classroom”. And just to ensure you don’t question their educational expertise they use a word ending with “-gogy” in the same paragraph as the phrase “evidence-based”.
If you ask them where they have been in the last six months you find out that they probably went to a weekend healthcare education reform retreat or something equivalent…….
My principal concern with the today’s educational evangelist is that they are in search of a new way of doing everything. Often times they recommend complete and total overhauls to existing curriculum without regard to a true understanding of how to efficiently and effectively improve, and/or analyze the existing resources required to carry out such changes.
Further, the evangelist usually has a favorite methodology such as “small group learning”, “problem-based learning” or “simulation-based learning” that they are trying to convert everyone to through prophecy.
An easy target of all educational evangelist is the lecture, and often that is where the prophecy begins. They usually want to indicate that if lecture is happening, learning is not. As I discussed in a previous blog article lecture is not dead, and when done well, can be quite engaging and create significant opportunities for learning and is maximally efficient in terms of resources.
If you think about a critically it is just as easy to do lousy small group facilitation as it is to do a lousy lecture. Thus, the potential gains in learning will not achieve maximal potential. The difference is small group facilitation like simulation, generally take significantly more faculty resources.
The truth is the educational evangelist is a great person to have in and amongst the team. Their desire for change, generally instilled with significant passion are often a source of great energy. When harnessed they can help advance and revise curricula to maximize, and modernize various educational programs.
However, to be maximally efficient all significant changes should undergo pre-analysis, hopefully derived from a needs assessment, whether it is formal or informal. Secondly, it is worth having more than one opinion to decide the prioritization of what needs to be changed in a given curriculum. While the evangelist will be suggestive that the entire curriculum is broken, often times with a more balanced review you find out that there are areas of the curriculum that would benefit from such overhaul, and some aspects that are performing just fine.
When you begin to change aspects of the curriculum, start small and measure the change if possible. Moving forward on a step-by-step basis will usually provide a far better revised curriculum then an approach that “Throws out the baby and the bathwater”. Mix the opinions of the stalwarts of the existing curriculum methods with the evangelists. Challenge existing axioms, myths and entrenched beliefs like “Nothing can replace the real patient for learning….” If this process is led well, it will allow the decision making group to reach a considerably more informed position that will lead to sound decisions, change strategies, and guide investments appropriately.
So if you’re the leader or a member of a team responsible for a given curriculum of healthcare instruction and confronted with the educational evangelist, welcome their participation. Include them in the discussions moving forward with a balanced team of people have them strive to create an objective prioritization of the needs for change. This will allow you to make excellent decisions with regard to new technologies and/or methods that you should likely embrace for your program. More importantly you will avoid tossing out the things that are working and are cost efficient.
Most people have heard someone say “In Simulation, debriefing is where all of the learning occurs.” I frequently hear this when running faculty development workshops and programs, which isn’t as shocking as hearing this espoused at national and international meetings in front of large audiences! What a ridiculous statement without a shred of evidence or a realistic common sense approach to think it would be so. Sadly, I fear it represents an unfortunate instructor-centered perspective and/or a serious lack of appreciation for potential learning opportunities provided by simulation based education.
Many people academically toil over the technical definitions of the word feedback and try to contrast in from a description of debriefing as if they are juxtaposed. They often present it in a way as if one is good and the other is bad. There is a misguided notion that feedback is telling someone, or lecturing to someone to get a point across. I believe that is a narrow interpretation of the word. I think that there are tremendous opportunities for learning from many facets of simulation that may be considered feedback.
Well-designed simulation activities hopefully provide targeted learning opportunities of which part of it is experiential, sometimes immersive, in some way. I like to think of debriefing as one form of feedback that a learner may encounter during simulation based learning, commonly occurring after engaging in some sort of immersive learning activity or scenario. Debriefing can be special if done properly and will actually allow the learner to “discover” new knowledge, perhaps reinforce existing knowledge, or maybe even have corrections made to inaccurate knowledge. No matter how you look at it at the end of the day it is a form of feedback, that can likely lead, or contribute to learning. But to think that during the debriefing is the only opportunity for learning is incredibly short-sighted.
There are many other forms of feedback and learning opportunities that learners may experience in the course of well-designed simulation based learning. The experience of the simulation itself is ripe with opportunities for feedback. If a learner puts supplemental oxygen on a simulated patient that is demonstrating hypoxia on the monitor via the pulse oximetry measurements and the saturations improve, that is a form of feedback. Conversely, if the learner(s) forgets to provide the supplemental oxygen and the saturations or other signs of respiratory distress continue to worsen then that can be considered feedback as well. The latter two example examples are what I refer to as intrinsic feedback as they are embedded in the scenario design to provide clues to the learners, as well as to approximate what may happen to a real patient in a similar circumstance.
With regard to intrinsic feedback, it is only beneficial if it is recognized and properly interpreted by the learner(s) either while actively involved in the simulated clinical encounter, and if not, perhaps in the debriefing. The latter should be employed if the intrinsically designed feedback is important to accomplishing the learning objectives germane to the simulation.
There are still other forms of feedback that likely contribute to the learning that are not part of the debriefing. In the setting of a simulated learning encounter involving several learners, the delineation of duties, the acceptance or rejection of treatment suggestions are all potentially ripe for learning. If a learner suggests a therapy that is embraced by the team, or perhaps stimulates a group discussion during the course of the scenario the resultant conversation and ultimate decision can significantly add to the learning of the involved participants.
Continuing that same idea, perhaps the decision to provide, withhold, or check the dosage of a particularly therapy invokes a learner to check a reference, or otherwise look up a reference that provides valuable information that solidifies a piece of information in the mind of the leaner. The learner may announce such findings to the team while the scenario is still underway thereby sharing the knowledge with the rest of the treatment team. Waaah Laaaah…… more learning that may occur outside of the debriefing!
Finally, I believe there is an additional source of learning that occurs outside of the debriefing. Imagine when a learner experiences something or becomes aware of something during a scenario which causes them to realize they have a knowledge gap in that particular area. Maybe they forgot a critical drug indication, dosage or adverse interaction. Perhaps there was something that just stimulated their natural curiosity. It is possible that those potential learning items are not covered in the debriefing as they may not be core to the learning objectives. This may indeed stimulate the learner to engage in self-study to enhance their learning further to close that perceived area of a knowledge gap. What???? Why yes, more learning outside of the debriefing!
In fact, we hope that this type of stimulation occurs on the regular basis as a part of active learning that may have been prompted by the experiential aspects provided by simulation. Such individual stimulation of learning is identified in the sentinel publication of Dr. Barry Issenberg et al in Vol 27 of Medical Teacher in 2005 describing key features of effective simulation.
So hopefully I have convinced you, or reinforced your belief that the potential for learning from simulation based education spans far beyond the debriefing. Please recognize that this statement made by others likely reflects a serious misunderstanding and underappreciation for learning that can and should be considered with the use of simulation. The implication of such short-sightedness can have huge impacts on the efficiency and effectiveness of simulation that begin with curriculum and design.
So the next time you are incorporating simulation into your education endeavor, sit back and think of all of the potential during which learning may occur. Of course the debriefing in one such activity during which we hope learning to occur. Thinking beyond the debriefing and designing for the bigger picture of potential learning that can be experienced by the participants is likely going to help you achieve positive outcomes from your overall efforts.
One of the things that is a challenge for healthcare education is the reliance on random opportunity for clinical events to present themselves for a given group of learners to encounter as part of a pathway of a structured learning curriculum. This uncertainty of exposure and eventual development of competency is part of what keep our educational systems time-based which is fraught with inefficiencies by its very nature.
Simulation curriculum design at present often embeds simulation in a rather immature development model in which there is an “everybody does all of the simulations” approach. If there is a collection of some core topics that are part and parcel to a given program, combined with a belief, or perhaps proof, that simulation is a preferred modality for the topic, then it makes sense for those exposures. Let’s move beyond the topics or situations that are best experienced by everyone.
If you use a model of physician residency training for example, curriculum planners “hope” that over the course of a year a given first year resident will adequately manage an appropriate variety of cases. The types of cases, often categorized by primary diagnosis, is embedded in some curriculum accreditation document under the label “Year 1.” For the purposes of this discussion lets change the terminology from Year 1 to Level 1 as we look toward the future.
What if we had a way to know that a resident managed the cases, and managed them well for level one? Perhaps one resident could accomplish the level one goals in six months, and do it well. Let’s call that resident, Dr. Fast. This could then lead to a more appropriate advancement of the resident though the training program as opposed to them advancing by the date on the calendar.
Now let’s think about it from another angle. Another resident who didn’t quite see all of the cases, or the variety of cases needed, but they are managing things well when they do it. Let’s call them Dr. Slow. A third resident of the program is managing an adequate number and variety, but is having quality issues. Let’s refer to them as Dr. Mess. An honest assessment of the current system is that all three residents will likely be advanced to hire levels of responsibilities based on the calendar without substantial attempt at remediation of understanding of the underlying deficiencies.
What are the program or educational goals for Drs. Fast, Slow and Mess? What are the differences? What are the similarities? What information does the program need to begin thinking in this competency based model? Is that information available now? Will it likely be in the future? Does it make sense that we will spend time and resources to put all three residents through the same simulation curriculum?
While there may be many operational, culture, historical models and work conditions that provide barriers to such a model, thinking about a switch to a competency based model forces one to think deeper about the details of the overall mission. The true forms of educational methods, assessment tools, exposure to cases and environments, should be explored for both efficiency and effective effectiveness. Ultimately the outcomes we are trying to achieve for a given learner progressing through a program would be unveiled. Confidence in the underlying data will be a fundamental necessary component of a competency based system. In this simple model, the two functional data points are quantity and quality of given opportunities to learn and demonstrate competence.
This sets up intriguing possibilities for the embedding of simulation into the core curriculum to function in a more dynamic way and contribute mightily to the program outcomes.
Now think of the needs of Dr. Slow and Dr. Mess. If we had insight combined with reliable data, we could customize the simulation pathway for the learner to maximally benefit their progression through the program. We may need to provide supplement simulations to Dr. Slow to allow practice with a wider spectrum of cases, or a specific diagnosis, category of patient, or situation for them to obtain exposure. Ideally this additional exposure that is providing deliberate practice opportunities could also include learning objectives to help them increase their efficiencies.
In the case of Dr. Mess, the customization of the simulation portion of the curriculum provide deliberate practice opportunities with targeted feedback directly relevant to their area(s) of deficiency, ie a remediation model. This exposure for Dr. Mess could be constructed to provide a certain category of patient, or perhaps situation, that they are reported to handle poorly. The benefit in the case of Dr. Mess is the simulated environment can often be used to tease out the details of the underlying deficiency in a way that learning in the actual patient care environment is unable to expose.
Lastly, in our model recall that Dr. Fast may not require any “supplemental” simulation thus freeing up sparse simulation and human resources necessary to conduct it. This is part of the gains in efficiencies that can be realized through a competency -based approach to incorporating simulation into a given curriculum.
Considering a switch to a competency based curriculum in healthcare education can be overwhelming simply based on the number of operational and administrative challenges. However, using a concept of a competency based implementation as a theoretical model can help envision a more thoughtful approach to curricular integration of simulation. If we move forward in a deliberate attempt to utilize simulation in a more dynamic way, it will lead to increases in efficiencies and effectiveness along with providing better stewardship of scarce resources.