Tag Archives: design

Recreating Reality is NOT the goal of Healthcare Simulation

Discussing the real goals of Healthcare Simulation as it relates to the education of individuals and teams. Avoiding the tendency to put the primary focus into recreating reality, and instead providing the adequate experience that allows deep reflection and learning should be the primary focus. This will help you achieve more from your simulation efforts!

 

Leave a comment

Filed under scenario design

Learning from Simulation – Far more than the Debriefing

Most people have heard someone say “In Simulation, debriefing is where all of the learning occurs.” I frequently hear this when running faculty development workshops and programs, which isn’t as shocking as hearing this espoused at national and international meetings in front of large audiences! What a ridiculous statement without a shred of evidence or a realistic common sense approach to think it would be so. Sadly, I fear it represents an unfortunate instructor-centered perspective and/or a serious lack of appreciation for potential learning opportunities provided by simulation based education.LearningDuringSimulation2

Many people academically toil over the technical definitions of the word feedback and try to contrast in from a description of debriefing as if they are juxtaposed. They often present it in a way as if one is good and the other is bad. There is a misguided notion that feedback is telling someone, or lecturing to someone to get a point across. I believe that is a narrow interpretation of the word. I think that there are tremendous opportunities for learning from many facets of simulation that may be considered feedback.

Well-designed simulation activities hopefully provide targeted learning opportunities of which part of it is experiential, sometimes immersive, in some way. I like to think of debriefing as one form of feedback that a learner may encounter during simulation based learning, commonly occurring after engaging in some sort of immersive learning activity or scenario. Debriefing can be special if done properly and will actually allow the learner to “discover” new knowledge, perhaps reinforce existing knowledge, or maybe even have corrections made to inaccurate knowledge. No matter how you look at it at the end of the day it is a form of feedback, that can likely lead, or contribute to learning. But to think that during the debriefing is the only opportunity for learning is incredibly short-sighted.

There are many other forms of feedback and learning opportunities that learners may experience in the course of well-designed simulation based learning. The experience of the simulation itself is ripe with opportunities for feedback. If a learner puts supplemental oxygen on a simulated patient that is demonstrating hypoxia on the monitor via the pulse oximetry measurements and the saturations improve, that is a form of feedback. Conversely, if the learner(s) forgets to provide the supplemental oxygen and the saturations or other signs of respiratory distress continue to worsen then that can be considered feedback as well. The latter two example examples are what I refer to as intrinsic feedback as they are embedded in the scenario design to provide clues to the learners, as well as to approximate what may happen to a real patient in a similar circumstance.

With regard to intrinsic feedback, it is only beneficial if it is recognized and properly interpreted by the learner(s) either while actively involved in the simulated clinical encounter, and if not, perhaps in the debriefing. The latter should be employed if the intrinsically designed feedback is important to accomplishing the learning objectives germane to the simulation.

There are still other forms of feedback that likely contribute to the learning that are not part of the debriefing. In the setting of a simulated learning encounter involving several learners, the delineation of duties, the acceptance or rejection of treatment suggestions are all potentially ripe for learning. If a learner suggests a therapy that is embraced by the team, or perhaps stimulates a group discussion during the course of the scenario the resultant conversation and ultimate decision can significantly add to the learning of the involved participants.

Continuing that same idea, perhaps the decision to provide, withhold, or check the dosage of a particularly therapy invokes a learner to check a reference, or otherwise look up a reference that provides valuable information that solidifies a piece of information in the mind of the leaner. The learner may announce such findings to the team while the scenario is still underway thereby sharing the knowledge with the rest of the treatment team. Waaah Laaaah…… more learning that may occur outside of the debriefing!

Finally, I believe there is an additional source of learning that occurs outside of the debriefing. Imagine when a learner experiences something or becomes aware of something during a scenario which causes them to realize they have a knowledge gap in that particular area. Maybe they forgot a critical drug indication, dosage or adverse interaction. Perhaps there was something that just stimulated their natural curiosity. It is possible that those potential learning items are not covered in the debriefing as they may not be core to the learning objectives. This may indeed stimulate the learner to engage in self-study to enhance their learning further to close that perceived area of a knowledge gap. What???? Why yes, more learning outside of the debriefing!

In fact, we hope that this type of stimulation occurs on the regular basis as a part of active learning that may have been prompted by the experiential aspects provided by simulation. Such individual stimulation of learning is identified in the sentinel publication of Dr. Barry Issenberg et al in Vol 27 of Medical Teacher in 2005 describing key features of effective simulation.

So hopefully I have convinced you, or reinforced your belief that the potential for learning from simulation based education spans far beyond the debriefing. Please recognize that this statement made by others likely reflects a serious misunderstanding and underappreciation for learning that can and should be considered with the use of simulation. The implication of such short-sightedness can have huge impacts on the efficiency and effectiveness of simulation that begin with curriculum and design.

So the next time you are incorporating simulation into your education endeavor, sit back and think of all of the potential during which learning may occur. Of course the debriefing in one such activity during which we hope learning to occur. Thinking beyond the debriefing and designing for the bigger picture of potential learning that can be experienced by the participants is likely going to help you achieve positive outcomes from your overall efforts.

5 Comments

Filed under Uncategorized

Simulation Curriculum Integration via a Competency Based Model

Process_Integration.shutterstock_304375844One of the things that is a challenge for healthcare education is the reliance on random opportunity for clinical events to present themselves for a given group of learners to encounter as part of a pathway of a structured learning curriculum. This uncertainty of exposure and eventual development of competency is part of what keep our educational systems time-based which is fraught with inefficiencies by its very nature.

Simulation curriculum design at present often embeds simulation in a rather immature development model in which there is an “everybody does all of the simulations” approach. If there is a collection of some core topics that are part and parcel to a given program, combined with a belief, or perhaps proof, that simulation is a preferred modality for the topic, then it makes sense for those exposures. Let’s move beyond the topics or situations that are best experienced by everyone.

If you use a model of physician residency training for example, curriculum planners “hope” that over the course of a year a given first year resident will adequately manage an appropriate variety of cases. The types of cases, often categorized by primary diagnosis, is embedded in some curriculum accreditation document under the label “Year 1.” For the purposes of this discussion lets change the terminology from Year 1 to Level 1 as we look toward the future.

What if we had a way to know that a resident managed the cases, and managed them well for level one? Perhaps one resident could accomplish the level one goals in six months, and do it well. Let’s call that resident, Dr. Fast. This could then lead to a more appropriate advancement of the resident though the training program as opposed to them advancing by the date on the calendar.

Now let’s think about it from another angle. Another resident who didn’t quite see all of the cases, or the variety of cases needed, but they are managing things well when they do it. Let’s call them Dr. Slow. A third resident of the program is managing an adequate number and variety, but is having quality issues. Let’s refer to them as Dr. Mess. An honest assessment of the current system is that all three residents will likely be advanced to hire levels of responsibilities based on the calendar without substantial attempt at remediation of understanding of the underlying deficiencies.

What are the program or educational goals for Drs. Fast, Slow and Mess? What are the differences? What are the similarities? What information does the program need to begin thinking in this competency based model? Is that information available now? Will it likely be in the future? Does it make sense that we will spend time and resources to put all three residents through the same simulation curriculum?

While there may be many operational, culture, historical models and work conditions that provide barriers to such a model, thinking about a switch to a competency based model forces one to think deeper about the details of the overall mission. The true forms of educational methods, assessment tools, exposure to cases and environments, should be explored for both efficiency and effective effectiveness. Ultimately the outcomes we are trying to achieve for a given learner progressing through a program would be unveiled. Confidence in the underlying data will be a fundamental necessary component of a competency based system. In this simple model, the two functional data points are quantity and quality of given opportunities to learn and demonstrate competence.

This sets up intriguing possibilities for the embedding of simulation into the core curriculum to function in a more dynamic way and contribute mightily to the program outcomes.

Now think of the needs of Dr. Slow and Dr. Mess. If we had insight combined with reliable data, we could customize the simulation pathway for the learner to maximally benefit their progression through the program. We may need to provide supplement simulations to Dr. Slow to allow practice with a wider spectrum of cases, or a specific diagnosis, category of patient, or situation for them to obtain exposure. Ideally this additional exposure that is providing deliberate practice opportunities could also include learning objectives to help them increase their efficiencies.

In the case of Dr. Mess, the customization of the simulation portion of the curriculum provide deliberate practice opportunities with targeted feedback directly relevant to their area(s) of deficiency, ie a remediation model. This exposure for Dr. Mess could be constructed to provide a certain category of patient, or perhaps situation, that they are reported to handle poorly. The benefit in the case of Dr. Mess is the simulated environment can often be used to tease out the details of the underlying deficiency in a way that learning in the actual patient care environment is unable to expose.

Lastly, in our model recall that Dr. Fast may not require any “supplemental” simulation thus freeing up sparse simulation and human resources necessary to conduct it. This is part of the gains in efficiencies that can be realized through a competency -based approach to incorporating simulation into a given curriculum.

Considering a switch to a competency based curriculum in healthcare education can be overwhelming simply based on the number of operational and administrative challenges. However, using a concept of a competency based implementation as a theoretical model can help envision a more thoughtful approach to curricular integration of simulation. If we move forward in a deliberate attempt to utilize simulation in a more dynamic way, it will lead to increases in efficiencies and effectiveness along with providing better stewardship of scarce resources.

 

1 Comment

Filed under Uncategorized

Lecture: It’s not Dead Yet

LectureNotDeadFellow simulationists, let’s get real. We should not be the enemy of lecture. Lecture is a very valuable form of education. What we should be campaigning against are bad lectures, and the use of lecture when it isn’t the best tool for the associated attempt at education.

We have all listened to lectures that were horrific and/or lectures presented by speakers who have/had horrific public speaking or presenting skills. But in essence a good lecture can be an incredibly efficient transfer of information. The one to many configuration that is in inherent in the format of lecture can lead to an amazing amount of materials covered, interpreted and/or organized by the presenter to raise the level of knowledge or understanding of the people in attendance.

Like anything else in education we need to stratify the needs of what we are trying to teach and create solutions by which to teach them. With regard to lecture as a tool, we need to find ways to engage the audience into active participation to enhance the comprehension, learning and attention of the participants. There are many tools available for this, some involving technology, some not. The onus is on the presenter to seek out techniques as well as technologies or creative ways to engage people in the audience into an active learning process.

I don’t think of simulation as an alternative, or better way to teach, then lecture. I view lecture and simulation as two different tools available to the educational design process to affect good learning. Much the same way that I would not say a screwdriver is a better tool than a pair of pliers.

Too many times at simulation meetings and in discussions with simulation enthusiasts I hear empirical lecture-bashings if it is old school, out-moded or something lacking value. During these conversations it becomes readily apparent that the person speaking doesn’t have full command of the fact that the main goal is education, not simulation, and that there are many ways to create effective learning environments.

Now lecture can get a bad rap deservedly. Go to a meeting and listen to a boring monotonous speaker drone on and read from their powerpoint slides while not even recognizing that there is an audience in front of them. Unfortunately that is still more common than not at many physician and nursing meetings. Or worse yet, in the new age of converting to flipped classrooms and on-line learning, people are taking the easy way out and moving videos of lectures and plopping them on-line and calling it on-line learning. How pitiful. How painful. The only thing I can imagine worse than a bad lecture in person, is a bad lecture on web based learning that I would have to suffer through.

So I still teach and lead workshops on helping people enhance their lecturing and presentation skills. In part because I continue to recognize that not only will lecture be around for a long time, it should be around for a long time because it CAN be incredibly powerful with the right preparation and in the right hands. Also I continue to recognize the value of seeing modern healthcare education efforts being carefully thought out to understand which tool is best for which phase of learning after careful evaluation of the intended learner group and the topic at hand.

We need to end the silo-like thinking of simulation is better than lecture and convert to a more outcomes oriented thought process that evaluates and implements the appropriate educational tool for the intended educational accomplishments.

So let’s commit to each other to never do a simulation that could be just effective as an engaging lecture, AND lets all agree to never do a lecture that sucks.

2 Comments

Filed under Uncategorized

Are Routine Pre-and Post Simulation Knowledge Tests Ethical? Useful? To whom?

shutterstock_77554009X_aDisclaimer (before you read on): This post is not referring to research projects that have been through an institutional review board or other ethics committee reviews.

What I am actually referring to is the practice of many simulation programs that do routine written pre-test, followed by written posttest to attempt to document a change in the learner’s knowledge as a result of participating in the simulation. This is a very common practice of simulation programs. It seems the basis of such testing would be to eventually be able to use the anticipated increase in knowledge as a justification for the effectiveness of the simulation-based training.

However we must stop and wonder if this ethical? I believe as described in some of my previous posts that there is a contract that exists between participants of simulation encounters, and those who are the purveyors of such learning activities. As part of this contract we are agreeing to utilize the time of the participating in a way that is most advantageous to their educational efforts that help them become a better healthcare provider.

With regard to pretesting, we could argue from an educational standpoint that we are going to customize the simulation education to help tailor of the learning to the needs of the learners as guided by the results of some pretest. I.e. using to pretesting some sort of needs analysis fashion. But this argument requires that we actually used the results of said pre-test in this fashion.

The second argument and one that we embark upon in several of the programs of which I have designed is that we are assessing the baseline knowledge to evaluate the effectiveness of pre-course content, or pre-course knowledge that participants are programs to do either complete or possess prior to coming to the simulation center.  I.e.  A readiness assessment of sorts. In other words the question being is this person cognitively prepared to engage in the simulation endeavors that I am about to ask them to participate in.

Finally another argument from an educational standpoint for pretesting could be made that we would like to point out to the participants of the simulation areas of opportunity to enhance their learning. We could essentially say that we are helping the learner direct where they will pay close attention and focus on during the simulation activities or participation in the program. Again this is predicated on the fact that there will be a review of the pretest answers, and/or at least feedback to the intended participants of the simulation program on the topic areas, questions or subjects of which they did not answer the questions successfully.shutterstock_201601739-a

The posttest argument becomes a bit more difficult from an ethical perspective outside of the aforementioned justification of the simulation-based education. I suppose we could say that we are trying to continue to advise the learner on areas that we believe there are opportunity for improvement and hopefully inspire self-directed learning.

However my underlying belief is if we look at ourselves in the mirror, myself included, we are trying to collect the data over time so that we can perform some sort of retrospective review and hopefully uncover there was a significant change in pretest versus posttest testing scores that we can use to justify our simulation efforts in whole or in parts.

This becomes more and more concerning if for no other reason than it can lead to sloppy educational design. What I mean is if we are able to ADEQUATELY assess the objectives of a simulation program with a given pair written tests, it is likely more knowledge-based domain items we are assessing and we always have to question is simulation the most efficient and effective modality for this effort. I.e. if this is the case may be every time I give a lecture I should give a pre-and posttest (although this would make the CME industry happy) to determine the usefulness of my education and justify the time of the participants attending the session. Although in this example if I was lecturing and potentially enhancing knowledge, perhaps one could argue that a written test is the correct tool. However the example is intended to put out the impracticality and limited usefulness of such an endeavor.

As we continue to attempt to create arguments for the value of simulation and overcome the hurdles that are apparent as well as hidden, I think that we owe it to ourselves to decide whether such ROUTINE use of pre-and post-testing is significantly beneficial to the participants of our simulation, or are we justifying the need to do so on the half of the simulation entity. Because we owe it to our participants to ensure that the answer reflects the former in an honest appraisal.

1 Comment

Filed under Uncategorized

Simulation Programs Should Stop Selling Simulation

SimforSaleWhatever do I mean? Many established simulation programs believe that their value is through creating simulation programs for people by which to attain knowledge, skills and/or perfect aspects of that needed to effectively care for patients. All of that is true, obviously. However, I believe that the true value of many established simulation programs is in the deep educational infrastructure that they provide to the institution with whom they may be affiliated. Whether that expertise is in the project management of educational design, educational design itself, the housing of the cadre of people who are truly interested in education, or the operational scheduling and support needed to pull off a major program, I believe these examples are the true understated value of many simulation programs.

Simulation programs tend to attract a variety of people who are truly interested in education. While I don’t think that everyone who is passionate about teaching in healthcare needs to be an educational expert, I do believe that it is important that we have people involved in the development and deployment of innovative education who are truly interested in teaching. Many hospitals and universities rely on personnel to conduct their education programs that are subject matter experts, but may or may not have desire, interest or satisfactory capabilities needed for teaching.

Many people who are passionate about teaching in healthcare have a particular topic or two that they like to teach about, but lack the skills of critical analysis, and deeper knowledge of educational design principles to help them parse their education efforts into the appropriate methods to create maximal efficiency in the uptake of the subject matter.  This very factor is likely why we still rely on good old-fashioned lecture as a cornerstone of healthcare education whether we are evaluating that from the school perspective, or the practicing healthcare arena. Not that I believe there is anything wrong with lecture, I just believe that it is often overused, often done poorly, and often done in a way that does not encourage active engagement or active learning between the lecturer in the participant’s.

Simulation programs are often the water cooler in many institutions around which people that are truly interested in and may have some additional expertise in an education will tend to congregate. The power of this proximity creates an environment rich for brainstorming, enthusiasm for pushing the envelope of capabilities, and continuous challenge to improve the methods by which we undertake healthcare education.

Simulation programs that have curricular development capabilities often have project management expertise as well as operational expertise to create complex educational solutions. This combination of skills can be exceptionally valuable to the development of any innovative education program in healthcare whether or not simulation is part of the equation.

Many times healthcare education endeavors are undertaken by one or two people who quickly become overwhelmed without the supporting infrastructure that it takes to put on educational activities of a higher complexity than a simple lecture. Often times this supporting technology or set of resources resides inside the walls of “simulation centers” are programs. By not providing access to these para-simulation resources to the rest of the institution, I argue that simulation programs are selling themselves short.

If you consider the educational outcomes from a leadership perspective (i.e. CEO, Dean etc.), They are much less concerned about how the educational endeavor occurred, but far more focused on the outcomes. So while there are many topics and situations that are perfect for simulation proper, we all know there is a larger need for educational designs with complexity larger than that of a lecture that may not involve simulation.

If a given simulation program partners with those trying to create complex educational offerings that don’t directly involve simulation, but are good for the mission of the overall institution with whom they are aligned, it is likely going to endear, or create awareness for the need for continuing or expanding the support of that particular program by the senior leadership team.

If you sit back and think about it, isn’t that an example of great teamwork?

1 Comment

Filed under Uncategorized

SIMULATION AND THE ELECTRONIC HEALTH RECORD: MIND YOUR OBJECTIVES

There is a lot of disEHRandSim'cussion recently about incorporating electronic health record (EHR) into simulations. Which vendor? Which product? What features are needed? The disturbing thing about most of these discussions in my mind is that no one is talking about what they are trying to accomplish with the inclusion of electronic health records into the simulation environment.

What is the purpose of the EHR in it in a simulation? Is it simply to provide realism? If so, is the EHR that is implemented likely to be the one in the practice environment experienced by the student? Because if not, it is missing the mark likely adding confusion as well as increasing the orientation time necessary for a given simulation. Is the EHR supposed to provide crucial information that will help make healthcare decision during the simulation encounter? Is the entire simulation designed around an efficient query for specific information of a patient’s history? Are entries in the EHR made by the participants of simulation going to be analyzed for knowledge or critical thinking regarding a case? There are so many possibilities! I would argue however that integrating the EHR into the simulation simply for reality will likely be a colossal waste of time.

Much like any other component included in simulation the EHR should be included thoughtfully and carefully driven by needs analysis based on the learning objectives of the educational encounter. EHR technology can be overwhelming by itself to understand and navigate, combined with the fact that there are many different types of systems for different practice environments make it unwieldy to become expert in all brands, systems or examples.

Similarly, it if you have successful implementation of the EHR into your simulations I would recommend that you carefully decide for each and every simulating counter whether you need to include it or not. Again, this decision should rest upon the learning objectives and the intended educational outcomes of the event. Interacting with the EHR can be a time-consuming, frustrating part of the delivery of healthcare and it is up to the creator of the educational encounter to determine the usefulness and necessity of such integration.

The thoughtful use of EHR into select simulated encounters can significantly lead to increased observations of critical thought process, attention to detail, as well as overall understanding of the depth and breadth of understanding of a given case. Additionally it could serve as another avenue for assessment. If the integration of the EHR is predicated around these efforts and clearly the addition of the EHR component is both worthwhile and necessary. Additionally, simulations involving workflow and human factors can possibly benefit from such integration knowing that in today’s delivery of healthcare the interaction with the EHR is a daily reality.

I must close however with reminding the simulation community it is not our job to re-create reality, it is our job to create an innovative educational encounter from which we can form opinions to engage in discussions to help healthcare providers on their quest towards excellence.

2 Comments

Filed under Uncategorized