Don’t be Confused! Every Simulation is an Assessment

 

Recently as I lecture and conduct workshops I have been asking people who run simulations how often they do assessments with their simulations. The answers are astounding. Every time there are a few too many people reporting that they are performing assessments less than 100% of the time that they run their simulations. Then they are shocked when I tell them that they do assessments EVERY TIME they run their simulations.

While some of this may be a bit of a play on words there should be careful consideration given to the fact that each time we run a simulation scenario we must be assessing the student(s) that are the learners. If we are going to deliver feedback, whether intrinsic to the design of the simulation, or promote discovery during a debriefing process, somewhere at some point we had to decide what we thought they did well and identify areas for needed improvement. To be able to do this you had to perform an assessment.

Kundenbewertungen - Rezensionen

Now let’s dissect a bit. Many people tend to equate the word assessment with some sort of grade assignment. Classically we think of a test that may have some threshold of passing or failing or contribute in some way to figure out if someone has mastered certain learnings. Often this may be part of the steps one needs to move on, graduate, or perhaps obtain a license to practice. The technical term for this type of assessment is summative. People in healthcare are all too familiar with such types of assessment!

Other times however, assessments can be made periodically with a goal of NOT whether someone has mastered something, but with more of a focus of figuring out what one needs to do to get better at what they are trying to learn. The technical term for this is formative assessment. Stated another way, formative assessment is used to promote more learning while summative assesses whether something was learned.

When things can get even more confusing is when assessment activities can have components or traits of both types of assessment activities. None the less, what is less important then the technical details is the self-realization and acceptance of simulation faculty members that every time you observe a simulation and then lead a debriefing you are conducting an assessment.

Such realization should allow you to understand that there is really no such thing as non-judgmental debriefing or non-judgement observations of a simulation-based learning encounter. All goal directed debriefing MUST be predicated upon someone’s judgement of the performance of the participant(s) of the simulation. Elsewise you cannot provide and optimally promote discovery of the needed understanding of areas that require improvement, and/or understanding of the topic, skills, or decisions that were carried out correctly during the simulation.

So, if you are going to take the time and effort to conduct simulations, please be sure and understand that assessment, and rendering judgement of performance, is an integral part of the learning process. Once this concept is fully embraced by the simulation educator greater clarity can be gained in ways to optimize assessment vantage points in the design of simulations. Deciding the assessment goals with some specificity early in the process of simulation scenario design can lead to better decisions associated design elements of the scenario. The optimizing of scenario design to enhance “assess-ability” will help you whether you are applying your assessments in a formative or summative way!

So, go forth and create, facilitate and debrief simulation-based learning encounters with a keen fresh new understanding that every simulation is an assessment!

Until Next Time Happy Simulating!

Leave a comment

Filed under assessment, Curriculum, design, scenario design, simulation

Three Things True Simulationists Should NEVER Say Again

From Wiktionary: Noun. simulationist (plural simulationists) An artist involved in the simulationism art movement. One who designs or uses a simulation. One who believes in the simulation hypothesis.

Woman taping-up mans mouth

 

After attending, viewing or being involved in hundreds if not thousands of simulation lectures, webinars, workshops, briefings and conversations there are a few things that I hear that make me cringe more than others. In this post I am trying to simmer it down to the top three things that I think we should ban from the conversations and vocabularies of simulationists around the globe!

1. Simulation will never replace learning from real patients!: Of course it wont! That’s not the goal. In fact, in some aspects simulation offers some advantages over learning on real patients. And doubly in fact, real patients have some advantages too! STOP being apologetic for simulation as a methodology. When this is said it is essentially deferring to real patients as some sort of holy grail or gold standard against which to measure. CRAAAAAAAZY……   Learning on real patients is but one methodology by which to attack the complex journey of teaching, learning and assessing the competence of a person or a team of people who are engaged in healthcare.  All the methodologies associated with this goal of education have their own advantages, disadvantages, capabilities and limitations. When we agree with people and say simulation will never replace learning from real patients, or allow that notion to go unchallenged, we are doing a short service to the big picture of creating a holistic education program for learners. See previous blog post on learning on real patients. 

2. In simulation, debriefing is where all of the learning occurs!: You know you have heard this baloney before. Ahhhhhhhhhhhhh such statements are purely misinformed, not backed up by a shred of evidence, kind of contrary to COMMON SENSE, as well as demeaning to the participants as well as the staff and faculty that construct such simulations. The people who still make this statement are still stuck in a world of instructor centricity. In other words, “They are saying go experience all of that…… and then when I run the debriefing the learning will commence.” The other group of people are trying to hard sell you some training on debriefing and then make you think it is some mystical power held by only a certain few people on the planet. Kinda cra’ cra’ (slang for crazy) if you think about it.

When one says something to articulate learning cannot occur during the simulation is confirming that they are quite unthoughtful about how they construct the entire learning encounter. It also hints at the fact that they don’t take the construct of the simulation itself very seriously. The immersive experience that people are exposed to during the simulation and before the debriefing can be and should be constructed in a way that provides built in feedback, observations, as well as experiences that contribute to a feeling of success and/or recognition of the need for improvement. See previous blog post  on learning beyond debriefing

3. Recreation of reality provides the best simulation! [or some variant of this statement]: When I hear this concept even eluded to, I get tachycardic, diaphoretic, and dilated pupils. My fight or flight nervous system gets fully engaged and trust me, I don’t have any planning on running. 😊

[disclaimer on this one: I’m not talking about the type of simulation that is designed for human factors, and/or critical environmental design decisions or packaging/marketing etc. which depend upon a close replication to reality.]

This is one of the signs of a complete novice and/or misinformed person or sometimes groups of people! If you think it through it is a rather ludicrous position. Further, I believe trying to conform to this principle is one of the biggest barriers to success of many simulation endeavors. People spent inordinate amounts of time trying to put their best theatrical foot forward to try to re-create reality. Often what is actually occurring is expanding the time to set up the simulation, expanding the time to reset the simulation and dramatically increasing the time to clean up from the simulation. (All of the after mentioned time intervals increase the overall cost of the individual simulation, thereby reducing the efficiency.) While I am a huge fan of loosely modeling scenarios off of real cases in an attempt to create an environment with some sense of familiarity to the clinical analog, I frequently see people going to extremes trying to re-create details of reality.

We have hundreds and thousands of design decisions to make for even moderately complex scenarios. Every decision we make to include something to try to imitate reality has the potential to potentially cause confusion if not carefully thought out. It is easy to introduce confusion in the attempts to re-create reality since learners engage in simulation with a sense of hyper-vigilance that likely does not occur in the same fashion when they are in the real clinical learning environment. See previous blog post on cognitive third space.

If you really think about it the simulation is designed to have people perform something to allow them to learn, as well as to allow observers to form opinions about the things that the learner(s) did well, and those areas that can be improved upon. Carefully selecting how a scenario unfolds, and/or the equipment that is used to allow this performance to occur is part of the complex decision-making associated with creating simulations. The scenario should be engineered to exploit the areas, actions, situations or time frames that are desired focal points of the learning and assessment objectives.  Attention should be paid to the specifics of the learning and assessment objectives to ensure that the included cache of equipment and/or environmental accoutrements are selected to minimize the potential of confusion, create the most efficient pathway that allows the occurrence of the assessment that contributes improving the learning.

Lastly, lets put stock into the learning contract we are engaging in with our learners. We need to treat them like adult learners. (After all everybody wants to throw in the phrase adult learning principles…. Haha).

Let’s face it: A half amputated leg of a trauma patient with other signs and symptoms of hemorrhagic shock that has a blood-soaked towel under it is probably good enough for our adult learners to get the picture and we don’t actually need blood shooting out of the wound and all over the room. While the former might not be as theatrically sexy, the latter certainly contributes to the overall cost (time and resource) of the simulation. We all need to realistically ask, “what’s the value?”

While my time is up for this post, and I promised to limit my comments to only three, I cannot resist to share with you two other statements or concepts that were in the running for the top three. The first is “If you are not video recording your scenarios you cannot do adequate debriefing”, and the second one is “The simulator should never die.” (Maybe I’ll expand the rant about these and others in the future 😉).

Well… That’s a wrap. I’m off to a week of skiing with family and friends in Colorado!

Until next time,

Happy Simulating!

6 Comments

Filed under Curriculum, debriefing, scenario design, simulation

Don’t Let the Theory Wonks Slow Down the Progress of Healthcare Simulation

AdobeStock_85761977_rasterized

Those of us in the simulation community know well that when used appropriately and effectively simulation allows for amazing learning and contributes to students and providers of healthcare improving the craft. We also know there is very little published literature that conclusively demonstrates the “right way to do it”.

Yet in the scholarly literature there is still a struggle to define best practices and ways to move forward. I believe it is becoming a rate limiting step in helping people get started, grow and flourish in the development of simulation efforts.

I believe that part of the struggle is a diversity of the mission of various simulation programs ranging from entry level students to practicing professionals, varying foci on individualized learning incompetence, versus and/or team working communications training etc. Part of the challenges in these types of scholarly endeavors people try to describe a “one-size-fits-all“ approach to the solution of best practices. To me, this seems ridiculous when you consider the depths and breadth of possibilities for simulation in healthcare.

I believe another barrier (and FINALLY, the real point of this blog post 🙂  is trying to overly theorize everything that goes on with simulation and shooting down scholarly efforts to publish and disseminate successes in simulation based on some missing link to some often-esoteric deep theory in learning. While I believe that attachments to learning theory are important, I think it is ridiculous to think that every decision, best practice and policy in simulation, or experimental design, needs to reach back and betide to some learning theory to be effective.

As I have the good fortune to review a significant number simulation papers it is concerning to me to see many of my fellow reviewers shredding people’s efforts based on ties to learning theories, as well as their own interpretations on how simulation should be conducted. They have decided by reading the literature that is out there (of which there is very little, if any, conclusive arguments on best practices) has become a standard.

My most recent example is that of a paper I reviewed of a manuscript describing an experimental design looking at conducting simulation one way with a certain technology and comparing it to conducting the simulation another way without the technology. The authors then went on to report the resulting differences. As long as the testing circumstances are clearly articulated, along with the intentions and limitations, this is the type of literature the needs to appear for the simulation community to evaluate and digest, and build upon.

Time after time after time more recently I am seeing arguments steeped in theory attachments that seem to indicate this type of experimental testing is irrelevant, or worse yet inappropriate. There is a time and place for theoretical underpinnings and separately there is a time and place for attempting to move things forward with good solid implementation studies.

The theory wonks are holding up the valuable dissemination of information that could assist simulation efforts moving forward. Such information is crucial to assist us collectively to advance the community of practice of healthcare simulation forward to help improve healthcare globally.  There is a time to theorize and a time to get work done.

While I invite the theorist to postulate new and better ways to do things based on their philosophies, let those in the operational world, tell their stories of successes and opportunities as they are discovered.

Or perhaps it is time that we develop a forum or publication of high quality, that provides a better vehicle for dissemination of such information.

So…… in the mean time….. beware of the theory wonks. Try not to let them deter from your efforts to not only move your own simulation investigations forward, but to be able to disseminate and share them with the rest of the world!

Leave a comment

Filed under Curriculum, design, patient safety, return on investment

FIVE TIPS on effectively engaging adult learners in healthcare simulation

Leave a comment

Filed under Curriculum, design

True Systems Integration for Hospital Based Simulation Programs

Businessman is using tablet pc and selecting integrationHospital based simulation programs serve a different need than their counterparts housed in schools of medicine and nursing. The stakeholders, the mission, the program assessment and development of curriculum vary significantly. Not to over-generalize but the overall mission of the school focused simulation programs is based around having them integrated into the education processes that contribute to the development of successful students who will be called graduates. Many times, these students end up taking licensing, certifying or other high-stakes examinations that can serve as a convenient data set to assess the impacts of programs.

The mission of hospital, or health-system based programs can be more complex in terms of alignment within the organization. There is a myriad of possibilities within the healthcare delivery environment that can drive the objectives of simulation programs. Examples range from employee training and education; quality, safety or risk based; or perhaps focusing on facilities engineering perspectives. With all of these possibilities the potential strategies for measurement markers to evaluate the success of the program can become blurry, and at times harder to have ready access to the necessary information.

In an era of healthcare cost reductions that we are experiencing now in the United States and many other areas of the world there is significant pressure coming from many different sides to reduce costs and at the same time improve the quality of care. Thus, to prevail in this era of medicine any entity within healthcare delivery system that costs money to operate (like simulation programs) needs to ensure it is providing value to the hospital or system which supports it.

Determining such value can be very challenging. While there are a couple of examples in the literature of isolated value calculations (such as central line training) the utility of such reports is limited in isolation. In total they are only a minute part of the safety problems associated with the delivery of care in the hospital.

Determining the best value of a hospital based program can be achieved through a series of needs assessments that require the simulation leadership to establish relationships in the hospital leadership teams or C-Suites outside of folks involved in education. The true needs assessment comes from participating in a deep understanding of the existing problems, challenges, solutions and successes that the c-suite is incurring to execute the mission of the hospital. This information is often housed in offices of risk management, quality or patient safety.

Integration with the risk management team can better position the simulation program to understand the legal risks from errors and litigation that is currently facing the hospital. Identifying trends and subject matter that could benefit from simulation training can emerge.

Quality offices generally have significant amounts of information regarding the initiatives that the hospital should be, or is focusing on to better provide care to patients. Such initiatives are often based on measurement programs from payers (insurance companies, whether private of government such as Medicare) that result in significant financial risks and/or benefits for the organization. Thus, identifying simulation solutions that could benefit the initiative in some form or fashion can result in value creation for the program.

Patient Safety Offices (sometimes under, or aligned with quality offices) house much of the data on mistakes, small and large, and in some cases near misses, that are occurring in an institution. Such data will also have information on trends, as well as if there was harm transferred to the patient.

Access to this data over time can help to identify the true needs of organization, and help direct a value-based implementation of the simulation efforts. Importantly though, a careful analysis of this data can also help the simulation program recognize what is not likely to bring as much value to the organization.

Two things are important when considering such integration efforts. The first is, even though there is a new era of transparency emerging regarding patient safety, the information is sensitive. To achieve true integration the simulation program leadership needs to establish relationships across the organization. Ideally you desire not only access to the data, but also a presence that positions themselves closer to the core of the analysis and decision making. Many simulation programs remain peripheral to such processes and thus experience a contractor-vendor type of relationship instead of one more akin to an active partner. It takes time, trust and effort to develop such relationships.

Secondly, a dispassionate evaluation of the data that is achieved from the needs analysis is necessary with regard to properly interpreting the value provided by the simulation program. Many simulation programs are born of a passion to simulate, a passion of the first faculty members, and an attachment to legacy programs that have been running for years. For true alignment within an complex organization and surviving future value analysis initiatives (ie. Remaining supported and funded) a program needs to take a hard look at its existing programs and ensure they are pegged to the overall “true” needs of the institution at large.

While this post is not representative of all the possible strategies to integrate a simulation program, it is meant to give insight into a few examples of possibilities, and articulate the depth of the relationships that should be developed.

 

Leave a comment

Filed under hospital, patient safety, return on investment, Uncategorized

Recreating Reality is NOT the goal of Healthcare Simulation

Discussing the real goals of Healthcare Simulation as it relates to the education of individuals and teams. Avoiding the tendency to put the primary focus into recreating reality, and instead providing the adequate experience that allows deep reflection and learning should be the primary focus. This will help you achieve more from your simulation efforts!

 

Leave a comment

Filed under scenario design

Simulation Programs, Hospitals and Health Systems: Where is the organizational fit?

Some excerpts taken from a plenary speech I delivered in Taipei, Taiwan recently to healthcare leaders and education directors. It is important that simulation programs position themselves within complex healthcare systems to be able to deliver maximal benefit to the organization. High performing simulation programs need to deliver more than educational resources to the organization.

 

 

 

 

1 Comment

Filed under hospital, patient safety