Can you give us a brief career path update – how did you get to your current role with Global Medical Response (GMR) and how have your experiences in the industry shaped your approach to patient safety?
I spent the formative years of my career working as an emergency department nurse. After nearly a decade, I moved to the intensive care unit (ICU) and then on to flight nursing. All in all, I took care of patients for approximately 25 years. It was such a gratifying career. One of the best aspects of nursing for me was the opportunity to care for people, no matter who they were, where they came from, or how they arrived at their current condition. I was blessed to work alongside and learn from incredible colleagues and mentors who were not only excellent professionally, but who were also remarkably compassionate. We witnessed so much human suffering and pain, caring for patients and families in what was often their darkest hour. But there was also joy, healing, humor, and constant reminders of the resilience of the human spirit. I learned that, even in the midst of the most difficult and tragic situations, beauty can rise from the ashes. I always try to remember that when times are tough.
Even early in my career, I was struck by the needless pain, suffering, and costs – both economic and non-economic – related to preventable illnesses and injuries. I developed a keen interest in prevention and pursued a graduate degree in nursing focused on population-based health. My interest then was illness and injury prevention, keeping people healthy and out of the hospital to begin with, and also decreasing emergency department recidivism.
Personal experience and ‘Just Culture’
My focus shifted after I became a flight nurse. Five years into my flight nursing career, in 2005, we had a fatal crash in which three friends and colleagues perished. To say this was devastating is an understatement. The helicopter crashed in deep water and parts of the aircraft were irretrievable, so the question as to why the crash occurred remains – as does our grief.
In a surreal twist to my career and life, I was in a helicopter crash a month later. We were in a brand-new aircraft, complete with all the technological bells and whistles touted to be the ‘silver bullet’ for safety at the time. We lifted from a rooftop helipad at a hospital with a patient on board, lost power, and crashed to the ground below. Thankfully we all survived without permanent injuries. The aircraft was destroyed.
This time, as everyone survived and the aircraft was retrievable, we could learn why the crash happened. My highly experienced pilot had inadvertently lifted from the rooftop helipad with one engine in idle. His employment was subsequently terminated. My perception at the time was that his employer felt the problem was solved with his termination, and I struggled with that notion. In hindsight, I was looking for a ‘Just Culture’ before Just Culture was ‘a thing’.
I’ve since spent years studying how to: 1) prevent adverse events from occurring; 2) respond to events in a way that minimizes damage and maximizes learning; 3) recover individually and organizationally from adverse events; and 4) apply lessons learned and improve processes to mitigate future risk.
The US National Transportation Safety Board (NTSB) report validated my suspicion that there was much more to this crash than ‘pilot error’. It cited inconsistencies in training, among other issues. Furthermore, in my opinion, the fact that the aircraft allowed the pilot to lift at all with one engine in idle constituted a single point of failure for a critical process that should be engineered out of aircraft designs, or any critical equipment designs for that matter.
To manage risks effectively, we need to understand what the risks are to begin with
The more I analyzed this crash and all of the contributing factors beyond the pilot, the more I realized that the latent hazards and conditions remained in place just waiting for the next fallible human to wander into the crosshairs of a flawed system. These latent hazards and system weaknesses needed to be addressed if we were to truly solve the problem. Simply removing the individual from the pointy end was not in-and-of-itself a solution, as is true in most cases.
The NTSB is responsible for identifying the ‘probable cause’ of crashes. In the air medical transportindustry, the vast majority of NTSB reports cite 'pilot's failure to…,’ (aka, ‘pilot error’) as the probable cause of crashes. My perception is that we in the air medical industry have historically likely found a false sense of security in this, as the pilots involved are either terminated or, tragically, have died. In reality, the pilot may be a part of the proximate cause of a crash, but there are almost always upstream contributing, and potentially causal, factors as well. As Drs Scott Shappell and Douglas Wiegmann point out in their book A Human Error Approach to Aviation Accident Analysis: the human factors analysis and classification system (2003) pp.26-28, ‘the human is rarely, if ever, the sole cause of an error or accident. So, if your goal is to reduce accidents… efforts must focus on the system as a whole, not just the human component.’
If we continue to blame the individuals at the pointy-end and look no further, we will continue to have the same kinds of incidents and accidents over and over again because we’re not addressing the system problems. This is just as true in healthcare as it is in aviation. In the medical transport industry, we operate at the intersection of healthcare and medical transport – both of which are highly complex, high-consequence industries. If we want to prevent harm, it’s paramount that we evaluate our systems, that we refuse to oversimplify as we seek contributing factors as well as risk mitigation strategies. And we have to look out for any unintended consequences our proposed mitigation strategies may produce.
This is not to say we simply absolve people from personal accountability. Not at all. Individuals are responsible for their choices and behaviors within any given system. We, as individuals, are also responsible for helping to identify and communicate shortcomings and weakness in our systems. We should always strive to reach our highest potential and peak performance as individuals and teams. And often, we are truly the last line of defense when it comes to preventing harm to other people and ourselves. But we, as humans, are prone to error, complacency, and drift, and our systems need to be designed to help prevent our fallibility from causing harm.
Thus, organizations need to make a concerted effort to evaluate their systems and optimize them, in as much as that’s humanly possible, to maximize the potential for our people to be successful at what they do. My hope is that the pendulum will swing away from predominantly blaming the individual at the pointy end towards a systems-oriented approach, but without being exclusively focused on that either. The pendulum needs to stay in the middle, striking a balance, and creating truly shared accountability between people and systems.
Shared accountability is a foundational tenet of a Just Culture, and I’ve seen many organizations who believe they have a Just Culture, but in reality, they focus exclusively on the individual and neglect to look at themselves in the mirror to determine how their systems contributed to the issue at hand. Shared accountability requires self-reflection from the organization, so a culture is not truly ‘Just’ until that occurs.
I do believe the industry has come a long way when it comes to managing risk and I am truly standing on the shoulders of giants. But we’ll always be a work in progress – the pursuit of excellence and high-reliability is, and should always be, endless.
Survivors Network for the Air Medical Community
Although I continued to work as a flight nurse for nearly a year after the crash I was in, I found that I was experiencing symptoms of post-traumatic stress injury. That, coupled with the fact that I was struggling with the lack of a systems-approach to the crash, led me to step away from flying. I got some help and ultimately recovered well. In the meantime, I met some other helicopter crash survivors from across the industry and together we formed the Survivors Network for the Air Medical Community, which is not only a resource for crash survivors, but for everyone surviving life in this and related professions.
I proceeded to split my professional time between the Survivors Network, volunteering for the National EMS Memorial/Air Medical Memorials, and working in emergency departments. After a few years, I received a call from an industry colleague who was familiar with the work I’d done, who was wondering if I’d be interested in a patient safety position at Med-Trans Corporation. It seemed to be right up my alley, so I applied for and was accepted into the position.
In this position I applied the philosophy and approach that I had wished had been applied to the crash I was in. I continued to study and refined my approach to risk mitigation and responding to events effectively, focusing largely on bringing some risk mitigation strategies from aviation to the clinical care arena. I combined theories and models that incorporated the major principles I’d learned over the years since my crash: High Reliability, Just Culture, Human Factors, Crew Resource Management, and personal accountability/pursuit of peak performance.
Most importantly, at the heart of this model is the health and wellbeing of our employees. It is foundational to everything we do. Life in this and related industries was difficult enough before all of the events of the past two years. Now more than ever we need to support our people, ensure they have appropriate and accessible resources, and create a culture that de-stigmatizes asking for and receiving help. They need to be able to find joy in their lives, have sustainable careers, and be healthy enough to manage the risks and complexities of the work they do every day. I found this approach, at least anecdotally, to be appreciated and accepted by our frontline teams and leadership alike. It made sense to them.
Global Medical Response
After Global Medical Response (GMR) was formed, I was asked if I was interested in the National Director of Patient Safety position. While it seemed incredibly daunting, I agreed. I am fortunate to have landed on both the clinical leadership and safety leadership teams, each under the guidance of long-time industry leaders for whom I have immense respect. I find myself in the company of many like-minded people who are committed to the pursuit of excellence, who are not afraid to ask difficult questions, and who work hard to find and implement effective solutions and strategies for mitigating risks. I am grateful to have a voice at this table, and we have a lot of work to do to continue to contribute to the improvement of care delivery during air and ground transport, in EMS, and within our communities.
How is GMR ensuring that safety – of patients and crews – is paramount to the operations on a daily basis (investment in fleet/avionics and training, for instance)? What moves is GMR making to enhance patient care onboard its air medical fleet?
When we talk about ‘safety’ I always think of this quote from Jerome Lederer, Flight Safety Foundation and NASA. I agree with his point about ‘risk management’ versus ‘safety’:
“Risk management is a more realistic term than safety. It implies that hazards are ever present, that they must be identified, analyzed, evaluated and controlled, or rationally accepted.”
When we talk about ‘GMR’ it’s important to remember that GMR is a conglomeration of multiple air certificates and ground operations from across the country. So, there is variation across cultures, operations, training, and clinical practice that is driven by certificate, state, and local regulations, as well as local cultures of GMR entities countrywide. This is true of EMS in general across the US as well.
Our leadership at GMR is committed to building a consistent foundational culture in which people are the focus – our patients, our teams, and our communities
Our leadership at GMR recognizes this variation and is committed to building a consistent foundational culture in which people are the focus – our patients, our teams, and our communities. This is a critical step in helping to manage the multitude of risks that are ever-present in this industry. Furthermore, considering everything our teams, along with everyone in healthcare, have been through in the past two years, there’s no better time than the present to pause and reset as we shift from crisis mode and set a deliberate course towards culture-building and a renewed focus on excellence.
By creating a culture in which people are treated respectfully and feel supported, aren’t afraid to speak up, and aren’t afraid to report errors, we can collaboratively identify risks and system weaknesses and work toward mitigation strategies. We can become more proactive than reactive, especially when people report near-misses. We can build better systems with the input of those who know best – the people who do the work every day.
The next investment that GMR has made along these lines is instituting a high reliability approach to risk management. There is organization-wide education and training occurring to prepare our teams in how to look at risk, systems, and how to treat people fairly and consistently. It is essentially the next evolution of Just Culture.
A holistic approach
Further, GMR created a department called GMR Life, which is dedicated to ensuring our teams have resources for wellness and mental health, along with streamlined access to trauma-informed providers should they seek professional help. Burnout, stress injury, and compassion fatigue are significant issues impacting healthcare professions, especially after the events of the last few years and it’s essential that our teams get the support they need. GMR Life also provides education to our teams on a wide variety of health and wellness-related topics, from stress injury formation to healthy relationships and beyond. As I mentioned earlier, the health and wellness of our teams is paramount to their identification and mitigation of the risks they encounter daily, as they need to have capacity for complexity. This is our foundation, along with the culture we are continuing to build.
Next, GMR belongs to a Patient Safety Organization (PSO), the Center for Patient Safety. This allows us to have transparent conversations around events within our organization that are protected, and offers us a venue to share de-identified data along with lessons learned with the EMS/air medical community at large.
With the foundational components of wellness and culture, of which the high reliability approach is a cornerstone, the next critical step is giving our teams a voice. We have a safety/patient safety reporting system which has been in existence for many years on the air side and we are now expanding it to the ground side. This will help to empower our teams to speak up when they identify systems issues, errors, near-misses, or other incidents. Then, using the High Reliability/Just Culture methodology, we can collaboratively manage the issue at hand and develop strategies to mitigate future risk. The reporting system is not only a venue to manage events acutely, but it also helps us to monitor and manage trends, and collect data to share with the Patient Safety Organization.
Between healthy people, a supportive culture, and a venue to discuss and manage risks, we can collaboratively build better systems. We can create safety nets that help prevent some human error (it is not realistic to think we can eliminate human error) to begin with, and that will also catch errors before they result in catastrophe. We can employ engineering controls when possible, which is the strongest of interventions.
Effective planning and preparation
In all aspects of our organization, to manage risks effectively we need to understand what the risks are to begin with. It is incumbent upon us as an organization to provide the equipment/tools, policy/processes, training, and support our teams require to identify and manage risk, and ultimately provide excellent service and care, no matter their role in the organization.
I think one of the best examples of this is a transport AirMed International completed recently. They successfully flew an ECMO patient from Portland, Oregon, to Dubai. Their planning and preparation took weeks with incredible attention to detail. Between the people and systems: training, equipment, policy/processes, preparation, support, and expertise, they were able to successfully mitigate the risks of this highly complex transport and provide excellent service and care to this patient. This is an extreme example of complexity and risk management, but we can and should replicate this approach and tailor it to all of our operations.
What if something had gone wrong during this transport? When we consider the incredible complexity, variables, systems beyond our own at play, the criticality of the patient, and other confounding factors, it would make zero sense to exclusively focus on the teams involved in the transport. Ideally, we would examine all of the complexities, variables, and systems to determine how the team ended up in the position they did and how they made decisions within that context, and we would respond accordingly with a spirit of shared accountability.
Identifying systemic issues
In the US, a nurse has recently been criminally charged for a fatal medication error that occurred in a hospital. Just as in the aforementioned transport, there are significant complexities and variables that contribute to risk in delivery of care within healthcare systems. According to reports, the nurse self-reported her error when she realized it. The regulator’s report on the hospital identified substantial system issues that contributed to the event.
This case has sparked debate, inflamed emotions, and deeply impacted not just nurses but healthcare professionals across the spectrum and with good reason. In the context of what we have asked of our healthcare professionals caring for patients through a pandemic, with all the challenges and a spike in complexity and risk that is likely unprecedented in modern day healthcare, it is unrealistic to think errors won’t occur. And it seems unthinkable that errors could be criminalized. I’m not saying that we can’t take action, punitive or otherwise, for reckless behavior and choices – we absolutely must. But criminalizing? I can’t imagine my pilot’s error being criminalized – that would have been devastating and completely counterproductive. Conversations across social media illustrate significant concern regarding the potential for this verdict to lead to a reluctance to report errors and a detrimental impact to patient safety. This verdict has created a culture of fear.
This is a tragedy all the way around. First and foremost, the consequence of the error was tragic for the patient and her family and my heart breaks for them. My heart goes out to the nurse as well. Even though she should certainly be held accountable for substandard nursing practice, putting her behind bars doesn’t solve any problems and actually creates a myriad of new ones.
It is incumbent upon all of us to learn from this and do our best to prevent this and similar tragedies from occurring again. We are all vulnerable to error and this tragedy could have happened anywhere. But how can we do learn if people are afraid of reporting? How do we respond to leading, rather than lagging, indicators if we are reluctant to have these important conversations because of fear? The tragedy is magnified if we aren’t able to learn because this verdict has hobbled our patient safety efforts.
From my perspective, this case actually demonstrates the need to increase reporting efforts so we can collectively identify the systems issues that allow these types of tragedies to occur. We need to heed the Institute for Healthcare Improvement’s wisdom: “This case should be a wake-up call to health system leaders who need to proactively identify system faults and risks and prevent harm to patients and those who care for them.”
We can’t let fear stemming from this verdict hold us back. And, since safety and risk management shouldn’t be proprietary, we need to continue to collaborate within the industry as we work on risk mitigation strategies for patient care as well as for general safety in aviation and ground operations.
Improving healthcare delivery
The last thing we want at GMR is a culture of fear. We are continuously working to develop and support our people and create the culture and systems conducive to managing risks, learning lessons, improving processes, and contributing to the national conversation on improving healthcare delivery. It’s all a work in progress in this organization with massive scope and scale, but I believe we have the vision and are heading in the right direction. Reflecting back on my early interest in prevention and experiencing the crash I was in, this is the approach I was looking for all those years ago. It seems I’ve come full circle and I am grateful to have a voice at the table at GMR and to help support the care our frontline teams provide every day.
What role do you think accreditation of air medical services plays in enhancing patient care and safety in the industry?
I think accreditation is important because it sets a bar and standards that might not otherwise be set.