Dicerra

Michael Sandler Describes A Medication Error Due To Fatigue

Michael Sandler: So you asked for a clinical scenario that highlights the issues that clinicians face when it is relating to safety. And I think every single one of us, regardless of how long we’re going to practice that can identify with less safety incidents that occurred to us with our patients. And one that comes to mind for me, it just really encapsulates the whole issue around safety was a medication error that I was working in a large coronary referral intensive care unit, in province, and in, in Canada. We was a night shift, it was a weekend, I had come to work already fatigue, it was my third night shift in the row. It was Saturday night. And I remember that because the Olympics were in town and Sidney Crosby had just scored the golden goal. And we had spent the afternoon reveling in the amazingness. That was Team Canada. And I was assigned to a very sufficient, and I started off the shifts with no firing on all cylinders, obviously, I was tired, I was fatigued, I was a bit hungry, I had a whole host of distracting pieces in my life, including the fact that all of my friends were still celebrating the golden golden I was at work. And, you know, I had this I was in this space that probably was already at a risk for safety or quality issue. The individual that I was taking report from was just as excited to be gone from there as I was unexcited, to be there. And so our handover was less robust than is normal. And we miss the opportunity to exchange some very important information about the patient. And I was working with a crew, who was slightly less experienced than you would normally expect to see in the unit that I was working in with a crew of supporting residents and allied health practitioners who were also less experienced than you would normally expect to experience in an ICU of that size and nature. All related, of course, to the Olympics and external issue that had nothing to do with the patients who were in the unit and deserved our undivided attention and care. And, of course, at one o’clock in the morning, which is always a dangerous time for the application of clinical care. And you would you are aware of the research that indicates that after 16 hours, it is like operating a vehicle under the influence of alcohol with a blood alcohol level, I think of point five, if I recall. And then as you move past that you get more and more intoxicated. And I say that not in the sense of being truly intoxicated, but your reaction times and your thought processes and all of the things that you need to safeguard your patient against an error are impaired the same way they would be if you had been drinking. And so I was undoubtedly impaired. And my patient’s blood pressure was falling. And I was confused as to why this was happening. I didn’t have a good clinical understanding of what the scenario was. And it’s not a common issue to see a fall in blood pressure in an intensive care unit. And yet I had this inability to be able to solve the problem that was in front of me of these cognitive and cognitive barriers, both physical, emotional and mental. And I ended up choosing a course of action that in the cold light of day in retrospect, you would look back on this course of action and say to yourself, What exactly were you thinking when you decided that that was a good idea. And so what I said ostensibly did was instead of using the medication that was already at the bedside, and refusing to allow the patient to support their blood pressure, I decided that I would go get an entirely new one and start this new medication when that would solve the problem. And so I did that I went to the AMI cell which is our medication dispensing machine and I put in the patient’s name and phn number and I put in the medication that I wanted and it said to me, are you sure you want to do this big like warning? Like, you know, this is a high risk medication and you really want to do this? And I remember thinking to myself, yes, yes, I really want to do this. Just give me the medication this time. sensitive scenario, I would like to do this. And so I brought this medication back to the bedside. And I got to the point where I was about to administer it. And we also have a policy in place and other forest function that says you need to bring someone else into double check to make sure that this is a good idea. Because really, we don’t want you doing this entirely by yourself when you’re tired and fatigued and hungry and okay. So you know, knock on the glass, hey, come help me out. And this is where your biases come in. This is where bias training is so important. And I had none of those things at the time. And I remember holding up the medication saying, hey, is this medication X? Yep, this is the, this is what you see. Yeah, you see this? And so unfortunately, my partner was like, oh, yeah, it’s exactly what I see. Because I told them what they received. In fact, it had nothing to do with what I was holding. And so, you know, now I’ve completed this independent double check in pegging this medication. And I start the medication. And I go to chart the medication. And a colleague who I’ve known for a very long time is literally walking by with a cup of coffee. And this is where you definitely don’t want to predicate your safety systems on the individual walking by with a cup of coffee as the final check in terms of making sure that someone doesn’t get irreparably harmed. And they happen to glance over at me. And they happened to look down and what I was doing, and they asked me a question, what are you doing, I just started this medication. And she said to me, well, that’s not what you’re writing down. It’s not the medication that you’re writing. And I looked at her, and I looked at the medication that I was running, which was not the medication that I was thinking that I was reading. And I was positive that I was giving this patient a medication that was going to solve a problem, when in fact, I was giving them a medication that was going to make problems. And so thankfully, I had this interaction at that moment before there was any irreparable harm. But it just aligned for me, the safety conversation that we have in the practice environment every day, some people refer to it as the Swiss cheese model. Some people refer to it as the bias model and the slow channel, fast channel thinking, there’s all of these ways in which you can look at it, but I had slid into I had let my petite my human factors get in the way of my thinking process. And then I had slid into a fast channel thinking process which removed the final safety barriers that should be applied in any safe system. And it was by force of luck, that I was in a position to not arm the patient, there was an opportunity to create great arm, and was provenance that somebody wandered by at the right moments, and had the wherewithal to actually look at what I was doing, that probably saved this patient’s life. And it is those near miss events in your career that forcefully propel you into the conversation around safety, and why safety and safety systems, human factors training, and the ability to spend time really engaging with what a good safety process should look like becomes so important, because I think every clinician has experienced what I experienced, which is that near miss, that could have just turned out disastrous, but didn’t. But only because we got lucky. And luck is not a great way to engage in delivery of care. Yeah,

TK: I absolutely agree. And I agree with your statement that everybody else has likely experienced some version of that over the course of their career. And if they haven’t, they will, the hierarchy triangle that you see in aviation, where you have one fatality at the top of the triangle, and underneath it, there’s 10 major accidents, and underneath that is 30 minor. And then underneath that there’s a wedge of 600, near misses, which is, I guess, this just cool research and the likelihood of a near near miss contributing to a catastrophe in aviation. And I would wager that it’s something very similar in healthcare in terms of the number of near misses that happen, just getting people off on sheer luck, whether it’s all the holes in the cheese lineup, but one at the very end, whether it’s fatigue that’s getting in the way or systems in the way, there’s so many of those near misses, that we ought to learn from that we probably have no way of capturing or at least haven’t captured. And we got to wrap up here for time, but can you close this out and maybe on whether or not that event was was captured in a in a write up of some kind?

Michael Sandler: This is the issue. So the event was captured in the safety system that was provided at the time and it went to an individual who reviewed it and came and talked to me about it concluded the investigation there and there is no further opportunity to really delve into the slices of the Swiss cheese that had lined up so perfectly that evening, there was no conversation about fatigue mitigation, for example, utilizing scheduling, or utilizing staff composition or scope of practice, or any of those things that could have led to interrupting that process. There is no conversation around systems safety training, there is no conversation around. There’s no conversations around human factors training and being able to identify the antecedent pieces that have led to some of that decision making. So there was no safety checklist that we developed out of that conversation and said, Hey, listen, if you show up to work, and you’re fatigued, you’re hungry, and you’re tired, and we never implemented any of those things. And we could take learnings from the aviation industry safety checklist, this is what you need to do to be safe to come to work, you have to have so many hours of pillar time before you can you can’t have all of those things we never spent any time taking a look at after the fact. And so I think this is our opportunity to really dig into a process that allows for not only organizations to take a look at what those pieces are than how they lined up, but also for clinicians to feel comfortable sharing that and saying listen, this is where I was this is the slice of cheese that this is my slice in this process. How can I ensure that that hole doesn’t line up with the other three that are on either side? So I’m hopeful that Sarah is well positioned to be able to answer that question and but I am confident that I am not the first or the last clinician to have a conversation with this.

TK: Thanks, Michael. Much appreciated.

 

Related posts

Part 2 – Dr. Bill Bestic, Ex-special Forces Operator Turned Trauma Physician Talks Human Performance In Medicine & Aviation

A uniquely talented commercial helicopter pilot, special forces officer in the NZSAS, anaesthetist, and trauma physician; Dr. Bill Bestic talks about human performance, excellence, and fallibility in medicine from an Australian perspective.

Dr. Brian Goldman On Blame Culture – Highlights

Dicerra interviews the host of CBC's White Coat, Black Art on the culture of reporting and its effects on human performance in healthcare.

Full Interview – Dr. Brian Goldman On Improving Healthcare Delivery In Canada

Dicerra's full interview with Dr. Brian Goldman from CBC's White Coat, Black Art where he discusses human performance, human kindness, and the obstacles that must be overcome to improve healthcare delivery.

Part 1 – Dr. Bill Bestic, Ex-special Forces Operator Turned Trauma Physician Talks Human Performance In Medicine & Aviation

A uniquely talented commercial helicopter pilot, special forces officer in the NZSAS, anaesthetist, and trauma physician; Dr. Bill Bestic talks about human performance, excellence, and fallibility in medicine from an Australian perspective.

Related posts

Part 2 – Dr. Bill Bestic, Ex-special Forces Operator Turned Trauma Physician Talks Human Performance In Medicine & Aviation

A uniquely talented commercial helicopter pilot, special forces officer in the NZSAS, anaesthetist, and trauma physician; Dr. Bill Bestic talks about human performance, excellence, and fallibility in...

Dr. Brian Goldman On Blame Culture – Highlights

Dicerra interviews the host of CBC's White Coat, Black Art on the culture of reporting and its effects on human performance in healthcare...

Full Interview – Dr. Brian Goldman On Improving Healthcare Delivery In Canada

Dicerra's full interview with Dr. Brian Goldman from CBC's White Coat, Black Art where he discusses human performance, human kindness, and the obstacles that must be overcome to improve healthcare...

Part 1 – Dr. Bill Bestic, Ex-special Forces Operator Turned Trauma Physician Talks Human Performance In Medicine & Aviation

A uniquely talented commercial helicopter pilot, special forces officer in the NZSAS, anaesthetist, and trauma physician; Dr. Bill Bestic talks about human performance, excellence, and fallibility in...