Dicerra

Partie 1 - Dr Bill Bestic, ancien opérateur des forces spéciales devenu médecin spécialisé en traumatologie, aborde les performances humaines en médecine et aviation.

TK: Hello folks and welcome to today’s episode of the Dicerra Podcast. I’m Theon te Koeti, the CEO and founder of Dicerra, a web and mobile platform designed to advance the professions of healthcare and aviation. On today’s episode, we are super lucky to welcome Dr. Bill Bestwick, who has a super interesting career span that I think would take most of us three lifetimes to achieve. A few points to note, he was a Special Forces officer and the New Zealand SAS, and for those who are watching and listening that don’t know what that is, it’s a tier one special force. And there’s actually a great Docu series available on YouTube, I believe called “NZSAS First Among Equals,” highly recommended. But that wasn’t enough, left the SAS, became a doctor and an anaesthetist and just put the icing on top, a helicopter pilot as well. So we’re super lucky to have you, Dr. Bestic. And I’d like to welcome you and ask if you could please introduce yourself and maybe share some of the highlights of your incredible path into medicine.

 

BESTIC: Thanks, TK. Thanks for inviting me. You know, we’ve chatted earlier, but certainly, I think we’ve got a lot of crossover your background as being a pilot, and in aviation. I think I can certainly recognize a lot of the same mindset that you have, and certainly the various professions I’ve had, although there might seem quite disparate, I haven’t really had to fully retrain in any of those because there’s so much crossover between each of those professions, and we’ll probably touch on some of those crossovers a bit later. But yeah, the the mindset, the skill set of each of those are built on the one previously and certainly when I’ve ended up in the, I’m a very, very junior piloting a very much a fraud to call myself a pilot. But I work part time as a commercial helicopter pilot. And even though I’m very very junior in my flying status, the aviation mindsets really resonated with me. So right from those early days of being trained as a student pilot, the skills and mindset being drilled into me by instructors were stuff that took me years to learn in another professions. So I was really impressed that aviation starts with that approach and the others, it’s either unspoken, or it’s just gradually formed on. So yeah, to go back to your original question. I started in the New Zealand Army. Which New Zealand Defence Force which I know you were a member of as well. And spent a few years there in the infantry and then did selection end up in the unit for a while, and then came over to the Australian Army and spent a couple of years over in the Australian Army decided to go to med school on a bit of a whim, there’s a TV show called ER, if you remember back in the day.

 

TK: I remember it, yeah.

 

BESTIC: I thought, yeah, I’ll have a crack at that. The absolute naivety and blustery, youthful confidence without actually fully investigating what might be involved which sometimes helpful, right? 

 

TK: Right. 

 

BESTIC: So I stumbled into that following a dream. And in the meantime, paid my way through med school by doing private security consulting in Middle East and South America and other places. And then afterwards, when I had a bit more time on my hands, went and did my commercial helicopter licence and now I just sort of share between the two jobs.

 

TK: That’s huge. The last time we were chatting we discussed some of the advantages in aviation mindset, have brought to your practice as a physician, particularly in the Human Factors context, which you kind of alluded to. Can you explain to our listeners the difference between, say, aviation and healthcare in the way fatigue for example, is treated?

 

BESTIC: I think between the two in one industry it’s driven from within. And the other industry, it’s driven from without. So my observation is that, amongst pilots, generally speaking, of course, it’s a bell curve. The concept of managing fatigue is seen to be important. 

 

TK: That’s right.

 

BESTIC: That to fly tired is a really stupid thing to do. That it will cause accidents, that it’s reckless, and there’s nothing cool about being up all night and then getting in an aircraft and flying. Now part of that might be because I came to aviation in my 40s versus my 20s. But it’s generally accepted and I noticed when I worked on the rescue helicopter as a doctor about 10 years ago, the paramedics on the helicopter and the pilots were always most concerned whether the doctor had been up all night. Because frequently we had been. We didn’t think anything of it, we might have been up all night in the hospital and then turn up for a helicopter shift in the morning. And in the morning brief, everyone would look at you as a doctor and go, were you on night shifts or on call last night, and you’d think uh maybe and then if you had been they would take you off the helicopter or they would ground the helicopters for the day. 

 

TK: Okay. So your like part of the flight crew and 

 

BESTIC: You’re part of the crew, and your, making your behaviour has made all of us, put all of us at risk with your fatigue. Even though you’re just a numpty in the back as a passenger basically. So I was really kind of like, wow, this is, this is pretty weird. And in fact, the morning brief would finish. Once we started with okay, what’s the status of the helicopters? And how many hours we got left on the airframe? What jobs do we know? What’s the weather? Then the brief would finish, we’d go around the room and there are always two helicopters on generally. So everyone would say anything that might affect them, on this duty today. Someone might say actually, “I’m going through a divorce and my mind is not in the game today.” Someone else might say, “you guys will know, but I’ve got a new baby and I’m trying to sleep, it has kept me awake at night. So keep an eye on me today.” And I would look around the room at all these roughly tufty Taipei blades talking about their feelings and how they might be tired and thinking, “this is really weird. I’ve never seen this.” Now contrast this to the medical world where they’re trying to bring in safe working hours from above, from without. But the culture within, is that “suck it up man.”

 

TK: And that starts in residency, doesn’t it? 24 hours shifts. “We did it, so you’ve got to suck it up and do it.”

 

BESTIC: Exactly, exactly is a bit of a running joke about you know, as a consultant, when you leave to go home during the night, usually hand over to a senior trainee, and will sometimes look at them and say, “if you need me, just get on the phone, you know, don’t be afraid to cope.” It’s this kind of thing like, yeah, I’m supposed to tell you to call me but feel free to cope on your own right. 

 

TK: Right.

 

BESTIC: Because actually, if you don’t call me, then I’m completely absolved of anything that goes wrong. Certainly don’t call me to say, “Hey, I just want to let you know I’m doing this and I don’t need you” because then you’ve woken me up for no reason.

 

TK: Right. I mean, all of the the literature on the effects of fatigue on the human body and what it does and how inebriated you can be after 18 plus hours of continuous wakefulness. It comes from the medical community, that’s what’s informed aviation’s, you know, safe practice with respect to crew rest. Yet it seems like within the medical community, from whence it came, there is still a cultural resistance to adopting it. And I imagine not a not a complete dump on, on healthcare. It’s also you know, you have problems with scheduling and with men personnel, and there’s only, you know, there’s only so many doctors, there’s only so many on call staff. And if you need to have an ER staffed perhaps, you have no other choice than to have somebody on for really, really long shifts. Is there another way to mitigate this?

 

BESTIC: Yeah, I mean, again, there is always a way to mitigate it. You know, if I look at, say, my colleagues that work for some of the airlines, they have standby pilots, right. So if you call in sick, there’s a standby roster to call that we’re in, even by calling fatigue, right? “Hey, I’m fatigued.” And the standby guy gets called. We don’t have even a backup and medicine. If you call in sick for your shift, there’s no cover for you. In fact, it’s just more work for your colleagues. So you’re guilted into coming to the work tired or unwell because there is no system because the system won’t pay for a second set of medical people to be available for the ones that are sick. There’s a bit of a can do culture. If, if I’m running theatres at night, and I work in a major tertiary Trauma Center, we generally are staff for one operating theatre to run all through the night. And it’s not uncommon for us to do that, every single night of the year. If we have to run a second case, to say we’re doing an urgent head injury. And then someone needs an urgent caesarean section for the baby, we can sort of just run two theatres at a bit of a pinch. It’s not full, fully resourced, but we can do it. And then if a third case comes in, which it sometimes does, we’re not actually resourced for that at all, but frequently, we’ll just do it anyway. 

 

TK: So you’re using the same staff to bounce around between those 3, okay.

 

BESTIC: So I might say to my anaesthetic nurse, who’s supposed to be allocated to my theatre, “Hey, I’m pretty good here for a while, why don’t you duck next door, to the theatre next door.” And we’ll just kind of keep the door open between the theatre and we’ll just kind of help each other out. And we just sort of muddle our way through. Because the alternative is we just let the patient bleed to death, sitting at the entrance to the operating theatre, so we don’t have enough people. And then the next and I’ve had nurses say to me, you know, they’ll get really tired of this, because every time we make it happen, no one fixes the problem. And if I escalate that up the chain to say, hey, look, we had to do this case last night with not enough people, I would get told, “Well, no one made you do it.” So you’ve got two options, as a physician at two o’clock in the morning, you just do it, or you complain about it. And nothing happens. If you did nothing. let that patient die, ethically and morally, it’s pretty difficult to do that. And you’d be widely criticised by your colleagues for what, you were tired? Like seriously, I once called in fatigue, once on the helicopter actually. Because I thought I’ve got a bit of fatigue culture. I’ve been up all day, got called in that evening, went all night doing a fixed wing retrieval of a sick patient. And as I handed them off, at seven o’clock the next morning to a hospital, they redeployed me for a third task. By midday, on the second full day, they tried to deploy me for a fourth task. And this is the medical component of the retrieval unit. And I said, “Hey, look, I’ll do it if there’s no one else, but I gotta tell you, I’ve watched three shift changes of paramedics and nurses in the same time that I’ve been on. Three, and I’m the same person.” And the response I got at the other end was,”oh, were calling in fatigue are we?”

 

TK: That’s unbelievable. 

 

BESTIC: So I feel really guilty. Now, this is 10 years ago, maybe things have changed. So so there’s this is real resistance to declaring that fatigue is a thing. Like if you’ve got COVID? Well, no one can debate that and you come in with COVID, we don’t want to give it to patients. So actually, they’re not allowed for you to. But if you don’t test, well, that’s easy, too. You just come in with COVID. So there’s a there’s A. there’s not appreciation that it’s a problem and B. it’s the system’s not supporting you as in paying for it. And C. within the culture, you know, I get a trainee say, “Hey, I’m really tired, because I’ve done two nightshift this week.” My immediate thought is, “well, when I was training, I would have done twice that many night shifts,” you know. 


TK: When you say you know, there’s just not support at the system level. First thing that brings to mind is a couple of things. One, that when an accident happens, or when care is compromised in some way, it usually only affects maybe one or a handful of people rather than three or 400 people at a shot. So it’s not a newsworthy incident, if fatigue contributes to compromise care and healthcare as it is in aviation. And the second thing would be there are probably other data points to support it either. Because when you’re dealing with that single report to a single person in the chain, and they kind of dismiss you like “oh, you’re fatigued, are you,” it never goes anywhere. As opposed to having you know, 30,000 different data points reported over a very short period of time that people at the system level, leadership at the system level have to take into account now the data is there in front of them. And it’s it’s inexcusable. But of course, that doesn’t help you guys on the frontlines at the clinician level. In the interim, it’s going to take time to change culture, it takes time to build up data. So is there anything that you can do when you’re, when you know, you’re fatigued, and you’re, you know, you’re integrated within a team that may not be quite as fatigued as you that you can suggest to your team, like holes in the cheese are starting the lineup and can you provide an additional layer of safety, knowing that I’m potentially compromised right now. How would you approach that at a medical context with your team?

 

BESTIC: Yeah, so there’s, there’s certainly strategies that we can employ as workarounds to fatigue. In fact, I should find it, there was a memo from one hospital years ago that suggested that you should drink coffee when you’re tired as a doctor. That came from management. So rather than acknowledge that you might be tired. It said just have coffee. Thanks, there’s great, cheers. Appreciate that. I hope you sent email before you went home at five o’clock too, that’s important.

 

TK: If the coffee doesn’t work, try a Redbull. 

 

BESTIC: Yeah, so none of the coffee machines work, so you need to bring your own. So I think the workarounds that I’ve developed and seen used well. And so I will get told, for example, by senior colleagues when I finished my training, some of these tips. One of them was when you’re really tired, don’t put any chairs in the operating theatre, then you won’t sit down and fall asleep. To the fact that we even discussed that. Imagine that mentioned in the cockpit, “why are you standing? I’m just I’m too tired. If I sit I’ll fall asleep, I’m just going to try and stand and fly this thing, I should be able to, this will keep me more alert.” So that was one. The other thing I’ve tended to do is and following on from the Aviation style is to say so I’ll give you an example. In that, very recently, I ended up being awake for 18 hours. And then I had a four hour break. And then I did another 11 hour shift. And that’s stupid, right stupid, I shouldn’t have done that. But the way it sort of insidiously comes on is that, you know, I’m in a public hospital on the, on the Tuesday, and then I’m in the private hospital on the Wednesday. So there’s, there’s no real cover for me on that next day. And I happen to get called in for a major trauma that took a lot longer than expected, didn’t finish till two or three in the morning. So I go to the private list, the next morning, and there’s 11 patients on it. And I say to the nurse who I know, “hey, look, I just want to let you know that I haven’t had much sleep. And I know it’s unprofessional. But you need to keep an eye on me.” Because I now have insight that I’m going to make errors that I’m not even aware I’m making. That’s the really, really dangerous part of fatigue is, you know, if you were filmed on a day that you were tired, and they played back, here’s the 14 errors that you made, that you didn’t even know that you made. Did you know that you picked up the phone and said something that was totally unintelligible and hung up and thought that was normal? I mean, all that stuff, right? So I said, Look, I am going to make mistakes today that I’m not even going to, so I need you to be doubly vigilant on me. One of the 11 patients I know has an allergy to cefazolin, and every patient is going to get cefazolin. So let’s together come up with a strategy to stop me accidentally giving that to the patient. How about every patient, you say, “Hey, Bill is this the patient with a cefazolin allergy?” And I’ll engineer out of the solution by not pre drawing it up like I usually do. So worst case, I forget the drug. But it’s only prophylaxis. So that’s better than giving the drug that’s going to kill someone. So I’m sort of managing the risk of it here. Let’s schedule a break at 10 o’clock. Kidney caffeinated, and fed. And I would, because I’m going to be freshest in the morning, I’ll pre draw a lot of the drugs for the cases and label them so that that’ll reduce drug error down the track. And I’ll let the surgeon know that I’m a bit tired as well. So there’s sort of some human factors, strategies that I learned from aviation potentially or some workarounds to fatigue, that have been quite useful, you know, essentially threat and error management. But then I think back to say, time in the military and the unit, we never manage fatigue there either. We just, we just boxed on. But I think we were younger. There’s not as much really at stake. Although we say It’s life and death, it’s actually a lot worse when you’re flying an aircraft, or you’re dealing with a patient.

 

TK: It’s interesting, you know, going back to what you were saying about some of these suggestions to take away chairs, or just to drink coffee, and that will solve everything. And it seems like addressing symptoms rather than the root cause and what you’re able to drill down to using a little bit of human factors back around from aviation was kind of reverse engineering, the root cause analysis to say, “What could possibly go wrong? And where would we trace it back to? Drawing up these these drugs and the fact that somebody has an allergy and the other 10 Patients are all going to get the same thing. So taking the starting point of where an investigation would go back to and then nipping the potential error right in the bud right at the source. How valuable Do you think it would be for other folks in you know, in health care who have grown up in the culture of just suck it up and you know, the pride of working a 24 hour shifts, who have had no exposure really to the other culture, the flipside culture in aviation, to having maybe some CRM or some human factors training, just to make people a little bit more aware of their fallibility, and that it’s not a pride issue. It’s just part of the human condition that we ought to be aware of. And there’s tools that you can use to leverage the team around you. How important would training be and how important would leadership be, like the small team level to implement that?

 

BESTIC: Oh it would be critical. Like I said, it has to come from within. The physician, the individual has to feel that fatigue is a problem. And I think when we’ve started to look at culture and cultural issues with institutions that I’ve worked in, I’ve found the most successful approach is to vouch it in terms of performance, because everyone’s, you’ve got to find a self interest. People are pretty self centred, I think. So if you can say “This will reduce your error rate, this will make you better, this will make you better than the rest. This will be a performance advantage, a life hack.” People are more interested than saying, you know,”This is a safety issue. It comes from above.” So I think that’s important that it’s vouched in terms of performance. The second issue is that you alluded to it earlier around investigation. I have never seen an investigation into a medical error. That’s mentioned fatigue, even asks what the doctors roster was like, not once ever.

 

TK: That is quite, that’s shocking, actually, from an aviation background where it’s always a question of threat.

 

BESTIC: So this week’s been quite interesting, in Australia. There’s been a very public reporting about a death of a young man, in a hospital Intensive Care Unit. It was a motor vehicle accident, and on about day 12, we had a tracheotomy. So that’s a tube in through directly in the front of the neck. When patients are intubated through the mouth for more than nine or 10 days, the intensive care unit, they’ll frequently take the tube out and put it in through the neck because it can last longer. And he also had facial fractures. During his admission, this got dislodged and he ended up dying of hypoxic brain injury. And the coroner has just published their report. This is open source reporting, anybody can find it. There’s some interesting points out of that. The coroner has referred something like four of the doctors for disciplinary action to the medical board for unsatisfactory professional conduct. So this raises some really interesting issues, that they had a bit of a strategy that they’ve pre written a plan. And when they followed the plan, the plan didn’t work. So they were heavily criticised for following the plan. They were told you should have exercised better clinical judgement. And I think this report speaks to the heart of the conflict that doctors have at work. If we follow it blindly, follow a protocol we’re criticised. If we breach a protocol, and it goes wrong, we’re criticised. Those doctors gave free and open evidence at the coroner’s report. And are now having that evidence used to find them unsatisfactory. So what’s the level of trust for other doctors when they read that for future coronial inquests. You better lawyer up, and you better keep your mouth shut, because this can be used against you. The family were highly critical that the hospital seem to be hiding notes and not providing information to them. And that probably speaks to the hospital’s litigious defensive nature and the people involved. So when when an accident happens, people generally run for cover, because the hospital is very quick to shock you out if they can do it, because they want to protect their own reputation. And you contrast this to aviation where when you buy a ticket, you indemnify the pilot from personal error. Doesn’t mean the pilot can’t be found responsible to some level. But you as a passenger can’t to my understanding go and personally sue that pilot. You can do it to a doctor.

 

TK: That route of litigiousness and medicine is a differentiating factor between these two industries and a big one, a significant one because blame culture for medical stuff is, it’s very different. And I can absolutely see how it would undermine the trust of people to report freely and openly with the intent of sharing advice on best practice, knowing full well, we are fallible, and oftentimes when things happen, it’s not even that a mistake was was made, or an error was made, it could be a broken process. So, I mean, an investigation should be, let’s look at the root cause analysis and look, maybe the process needs to be fixed. Or maybe there’s some other systemic problem that we can address, we actually can be informed by these four people that were involved, to help us make the system better. Coming down on them with punitive measures, and saying, ”We’ve figured out the answer to the problem, and they are the problem. And if we punish them, the problem goes away,” doesn’t really help anyone, and it creates a whole other problem, which is now people don’t want to report like you mentioned. I don’t know what, kind of the best way for it is for the Australian system. I know that New Zealand recently, fairly recently, within the last maybe five years implemented an approach towards non punitive and anonymous reporting. Perhaps partially influenced by the way reporting is conducted and aviation. And I’m not sure what the data is on how many people have whether there’s been an uptick in submissions, because people are more open to report. But I think that what you said earlier about framing it in terms of human performance might be a really valuable thread that we should pull on here. Because when you frame it in terms of, “Hey, this is a safety problem,” and it implies that somebody’s responsible somewhere, whereas if you think it from a performance problem, you’re thinking, “how can I just get better? Instead of how can I punish someone else?”

 

BESTIC: Well, anonymous reporting is interesting. You know, there’s been some powerfully negative data out of the United States around anonymous reporting. And we’re starting to implement what we have already implemented, Anonymous Reporting systems in Australia, under this beautiful title of speaking up for patient safety, I mean, who wouldn’t want to implement a program called speaking up for patients sake, just a marketing genius, right? And the evidence is that these programs are widely flawed. The evidence is that these programs, they call it the weaponization of safety systems. The vast majority of reporting is actually geared around airing personal grievances, dobbing in your competition. You know, if you’re a up and coming, plastic surgeon and you want to knock off some competition, you can put on a stack of anonymous reporting around their performance and they get suspended, investigated. Online, you know, reviews of doctors, people with grudges. A nurse that is unhappy with a particular doctor on the ward can put in, can get her and her friends to put in a stack of complaints against a doctor. Maybe the nursing unit manager speaks to one of the junior nurses to say about her punctuality at work and that junior nurse decides to initiate a series of anonymous complaints against the nursing unit manager. So that’s what the evidence shows. In fact, the United States experiences that it skews the reporting towards minority groups and females and pregnant females. So it hasn’t been successful. 

 

TK: Is there a way to? Because the, I can definitely see how that can be weaponized. And I can actually think back to a case in aviation, where a flight safety system was weaponized by an individual going after other people as well in a very similar method to exactly what you described, which is, you know, you gotta avoid that at all costs. So would a, an appropriate measure be that when you have these systems, it’s not, there is no mechanism by which one individual can inform on another individual? The anonymity is, I want to bring something up that’s process related or even, you know, I succumb to, to make sure everybody knows about it. But there are, there’s no identifying features of anybody involved, so that the data points we have are “okay, well, this individual and we don’t know who, what it was, was awake for 27 straight hours, this cluster of events happened. There’s a systemic problem here. And we can log that as a single data point and then compare it with 30,000 other very similar ones, to now bring a case for some kind of change.” You would have to remove the mechanism by which one person can, you know, leverage a career against another person. That would be you know, fundamental, it’s, it’s surprising that you know, Australia is pursuing a model where that is the case, knowing the way that humans are the way that you just described.

 

BESTIC: Well, it sits very well from a executive and management perspective. You know, there is a trend definitely away from strong leadership. Strong leaders are not rewarded for being strong leaders. If you want to survive in an executive position, you don’t make strong decisions. So it’s a bit like we see with politicians, you don’t make a decision, you make a referendum, you have another inquiry, you have another royal commission, you let someone else make the decisions. And we see, we see this a lot in health care. People don’t want to put their head above the parapet, they don’t walk the floor and talk to people and find out what the problems are because they don’t want to know what the problems are. And by having anonymous bottom up reporting systems, you actually de-empower managers at every level, because that manager now is watching their back all the time. Why even bring up someone for punctuality? Just keep your mouth shut, you’re better off doing nothing. Because the moment you start engaging with people and have interpersonal conflicts, which has to happen when you’re a good manager, if you want to manage performance and performance and safety, to me, are linked. If you’ve got a high performing organisation, they’re safe. So all those little things matter, people’s behaviour, level of professionalism, and so on. That often requires a manager to correct behaviour, and in really mature organisations, individuals correct each other. 

 

TK: That’s right. 

 

BESTIC: But if you’ve got a system that empowers all of your subordinates to complain about you, and puts weight to that, then you’re going to encourage managers to take a backseat. And it looks great from outside. “Yeah, look, we’ve got a system called speaking out for patient safety, high five, look at all the report and look at all the reports were generating and all this person didn’t wash their hands, check that out.” I mean, we just get an absolute emphasis on when you shy, and like you’re saying we don’t have the data points to say, well, is that actually important? And we’ve got two groups of people that are trained and think differently, doctors and nurses. Opening a can of worms here.

 

TK: Tread carefully this is a minefield 

BESTIC: Yeah, might need to edit this out. The general mindset for the nursing training is protocols are important. Protocols save lives, systems and protocols and procedures exist, and they’re important and must be followed. The medical training is a little bit encourages independent, free clinical thinking. In the military context, we would call it an immediate action, patrolling in the jungle, you get shot at all the soldiers are trained to perform an immediate action to that response to being shot at and that allows the platoon commander to then make some decisions. So there’s an immediate reaction followed by deliberate decision making. Same in counterterrorism or anything else, you know, that’s what SOPs do. They, they take away that, “okay, here’s my immediate response. I have an engine failure and helicopter, I immediately enter auto rotation, and then I consider what I need to do next.” So from a medical perspective, it’s like, right, we’ve had immediate reaction, now the doctor arrives and starts to make clinical decisions. That’s what we can clash. Nurse goes “but there’s a protocol for this,” doctor might be, and may not be articulating it very well, sure. But I’m gonna deviate from there, because I’ve got experience in the area or what have you. And you look at this latest coroner’s report, the doctors are criticised for not showing independent clinical thought. So when you put those two systems together, they’re going to clash. And if you’re going to say, because the belief system within the nursing culture is a bit more, when you deviate from protocol, it’s bad, then every time they see that deviation, they’re going to report it. So the system is going to get swamped with things that may or may not be relevant, and it’s going to become wildly distracted about things that are not actually important. So we’re going to chew up an awful amount of our cognitive space as an institution on stuff that’s just not important. And in the meantime, leadership just sails off into the distance.

 

TK: If you’re going to implement some version of this reporting, it must be non punitive. And if you have a punitive element where you’re still looking at “who can I blame for something,” instead of “what can we fix?” And the answer might be nothing can be fixed right now, the answer might be we need a little bit more time and data and we’ll figure out a bit of a better way ahead. Or it might be a technological fix that we haven’t come up with yet. But it really ought not to be “let’s find the person, the individual to blame,” because that’s just such a, it’s such an unhealthy culture. And I think we’ve managed to get to a space maybe in aviation where that’s not the case. It would be a rare case that somebody is, faces a heavy penalty for negligence and aviation. Most of the time, it is the human factors, the H-Facts classifications, you know, somebody files a report on themselves for runway incursion on the NASA’s Aviation Safety Reporting System at an airport in the US and they do a thorough background check and they classify it in terms of the human factors in involved in and they publish it for everybody to read and learn from. That pilot can, you know, conceivably may not lose their licence. They may not be punitive measures because they might not be warranted. But, you know, health care, like you said, you need to get to a point, it’s a mature culture where, where peers keep each other in line without punitive measures necessarily needing to be mandated, and it requires strong leadership. And those two things, it’s going to take some time to build that kind of a culture that has been allowed to grow in a different direction for so long, right?

 

BESTIC: Yeah, I don’t know. It’ll be interesting whether it can ever happen. Yeah, the Civil Aviation Safety Authority in Australia, CASA, publish a regular flight safety magazine. It’s ironic, when you compare the two professions. If you write in about your own error, and people are writing about the dumbest stuff, you know, “I flew when was dark, and I don’t have a night riding, I flew into a storm, despite the fact that it was,” I mean, you know, you wouldn’t script it, they get $500, if that gets published. Now, in the medical world, if I stand up at a department meeting and admit, “hey, look, I made a really bad error the other day, and the patient ended up dying or something bad happened,” or there’s a proper outcome that, me talking to the department that could be subpoenaed and used by the family’s lawyer against me. So when we discuss cases in a department, that’s not, it’s not privileged. So that information can be subpoenaed. So everyone’s a bit careful about what they say. Two hospitals can be over the road from each other, they won’t share their near misses with each other.

 

TK: Yeah, and it’s no, you’re in no position to advance performance or advanced the profession when it’s that solid. And when people are afraid of blame culture.

 

BESTIC: The only thing we can do is within our own, we can actually use the silos to our advantage. And this is what I’ve, the point that I’ve come to in my career is, I’ve given up trying to fix the system, because it’s exhausting. And I’m not sure, you can’t fix something that doesn’t want to be fixed or doesn’t see that it has a problem. You know, there’s this narrative concept of narrative fallacy. Humans like a simple story. We don’t like complex stories, they’re too hard. So we like the story, when it’s more appealing the story that the doctor made an error. Because to look at all the causative factors, you know, it just takes a long time. Where aviation, the immediate understanding is that there are going to be more than one cause to this crash. Always, always a contributing factor. Sure, the pilot might have made a really bad error. And maybe in this particular crash, fatigue wasn’t an issue. But when we look at the culture of this particular company, there is a culture of over rostering, and not letting people call in sick, that is going to cause another problem down the track. And because all those little organisations know that, if a crash happens, the investigator is going to come and look at everything. That you know that carrot and stick kind of works. And when investigators come they are professionally trained investigators from external body. We don’t really have that, we might just get a doctor from another hospital come and investigate. So we don’t have this professional body that moves around investigating things, and we should. I made a controversial statement A while ago saying, How would people feel if we had, we recorded by voice all the theatre conversation? For every case,

 

TK: Like an FDR or a CVR.

 

BESTIC: Exactly right. I say why don’t you like the monitoring on the anaesthetic machine, that’s all your flight data. I record that. Sometimes manually for a whole case. I do. The very data that might hold me responsible. I’m the person recording it. So when the blood pressure is a bit low, it’s on my honesty, to actually accurately record that, that shouldn’t be like that. That should be recorded in a way that I can’t access. But when I suggested the recording people were like, “why would we do that?” I said, “Well, cockpits are recorded. It doesn’t stop me talking about whatever you want to talk about. But.” So there’s this absolute, there’s no way we want visibility on what we’re doing inside our operating theatre. Absolutely not. People are mortified about the recent cases where patients have hidden listening devices on them. Little USBs in their hair bun or something and then recorded it. There was quite a famous case out in States, where patient had a colonoscopy and an anaesthetist and surgeon were disparaging about the patient and then went to the media with it. But it shouldn’t. You know, the horrifying thing shouldn’t be the patient’s recording it, it should be that there should be nothing happening that we don’t mind visibility on. But because we’re personally liable and responsible then changes the nature of it. So I think there’s A. got to be that understanding that there are always more than one causative factor. And I’ve even noticed in aviation, when I’ve had a colleague recently, who had a minor crash. People are very careful not to actually, I mean, there’s human nature to oh yeah, well, I’m not surprised he did that or you know, we’ve got a competitive nature where we want to kind of put that person down if we can, makes us feel better. But Pilots I’ve found are very critical people. People say to me, the only thing two pilots will agree on in a room is that they’re better than the third pilot who’s not there. So pilots are very, very critical of each other. But they’re also quick, I found to say, we’ve got to wait for the whole report. That doesn’t happen in medicine. We won’t talk about accidents, we’ll keep it quiet. And so the only way to change it, coming back to the sort of siloed thing to use that as an advantage is to say, “Well, why don’t you as an individual stand up to the department and talk openly about your errors?” That will have the courage to do that, that will create a culture amongst junior staff, that it’s okay to talk about errors. And then the tertiary hospital I work at that’s generally what happens, the most senior respected, most capable people will frequently jump up and talk about something basic. I want to introduce a system called Consultant Confessional, where just the consultants talk about the dumbest things I’ve done in the last week. Or the cannulas they’ve missed or anything because what it does is it tells people that we’re human, and we’re not pretending to be something that we’re not, and that would open up the door a bit. 

 

TK: Have you implemented that yet? Are you, is that a future goal?

 

BESTIC: No, we’ve got it on an ad hoc basis. You know, I’ll, yesterday in fact, on a Thursday, so two days ago, I, in front of junior trainees, I said to one of my senior anaesthetists who, mentored me, I’ve got a great deal of respect for the guy. And I said, Chris, I need to confess something to you that I did yesterday. And I told him that I’d given a drug to a patient that I shouldn’t have. I said, “I gave this drug, the patient was in heart failure. They nearly had an arrest, it was a really dumb thing to do.” And he goes, “Well, that was stupid. I’ve told you not to do that.” I said, “I know,” he goes, “Did you think of me when he did it?” I said “I did.” “Because did you have a bad sleep last night?” “I did.” “Do you feel better for telling me?” “A little bit.” I said, “I just feel annoyed with myself. That this far into my experience, I’m still making basic errors, it really annoys me.” But the reality is, of course, we’re making errors all throughout our career. But a part of it is I’ve got to get that off my chest and not hide it. I’ve got to give it air time. And it also encourages other people. I went back to one of the trainees and I said to him, “hey, look, you saw me get that drug yesterday, you know, I shouldn’t have right.” He said, “I did think that.” Some people are people now but anyway. So look, this is not solving the world. It’s not changing the whole system. But when we get overwhelmed with the frustration of “Well God how do I fix this?” We can just start at our own level. We can start by you know, when an instructors pushing you to do a pre flight that you’re gonna want to take more time to do this properly. We can affect change within ourselves, and seek mentors and show leadership. It’s not going to change the world, but it’s going to make you a better human. And again, we’ve come back to performance. If the greatest accolade we can get as a physician is would our own nurses that see us every day trust us with their family. So cause they see us at our worst. So if they trust us, and if you want to know maybe your mom or your dad or if someone needs to see a cardiologist and you don’t know any cardiologists you ask a recovery nurses or the nurses that work with other cardiologists. They’ll know. 

 

TK: Yeah.

BESTIC: Because they see.

 

TK: I was an instructor pilot for a number of years and I had a similar thing, would I want my family flying with this pilot, if they were a student that I was teaching or checking.

Related posts

Interview complet – Dr. Brian Goldman sur l’amélioration de la prestation des soins de santé au Canada

L'interview complète de Dicerra avec le Dr Brian Goldman de l'émission White Coat, Black Art de la CBC où il discute de la performance humaine, de la bonté humaine et des obstacles à surmonter pour améliorer la prestation de soins de santé.

Dr. Brian Goldman sur la culture du blâme – Points saillants

Dicerra interviewe l'animateur de White Coat, Black Art de CBC sur la culture du reportage et ses effets sur les performances humaines dans le domaine de la santé.

Related posts

Interview complet – Dr. Brian Goldman sur l’amélioration de la prestation des soins de santé au Canada

L'interview complète de Dicerra avec le Dr Brian Goldman de l'émission White Coat, Black Art de la CBC où il discute de la performance humaine, de la bonté humaine et des obstacles à surmonter pour...

Dr. Brian Goldman sur la culture du blâme – Points saillants

Dicerra interviewe l'animateur de White Coat, Black Art de CBC sur la culture du reportage et ses effets sur les performances humaines dans le domaine de la santé...