Quality 2012: Evolving Frontiers in Quality & Patient Safety

Presentation: How to Prevent Human Errors Transcript

(00:00) We are very pleased to have Craig Clapper with us today. Mr. Clapper brings 20-plus years of experience in performance improvement. A lot of his experience early on was the energy industry and nuclear power plants and commercial aviation. Chuck probably has a symmetry there with him. But then, you know, over a number of years, he has also worked a number of outstanding healthcare organizations around performance improvement, around failure analysis, around increasing reliability. Some of his former clients include, at least some of my who’s who of health-care quality like Sentara and Sharp, and so I’m very pleased and I’m looking forward to hearing him speak today. And without further adieu, Craig.

(00:53) Thank you. Well the first thing I do when I listen to a presentation, I always peek ahead to see how many slides will I have to endure. I won’t ask anybody if they’ve already done that. But I know on PowerPoint slide show you can type a number on the keyboard, hit “enter,” and it takes you to the last slide, which is 40, and there’s my contact information. So we’ll hit “1” and “enter,” and we go back. You should always get something useful out of every talk, and that might be it, just that little PowerPoint tip.

(01:23) Slide two is really an introduction for our company. We’re a reliability company. Most people pass us off as a safety company. Here are those six aims of the IOM, they put “safe” right in the middle. Safe is one of those where zero is the perfect number. You want zero cases of harm. And around the other five is where one is the appropriate number, because one is the perfect number to an engineer like Craig. That means unity every patient every time. So you guys might say a hundred percent, and that’s where they get the zero in a hundred percent or the right numbers.

(02:00) If you can’t remember all six then try the three. Do nothing to cause harm. Do everything you can to heal. Try to treat them like a person in the process. The little girl is the patient. I’ll point out her ID bracelet right there. I hear sometimes the health-care people can’t see those very well. The other two are her brothers, and to my knowledge, nothing bad has ever happened to that patient. I just like the picture. What do you say to a little girl to get that look? Most of them just turn away and gasp when I talk to them.

(02:34) Not today at the hospital, but later when you’re at home, I want you to go onto You Tube and listen to Jenny’s story. The reason I don’t want you to try it at the hospital is that the safe’s surf will block you and you won’t be able to see it, and you’ll get reported to your IT division. Jenny tells her own story, and what I think is interesting about this case of harm, it was coordination of care. She didn’t receive the antibiotic before the procedure. They knew that. They elected to proceed anyway. Then if you listen to the story with the number of procedures that are racking up, and she even has right here in this line, brain aneurysm, loss of sight in her right eye. She elects to have the leg removed. She said, “I need to get my life back.” Big weight loss. She’s been hospitalized for quite a while. Even though the physicians think we can still save the leg, she elects to have it removed. And at the very end, the bottom line, “My life wasn’t ruined by this, but it certainly was changed.” So who was the recovering optimist? Dr. Denim was? So I think she’s in that same group.

(03:51) I grew up in the nuclear power industry. If you look at this comparative unreliability slide over there on the far right hand column, chances that a member of the public would die in a nuke year power accident, one 1 out of 100,000,000 per year. Pretty reliable. Chances that a passenger would die in a commercial flight, 1 out of 10,000,000 per departure. Chances that a patient would die from a human error in a hospital setting 1 out of 1,000. If you look at the other dimensions though, there’s not that much difference in reliability. So I think one dirty secret is I think aviation and nuclear power are centrally one-trick pony. Like hiring a pony for the birthday party, they’re very good at that one trick, which is safety, or if you look at the other dimensions.

(04:48) :So I chose this number, it’s from Medicare. Do our patients get recommended care for heart attack? They said 82 percent of the time. Very close to Charlotte’s Douglas airport, which was number one that year for on-time arrivals. Who here is going through Charlotte airport. Can you believe they’re number one for on-time arrival? And then over here, capacity factor, how much electricity do they make? When you get to cost, it’s not apples and oranges. It’s probably more like apples and anvils.

(05:20) So here’s cost per kilowatt hour, price per available seat mile, price per admission average in the U.S., if there such a thing as an average admission. Across the bottom this one says “Number of people who agreed or strongly agreed that my care provider listened to me and understood my needs?” I was surprised the number was so hire, because when I talk with your folks, you say, “Maybe we don’t really do that that well.” But 89 percent said I think they really understood what I needed. But here is Continental, your hometown airline, number one with a whopping 664 out of a thousand points according to J.D. Power. And then over here in the power industry, nobody really knows where it comes from or where it goes. It’s really untrue.

(06:10) What they do in Texas, because there’s three grids in Texas, there’s an eastern grid, there a western grid, and then there’s the Texas interconnect. So you get your own grid in Texas. Everything’s bigger in Texas. Whenever I say, Here’s our thinking on culture change or here is what our data say,” it really comes from these 105 hospitals spread over 27 systems. When we were working in this timeframe, I was still with Dr. Chu at Performance Improvement International out of San Clemente, and then here is our existing company, Healthcare Performance Improvement. There were a couple surprise like Novant Health, Memorial Herman, I think they have a place not too far from here.

(06:56) If you want to read more about it, I recommend these three case studies. They’re probably the best published. Up at the top, the Memorial people up in Savannah, they published mostly results, $21 million in cost savings spread over two years. Here is the Sentara people in Norfolk. They’re best published through the American Hospital Association and McKesson quest for quality award. They publish mostly on spread. You know, we can do this in our hospitals. We can do it in every hospital. We can do it in our medical group. We can do it in home care. Everywhere they had Sentara people they said, I think we’ll try this,” and showed that they had results. They crossed over that 50 percent mark at 18 months. And then if you’d like a narrative, the advocate people up in Chicago, eight pages to describe here’s how we went about it. I think the author did a very good reality of the situation.

(07:49) So I want to look at some outcome data. This first one is serious safety event rate. To be a serious safety event is you have to say, “It was our poor care that caused that bad outcome and that bad outcome was greater than moderate temporary harm.” Always remember the word “harm” has built-in negativity, is that you caused something bad to happen, so not every bad outcome is a case of harm.

(08:26) The bars are the number of cases in any one month that reads off that right-hand scale. That red line is a 12-month rolling average. Over here, this is the number of cases involving serious harm for 10,000 adjusted patient days. Not being a clinician, I don’t judge patient care directly, but if you were to tell me, “Oh, I think that hospital across the street pretty good,” I would infer one case for every 10,000. If you told me, “I don’t think they’re very good over there. I wouldn’t go over there myself, but a lot of people do.” I would think it’s probably closer to three. If you were to say, “Nobody in town would ever go over there,” that would be like 10 cases per 10,000.

(09:06) So here they were underneath that one threshold and they started their safety culture journey right there with the little green diamond. They have cut down that event rate by 50 percent. Here is a single hospital, a little bigger, 1,000 beds. Here is that one, is a good number to know. Here is where they started their journey. They’re down at about 70 percent. So cases come and daises go. Easiest way to have an event-free year, have an event-free day 365.25 times.

(09:40) So how do you have an event-free day? I think that’s what we’re going to talk about in safety culture is getting everybody focused on what will it take to today to have patient care without harm. Henry Mintsberg p at McGill, he pointed out in his book, managers not MBAs, he said, “Managing by the numbers is like playing tennis looking at the scoreboard.” So where do we look when we’re playing tennis? At the ball. Keep your eye on the ball. And that’s how you make the day safe is getting everything to keep eye on the ball. Let’s have a safe day today. And then you can sit back later and look at the numbers.

(10:21) Oddly, this is the most important slide in the entire presentation. So it’s set up is a reliability slide where higher is better. So using scientific notation, ten to the minus six; that would be one defect for every million opportunities. We all know that six sigma is based around this idea that 3.4 defects per million opportunities would be good delivered quality level. Think of this as a time access. You can look at it like 18 or 24 months. You can look at it like 50 years in some cases. This right here, this lighter blue oval, clinical bundle. It’s the end-all be-all of a symposium. I went to the symposium, I came back with great clinical bundles. These are all evidence-based best practices that we can do in our hospital, and that will improve patient care.

(11:10) And then as you look at the outcome where people are using the same clinical outcomes, there’s quite a bit of variability in the results that they achieved. And I can explain that ready by using a simple parallel. I have the same golf clubs as Tiger Woods. I don’t see him busting his hump, you know, talking, you know, flying around these hour talks where you guys get to eat and I don’t. They sneak me a strawberry every once in a while. It must be something that Tiger Woods is doing with those golf clubs.

(11:45) That darker blue oval, that’s that people bundle. Whenever you have a people bundle where they have behaviors associated with high reliability, then you have that synergy and it makes that clinical bundle work. So you get that shift up to this higher line and you can get better outcomes in the same timeframe in the long run even more. I think that is the big difference now is that we’re all playing off the same evidence base. You know, what does good look like? It’s for our ability to deliver that every day, keeping the eye on the ball.

(12:19) Here is one piece of quality data. This system was looking at lowering harm. We have fewer cases of harm that would be better for patient care. What they didn’t realize is safety culture being a reliability intervention also improves their clinical outcome data as well. Here is one other system, so another county heard from. Keep in mind where it says “adverse drug events,” these are only on anticoagulants. That was used independently of problem reporting on global trigger tool methodology, and then just one slide on data.

(12:56) Ed Hall really helped me out with his data. He said, “You know, Craig, in the U.S., money is a harm measure.” So when there is harm you oftentimes will get this letter from an attorney, notice of intent, and they will asked to be compensated for that harm. Well this system was interested well if we improve patient safety, can we see it in the number of claims and losses, the moneys that we pay out to settle those claims. So these red bars are national average in losses per acute care beds. So if you want to figure it for Methodist, take the number of acute care beds in the system, multiple, let’s say by $3,000, to make the math easy, and that’s money in addition to insurance costs.

(13:45) The blue bars are this one system’s data, and they always manage their losses better than the national average. That blue line is the number of claims. And as they see the claims go down, down, and down, they see their losses go down a little bit. But they haven’t seen this flat line that everybody else has seen. If you want to look just empirically at these kind of data, for every unit of event rate reduction, 70 percent decrease unit, 70 units, .7 units, that’s all I’m trying to say -- in losses. So they’re not really one to one, but they’re there.

(14:25) Now at this point I have used this expression, “human error,” quite a bit. I just wanted to point out that in a human-based system, care is delivered through people. So whenever care is delivered poorly, even though it’s a system issue, it manifests through human error, so you can say “acts” if you like. You can also tell a lot about a person by the way they say the word “error.” You know, I grew up in the Midwest, we say it as one syllable, human air, like the air that you breathe. I notice when I’m on the East Coast, they really hit that second syllable hard like a computer would say it, like air or, air or. Down south they really say it nicely. They put an “A” on it. It almost sounds like a good things to have. It’s one of them human era stories. Well today the patient came into the ED and there was an era.

(15:24) In our root cause practice we think the number is eight. Eight inappropriate acts of euphemism for human error combined to cause harm. When we look at our client’s data we see closer to four. I think what we need to do in the future is look a little harder for those things that would of could of, should of. City Decker pointed out that if there’s people involved, you can count on two things, over on the left they’re going to make those mistakes from time to time. So if this is the dart board at the pub, I think they want the right stuff. They want it to be safe and high quality and satisfaction and low cost. It’s just the variability gives you the spread. It looks like they’re aiming for the center.