Reducing Medical Misadventures: “My name is Atul Gawande and I’m a surgeon”

| January 16, 2009

Safety is NOT the result of people not making mistakes – people make mistakes all the time. Safety occurs when a system is robust enough to catch mistakes.

When a team gathers in a hospital operating theatre to perform an operation, it seems a matter of basic politeness that members should introduce themselves to one another. However it turns out that it is more than this. According to a report just published online by The New England Journal of Medicine, improbable as it may seem, it is one of a set of basic actions that help determine whether the patient lives or dies.

The paper reports a study, conducted simultaneously in hospitals in eight countries round the world and covering nearly 8000 patients. It tallied outcomes for 3733 who received surgery before a procedural improvement was introduced and 3955 receiving it afterwards.

Patient death rate after introduction almost halved (dropping from 1.5% to 0.8%) and rate of inpatient complications materially dropped.

So what was the improvement?

It was introducing the use of a checklist of 19 items covering such mundane matters as PATIENT HAS CONFIRMED IDENTITY, SITE, PROCEDURE, CONSENT and CONFIRM ALL TEAM MEMBERS HAVE INTRODUCED THEMSELVES BY NAME AND ROLE.

Why have such items not always been standard practice? An important reason, although the NEJM paper does not discuss this, is the tradition of a steep authority gradient in medicine. That is to say, specialist surgeons stand close to God. Registrars (essentially doctors serving post-graduate apprenticeships) are away down there, and registered nurses deserve little more courtesy than "Scalpel, nurse!". As for hospital administrators, they think they are above everyone else and everyone else thinks exactly the opposite.

Authority gradient has been extensively studied by aviation safety researchers. In two-pilot cockpits, teamwork is essential for safe flying (subsumed under the broader heading of "crew resource management") and in a number of accidents teamwork shortcomings have been a causitive factor. Here's an analysis of how an Indonesian Boeing B737 came to grief when the pilot flying the aircraft got distracted by his colleague (search for "gradient" to find discussion towards the end of the document).

Atul Gawende is a Boston surgeon and writer for The New Yorker. He is also director of the World Health Organization's Global Challenge for Safer Surgical Card, which sponsored the research reported in the NEJM. In 2007 he wrote a New Yorker feature, "The Checklist", in which he described how a critical-care specialist at Johns Hopkins University, Peter Pronovost, had started investigating the use of checklists, and what he did about the authority gradient:

"On a sheet of plain paper, he plotted out the steps to take in order to avoid infections when putting a (intra-arterial or intravenous) line in. Doctors are supposed to (1) wash their hands with soap, (2) clean the patient's skin with chlorhexidine antiseptic, (3) put sterile drapes over the entire patient, (4) wear a sterile mask, hat, gown, and gloves, and (5) put a sterile dressing over the catheter site once the line is in. Check, check, check, check, check. These steps are no-brainers; they have been known and taught for years. So it seemed silly to make a checklist just for them. Still, Pronovost asked the nurses in his I.C.U. to observe the doctors for a month as they put lines into patients, and record how often they completed each step. In more than a third of patients, they skipped at least one.

"The next month, he and his team persuaded the hospital administration to authorize nurses to stop doctors if they saw them skipping a step on the checklist; nurses were also to ask them each day whether any lines ought to be removed, so as not to leave them in longer than necessary. This was revolutionary. Nurses have always had their ways of nudging a doctor into doing the right thing, ranging from the gentle reminder ("Um, did you forget to put on your mask, doctor?") to more forceful methods (I've had a nurse bodycheck me when she thought I hadn't put enough drapes on a patient). But many nurses aren't sure whether this is their place, or whether a given step is worth a confrontation. (Does it really matter whether a patient's legs are draped for a line going into the chest?) The new rule made it clear: if doctors didn't follow every step on the checklist, the nurses would have backup from the administration to intervene.

"Pronovost and his colleagues monitored what happened for a year afterward. The results were so dramatic that they weren't sure whether to believe them: the ten-day line-infection rate went from eleven per cent to zero. So they followed patients for fifteen more months. Only two line infections occurred during the entire period. They calculated that, in this one hospital, the checklist had prevented forty-three infections and eight deaths, and saved two million dollars in costs…."

Provonost was able to extend use of the checklist to hospital intensive care units in Michigan:

"In December, 2006, the [Michigan] Initiative published its findings in a landmark article in The New England Journal of Medicine. Within the first three months of the project, the infection rate in Michigan's I.C.U.s decreased by sixty-six per cent. The typical I.C.U. – including the ones at Sinai-Grace Hospital – cut its quarterly infection rate to zero. Michigan's infection rates fell so low that its average I.C.U. outperformed ninety per cent of I.C.U.s nationwide. In the first eighteen months the hospitals saved an estimated hundred and seventy-five million dollars in costs and more than fifteen hundred lives. The successes have been sustained for almost four years – all because of a stupid little checklist."  

In fact if it comes to that, use of checklists was pioneered by aviation safety researchers. One of the foremost experts, Manchester University Professor James Reason, became interested in the topic of mistakes due to his impatient cat. One day he was making a pot of tea, the cat was clamouring to be fed, he opened a tin of cat food, then absentmindedly spooned cat food into the teapot. (He wrote about this in a piece in the British Medical Journal in 2000 but unfortunately it is now behind a pay wall).

He developed the theory that safety is not the result of people not making mistakes – people make mistakes all the time, as he did with the cat food. Safety occurs when a system is robust enough to catch mistakes before they have bad consequences, and to correct them.

Reason's conceptual model envisages defences against error or failure like consecutive slices of Swiss cheese. Each has holes in it, placed at random. If there are sufficient levels then the chance of the holes lining up in every slice, letting the error through, become vanishingly small. There is further discussion of the model in context of aviation at www.aviation.unsw.edu.au/about/articles/swisscheese.html

The conceptual model that has evolved for aviation professionals is exactly that: we are all occasionally absentminded or make mistakes, it is obligatory for a junior co-pilot to question a senior pilot whose action seems clearly wrong (although some cultures that strongly venerate status and age can find this difficult).

Safety research and methods improvement in aviation, off-shore oil drilling and a range of other potentially dangerous pursuits is driven by research, meticulous investigation of accidents and evidence. (The Australian Transport Safety Bureau has reports of hundreds of incidents and accidents over the years at http://www.atsb.gov.au/). Regrettably, medicine, which most readers probably assume is more devoted to their health and safety than any other, does not – or doesn't always, not yet.

Gawande's web site is at www.gawande.com.

MikeM is roadkill in the wake of the capitalist juggernaut but his voice continues to protest that he is not an individual.

SHARE WITH: