This tool will help you improve patient safety

February 9, 2015

Reducing medical errors requires taking a structured approach that includes teamwork, communication and tools to help eliminate bias and oversights, experts say.

Clear communication and well-constructed checklists can provide perhaps the most powerful tools in the battle against medical errors, experts say. 

As the complexity of medicine in general has increased in recent decades, says Stephen Helms, M.D., so has the potential for errors. A recent book estimates that approximately 6,000 drugs - and 4,000 medical and surgical procedures - are in use worldwide.1

"The numbers are staggering," says Dr. Helms, who is a professor in the Department of Dermatology at the University of Mississippi Medical Center in Jackson, Mississippi.

Meanwhile, edds Eliot Mostow, M.D., the number of medical diagnoses has mushroomed to a level approaching "diagnostic overload." And with the number of publications, e-mail blasts and other electronic communication generated in the medical field, he adds, "There's a medical literature overload." He is professor and chair of dermatology, Northeast Ohio Medical University, and associate clinical professor of dermatology, Case Western Reserve University School of Medicine.

Targeting trouble spots

Looking ahead

Targeting trouble spots

Cracking cognitive code

Checking it twice

 

Targeting trouble spots

In this context, Dr. Mostow says, preventing medical errors begins with identifying their potential sources. Specifically, he and Dr. Helms suggest using a process analysis approach like that of the military and airline industries. In the latter area, they say, the discovery during the 1980s that up to 80% of commercial airline accidents stemmed from human error led to the development of an approach called Crew Resource Management (CRM). The concept didn't take off in the commercial airline industry, however, until some pilots overcame their resistance to accepting tools such as checklists, which they initially considered to be "beneath them," Dr. Helms says.

In dermatology, Dr. Mostow recommends taking a similar systematic approach in identifying potential error sources, incorporating not only your practice, but also those practices who refer to you. Within this system, he says, anything from a misdiagnosis to a misspelled phone message can have minimal to disastrous consequences.

To prevent such problems, Dr. Mostow says, "Think about your day - what are all the steps that happen, and can you create a map showing potential problem areas where you can intervene? For example, your practice gets a phone call. A patient walks in the door. Who's greeting them? Who brings the patient back" to an examination room and takes the patient's history?

Just as any point of patient contact can introduce errors, so can mishandling of any materials related to a patient, Dr. Mostow says. For example, Dr. Helms says, surgeons at Columbus Children's Hospital have noted significantly poorer results if prophylactic antibiotics are given more than one hour before or after a procedure.1

"This is a simple slipup that may make a difference - it has nothing to do with training of nurses, anesthesiologists or other providers. It does, however, highlight a lack of communication between team members," Dr. Helms says.

Cracking cognitive code

 

Cracking cognitive code

Technical errors such as the one above often make headlines, Dr. Helms says. But according to expert analysis, he says, "The majority of errors are in physician thinking." In one study of medical mistakes, 80% stemmed from cognitive errors,2 he says.

"Cognitive errors are much harder to wrap our heads around," Dr. Mostow says. "Sometimes we don't know the diagnosis, or maybe what the best treatment is. Maybe we don't know new tests are available that we should do. It's very easy to jump to conclusions," or misread something in a moment of haste, Dr. Helms says.

He adds, "We have perceptions, judgments and biases that affect our approach to the patient and how we make a diagnosis. And we don't realize what our biases are." In these areas, Dr. Helms says, "We don't know what we don't know - overconfidence can be a problem."

RELATED: EHRs: beware autopilot errors

The main forms of cognitive bias that impact medicine include the following:2

  • Anchoring – Latching on too quickly and firmly to a single diagnosis. Dr. Helms likens it to prematurely closing the investigation without considering multiple diagnoses and forming a differential diagnosis. If other diagnoses are considered proper, he says, bedside tests or laboratory investigations may be critical.

  • Attribution – Failing to consider all possible sources of a problem and thereby attributing it to factors or conditions that have been influenced by one's biases. "This happens very commonly," says Dr. Helms. Attempting to take a thorough patient history, he explains, medical professionals might formulate a dozen ways to ask a patient with eczema, "What are you using on your skin?" In such cases, "We might not be asking the right questions." In one such case, Dr. Helms says, a male patient's contact dermatitis proved difficult to diagnose because he was reacting to benzocaine in a medication he had applied to his wife's skin.

  • Availability – Overemphasizing the most likely diagnosis, or the easiest one to recall. In one case, a patient presented with cheilitis but no apparent oral cavity symptoms. Because the patient was a young female, says Dr. Helms, "I honed right in on what kind of lipstick, gloss or balm she was using," as well as toothpaste and other potentially allergenic products. "I'm so into contact dermatitis, it's easy for me to go down that road. But fortunately I did a KOH exam," which showed that the patient had an unusual candidiasis presentation.

Many errors combine cognitive and technical components, Dr. Mostow says. "I think of it as a continuum. Some items may overlap - there may be action items you assign to the wrong lesion, or there was a typo - left versus right, malignant or nonmalignant."

Other errors can stem from inaction. In this regard, Dr. Mostow says, "The largest basal cell carcinoma I ever found - eight cm – was under an elderly female patient's breast. Over the course of several years, no one looked." In other instances, a laboratory may lose or simply fail to report back regarding a specimen. "There's lots of potential for issues we loosely call 'errors.' But it's not all black and white."

Checking it twice

 

Checking it twice

To minimize cognitive and technical errors, Drs. Helms and Mostow recommend applying root cause analysis – the plan, do, check, act (PDCA) cycle. Once you've identified a problem, Dr. Mostow says, devise and test a solution. If it fails, "Come up with another plan and do it again. It's not rocket science."

Such an approach can deliver stellar results, though. A pioneering paper showed more than 100 hospital intensive care units in Michigan reduced catheter-related bloodstream infection rates 66 percent after implementing an infection-control checklist.3 The checklist included only five steps, Dr. Helms says, "And if they skipped one wtep, the statistics were affected. They made everybody do the exact same thing."

A good checklist functions as a cognitive safety net. Yet even the best checklists require revision and frequent updating, Dr. Mostow says. "Every checklist is terrible the first time. According to Atul Gawande, M.D., M.P.H., a good checklist is usually revised 15 to 18 times."1

The checklist development process begins with identifying clear, concise objectives, Dr. Helms says. At this stage, he also recommends adding items to improve communication among team members, and involving all team members in the checklist creation process. "Many times, they can tell you things you wouldn't even think of."

To draft the checklist itself, he suggests:

  • use natural breaks in workflow as pause points;

  • use simple sentences and basic language; and

  • make sure the checklist fits on one page, avoiding potentially distracting extraneous language.

Validating the checklist requires ensuring that it fits the actual workflow and detects errors while they're still correctable, Dr. Helms says. Finally, testing the checklist with front-line users allows one to modify it in response to their feedback, he adds.

The result should be both thorough and simple. For example, Dr. Helms' practice has devised a checklist that goes beyond the iPLEDGE requirements to help minimize isotretinoin-related risks. Beyond logging a patient's childbearing status and birth control regimen, "For medicolegal reasons as well as good medical care, we made sure we made mention of any GI complaints, and we always want to ask about neurological history, to monitor for depression as well as to not miss pseudotumor cerebri. Additionally, we keep a log of total isotretinoin dose and all applicable lab results."

In surgical situations, Dr. Mostow adds, checklists increasingly include a "timeout." Before any procedure, "You say, 'timeout - do I have everything right?' I do it especially for laser procedures - do we have the right spot? Is everybody in agreement regarding what we're doing? Do we have the right personnel? Does everyone have their goggles on? Then we go forward."

Dr. Helms says much of the public believes that as a dermatologist, "You treat acne and warts. That's your day. They don't realize how many times we pick up significant problems related to internal disease," or solve a baffling bout of contact dermatitis.

By the same token, he says, some physicians still believe that when confronted with a pruritic patient, "Just give a steroid - it doesn't matter which one. Betamethasone diproprionate and clotrimazole 'cures' everything, and we don't have to worry" about potential allergies or contraindications.

However, Dr. Helms concludes, the subtleties of modern medicine demand that dermatologists continue refining systems and strategies to keep such information from falling through the cracks.

References

1 Gawande A. The Checklist Manifesto: How to Get Things Right. New York: Metropolitan Books; 2010.

2 Groopman J. How Doctors Think. New York: Houghton Mifflin Company; 2007

3 Pronovost P, Needham D, Berenholtz S et al. N Engl J Med. 2006 Dec 28;355(26):2725-32.

Drs. Mostow and Helms report no relevant financial interests.

For more information:

www.aad.org

http://asq.org/learn-about-quality/project-planning-tools/overview/pdca-cycle.html