A dubious six sigma claim

January 17, 2005

In the first part of these Six Sigma articles, it was noted that Six Sigma is a collection of quality tools and procedures.

Recently, there was a interference problem in BD Vacutainer test tubes that led to significant biases for a variety of immunoassays affecting a huge number of patients, and which led to retesting (1-2).

The problem was detected by an endocrinologist at NIH and initially investigated by clinical chemists at NIH.

What was striking was the following quote from a vice president of quality at BD Diagnostics Preanalytical Systems (1):

Through the application of Six Sigma methods, BD scientists and engineers have been able to identify an interference between a constituent of our product an a limited number of assays

The most beneficial use of Six Sigma is to prevent problems, for example through tools such as FMEA (Failure Mode and Effects Analysis). Thus, one could call this a failure to effectively use Six Sigma. I’m sure that BD scientists studied the problem, developed corrective actions, and looked at ways to preventing this type of problem in the future. This too is Six Sigma but hardly qualifies as a Six Sigma success as implied in the quote in light of the failure prevention.

  1. Parham S. Test tub interference shows ‘chink in armor’ CAP Today 2005;19:1, 68, 70, 72, available on line at http://www.cap.org/apps/docs/cap_today/feature_stories/0105Tubes.html
  2. Raffick A.R. Bowen, Yung Chan, Joshua Cohen, Nadja N. Rehak, Glen L. Hortin, Gyorgy Csako, and Alan T. Remaley Effect of Blood Collection Tubes on Total Triiodothyronine and Other Laboratory Assays doi:10.1373/clinchem.2004.043349 available at http://www.clinchem.org/papbyrecent.shtml

Frequency of Preventable Medical Errors – Myth or Reality

January 17, 2005

There has been a wealth of publicity surrounding preventable medical errors, including the Institute of Medicine report (1) which stated that up to 98,000 people die each year in the US from such errors. This book, published in 2000, drew upon earlier studies, including a New England Journal of Medicine study from 1991 (2).

Medical errors are not just reported in medical journals, but also in popular media. Examples include the health reporter for the Boston Globe, Betsy Lehman, who died from an overdose during chemotherapy and Willie King who had the wrong leg amputated. When I talk to people about wrong site surgery, they often recall an experience about someone they knew who had surgery and was repeatedly asked about the surgery site to confirm its correctness.

Since then, with all of the public awareness and regulatory programs, is the rate of preventable medical errors improving? A recent article suggests that the answer is no, at least for wrong site surgery. Lauran Neergaard, an AP medical writer noted in a June 22, 2004 article (3) that according to JCAHO, since 1999 there have been 275 cases of wrong site surgery – “a steady increase each year and a problem it calls undoubtedly undercounted”. It’s hard to imagine that the number is this high for this type of error but it is.

FMEA and RCA

FMEA (Failure Mode Effects Analysis) and RCA (Root Cause Analysis) represent techniques to deal with potential errors (FMEA) and observed errors (RCA). One does not want “to observe” the event wrong site surgery – hence FMEA is a more appropriate tool.

References

  1. An electronic summary is available here.
  2. Brennan, Troyen A.; Leape, Lucian L.; Laird, Nan M., et al. Incidence of adverse events and negligence in hospitalized patients: Results of the Harvard Medical Practice Study I. N Engl J Med. 324:370­376, 1991. See also: Leape, Lucian L.; Brennan, Troyen A.; Laird, Nan M., et al. The Nature of Adverse Events in Hospitalized Patients: Results of the Harvard Medical Practice Study II. N Engl J Med. 324(6):377­384, 1991
  3. See: http://msnbc.msn.com/id/5264092/

Focus on systems not people

January 13, 2005

In a medical diagnostics company, an engineering manager was charged with the task of improving the reliability of a blood gas analyzer which broke down too often. He kept asserting that failures were due to customer misuse. This was somewhat convenient since if this were accepted by management, then no action was required by the engineering manager. However, management asserted that an improved design focusing on ease of use and better instruction manuals (these documents were provided by engineering) would reduce customer misuse.

In a similar fashion, if a person in a hospital has committed an error that leads to patient injury or death, one could argue that better people and / or more training is needed. This is the “people” solution which may at times be valid. However the systems solution may be more appropriate. Consider a real example.

“A hospitalized patient was connected to a portable blood pressure (BP) monitoring device was transported to radiology for an MRI.. A length of tubing that led from the monitor’s BP cuff inflator had a male Luer connector. This fit into a female connector on a shorter length of white tubing that was integrated with a Critikon disposable BP cuff. The tubing and cuff were disconnected before the MRI since the Luer connector on the monitor’s tubing was metal. After the test, a radiology employee reconnected the tubing and transported the patient back to his room. Upon arrival, a family member immediately noticed that the tubing from the monitor was attached incorrectly to a needleless Y-injection port on the patient’s IV line! A nurse was contacted and she quickly disconnected the tubing. Normally, the device cycles at preset intervals, inflating the cuff with more than 500 mL of air at pressures up to 300 mm Hg. If no resistance is met with an inflated cuff, two additional cycles quickly occur. Thus, more than 1,500 mL of air might have entered the patient’s vascular system. Fortunately, this did not happen, as the machine had not yet cycled to take a BP reading. Another patient was not as lucky. In that case, the patient died from an air embolism after a nurse mistakenly connected the monitor tubing to his IV line.” (1) Reproduced with permission from ISMP Medication Safety Alert! June 12, 2003.

Some potential mitigations to prevent this error include:

  1. provide training to all relevant staff to make them aware of the issue
  2. label all BP and IV connectors with a warning message
  3. change all BP connectors, making it impossible for them to be connected to an IV line

These mitigations illustrate the spectrum from focusing on a people solution (1) to a systems solution (3). Solution 2 can be thought of as a combination of people and systems. Solution 3 follows Ralph Evans advice of make it easy to do the right thing and hard to do the wrong thing but is probably more expensive since it would involve replacement or modification of existing equipment.

References

  1. From the Institute of Safe Medication web site, accessed 4/29/04 http://www.ismp.org/MSAarticles/blood.htm

Addition reading (References based discussion with Michael Astion, MD, Ph.D., University of Washington) 3/12/2005

  1. Reason, J. Human error: Models and management BMJ 2000;320:768-770  http://bmj.bmjjournals.com/cgi/content/full/320/7237/768
  2. Marx, D. Patient Safety and the “Just Culture”: A Primer for Health Care Executives http://www.mers-tm.net/support/Marx_Primer.pdf