When a hospital patient dies unnecessarily, standard procedure is to search for the point of failure. In the 2017 death of 75-year-old Charlene Murphey, that point of failure was identified as RaDonda Vaught, the nurse who injected Murphey with a fatal dose of vecuronium bromide, a muscle relaxant, instead of Versed, a sedative, while Murphey was at Vanderbilt University Medical Center for a brain injury. Prosecutors found that Vaught overrode a computer system when she couldn’t find Versed and instead chose the first medication on the list beginning with V-E, then ignored the red cap on the vecuronium bromide that states “Warning: Paralyzing Agent.” She told investigators later that she had been “distracted with something.” She was sentenced on May 13 to three years’ probation.

But the serious problem of medical accidents in the United States and other countries goes far beyond errors by individuals, be they nurses, technicians or doctors. If deaths occurred only in cases of egregious misbehavior, they would be rare. In fact, deaths during treatment are distressingly common. Hundreds of thousands of people may die annually partly as a result of medical error, according to some estimates. (That equals the loss of lives from several large passenger jets crashing every day.)

To save lives, hospitals and their regulators need to stop focusing on single points of failure. A potential solution can be found, surprisingly, in the engineering department at Massachusetts Institute of Technology. Nancy Leveson, a professor there, says hospitals ought to use “systems thinking,” or analyzing how accidents may come about from unforeseen interactions that emerge within a complex system.

I featured Leveson’s work in a newsletter last year. She emailed me after the verdict in the Murphey case to alert me to research by another scholar, Elizabeth White Baker, in which Baker proposes a better way for hospitals to detect and prevent errors in the administration of medications. Baker is a professor of information systems in the Virginia Commonwealth University School of Business who studied in Leveson’s lab and wrote a thesis on her findings.

Baker analyzed a case that’s similar to the one involving Vaught, though less serious: an unidentified 81-year-old woman who was given the wrong medication to treat a heart irregularity known as ventricular tachycardia. The doctor intended to prescribe 40 milliequivalents of potassium chloride (chemical name KCl), but the patient was instead given 40 milligrams of vitamin K. The doctor gave an order orally to a procedure nurse, who in turn gave an order orally to the unit nurse for “40 of K.” The unit nurse tried to contact the physician but was told that he was busy. The hospital pharmacist detected the error, but not until after the unit nurse had dispensed the wrong medicine by overriding the automated medication dispensing system. The patient, fortunately, was not harmed.

Baker wrote that the original incident report focused on finding the weak link in the causal chain and “did not include nursing supervisors or hospital administration” who were not directly involved but have a lot to say about how the system is set up.

Understaffing is a huge problem. People are more likely to make avoidable mistakes when they are exhausted and overwhelmed. Poor design is another. Baker said in an interview that automated medication dispensing systems aren’t designed or bought by the people who actually use them. They emit lots of warning beeps but don’t guide nurses toward making the right choice when there’s confusion. She said hospitals could learn a thing or two from McDonald’s, which makes it easy for employees to enter customer orders correctly.

Hospitals over-rely on computer systems to guarantee safety, she said. The problem is that a computer is only as good as the information that’s entered into it. When the computer isn’t up to date, staffers must make decisions on their own. “Inevitably,” she wrote in her thesis, “this consistent reliance on human judgment as a control mechanism in this scenario will break down and lead to losses because the system design is inherently hazardous.”

Doctors and nurses override safety measures regularly because they think it’s the only way to get their jobs done, Baker wrote, noting, “As one provider mentioned, ‘It is about saving lives in a timely manner, not always following rules.’” The solution, according to Baker, is to change the system so medical professionals don’t perceive a trade-off between saving lives and following rules.

Baker argues that the two main ways that hospitals study errors, namely root cause analysis and failure mode effects analysis, are about looking for the weak link in a causal chain, not viewing a system holistically. She contends that they aren’t “systems thinking,” even if they’re sometimes labeled as such. The alternative that Baker and Leveson favor, known as systems-theoretic accident model and processes, or STAMP, “avoids assigning blame for an accident to a human operator and instead looks for why the systems and structures to prevent the events were unsuccessful,” Baker stated in her paper.

Baker received a master’s degree from M.I.T. for her thesis. It goes along with the M.B.A. and doctorate from other schools she already had. She said her work at M.I.T. taught her that hospital administrators sometimes try to save money by stinting on safety measures even though improving safety is cheaper in the long run because it helps saves on costs like defending against malpractice lawsuits.

I contacted another expert in the field for his reaction to Baker’s thesis: Lars Harms-Ringdahl, who works at the Institute for Risk Management and Safety Analysis in Stockholm. “I agree completely,” he said in interview. “In health care they look too much at the lower-level errors and not very much on systems design and higher-level errors.”

Harms-Ringdahl said the problems Baker and Leveson identified in the United States exist in Sweden and in many other countries. He said he couldn’t name a country that has licked the problem. “I have to admit that it is a mystery why there are so little improvements made in hospitals and from regulatory authorities,” Harms-Ringdahl wrote in an email.


The number of job openings in the United States, which hit a record 11.5 million in March, according to the government’s preliminary count, has probably peaked but isn’t falling rapidly, the Goldman Sachs economist Ronnie Walker wrote in a client note on Friday. Indexes of online job listings have fallen about 3 percent from their high, foreshadowing a decline in the official data, which is reported with a lag by the Bureau of Labor Statistics, Walker wrote. The decline is broad-based, with about 80 percent of economic sectors experiencing fewer job openings, according to the Goldman analysis.


“One might even say that the beneficent ‘invisible hand’ envisioned by Adam Smith has become for increasing numbers of Americans a clumsy, heedless ‘invisible foot,’ which tramples on social, human and environmental values, rather than responding to them.”

— Hazel Henderson, “The Politics of the Solar Age: Alternatives to Economics,” 1988. Henderson, a self-taught environmentalist and futurist, died on May 22.

Have feedback? Send me a note at coy-newsletter@nytimes.com.



Source link

Previous articleWho will make WCWS in Oklahoma?
Next articleSteelers DL ‘called to move beyond the sport’

LEAVE A REPLY

Please enter your comment!
Please enter your name here