Dr Richard Cook, an anaesthetist at the University of Chicago, works daily with the most advanced equipment available. Yet he keeps several syringes nearby, filled either with succinylcholine (suxamethonium chloride) or thiopental (thiopentone). "The world isn't always a perfect place," he explained. When disaster looms, "you need a path of retreat, a way to back away." The importance of finding simple alternatives to complex equipment was one of the recommendations Cook made last week at the Southern California Patient Safety forum held in Los Angeles.
Traditional approaches to disaster management tell us nothing about how accidents actually happen, Cook said. Defining an accident as "an incident with bad consequences," he explained that accidents occur in complex systems such as hospitals because of the combination of multiple small failures, each insufficient in itself to cause a disaster.
Conventional disaster management, however, assumes that accidents result from single point failures, and the conventional response in medicine is to "blame and train": installing automated systems that are even more complex and therefore more prone to error; instituting ever more restrictive rules and policies; and imposing stiffer sanctions on the individual blamed for the disaster of the day. He also pointed out that "talking about safety is not safe," because it is more politically correct to appear to be safe than to have an honest discussion about a system's shortcomings and the trade offs required to increase safety.
Also speaking at the conference was Dr Karlene Roberts of the University of California, Berkeley. She heads a group of investigators who have been examining organisations in which error can have camstrophic consequences, including intensive care units in hospitals, commercial airlines, hostage and terrorist negotiation units in the United States and France, and community emergency services (police, fire, and emergency medicine) in the United States and the United Kingdom.
Roberts and her team have identified five ingredients necessary for patient safety: an established system of ongoing checks and balances designed to spot risk and identify safety problems; appropriate rewards for workers who identify possible safety risks, including mistakes they themselves have made; a system of quality control; the perception and acknowledgement of risk by the organisation; and a system of command and control that permits decisions to be made by the people with the most experience, even if they are of lower rank than others on their team.
Roberts encouraged her listeners to ask themselves if their organisations had all of these characteristics. "If [the organisation] doesn't, it is probably risk prone. And the costs of being risk prone are always ultimately higher than the costs of prevention."
Norra MacReady, Los Angeles
COPYRIGHT 1999 British Medical Association
COPYRIGHT 2000 Gale Group