Chemical structure of edrophonium
Find information on thousands of medical conditions and prescription drugs.

Edrophonium

Edrophonium is a readily reversible acetylcholinesterase inhibitor. It prevents breakdown of the neurotransmitter acetylcholine and acts by inhibiting the enzyme acetylcholinesterase, mainly at the neuromuscular junction. more...

Home
Diseases
Medicines
A
B
C
D
E
E-Base
Ecstasy (drug)
Edecrin
Edrophonium
Edrophonium chloride
Efavirenz
Effexor
Eflornithine
Elavil
Eldepryl
Elidel
Eligard
Elitek
Elixomin
Elixophyllin
Ellagic acid
Elmiron
Eloxatin
Elspar
Emtriva
Emylcamate
Enalapril
Enalaprilat
Enalaprilat
Endep
Enflurane
Enoxaparin sodium
Entacapone
Enulose
Epi-pen
Epinephrine
Epirubicin
Epitol
Epivir
Epogen
Eprosartan
Ergocalciferol
Ergoloid Mesylates
Ergotamine
Eryc
Eryped
Erythromycin
Esgic
Eskalith
Esmolol
Estazolam
Estazolam
Estrace
Estraderm
Estradiol
Estradiol
Estradiol valerate
Estring
Estrogel
Estrone
Estrostep
Ethacridine
Ethambutol
Ethchlorvynol
Ethosuximide
Ethotoin
Etiracetam
Etodolac
Etopophos
Etoposide
Etorphine
Evista
Exelon
Exemestane
Hexal Australia
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z

Clinical uses

Because its duration of action is only about 10 minutes, edrophonium is used to differentiate myasthenic crisis from cholinergic crisis. In myasthenic crisis, where a person is not able to produce enough neuromuscular stimulation, edrophonium will reduce the muscle weakness. In a cholinergic crisis, where a person has too much neuromuscular stimulation, edrophonium will make the muscle weakness worse.

Edrophonium is available under the trade names Enlon (AstraZeneca), Reversol, and Tensilon.

Read more at Wikipedia.org


[List your site here Free!]


Why did it take so long to make the diagnosis?
From Physician Executive, 1/1/05 by Michael S. Smith

A patient with fever has two blood cultures ordered. Two days later, on the weekend, both cultures grow S. epidermidis.

[ILLUSTRATION OMITTED]

What might happen in your institution?

* The report is filed, but the nurse doesn't see it, because there are many lab results.

* The nurse sees the result but has learned that it is a contaminant. This is rule-based functioning, (1) useful, but occasionally a rule is wrong or misapplied--"strong but wrong"--amenable to education.

* The on-call physician sees the patient, notes a rectal temperature of 100.6 and suspects nothing. To her intuition, the patient doesn't look sick.

* The on-call physician sees the patient, notes the temperature, sees the blood cultures and is not concerned. We deal with our complicated world, especially the complexities of medicine, by finding patterns. The first pattern we think of or use is resistance to change. Ironically, had the physician recently seen a case where the same blood cultures were not a contaminant, the thinking would likely have been different. We actively look for ways to fit data--two positive blood cultures here--into the pattern or to ignore it. The blood cultures are positive, but S. epidermidis is often a contaminant. If the patient doesn't appear sick from our examination, we may discount tests that don't fit a pre-conceived scenario. It is normal behavior and most of the time works well.

* The on-call physician is concerned about two of two cultures being positive and orders a third culture.

Two days later, the third blood culture is also positive for S. epidermidis. Now, what might happen?

* The attending physician sees the latest blood culture and is not concerned, because the fever the day before has subsided and the two prior positive blood cultures were not seen--"out of sight, out of mind."

* The attending physician remembers the patient has a history of a non-functional heart murmur. She looks at the chart and sees the other positive blood cultures.

In this particular case, the delay diagnosing SBE took 10 days. It is understandable to miss a diagnosis because on human variability; it is unfortunate to miss one because of a bad system.

Fortunately, much of clinical medicine is loosely coupled, as the term is called, allowing recovery from some errors. Those in nuclear power plants or on aircraft carrier flight decks don't have 10 days to recover and their systems are far more complex. (2)

How would your QA committee deal with the above?

* These things happen. Next time, be more careful. (Exhortation)

* Write a letter to the attending physician on the case and ask for a reply. (Sanction)

* Ask for explanations from both the physician and the head nurse on the unit. (Sanction: training)

* Ask "Why did this happen?" several times until nothing more can be learned. (Redesign the system).

There is no reason that these things should happen. A good system would:

* Emphasize when more than one culture grows the same organism.

* Ensure that pertinent information (the reason for the cultures, heart murmur, other cultures) is easily visible to all caregivers.

* Have adequate communication between physicians and between physicians and nursing staff.

In short, how do you make it impossible to miss the diagnosis? This requires work, but it is also work to care for patients who suffer complications, or to discuss the delay at committee.

How our brains set us up for error

1. Heuristics

Heuristics are hunches, rules of thumb, and widespread behavioral approaches we use to simplify decision-making in an ambiguous, complex world.

Availability heuristic (1)

We tend to judge situations based on what comes to mind first, rather than logically examining all data. (2) The first information received anchors and distorts subsequent processing. Frequent, shocking, or impressive events are easier to recall.

The heuristic is helpful but may cause us to err and lose time if we are wrong. (3) Under stress, we focus our attention, making it possible to miss the obvious or not find creative solutions.

Example: A neurology resident saw a patient in an ED waiting room herniate from a subdural hematoma. Years later, as an attending, he continued to instruct residents that any question of a subdural required a stat CT scan.

Representative heuristic (1)

We assign higher probabilities than warranted to scenarios that fit our expectations. For example, many people are more concerned about West Nile virus than motor vehicle fatalities, even though the latter are 100-200 times more frequent.

If two situations appear related, we consider the relationship causal. We err towards the familiar and expected, even if cumbersome. We think of what should occur rather than what should not occur. What doesn't fit tends to be ignored. The likelihood of an event is often based on its resemblance to other well-defined events.

Be cautious of positive findings in tests ordered for the wrong reason. (3)

Be careful when dealing with atypical manifestations of disease, equivocal findings, or rare diseases. Require compelling evidence before diagnosing a rare disease. Require extremely compelling evidence before diagnosing a rare disease that has a poor prognosis.

Think of an iatrogenic complication (medications, procedures) when seeing an unexplained change in the clinical course of a hospitalized patient--a first seizure, for example, is due to either something they received or something they didn't receive.

A good diagnosis is both coherent and adequate--it makes sense and explains everything. Be concerned if it is not.

2. Fixation error

A patient develops horizontal gaze palsy and vertical nystagmus eight weeks after gastric stapling, when the procedure was new. The working diagnosis is myasthenia gravis, although there is no response to edrophonium and nystagmus is not common in myasthenia.

Only later, when the physician happens to speak to students about the initial description of Wernicke's encephalopathy, due to prolonged vomiting and malnutrition, does he realize the scope of his error. Finally, he is able to discard the entrenched hypothesis of myasthenia in favor of the correct--and adequate--diagnosis.

Fixation errors (3) are persistent failures to revise the diagnosis or plan in the face of often readily available evidence suggesting such a need. A second opinion, a second look, may help. Once myasthenia gravis was considered the diagnosis, other possibilities were ignored until they practically smacked the physician in the forehead.

3. Pattern thinking

When we assemble something, we often fiddle with the parts, rather than carefully following the directions. We look for the right pattern rather than following directions, which involves "cognitive strain." Fortunately, the world's patterns are usually predictable and our processing is efficient if we find and use the right pattern.

Example: "I can't believe I didn't see it." A radiologist sees an abnormality in the small finger, matching the clinical history. A boxer's fracture of the fifth metacarpal, however, is missed. The biggest enemy of making a diagnosis is having made another one.

We see what we expect to see and hear what we expect to hear, whether or not it actually exists. We see faces on Mars, pronator drifts, or ST segment depression; we hear heart murmurs if we expect to or believe we should.

Frequency gambling (1)

When faced with an unusual situation, we use what has worked best in the past. In general, this approach is helpful, but it may be erroneous.

Example: A patient with leg weakness was felt to have lumbar disk disease. During the imaging procedure, contrast was injected, precipitating a focal seizure of the involved extremity, changing what part of the nervous system was imaged.

Example: Bizarre appearing movements in a teenager were felt to be hysterical until recognition of the pattern by a second individual led to determining that a drink had been spiked with an antipsychotic. The acute dystonic reaction was successfully treated with benztropine.

4. Overconfidence and optimism (1)

We tend to think we are better than we really are at doing things.

Example: In one facility, carotid endarterectomy complications were believed to be 1 percent. A review of the procedure showed major complications to be 12 percent and all complications to be 23 percent.

References:

1. Reason, J. Human Error in Medicine, Cambridge University Press. Cambridge, 1990.

2. Perrow, C. Normal Accidents. Princeton University Press, Princeton, N.J., 1999.

3. Kassirer, JP. Kopelman, RI. Learning Clinical Reasoning, Williams and Wilkins, Baltimore, 1991.

By Michael S. Smith, MD, MS

Michael S. Smith, MD, MS, a statistician, wants to help people in the medical community use statistics to make better, faster and easier decisions. He is self-employed and may be reached at 520-410-7917 or mssq@comcast.net

COPYRIGHT 2005 American College of Physician Executives
COPYRIGHT 2005 Gale Group

Return to Edrophonium
Home Contact Resources Exchange Links ebay