Book Review: Bad Medicine
"If we want things to stay as they are, things will have to change."
––Giuseppe Tomasi di Lampedusa, The Leopard, 1958
Evolutionary biologists have proven that the more adapted (i.e., comfortable) you are in your existing environment, the less able you are to adapt to environmental changes. Struggle is good for us.
Rigidity is what organizations manifest when they are faced with either superior competition or outdated business models. They blindly cling to “that is the way we have always done it” in defiance of the evidence that this way is no longer relevant to success.
This is the history of business. New ideas, inventions, and business models from the tinkerer in the garage change the world, while rendering obsolete the existing modes of production, infrastructure, and business models, in a never-ending “perennial gale of creative destruction,” as described by economist Joseph Schumpeter.
The Diffusion of Theories
David Wootton is a historian at the University of York, and author of Bad Medicine: Doctors Doing Harm Since Hippocrates, one of the most important books I have read in a long time. He is no medical profession basher, thanking modern medicine for saving his life and also proudly announcing his daughter is a doctor.
Not only is the book incredibly well written—even if, like me, you have no particular interest in the history of medicine—it is a mesmerizing look at how a supposedly scientific and evidence-based profession rejected new innovations, knowledge, and theories while stubbornly clinging to their old—and completely ineffectual, if not down right lethal—therapies.
Bad Medicine Drives Out Good Medicine
The history of medicine begins with Hippocrates in the fifth century BC. Yet until the invention of antibiotics in the 1940s doctors, in general, did their patients more harm than good. In other words, for 2400 years patients believed doctors were doing good; for 2300 years they were wrong.
The Case Against Medicine
The author makes three devastating arguments. First, if medicine is defined as the ability to cure diseases, then there was very little medicine before 1865. Prior to that—a period the author calls Hippocratic medicine—doctors relied on bloodletting, purges, cautery, and emetics, all totally ineffectual, if not positively deleterious (no matter how efficiently they were administered).
Second, effective medicine could only begin when doctors began to count and compare, such as using clinical trials.
Third, the key development that made modern medicine possible is the germ theory of disease.
We all assume that good ideas and theories will drive out bad ones, but that is not necessarily true. Historically, bad medicine drove out good medicine, as Wootton explains:
We know how to write histories of discovery and progress, but not how to write histories of stasis, of delay, of digression. We know how to write about the delight of discovery, but not about attachment to the old and resistance to the new (Wootton 2006: 14-15).
This is not to say that advances in knowledge were not made prior to 1860. Unfortunately, those advances had no pay-off in terms of advances in therapy, or what Wootton calls technology—that is, therapies, treatments, and techniques to cure.
Wootton describes how the advances in knowledge did not change therapies, in perhaps the most devastating conclusion in the book:
The discovery of the circulation of the blood (1628), of oxygen (1775), of the role of haemoglobin (1862) made no difference; the discoveries were adapted to the therapy [bloodletting] rather than vice versa.
...[I]f you look at therapy, not theory, then ancient medicine survive more or less intact into the middle of the nineteenth century and beyond.
Strangely, traditional medical practices—bloodletting, purging, inducing vomiting—had continued even while people’s understanding of how the body worked underwent radical alteration. The new theories were set to work to justify old practices. [Emphasis added] (Wootton 2006: 17)
In a reversal of the scientific method, the therapies guided the theory, not the other way around. Diffusing a new theory into a population is no easy task, nor is it quick. Wootton describes in captivating detail how various innovations in medicine were rejected by the medical establishment (this list is much longer):
- Joseph Lister is credited with positing germ theory in 1865, yet there was considerable evidence for this theory dating back to 1546, and certainly by 1700. Prior to this, infections were thought to be caused by stale air and water.
- Even though by 1628 it was understood that the heart pumped blood through the arteries, the use of tourniquets in amputations didn’t happen until roughly a century later.
- The microscope was invented by 1677—simultaneously with the telescope, which lead to new discoveries in astronomy—yet as late as 1820 it had no place in medical research, believed to be nothing more than a toy.
- Penicillin was first discovered in 1872, not 1941, as popularly believed. Its effectiveness was doubted for nearly 70 years.
Why the delay?
Wootton believes the primary obstacle to progress was not practical, nor theoretical, but psychological and cultural—“it lay in doctor’s sense of themselves.” Consider the psychological obstacles:
Medicine has often involved doing things to other people that you normally should not do. Think for a moment what surgery was like before the invention of anesthesia in 1842.
Imagine amputating the limb of a patient who is screaming and struggling. Imagine training yourself to be indifferent to the patient’s suffering, to be deaf to their screams. Imagine developing the strength to pin down the patient’s thrashing body (Wootton 2006: 21).
To think about progress, you must first understand what stands in the way of progress—in this case, the surgeon’s pride in his work, his professional training, his expertise, his sense of who he is (Wootton 2006: 23).
The cultural obstacles, Wootton believes, are based on a somewhat counterintuitive observation: institutions have a life of their own. All actions cannot be said to be performed by individuals; some are performed by institutions. For instance, a committee may reach a decision that was nobody’s first choice.
This is especially true for institutions that are shielded from competition and hermetically sealed in orthodoxy. In a competitive market, germ theory would have been tested in a competing company, diffusing into the population much faster than it did within the institutions of the medical community.
Why is this Relevant to the Professions?
The similarities between bad medicine, hourly billing, timesheets, Frederick Taylor’s efficiency metrics, and value pricing are illustrative.
If a supposed scientific and evidence-based profession is this slow to change, what chance do lawyers, CPAs, and other professionals have to move away from the discredited labor theory of value—the modern-day equivalent of bloodletting?
Will the professions resist change for as long as doctors did? Are the cultural and institutional legacies that entrenched? Do professionals really want to define themselves by how many hours they log on a timesheet?
Financier, Producer, Physicist, Neuroscientist, Impresario, and Playwright.
9 年: Shattering when you think about it: "The discovery of the circulation of the blood (1628), of oxygen (1775), of the role of haemoglobin (1862) made no difference; the discoveries were adapted to the therapy [bloodletting] rather than vice versa."
Founder at MeditateBetter.com
12 年While modern medicine isn't particularly "bad", it is still woefully ignorant of the cause of *modern* diseases. But before I get to that, let me first make the case that modern medicine has badly oversold itself. Consider these factoids: 1. For decades, medicine has been taking credit for extended lifespans. But while it has certainly saved lives, the actual credit for the statistical rise in lifespan belongs mostly to sanitation, with additional credit going to transportation systems and trained EMTs who keep wounds santitized, thereby preventing secondary infections. Credit also goes to OSHA and unions, who have eliminated most of the life-threatening occupational hazards. And evolved miltitary strategies that no longer send thousands of men across a field laced with deadly fire. Those advances *more* than account for the statistical increase in lifespan 2. Although medicine has routinely claimed credit for increased lifespan, the current generation is the first in our history to be facing a reduced lifespan--despite billions of dollars spent on research, diagnostic equipment, and the development of ever new drugs. 3. Although the United States has the most advanced system of modern medicine known to man, the United States ranks the *worst* among all industrial nations, with respect to infant mortality. In fact, it ranks lower than some *non*-industrialized nations. And it ranks lower than all other industrial nations on *every* measurement of health and longevity. 4. Even when medicine does "extend" a life, in many cases it merely prolonging death--because a bed-ridden patient can hardly be said to have a "life". With that prelude, I think we can safely say that "modern medicine" does not have a real handle on the problem, at all. If it did, the statistics would be accruing favorably. Instead, they are damning. The fact is "modern" medicine, which rejected the germ theory of disease for nearly a century (to the cost of many), has been and is even now rejecting the "dietary" theory of disease--the one that explains the heart disease (50% of diseased-based mortality) and cancer (25% of disease-based mortality). Those diseases represent a systemtic breakdown caused by a combination of inadequate nutrition and what are literally dietary *poisons" in the American food supply. And just like the surgeons who tried to save lives in unsanitary conditions, these doctors prescribe pills and expensive treatments in a vain attempt to stem a rising tide of conditions they don't understand, and can't control. "Modern" medicine persists in the belief that it can "cure" those conditions, once created. But the reality is that, realistically, they can only be *prevented*. In addition, it ignores the fact that dietary and environmental insults that people are subjected to have more subtle costs. Because long before a serious disease manifests itself, a person's life has already been significantly impaired. The problem, at bottom, is that medicine concerns itself with curing disease. And that is its *only* concern. That would not be particulary bad, were it not for the fact that it sells itself as a system of *health*--when the reality is that is not concerned with health at all. If it were, it would have revised its mission long ago.
Radio Talk-Show Host, The Soul of Enterprise at VoiceAmerica Talk Radio
12 年Great points, and discussion, from all. Perhaps I'll expand on this topic in a later post, but I also believe that if health care was paid for by the patients, rather than third parties (either government or insurance companies), the competitive dynamics and innovation would be very different. Essentially, we don't have health insurance in this country. You purchase insurance for things you don't want and can't afford (like earthquake, disability, etc.). We have prepaid health care, ran through insurance companies, which sets up a different dynamic between doctors and patients. If you want to see what a free market in medicine would look like, check out non-covered care--plastic, lasik surgery, etc. Prices have been driven down through competition and it doesn't seemed to have stalled innovation in the least.
Senior Content Analyst at Merative
12 年Add a comment