A Prescription For the Health Care Crisis

0
137

With all the shouting about America’s healthcare crisis, many are probably finding it difficult to concentrate, much less understand the cause of the problems confronting us. I find myself dismayed at the tone of the discussion (though I know it—people are scared) as well as bemused that anyone would presume themselves sufficiently qualified to understand how to improve our health care system best simply because they’ve encountered it when people who’ve spent entire careers studying it (and I don’t mean politicians) aren’t sure what to do themselves.

health

Albert Einstein is reputed to have said that if he had an hour to save the world, he’d spend 55 minutes defining the problem and only 5 minutes solving it. Our healthcare system is far more complex than most who offer solutions admit or recognize. Unless we focus most of our efforts on defining its problems and thoroughly understanding their causes, any changes we make will likely worsen them as they are better.

Related Articles : 

Though I’ve worked in the American health care system as a physician since 1992 and have seven year’s worth of experience as an administrative director of primary care, I don’t consider myself qualified to thoroughly evaluate the viability of most of the suggestions I’ve heard for improving our health care system. I do think. However, I can contribute to the discussion by describing some of its troubles, taking reasonable guesses at their causes, and outlining some general principles that should be applied in attempting to solve them.

THE PROBLEM OF COST

No one disputes that U.S. healthcare spending has been rising dramatically. According to the Centers for Medicare and Medicaid Services (CMS), healthcare spending is projected to reach $8,160 per person per year by the end of 2009 compared to the $356 per person per year in 1970. This increase occurred roughly 2.4% faster than the increase in GDP over the same period. Though GDP varies from year to year and is, therefore, an imperfect way to assess a rise in healthcare costs in comparison to other expenditures from one year to the next, we can still conclude from this data that over the last 40 years, the percentage of our national income (personal, business, and governmental) we’ve spent on health care has been rising.

Despite what most assume, this may or may not be bad. It all depends on why spending on health care has been increasing relative to our GDP and how much value we’ve been getting for each dollar we spend.

WHY HAS HEALTH CARE BECOME SO COSTLY?

This is a harder question to answer than many would believe. The rise in the cost of health care (on average, 8.1% per year from 1970 to 2009, calculated from the data above) has exceeded the increase in inflation (4.4% on average over that same period), so we can’t attribute the increased cost to inflation alone. Healthcare expenditures are closely associated with a country’s GDP (the wealthier the nation, the more it spends on healthcare). Yet, even in this, the United States remains an outlier (Figure 3).

Is it because of spending on health care for people over 75 (five times what we spend on people between the ages of 25 and 34)? In a word, no. Studies show this demographic trend explains only a small percentage of health expenditure growth.

care

Is it because of the monstrous profits the health insurance companies are raking in? Probably not. It isn’t easy to know that not all insurance companies are publicly traded and therefore have balance sheets available for public review. But Aetna, one of the largest publicly traded health insurance companies in North America, reported a 2009 second-quarter profit of $346.7 million, which, if projected out, predicts a yearly profit of around $1.3 billion from approximately 19 million people they insure. If we assume their profit margin is average for their industry (even if untrue, it’s unlikely to be orders of magnitude different from the average), the total profit for all private health insurance companies in America, which insured 202 million people (2nd bullet point) in 2007, would come to approximately $13 billion per year. Total healthcare expenditures in 2007 were $2.2 trillion (see Table 1, page 3), which yields a private healthcare industry profit of approximately 0.6% of total healthcare costs (though this analysis mixes data from different years, it can perhaps be permitted as the numbers aren’t likely other by any order of magnitude).

Is it because of healthcare fraud? Estimates of losses due to fraud range as high as 10% of all healthcare expenditures, but it’s hard to find hard data to back this up. Though some fraud percentage almost certainly goes undetected, the best way to estimate how much money is lost due to fraud is by looking at how much the government recovers. In 2006, this was $2.2 billion, only 0.1% of $2.1 trillion (see Table 1, page 3) in total healthcare expenditures for that year.

Is it due to pharmaceutical costs? In 2006, total prescription drug expenditures were approximately $216 billion (see Table 2, page 4). Though this amounted to 10% of the $2.1 trillion (see Table 1, page 3) in total healthcare expenditures for that year and must be considered significant, it remains only a small percentage of total healthcare costs.

Is it from administrative costs? In 1999, total administrative costs were estimated to be $294 billion, 25% of the $1.2 trillion (Table 1) in total healthcare expenditures that year. This was a significant percentage in 1999, and it’s hard to imagine it’s shrunk significantly since then.

In the end, though, what probably has contributed the greatest amount to the increase in healthcare spending in the U.S. are two things:

1. Technological innovation.

2. Overutilize health care resources by patients and health care providers.

Technological innovation. Data proving increasing health care costs are due mostly to technological innovation is surprisingly difficult to obtain, but estimates of the contribution to the rise in health care costs due to technological innovation range from 40% to 65% (Table 2, page 8). Though we mostly only have empirical data for this, several examples illustrate the principle. Heart attacks used to be treated with aspirin and prayer.

Now they’re treated with drugs to control shock, pulmonary edema, arrhythmias, thrombolytic therapy, cardiac catheterization with angioplasty or stenting, and coronary artery bypass grafting. You don’t have to be an economist to determine which scenario is more expensive. We may learn to perform these same procedures more cheaply over time (the same way we’ve figured out how to make computers cheaper). Still, as the cost per procedure decreases, the total amount increases because the number of procedures performed increases. Laparoscopic cholecystectomy is 25% less than the price of an open cholecystectomy, but the rates of both have increased by 60%. As technological advances become more widely available, they become more widely used, and one thing we’re great at doing in the United States is making technology available.

Overutilization of health care resources by both patients and health care providers themselves. We can easily define overutilization as the unnecessary consumption of healthcare resources. What’s not so easy is recognizing it. Every year from October through February, most patients who come into the Urgent Care Clinic at my hospital are, in my view, doing so unnecessarily. What are they coming in for? Colds. I can offer support, reassurance that nothing is seriously wrong, and advice about over-the-counter remedies—but none of these things will make them better faster (though I often can reduce their concern). Further, patients have a hard time believing the key to arriving at a correct diagnosis lies in history gathering and careful physical examination rather than technologically-based testing (not that the latter isn’t important—just less so than most patients believe). How much patient-driven overutilization costs the health care system is hard to pinpoint as we have mostly only anecdotal evidence.

Further, doctors often disagree about what constitutes unnecessary healthcare consumption. In his excellent article, “The Cost Conundrum,” Atul Gawande argues that doctors’ regional variation in the overutilization of healthcare resources best accounts for the regional variation in Medicare spending per person. He argues that if doctors could be motivated to rein in their overutilization in high-cost areas of the country, it would save Medicare enough money to keep it solvent for 50 years.

A reasonable approach. To get that to happen, however, we need to understand why doctors are overutilizing healthcare resources in the first place:

1. Judgment varies in cases where the medical literature is vague or unhelpful. A variation in practice invariably occurs when faced with diagnostic dilemmas or diseases for which standard treatments haven’t been established. If a primary care doctor suspects her patient has an ulcer, does she treat herself empirically or refer to a gastroenterologist for an endoscopy? If certain “red flag” symptoms are present, most doctors would prefer. Some would, and some wouldn’t, depending on their training and the intangible exercise of judgment if not.

2. Inexperience or poor judgment. More experienced physicians tend to rely on histories and physicals more than less experienced physicians and consequently order fewer and less expensive tests. Studies suggest primary care physicians spend less on tests and procedures than their subspecialty colleagues but obtain similar and sometimes even better outcomes.

3. Fear of being sued. This is especially common in Emergency Room settings but extends to almost every area of medicine.

4. Patients tend to demand more testing rather than less. As noted above. And physicians often have difficulty refusing patient requests for many reasons (e.g., wanting to please them, fear of missing a diagnosis and being sued, etc.).

5. In many settings, overutilization makes doctors more money. Doctors have no reliable incentive to limit their spending unless their pay is capitated or they receive a straight salary.

Gawande’s article implies there exists some level of utilization of health care resources that’s optimal: use too little, and you get mistakes and missed diagnoses; use too much, and excess money gets spent without improving outcomes, paradoxically sometimes resulting in worse products (likely as a result of complications from all the extra testing and treatments).

How can we get doctors to employ uniformly good judgment to order the right number of tests and treatments for each patient–the “sweet spot”–to yield the best outcomes with the lowest risk of complications? Not easily. Fortunately or unfortunately, there is an art to good healthcare resource utilization. Some doctors are more gifted at it than others. Some are more diligent about keeping current. Some care more about their patients. An explosion of studies of medical tests and treatments has occurred in the last several decades to help guide doctors in choosing the most effective, safest, and even cheapest ways to practice medicine. Still, the diffusion of this evidence-based medicine is a tricky business. For example, beta-blockers have been shown to improve survival after heart attacks, which doesn’t mean every physician knows or provides it. Data clearly show many don’t. How information spreads from the medical literature into medical practice is a subject worthy of an entire post unto itself. Getting it to happen uniformly has proven extremely difficult.

In summary, then, most of the increase in spending on health care seems to have come from technological innovation coupled with its overuse by doctors working in systems that motivate them to practice more medicine rather than better medicine, as well as patients who demand the former thinking it yields the latter.

But even if we could snap our fingers and magically eliminate all overutilization today, health care in the U.S. would remain among the most expensive in the world, requiring us to ask next—crisis

WHAT VALUE ARE WE GETTING FOR THE DOLLARS WE SPEND?

According to an article in the New England Journal of Medicine titled The Burden of Health Care Costs for Working Families—Implications for Reform, growth in health care spending “can be defined as affordable as long as the rising percentage of income devoted to health care does not reduce standards of living. When absolute increases in income cannot keep up with absolute increases in health care spending, health care growth can be paid for only by sacrificing consumption of goods and services unrelated to health care.” When would this ever be an acceptable state of affairs? Only when the incremental cost of health care buys equal or greater total value. If, for example, you were told that shortly you’d be spending 60% of your income on health care. As a result, you’d enjoy a 30% chance of living to the age of 250; perhaps you’d judge that 60% a small price to pay.

It seems to me that the debate on healthcare spending needs to be about this. Certainly, we should work on ways to eliminate overutilization. But the real question isn’t what absolute amount of money is too much to spend on health care. The real question is, what are we getting for the money we spend, and is it worth what we must give up?

People alarmed by the notion that as health care costs increase, policymakers may decide to ration health care don’t realize that we’re already allocating at least some of it. It just doesn’t appear as if we are because we’re giving it on a first-come-first-serve basis—leaving it at least partially up to chance rather than to policy, which we’re uncomfortable defining and enforcing. Thus, we don’t realize why our 90-year-old father in Illinois can’t have the liver he needs because a 14-year-old girl in Alaska got in line first (or maybe our father was in line first and got it while the 14-year-old girl didn’t). Given that most of us remain uncomfortable with the notion of rationing health care based on criteria like age or utility to society, as technological innovation continues to drive up health care spending, we very well may at some point have to make critical judgments about which medical innovations are worth our entire society sacrificing access to other goods and services (unless we’re so foolish as to repeat the crucial mistake of believing we can keep borrowing money forever without ever having to pay it back).

So, what value are we getting? It varies. The risk of dying from a heart attack has declined by 66% since 1950 due to technological innovation. Because cardiovascular disease ranks as the number one cause of death in the U.S., this would seem to rank high on the scale of value as it benefits a huge proportion of the population importantly. As a result of pharmacology advances, we can now treat depression, anxiety, and even psychosis far better than anyone could have imagined, even as recently as the mid-1980s (when Prozac was first released). T, some increases in healthcare costs have yielded enormous value we wouldn’t want to give up.

But how do we decide whether we’re getting good value from innovations? Scientific studies must prove the innovation (whether a new test or treatment) provides a clinically significant benefit (Aricept is a good example of a drug that works but doesn’t provide great clinical benefit—demented patients score higher on tests of cognitive ability while on it but probably aren’t significantly more functional or substantially better able to remember their children compared to when they’re not). But comparative effectiveness studies are extremely costly, take a long time to complete, and can never be perfectly applied to every patient, which means some healthcare providers must always use good medical judgment for every patient problem.

Who’s best positioned to judge the value to society of the benefit of an innovation—that is, to decide if an innovation’s benefit justifies its cost? I would argue that the American public is the group that ultimately pays for it. However, how the public’s views could be reconciled and effectively communicated to policymakers efficiently enough to affect actual policy lies far beyond this post’s scope (and perhaps anyone’s imagination).

THE PROBLEM OF ACCESS

Many of the population are uninsured or underinsured, limiting or eliminating their access to health care. As a result, this group finds the path of least (and cheapest) resistance–emergency rooms—which has significantly impaired our nation’s ER physicians’ ability to render timely emergency care. Also, surveys suggest a looming primary care physician shortage relative to the demand for their services. In my view, this imbalance between supply and demand explains most of the poor customer service patients face in our system every day: long wait times for doctors’ appointments, long wait times in doctors’ offices once their appointment day arrives, then short times spent with doctors inside exam rooms, followed by difficulty reaching their doctors in between office visits, and finally delays in getting test results. This imbalance would likely only partially be alleviated by less healthcare overutilization by patients.

GUIDELINES FOR SOLUTIONS

As Freakonomics authors Steven Levitt and Stephen Dubner state, “If morality represents how people would like the world to work, then economics represents how it does work.” Capitalism is based on enlightened self-interest, a system that creates incentives to yield behavior that benefits suppliers, consumers, and society. But when incentives get out of whack, people begin to behave in ways that continue to help them, often at the expense of others or even at their own cost down the road. Whatever changes we make to our health care system (and there’s always more than one way to skin a cat), we must be sure to align incentives so that the behavior that results in each part of the system contributes to its sustainability rather than its ruin.