The Healthcare Meltdown – Part III

Medicare

<< Back to Part I: How Insurance Works
<< Back to Part II: How Medical Insurance Was Broken

Thanks for sticking with the series so far. I know the last section was long and technical and I appreciate you slogging through it. The next parts should be a breeze in comparison.

In part II we saw how the employer tax deduction for health insurance tied insurance to jobs, transformed insurance from a protection against disaster to a way to obtain all care at a discount, and led to overutilization of care by shifting costs away from patient payments to premiums taken out of paychecks. Tying health insurance to patients’ jobs created another problem. It left retirees out of the tax subsidy that employees enjoyed. As healthcare became more and more expensive there was increased political pressure to extend affordable coverage to retirees.

(Note again how after World War II the language changed from getting patients high quality affordable care to getting them insurance coverage. Insurance became essentially the only way to access care.)

A simple way to even the playing field for retirees and to undo the myriad problems detailed in the last post would have been to eliminate the employer tax deduction. But doctors and hospitals were not keen on giving up their enormous subsidy. So in 1965 the Social Security Act created Medicare (and Medicaid). Medicare provides health insurance to Americans 65 and older and is funded by a payroll tax.

Medicare magnified many of the problems of the employer tax exemption. By creating a whole group of patients whose care was paid indirectly by all employees, costs were redistributed over even more people, further reducing any incentive to conserve. Utilization and costs rose even faster. The cost of Medicare doubled every four years between 1966 and 1980. This explosion in costs further cemented the fallacy in the minds of most patients that care is something that can only be obtained through insurance.

Now, I take care of lots of Medicare beneficiaries, and I can hear some of you objecting: “I might be able to afford to buy insurance if I didn’t have Medicare, but lots of people couldn’t. Have you seen how much medicines and doctor visits cost? Most people just can’t afford that.”

But that’s just it. Medicines and healthcare are expensive because they’re covered by insurance and patients aren’t exposed to the actual cost. There’s no viable business that produces goods and services that no one can afford. If patients paid directly, prices would plummet and patients (not insurance bureaucrats) would have to make the difficult decisions about how much they wanted to spend for the latest drug when a cheaper one is almost as good, or for the latest unproven surgery, or for the latest unproven scan. The point is that different patients would make different decisions, but many would save a lot of money by consuming less care than they do now. Catastrophic insurance would then be inexpensive and rarely used.

There would always be the truly indigent who can not afford any care at any price. They would rely on Medicaid, county facilities and private charities. But like in other marketplaces such as food, housing and transportation, that would be a tiny fraction of the population. Most people could afford their own care. The miscalculation that the insurance industry wants you to keep making is that insurance is what makes care affordable. Insurance, including Medicare, is what makes care unaffordable.

Next week, I’ll propose some changes that would actually help, though they are less likely to happen than that I am drafted by the Lakers.

Forward to Part IV: A Recipe for Reform >>

More

The Healthcare Meltdown – Part II

How Medical Insurance Was Broken

<< Back to Part I: How Insurance Works

In the last post we learned the legitimate valuable role that insurance plays in collectivizing risk.  In this post I will explain how that model broke for health insurance and how we are still suffering the consequences.

It seems incredible that our current difficulties with healthcare originated in the 1940s, but that is exactly the case.  During World War II the federal government imposed wage and price controls.  Because wages were kept below their market value, companies had a hard time attracting excellent applicants.  So to circumvent the wage cap companies began offering increasingly generous fringe benefits.  One of these benefits was health insurance.  Employer-provided health insurance became so popular that Congress passed a law making it tax deductible.  The wage and price controls were abolished after the war and are long forgotten, but the employer tax deduction for health insurance is with us over six decades later and has thoroughly disfigured the healthcare marketplace.  Let me explain how.

Remember the example of Bob’s Insurance Company in the last post?  Imagine, for example, a world in which employer-provided home insurance became tax deductible.  Everyone would get their home insurance policy through their work.  Let’s also agree, just to keep the numbers simple, that most employees pay about a quarter of their income in taxes.  The homeowner’s policy in the last post cost $1.50 per year, but prior to the tax exemption law each family would have to generate $2 in income to pay 50 cents in tax (25%) and have $1.50 left over to pay as their insurance premium.  After the tax exemption law passes, their boss can buy insurance for each employee for $1.50 pre-tax, leaving the $0.50 for additional salary or other benefits.  But this also completely skews what insurance can and should be used for.  In the last post we explained why insurance was a terrible deal for routine, frequent expenses.  Now, imagine that having a gardening service cost $150 per month.  Prior to the tax exemption, this would require $200 in pre-tax income (since a quarter, or $50 would go to taxes).  But now, Bob has a strong incentive to sell a policy that pays for gardening.  He can offer the policy for $175 per month, which the employer can deduct from the employee’s salary.  Since employers can get the insurance with pre-tax dollars, it’s suddenly cheaper to hire a gardener through your job-offered home insurance than directly.  Everyone wins in the short term:  Bob can pay the gardener $150 and still make $25 in profit.  The employee gets a gardener for $175 rather than $200, and the employer saves on payroll taxes.

See what’s happened?  The tax deduction has made it cheaper to buy something through pre-tax insurance than to buy the same thing directly with post-tax wages.  Suddenly the insurance company has been transformed from a protection against unpredictable and unaffordable disaster to a way to get routine services at a discount.

Obviously, in the real world the tax deduction was for health insurance not home insurance.  Now we begin to understand why health insurance has become so dysfunctional, even though most other types of insurance (think of life, auto, home…) are relatively inexpensive and almost never used.  The reason is that it became cheaper for all of us to get routine care through our employer-provided insurance rather than to pay for it ourselves.

But this had other destructive consequences.  First of all, it ties our ability to obtain routine healthcare to our job.  Unlike equally important goods and services, like food and housing, if one loses one’s job in this new system, one usually loses access to healthcare.  In the old system healthcare was just like food and housing – your employer gave you a salary and you shopped for the quality and price that was right for you.  If you lost your job you might have less money, but you were still in control of what you wanted to spend your money on.  You could still afford a doctor’s visit.  Even if you briefly lost your insurance, that wouldn’t have been as big a loss as it is today, since insurance only covered a disaster.  Now, insurance is the key to the doctor’s waiting room.

This handcuffing of employees to their jobs would have been bad enough, but the tax deduction had other perverse incentives.  For the employee, it made healthcare expenditures cheaper than general household expenses.  Using the simple example of 25% taxes, $400 in income can be used to either buy $400 in additional health insurance or $300 in other household expenses (after paying $100 in taxes).  So this scheme encourages more spending on healthcare and less on everything else.  Of course, doctors and hospitals didn’t complain as this amounted to a huge (but hidden) tax subsidy of the healthcare industry.

But a bigger problem caused by the tax exemption was that suddenly routine care was “covered”.  To individual patients/employees that meant that their employer was paying for their routine care by paying for their insurance and that their out-of-pocket expense for any specific item of care was very low.  This encouraged over-utilization.  Any economics student knows that when anything gets cheaper consumers will demand more of it.  Routine care became cheaper to the patient, so patients wanted more.  But remember what we learned in part I.  Redistributing cost through insurance doesn’t make anything cheaper.  It adds a layer of expense since the insurance company keeps some of the money as profit.  But this additional expense isn’t paid by the patient at the time care is delivered.  It’s siphoned out of his paycheck whether he goes to the doctor or not.  Each visit is cheap to the patient, so there is no incentive to conserve; the only incentive is to consume.

Predictably, utilization of healthcare and prices for services exploded.  In fact, since World War II healthcare has consistently risen in price more than the general inflation.

This led to efforts to control costs and utilization through increasingly complex bureaucracy.  Managed care was born.  Physicians had to comply with progressively more onerous rules about what was covered and what wasn’t.  Increasingly physicians worked for a boss other than their patient who dictated the quality and the reimbursement for the care they delivered.  Patients found themselves unable to demand quality and shop for a better price (like they could in every other marketplace) while the amount taken out of their paycheck for their insurance continued to climb.

To summarize, the employer tax-deduction for healthcare led directly (though unintentionally) to a system in which

  • Employees lose access to care when they lose their job,
  • There is a bias toward spending on healthcare versus other household expenses,
  • All care (rather than just catastrophic care) is purchased through insurance,
  • Utilization and costs are not constrained by price and must be constrained bureaucratically, and
  • Doctors are increasingly paid by, regulated by, and answerable to third-party payers, not patients.

This sounds horrific enough, but in 1965 we demonstrated that no marketplace is so terrible that we can’t make it worse.  That will be the subject of next week’s post in part III.  In two weeks the series will conclude with my suggestions of some ways out of this mess.

Forward to Part III: Medicare >>
Forward to Part IV: A Recipe for Reform >>

More

The Healthcare Meltdown – Part I

How Insurance Works

For many families healthcare is increasingly expensive while simultaneously increasingly mediocre.  A recent study in the American Journal of Medicine found that two thirds of bankruptcies were due in part to medical expenses, and surprisingly, over three quarters of the individuals going bankrupt had health insurance.  There is no denying that the American healthcare marketplace is broken.  The problem for many of us is that we don’t know enough history to understand how it broke and we don’t know enough economics to know how to fix it.  So we’re left listening to politicians, insurance lobbyists and doctors’ groups each of whom have their own (self-interested) agenda.

I would like in the next few posts to explain how American healthcare got here.  The complexity of the problem works to our disadvantage because it makes us reach for any solution without understanding the details, and some of the currently proposed solutions would be even worse than the status quo.  (And some solutions have already been tried in other countries or in American states with disappointing results.)  Every involved group has a lobby advancing their interests except patients and taxpayers (who are approximately the same people), so it’s hard to imagine a good outcome unless we all accept the difficult burden of democracy: informing ourselves.

The following may be condescendingly obvious to those with a math or economics background, but please bear with me as I try to clarify the details to a broader audience.

Before we start, we have to understand the legitimate purpose of insurance and how it works in settings other than healthcare.  That’s the goal of this first post.

To understand the point of insurance, let’s imagine a world in which insurance hasn’t been invented yet.  Let’s imagine a city of a million homes and let’s say that the homes average one million dollars in value.  Now, occasionally some unforeseen disaster happens – a fire burns a house down.  And let’s also assume that this happens on average to one home per year.  To the average family in this town this would be financially ruinous.  They would be unable to afford rebuilding their home and would lose the lifetime of work that was their equity in their home.  Besides the families who are irreversibly impoverished by the actual fire, many other families are very worried that their house could be next.

So one of the town residents, Bob, finds a solution.  He realizes that the loss of a house is too big of a loss for any single family to afford, but not for the whole town.  So he suggests that the town protect itself by having each family pay $1.50 into a fund every year.  That fund would be used to rebuild any house that is lost to fire.  Since on average the fund would pay out $1,000,000 every year but would take in $1,500,000 (since a million families are each paying $1.50) there should be extra money left over to pay Bob to handle the administrative work, make a profit, and save for the occasional year that two houses burn down.  Thus the first insurance company is born.

The important points to learn here is that the town is losing money on every house it rebuilds, since it’s paying a middleman, Bob, to rebuild the houses.  It’s paying a million and a half annually for a million dollar house.  Nevertheless, everyone wins, because what each family is purchasing with the extra money that Bob keeps is peace of mind.  By collectivizing their risk, they each lose a little money but avoid going broke.  That’s the legitimate service that insurance provides: the insurance company takes a risk off your hands and makes a profit for doing so.

The take-home points are that

  • Insurance is valuable for events that are both unpredictable and unaffordable.
  • Insurance doesn’t make anything cheaper.  It makes it more expensive but distributes the cost over many people.

So in general you should never buy insurance against an event that is affordable.  For example, buying an extended warranty on a new TV is rarely a good idea.  If the TV breaks, most of us could survive without it until we saved up enough to buy a new one.  Buying the insurance just means paying extra to buy it through the middleman.  Since the company selling you the insurance knows the likelihood that it’ll break (and you don’t) the price of the policy will always be more than enough to cover the risk and make the company a profit.  Unless you’ll lose sleep about the event you’re insuring against, that’s a bad deal.

For the same reason, it doesn’t make sense to insure against an event that happens frequently.  For example, if the residents in Bob’s town wanted their insurance policy to also pay for their weekly visit from the gardener or to repaint their house every year, they would be foolish.  Bob would be happy to sell them such a policy, but would charge them more than the gardener or the painting would cost.  It’s much cheaper for each family to pay for predictable costs themselves and only buy insurance for rare and devastating ones.

This is how most insurance works when it works well, and this is how American health insurance worked for a long time.  It covered only catastrophes.  For everything else, patients paid themselves.  Doctors and pharmacists set their prices and patients paid them.  Health insurance was relatively inexpensive and was used rarely.  Doctors were affordable because they had to be; an unaffordable doctor would have no patients.  So how did we end up with health insurance that is both expensive and doesn’t protect people from bankruptcy?

In 1943, with the best of intentions, the old health insurance system was destroyed.  Sixty-six years later we are still reeling from the consequences.  Next week I’ll explain what happened.

Learn more:

Wall Street Journal Health Blog:  Medical Bills Are Found Linked to Most Bankruptcies

Forward to Part II:  How Medical Insurance Was Broken >>
Forward to Part III:  Medicare >>
Forward to Part IV: A Recipe for Reform >>

More

Angst about Acetaminophen

When many of us get a headache, a fever, or just suffer the aches and pains of physical exertion we don’t think twice about reaching for an over-the-counter pain reliever.  Acetaminophen, which is the medicine in the well-known brand Tylenol, has long been considered the safest pain medication.  Non-steroidal anti-inflammatory medications (NSAIDs) can cause stomach irritation and ulcers and can decrease kidney function.  Opiates (morphine and its relatives) can cause drowsiness, constipation and addiction.  Acetaminophen has none of these side effects and remains the first choice of many physicians when safety (not efficacy) is paramount.

But yesterday an FDA working group released a report reminding us that even acetaminophen has risks.  Specifically, acetaminophen in high doses can cause serious, even fatal, liver injury.  Liver injury can happen at even lower doses in people who drink alcohol regularly or who have other liver diseases.  Every year some people die of liver failure due to acetaminophen overdose.  Some of these overdoses are intentional, and some are due to misunderstanding medications with multiple ingredients.  For example, some prescription medicines like Vicodin and Percocet contain an opiate pain medicine and also acetaminophen.  Patients who don’t know this and take Tylenol in addition may inadvertently take a dangerously high dose of acetaminophen.

The FDA working group recommended limiting the maximum adult daily dose of acetaminophen to no more than 3,250 mg.  (It’s currently 4,000 mg.)  The maximum dose should be even lower for patients drinking 3 or more alcoholic drinks daily.  The group also recommended eliminating the “extra strength” tablet dose of 500 mg and limiting tablets to 325 mg and single adult doses to a maximum of 650 mg.

I still think acetaminophen is the safest available pain reliever. We all need to be more careful about keeping track of the ingredients in the over-the-counter and prescription medicines we take, and in the case of acetaminophen, we need to keep a close eye on our total daily dose.

Learn more:

ABC News article:  FDA Group Issues Cautions on Acetaminophen Overdose

FDA report:  Recommendations for FDA Interventions to Decrease the Occurrence of Acetaminophen Hepatotoxicity (The report is 286 pages long.  I recommend reading the executive summary on the fifth page.)

More

Torpedoing Primary Care

For the last few years the future of primary care has been looking bleak.  Fewer and fewer medical students are choosing primary care careers, just as baby-boomers retire and will need more care.  Primary care physicians meanwhile are retiring early or cutting back their practices at record numbers, worsening the coming shortage.

The current issue of the Annals of Internal Medicine publishes a perspective article by Dr. David Norenberg that heaps on the gloom.  Describing himself as closer to the end of his career than the beginning, he mourns the end of the golden age of primary care.  While he was attracted to the close and prolonged relationship with patients that only primary care can provide, he sees young students turned off by the recent trends in medicine – insurance companies dictating care, reimbursement set by arcane algorithms, and a focus on quantity, not quality.

A recent Medical Economics article (link below) lends credence to Dr. Norenberg’s pessimism.  It details the filling of next year’s residency positions by medical school graduates.  Yet again, primary care positions have declined as students stampede into subspecialties.

Dr. Norenberg’s observations are right on the mark.  Then he proceeds to offer a solution that is sure to fail: a single-payer medical system based on Medicare that pays primary care doctors more.  Versions of this general scheme – giving everyone Medicare either as a sole insurer or as a “public option” to private insurance – are being considered as possible overhauls to our healthcare system.  Meanwhile, just this month the General Accounting Office announced that Medicare will run out of money by 2017.

The hull of the ship is leaking.  Time to board more passengers.

Demanding that insurers (either private or government) pay primary care doctors more will only lead to an internecine fight with specialists over who gets a bigger slice of the pie.  We miss the bigger picture that the whole pie will be gone in a decade.  We’re fighting over crumbs.

The case that primary care is valuable must be made to patients, not policy makers.  Patients will vote with their own dollars and decide for themselves the kind of healthcare they prefer.  The insurance model in which we all pay for each other’s care is failing catastrophically, but because of entrenched interests we will stay on that sinking ship until the water is up to our necks.

Eventually, out of the wreckage, patients will build a new system in which they each largely pay for their own care, using insurance only for unforeseen disasters.  How long that takes depends on when we notice the water rising.  Some of us are already heading for the lifeboats.

Learn more:

Annals of Internal Medicine article:  The Demise of Primary Care: A Diatribe From the Trenches

Medical Economics article:  “Match Day” delivers another blow to primary care

Financial Times article:  Medicare forecast to run out of money in 2017

Previous related posts:

Will Primary Care Survive?

On Being Doc and Being Happy

Pay for Performance: Peril for Patients

More

Folic Acid: Fabulous for Fertile Females, Feckless for Fellows

Folic acid, a vitamin found naturally in green leafy vegetables and legumes, is essential for making the building blocks of DNA.  And since copying DNA is an important part of what cells do before they divide, it’s critical for cell division.  Developing fetuses have very rapidly dividing cells, so it’s not surprising that folic acid deficiency has been linked to birth defects, specifically brain and spinal cord abnormalities.

To prevent these birth defects, physicians for many years have recommended folic acid supplements to pregnant women and women planning pregnancy.  The problem is that folic acid deficiency harms babies in the first weeks of pregnancy, before many women know they’re pregnant and before they seek prenatal care.  In an effort to end folic acid deficiency more comprehensively, the U.S. began requiring that flour and other grains be fortified with folic acid in 1998.  The incidence of brain and spinal cord birth defects subsequently declined.

So if folic acid is good for pregnant women, might it have benefits for everyone else?

Well, unfortunately, no other major benefits have been found to taking folic acid supplements.  Folic acid deficiency can cause anemia, but that’s rare and is easily treated (with folic acid!) when diagnosed.  Back in the 1940s it was noted that leukemia patients tended to have low folic acid levels.  It was hypothesized that folic acid deficiency played a role in leukemia and a trial was done in which leukemia patients were given folic acid.  Surprisingly, they died sooner than the patients getting placebo.  Their folic acid levels were low because it was being used up in the rapidly dividing leukemia cells; giving them more folic acid helped the leukemia cells divide faster.

Since then folic acid supplementation has been linked with other cancers.  Though the findings were not definitive, given the absence of proven benefits (in those of us who are not women in child-bearing age) there is no compelling reason to recommend folic acid for everyone.

This Monday’s LA Times had two very helpful articles which summarized the controversy.

The U.S. Preventive Services Task Force (USPSTF) reviewed the evidence on folic acid and reissued their recommendations this month.  The USPSTF recommends that all women planning or capable of pregnancy take a daily supplement containing 0.4 to 0.8 mg (400 to 800 micrograms) of folic acid.  There is no recommendation for men or for women not in their child-bearing years.

The rest of us should probably just eat our veggies.  If you do take a folic acid supplement (and I don’t) make sure it doesn’t contain more than 1 mg (1,000 micrograms) of folic acid.

Thanks to Ron T. for pointing me to the LA Times articles.

Tangential miscellany:

On Memorial Day my partner, Dr. Dorothy Lowe, and some of our staff and I will be riding in the Acura LA Bike Tour.  Register to ride with us, or come out to cheer and watch the spectacle of thousands of riders taking to the streets.  You don’t have other plans for 5 a.m., do you?

Learn more:

LA Times articles:  Folic acid might be losing its sheen and Folic acid is important, but take care not to overdo it

Folic Acid for the Prevention of Neural Tube Defects: U.S. Preventive Services Task Force Recommendation Statement

More

Vaccine Refusal: Turning Back Two Centuries of Progress

Vaccines have become a victim of their own success.  In 1809 Massachusetts became the first state to pass a law requiring a vaccination – of smallpox – ushering a series of public health victories over a number of serious diseases.  In the past 200 years smallpox has been eradicated, and measles, polio, rubella and tetanus have become so rare that they have disappeared from public consciousness.

The number of children who contract vaccine-preventable diseases today is tiny compared to the number before the era of vaccines.  Before measles vaccination there were 500,000 reported cases of measles annually in the US.  In the last few years the average has been 62 per year.

Perhaps because of this spectacular success, parents are now much less aware of the terrible consequences of vaccine-preventable diseases, and some parents are increasingly concerned about the risks of vaccines.  Despite the fact that the risks associated with vaccines are extremely small, unfounded rumors and beliefs about these risks continue to circulate.  In the last few years increasing numbers of parents are refusing vaccination for their children.

An important article in this week’s New England Journal of Medicine summarized the trends in vaccine refusal, the reasons parents cite for refusal, and the risks of vaccine refusal.  The article supports what we already thought we knew.  Obviously, unvaccinated children are more likely to contract vaccine-preventable diseases.  But more importantly, clusters of unvaccinated children put at risk other children around them.  For example, children who cannot be vaccinated because of medical problems depend on the general immunity in their surrounding community to keep them healthy.  Children whose parents refuse vaccination put those children who cannot be vaccinated at risk.

Southern California is known for its wonderful heterogeneity of ideas and lifestyles.  We think ourselves cool because we drink free range coffee and eat nothing but organic tofu and weave our sandals from post-consumer hemp.  But some ideas, besides being false, are also profoundly harmful.  While public health officials struggle with crafting policies to make vaccination more ubiquitous, you and I have to make it clear that refusing to vaccinate your kids is just not cool.

Tangential miscellany:

I received many positive responses (and some new readers) from last week’s post about the virus previously known as swine flu.  I hope the new readers aren’t bored when I get back to writing about diabetes and cancer screening.  Most health topics aren’t as funny as the potential worldwide spread of a new virus!

There is much less buzz (thank goodness) about H1N1 (swine) flu this week, but I thought a brief update would be useful.

  • Over 30,000 people die of the regular garden-variety flu in the US annually.
  • Swine flu by any other name is still not transmissible by eating pork.
  • As of today, the number of confirmed cases in the US is 1639, and the number of deaths is 2.  Both deaths were in patients with other chronic health problems.
  • The number of Tamiflu prescriptions I’ve written since this started is zero.
  • I still stand by everything I said last week.  To clarify, the reason you shouldn’t panic isn’t because you won’t get it.  We’ll all get it (or a vaccine) eventually.  The reason not to panic is that it won’t be that bad.
  • In my opinion the CDC and WHO have handled this epidemic wonderfully and the media have handled it terribly.
  • If in retrospect you believe that you were made more scared about this than you should have been, maybe it’s time to stop getting information from television.  Get your information on the web from reliable sources.  (See links below.)

Learn more:

New England Journal of Medicine article:  Vaccine Refusal, Mandatory Immunization, and the Risks of Vaccine-Preventable Diseases

A year ago I wrote about a measles outbreak in unvaccinated kids:  U.S. Measles Cases at Highest Numbers Since 2001

My post last week on the virus previously known as swine flu:  Swine Flu: Unlikely to End the World

The Centers of Disease Control page on H1N1 (swine) flu

More

Swine Flu: Unlikely to End the World

I thought it might be a good idea to write my weekly post early this week since there is so much anxiety about swine flu.

The media and officials in many countries have contributed to much fear and misunderstanding which may turn out to be more harmful than swine flu itself.  Let me try to shed some light without raising the heat.

The swine flu virus has been around for a long time as a cause of respiratory illness in pigs.  Sporadically, it has caused illness in humans who had a lot of contacts with pigs.  What’s unusual now and causing concern is that the swine flu virus for the first time has recently evolved the ability to be transmitted from person to person.  (By the way, though unfortunate, this is a beautiful demonstration of evolution happening before our eyes.  Yay, Darwin!)  The swine flu that is currently infecting humans is a blend of genes from the avian flu, the human flu and the swine flu.  Not surprisingly, given its makeup, it causes an illness in humans just like the flu.  Meaning (human) patients with the swine flu have fever, muscle aches, a cough and sore throat, just like a regular flu.

At this point the attentive reader is asking “But the swine flu must be worse than the regular flu, otherwise, why the big kerfuffle?”  Nope.  There is no reason to believe that the swine flu causes an illness worse than the regular flu.  As of today, 40 cases have been identified in the US.  There have been no fatalities and only one hospitalization.  All of the patients have recovered.

“So why the hubbub?”

Well, what makes this unusual and to public health officials potentially worrisome is that there is a new virus that can spread from human to human and it has about 6.7 billion potential hosts to infect that don’t have immunity for it.  Old viruses, like measles or chicken pox, can cause a little outbreak here or there, but most people are immune either due to vaccination or because they already had the disease.  Other old viruses like the regular flu or cold viruses constantly change, so they can infect new people all the time, but there’s still only a limited population that hasn’t been exposed to the current strain.  What’s different now is that, depending on the rate of spread of this virus, we might all catch it at about the same time.  That’s not a big deal for most of us.  Again, the illness won’t be worse than a regular flu.  But imagine if even a quarter of the 3.8 million people in Los Angeles had the flu at the same time.  The consequences for the vast majority of individuals would be just a major inconvenience, but for the frailest among us, and for public health and safety services, it would be catastrophic.

“But what about all the fatalities in Mexico?  Those numbers sound a lot worse than a regular flu.”

There have been some fatalities from respiratory illnesses in Mexico.  Some of them have not been confirmed to be from the swine flu, but some have.  Some of the apparent difference in severity between the cases in the US and the cases in Mexico has to do with the very aggressive surveillance and tracking being done by the CDC in the US.  The CDC is aggressively trying to find as many cases as possible in the US, so the 40 cases that we know about are the result of those efforts.  In Mexico there’s no way to know how many cases there have been.  Some of the cases only came to the attention of public health officials because they were fatalities.  So Mexico may have had thousands or tens of thousands of undiagnosed cases which recovered without treatment, and dozens of fatalities which were noticed.  Just like a regular flu season.  (Statisticians call this a sampling error – an apparent difference between two groups when no actual difference exists caused by sampling the two groups differently.)

“But won’t the swine flu get more lethal as it mutates?”

There’s no way to tell how the swine flu will change in the future, but what we know about previous epidemics suggests that it won’t get more lethal.  A mutation that makes the virus more quickly incapacitating and lethal is unlikely to be passed on to other hosts.  On the other hand a mutation that makes the disease milder so that the host goes to work, doesn’t feel so bad, and coughs all over his coworkers for the next two weeks before recovering will infect a lot more people.  That’s why in general epidemics get milder as time goes on, not more severe.

“I’ve been avoiding eating pork.”

That might be good for your cholesterol, but you can’t get swine flu from eating pork.  It’s transmitted from person to person, like the regular flu.

“What do I do if I get sick?”

If you develop flu symptoms, don’t go to work.  Cover your cough.  Call or see your doctor right away, since the same anti-viral medicines that shorten the duration of the regular flu work for the swine flu also.

“So I should call my doctor now, and demand a prescription for anti-viral medicine just to have around in case I get sick later?”

No.  You should not get anti-viral medicine unless you are sick.  We will not run out of anti-viral medicine.  By the way, what happened to that Cipro you forced me to prescribe for you back in 2001 because you were worried about anthrax?

“I still have that.  It’s got cobwebs on it.  Is there anything I can do to avoid getting sick?”

Avoid sick people.  Wash your hands frequently.

“So we’re not all doomed?”

No.  The world will not end because of the swine flu.

“But what about the coming zombie apocalypse?”

Well!  Look at the time!  Gotta go.

Learn more:

The Centers for Disease Control and Prevention swine flu information page

Today’s press briefing from the Centers for Disease Control and Prevention

Do your homework about zombies before it’s too late

More

What We Don’t Know About Diabetes – Part 3

In the last year I’ve written about a major change in our understanding of diabetes treatment.  The goals of treatment used to be to get blood glucose as close to normal non-diabetic levels as possible.  That usually meant increasing medication doses or adding additional medications until the glycated hemoglobin was down to normal.  (Glycated hemoglobin, or hemoglobin A1c, is a blood test that measures an average of blood glucose over the previous 3 months.)  Targeting normal glycated hemoglobins frequently meant complex medication regimens, occasional low blood sugars and the side effects caused by the medications.  But we were willing to accept this because we thought that getting the glycated hemoglobin down to normal prevented strokes, heart attacks, and other complications of diabetes.

Then the ACCORD and ADVANCE studies turned that thinking on its head.  They were both large randomized trials that looked at the effects of strict glucose control on outcomes in diabetes.  Rather than confirm the benefits we assumed, they showed that diabetics who were treated to get their glucose down to the normal range did no better (and in one trial did worse) than diabetics who were treated with more lax glucose control.  Suddenly the goal for the glucose and the glycated hemoglobin was no longer clear.  How low should we go?

A perspective article just released in the Annals of Internal Medicine attempts to review the issue and make recommendations.  The review reminds us of what we definitely know about tight blood-sugar control in diabetes: it causes more frequent hypoglycemia, more weight gain, and frequently requires a more complex and more expensive medication regimen.  But the benefits are now uncertain.  The article suggests that we instead focus on the interventions that have proven benefit in diabetes: smoking cessation, blood pressure control, cholesterol control, dietary modification and exercise.

The authors suggest for the glucose goals:

“Glycemic control efforts should individualize hemoglobin A1c targets so that those targets and the actions necessary to achieve them reflect patients’ personal and clinical context and their informed values and preferences.”

Which sounds to me like a nice way of saying “We’re not really sure what we’re doing any more, so try to prescribe medicines that the patient can afford and won’t cause harm, and focus on the sugars a little less.”

Learn more:

Annals of Internal Medicine article:  Glycemic Control in Type 2 Diabetes: Time for an Evidence-Based About-Face?

My previous posts:

What We Don’t Know About Diabetes – Part 1

What We Don’t Know About Diabetes – Part 2

More

The Common Cold

Several of my patients have developed nasty colds in the last few weeks, so it seemed like a good time to cover this perennial source of misery.  Even though the cold is one of the most common illnesses, many people are still confused about how to treat it and how to distinguish it from other illnesses.

Symptoms
Colds typically cause a scratchy or sore throat, runny or congested nose, cough and fatigue.  There is usually no fever.

Cause
Colds are caused by viruses, typically rhinovirus, coronavirus or respiratory syncitial virus, though many other viruses can be the culprit.

Diagnosis
There is no specific test that is usually done to diagnose a cold.  The diagnosis is usually made by the presence of typical symptoms in the absence of symptoms suggesting another diagnosis.

Treatment
The treatment of colds causes much misunderstanding and grief.  Antibiotics don’t help, and in fact nothing has been found that decreases the duration of symptoms.  The cold always resolves which makes it a prime candidate for quackery, since whatever you take for your cold you’ll definitely improve.  Nevertheless, many of my patients swear by vitamin C and Echinacea despite the consistent evidence for their lack of efficacy.  The best you can do is to treat the symptoms so as to minimize the misery until the cold resolves on its own.

Pseudoephedrine (Sudafed and generic store brands) is fairly effective for treating nasal congestion.  That can also help decrease the sore throat and the cough that are frequently caused by post-nasal drip.  In California, pseudoephedrine is no longer on the shelf.  Patients need to ask for it at the pharmacy counter and show identification.  This has also caused much confusion as patients have mistakenly purchased phenylephrine (Sudafed PE) which is available on the shelf but is less effective.  Pseudoephedrine should be used with caution in patients with high blood pressure and in men over 50.  It also makes some people feel jittery and can cause insomnia.  For patients who can not tolerate pseudoephedrine, there is a safe alternative by prescription.

Over the counter cough suppressants (containing dextromethorphan) can help decrease coughing, though they are usually only modestly effective.  Pain relievers can help with sore throat.  Non-medicinal alternatives like inhaling steam can help loosen mucous and can soothe irritated airways.

Prevention
Colds are very infectious, and no prevention method is perfect.  Frequent hand washing and avoiding people with the cold are probably the most effective steps.

Some other diagnoses that should be excluded

  • Streptococcal pharyngitis, or Strep throat, is usually marked by a severe sore throat, fever and swollen lymph nodes in the neck.  Nasal symptoms and cough are usually absent.  Viruses frequently cause these same symptoms and a rapid Strep test or throat culture should be done to confirm the presence of StrepStrep throat requires antibiotic treatment.
  • Otitis media (middle ear infection) usually causes pain in one ear.  Fever and nasal congestion may be present.  It is usually diagnosed by the doctor looking at the eardrum.  It is usually treated with antibiotics.
  • Influenza (the flu) causes high fevers, chills, cough and diffuse body aches.  It can be diagnosed with a nasal swab and treated with antiviral medicines, ideally in the first 48 hours of symptoms, so call your doctor right away.
  • Acute sinusitis (sinus infection) causes pain or pressure in the cheeks or forehead or upper tooth pain, and usually fever.  Antibiotics used to be the standard of care for sinusitis, but current recommendations are to use nasal decongestants and pain medicine for 7 days and to prescribe antibiotics only if symptoms persist after that.
  • Acute bronchitis is an infection of the airways marked by a productive cough and usually low-grade temperatures.  It is usually caused by viruses and resolves without antibiotics.
  • Pneumonia is a lung infection, usually marked by high fever, shaking chills, a productive cough and sometimes shortness of breath.  It requires medical attention and usually is treated with antibiotics.

Now that you’re an expert at managing the common cold, your medical education is well on its way.  Next week you’ll be performing organ transplants.

Learn more:

WebMD Guide to the Common Cold

My review of acute bronchitis

My review of vitamin C in prevention and treatment of the common cold

More