After suffering from a nasty cold for more than a few days, many of us think about checking in with our doctor just to be sure that it’s not something worse. The result on those visits is, all too often, a simple confirmation of what you already knew: you have a cold. You’ll get better. But in many cases you get something else—a prescription for an antibiotic. In some cases, the fact that you probably have a viral infection rather than a bacterial infection is never discussed or acknowledged, and other times the antibiotic is justified on the conjecture that the virus will “make you more susceptible to an infection.”
The “survival of the fittest” is the backbone of the well-established theory of evolution, and while this process generally results in good things—the survival and procreation of stronger and more adaptable beings– it can also result in some bad things. For example, farmers must deal with organisms that over time become resistant to pesticides and herbicides. This process can raise the costs of farming, and propel the farmer into an ongoing battle to stay a step ahead of resistance organisms. And another example is, of course, antibiotic resistance. Antibiotic resistance occurs when bacteria adapt to and develop resistance to antimicrobial treatments. As this process unfolds, microbes can morph into “superbugs” that don’t respond to conventional antibiotics.
To be fair to all of you upper respiratory virus sufferers who have taken the antibiotics prescribed by your physicians, you are not the main driver of antibiotic resistance globally, though you and your physician are certainly not helping. Antibiotics are prescribed in much larger doses and with much greater frequency in emergency departments, hospitals and other critical care and acute care settings. Antibiotics are often prescribed when physicians are unsure of the diagnosis [see, for example, our work on the overuse of antibiotics in the treatment of conjunctivitis, in Schneider et al. (2014) Epidemiology and Economic Burden of Conjunctivitis: A Managed Care Perspective. Journal of Managed Care Medicine 17(1)], or when formal or informal treatment protocols and standards of care call for the “empirical” administering of antibiotics, where physicians prescribe antibiotics “automatically” without awaiting the results of laboratory tests. This is more or less the case in most U.S. critical care settings.
There is widespread agreement that these practices—prescribing antibiotics in spite of high levels of uncertainty and prescribing empirically without the benefit of laboratory test results–lead to the overuse of antibiotics. The most visible and direct consequence of the overuse of antibiotics is the avoidable cost. Although the antibiotic market is flooded with inexpensive generics, high utilization rates result in non-trivial aggregate costs to payers, much of which is avoidable.
In addition to the direct costs attributable to the overuse of antibiotics, the indirect costs of antimicrobial resistance have been shown to be substantial, and, at best, understudied and poorly understood. Several years ago the U.S. Centers for Disease Control (CDC) held a workshop on methods to assess the costs of antimicrobial resistance [see Howard, D., et al. (2001). ̶#8220;Measuring the economic costs of antimicrobial resistance in hospital settings: summary of the Centers for Disease Control and Prevention-Emory Workshop.” Clin Infect Dis 33(9): 1573-1578.]. The group concluded that “efforts to provide this information have yielded widely variable and often conflicting estimates” and that the “lack of reproducibility is largely attributable to problems in study design and in the methods used to identify and measure costs.” Nevertheless, costs have been shown to be substantial. Overall, antibiotic resistance can add as much as $40,000 per hospitalization (Tansarli et al. 2013) and more than $40 per antibiotic prescription (Michaelidis et al. 2013).
Fortunately, there are options to reduce the prevalence of resistant microbes and reduce the economic and humanistic harm caused by them. Chief among them is for all health care providers to adopt some form of antibiotic stewardship.
According to Chung et al. (2013), antimicrobial stewardship is “defined by a series of strategies and interventions aimed toward improving appropriate prescription of antibiotics in humans in all healthcare settings.” The main goal is the “preservation of current and future antibiotics against the threat of antimicrobial resistance,” although important secondary goals include improving patient safety and reducing healthcare costs. Antibiotic stewardship should also include the use of rapid tests (e.g., a test for viral conjunctivitis) or short-duration laboratory tests for bacterial infections (e.g., procalcitonin) in critical care settings. These tests have been shown to reduce costs and reduce treatment failure in a variety of settings and health systems.
–John Schneider, PhD & Cara Scheibling