Antibiotic usage regulations – too little, too late?

This post is cross-posted from a blog post I wrote for the Science and Policy Exchange. For more insightful writing on science and how it relates to government, the media, and society at large, visit their site

In Dr. Seuss’ The Lorax, the narrator reflects sadly upon his past. He recounts arriving at a pristine forest of Truffula trees, which he cuts down to make thneeds, a product that “everybody needs”. Ignoring warnings and later pleadings from the titular Lorax, he cut down trees one-by-one, later 4-by-4, growing his business enormously in the process. He continues to cut down trees until all of the sudden, the last one is gone. Without any more Truffula trees, his business collapses and he falls into poverty, and tells his story so future generations won’t make the same mistake. He realizes how he was blinded by the promise that his business would only grow bigger, and overlooked the fact that he was rapidly removing the key resource he and others depended upon. By the time he realized his wrongs, it was already too late – the forest was gone, along with its animals – the Brown Bar-ba-loots, Swomee-Swans, Humming-Fish, and the Lorax himself.


Sure, The Lorax is a children’s story, but it illustrates some profoundly adult concerns. Take too much from a public good, and it will collapse, leaving everyone worse off in the long term. This story obviously parallels real-world issues like deforestation, pollution, and overfishing, but also applies to abstract public resources like transportation networks, taxation, and stability of financial markets. To protect common resources for the good of the public as a whole, regulations are needed to prevent individuals from exploiting these resources for short-term personal gain.

A somewhat obscure public good, but one that medicine depends upon is the efficacy of antibiotics. Antibiotics are only effective when the microbes they are used to treat are susceptible to the drug. If resistance to antibiotics spreads, antibiotics lose their utility, and this public resource is lost. We need these drugs not just to treat infections, but also to facilitate surgery, cancer therapy, and to protect immunocompromised patients, so losing our effective antimicrobials would be an unmitigated disaster.

Unfortunately, to retain antimicrobial effectiveness, we fight against formidable evolutionary forces. Add a strong selective pressure (antibiotic), and evolution drives the selection of the most fit (resistant) microbes. These resistant microbes will thrive where their non-resistant counterparts die, and take over the competition-free environment left behind. For this reason, we instruct patients to take the full course of antibiotics to completely eradicate an infection. Otherwise it could come back, stronger and tougher, and spread to other patients as well.

This example is usually framed around medical patients, but when we look into the emergence of antibiotic resistance, the more important subjects of antimicrobial use aren’t humans, but chickens, pigs, and cattle. Almost 80% of antibiotics are used in agriculture. These antibiotics are used in three ways: to treat acute infections, in feed as an additive for “prevention and control”, and lastly for use as growth promoters. The last usage doesn’t make much sense – why would chemicals that kill microbes alter the growth of livestock? The short answer is that don’t really know, this question is at the forefront of the burgeoning field of microbiomics. What matters for farmers and feedlot managers, however, is that it works, and their animals grow faster, bigger, or both, which means better profitability. It makes perfect economic sense to routinely add antibiotics to feed as growth promoters.

The problem is that this constant background of growth-promoting antibiotics in agriculture generates a selective pressure that drives the emergence and spread of antibiotic resistance. This was long suspected based upon our understanding of natural selection. Careful research has now shown that indeed, these growth-promoting antibiotics lead to increased emergence and spread of antimicrobial-resistant bacteria, and further, that these resistant bacteria cross over to human patients. Knowing this to be true, the widespread use of antimicrobials for growth promotion in agriculture is now viewed as a considerable public health hazard.

Recognizing this hazard, in 1977 the US Food and Drug Administration (FDA) committed to putting restrictions on antibiotic use in agriculture, a crucial step to limit the spread of resistance. The agency took the first step last December, 36 years later. Their Final Guidance for Industry 213 recommends that antimicrobial manufacturers discontinue indication of antibiotics as growth promoting agents. Some are celebrating this as an important first step. Many are rolling their eyes at what appears like a mostly empty gesture.

The FDA recommendation took far too long to happen. It also lacks teeth. First, it is voluntary, and it doesn’t seem likely that many manufacturers will embrace the opportunity to cut their revenues in half. Farmers operate on such narrow margins that discontinuing antibiotic use could force them out of business, and so change won’t happen there either. The recommendation also has a gigantic loophole: by simply rebranding antibiotic use from “growth promotion” to “prevention and control”, farmers and feedlot managers can continue business, fully compliant with the recommendation, without changing their de facto usage at all.

The FDA recommendation is a first step, but it’s a feeble one, and one that took far too long to happen. A knot of competing interests seems to be preventing much from happening. The medical community and food purity movements want antibiotics removed from agriculture. While the FDA seems to concur, its hands are often tied – the agency is caught between protecting public health and strong economic incentives to keep the status quo in agriculture. The US congress has meddled considerably in the matter, and no doubt lobbyists are pulling strings along the way. Comparatively low food costs have also contributed to the problem, as they push producers to squeeze every drop of efficiency they can from the system, growth promoters forming an important part of their plan. Given these opposing forces, it becomes less surprising that it took the FDA 36 years to publish their recommendation, but at this pace, by the time any substantial change happens, it may be far too late.

The Lorax in this story is the scientific and medical community, who have been pleading with industry for decades to adopt more responsible agricultural antibiotic practices. Individuals within agriculture, like Dr. Seuss’s tragic narrator, are mostly concerned with day-to-day operation of their business, and not esoteric concerns about the future of medicine. But as a collective, the industry needs to realize the loss of antibiotics will have long-term consequences for their business as well and take steps towards more sustainable use. In the absence of this kind of industry action, firm regulations on responsible antibiotic use are essential to protect antibiotics as a public resource. Otherwise, our antibiotics will drop one-by-one from pharmacists’ shelves like Truffula trees, until none remain, and like the animals in the Lorax’s forest, other life-saving innovations of medicine could be lost as well.

Full disclosure: I study antimicrobial resistance in a biomedical research lab, so I do have a stake in this matter. I’m also sympathetic towards farmers – I grew up on a beef farm, where my family still raises hereford cattle.

Edit 2015.09.02: I’ve updated my account since I first wrote this post. If you’d like to follow me online, find me @superhelical