By Yashar Saghai  and Joanna Mackenzie,
According to a new CDC report , antibiotic resistance is a growing public health threat that requires immediate action. In the US alone, at least 2 million antibiotic resistant infections and 23,000 deaths due to antibiotic resistance occur annually. The CDC cites the inappropriate and unnecessary use of antibiotics in humans and animals as a major driver of antibiotic resistance.
Almost 80%  of all antibiotics sold in the US are administered to animals raised for food. Why? Do animals really get sick that often? Although some of the antibiotics are used to treat ill animals, most are added in low doses to animal feed in order to prevent disease and promote growth.
This practice, called “sub-therapeutic antibiotic use,” is common in concentrated animal feeding operations (the infamous CAFOs). Aside from the serious moral issues with CAFOs related to animal welfare, a topic to be addressed later, sub-therapeutic antibiotic use has ethical implications due to its potential impact on human welfare.
Clearly, if it were established that sub-therapeutic antibiotic use in livestock increases antibiotic resistance in strains that affect humans, governments of well-functioning states would be not only morally entitled to but also obligated to use their authority to protect their populations’ health against this threat. The problem is that causation is difficult to prove and governments need to make decisions in the face of uncertainty and imperfect knowledge.
When is it time to err on the side of caution? This question is important because the sub-therapeutic use of antibiotics was banned in Sweden as early as 1986, and then banned by the EU in 1998, whereas in the US it continues unrestricted and unmonitored even today.
How can governing bodies see the same evidence and chose to act so differently? Here, a bit of history might help set the record straight. When Sweden decided to ban the use of sub-therapeutic use of antibiotics the relationship to antibiotic resistance was a possibility, not a clear causal link.
When, in 1998, the EU banned the use of all antibiotics used in human medicine for the use in animal feed, it was against the advice of their scientific advisory board, which concluded they were unable to assess the risk associated with antibiotic use in farm animals.
The EU Council of Ministers based its decision on the precautionary principle. The precautionary principle is a rule of thumb used by policy-makers when they face situations of uncertainty or risk. Roughly speaking, the precautionary principle puts the burden of proof on the supporters of an activity to show that what they propose to do is very unlikely to pose a significant threat to human health and safety or the environment.
It is a controversial principle both because it might be used to slow down the use of new, and potentially useful technology, and also because it might be invoked to justify pre-emptive war .
Critics of the precautionary principle argue that the EU decision was premature.
When did the evidence solidify? Professor Alan Goldberg , a Global Food Ethics team member  and a former member of the Pew Commission on the Impact of Industrial Farm Animal Production , stated that the evidence linking sub-therapeutic antibiotic use in animals to antibiotic resistance in humans was not clear when the Commission started its research in 2006. However, by the end of the investigation in 2008, the evidence was strong enough to support a call for a US ban.
Since then, numerous studies  have shown that sub-therapeutic use in animals has contributed to the rise in antibiotic resistant bacteria, including those that affect humans.
(Here’s, roughly, how it works. The low dose antibiotics enter an animal’s gut, killing off the all the normal, healthy bacteria and most of the disease-causing bacteria. However, some of the “bad” bacteria survive and are able to multiply faster in the absence of competition and spread their resistant genes to other bacteria. The resistant bacteria then spread to humans by way of contaminated animal products, and also through crops and water that have been fertilized with manure.)
Just one example of many, a newly-released study conducted by researchers at Johns Hopkins  showed that people living in communities close to swine farms and crops fertilized with swine manure, had increased risk of contracting MRSA (Methicillin-resistant staphylococcus aureus) and other skin and soft tissue infections. MRSA  kills more Americans each year than HIV/AIDS, emphysema, Parkinson’s and homicide combined.
So, setting aside thoughts of the precautionary principle and its opposite, the “innocent until proven guilty” principle, it is clear that the actions of the US government can no longer be explained by a mere difference in philosophy.
In fact, the American government’s stance on sub-therapeutic antibiotics is largely influenced by the interests of a few powerful constituents, such as pharmaceutical companies and agribusinesses, who consistently shut down any attempts to regulate or ban sub-therapeutic antibiotics.  
The evidence for banning sub-therapeutic antibiotic usage is now clear. What’s not so clear is how the US government can morally justify the continued use of sub-therapeutic antibiotics in the face of overwhelming evidence that sub-therapeutic use of antibiotics is responsible for human disease.
Joanna Mackenzie is a Registered Dietitian and Research Assistant for the Global Food Ethics Project  at the Johns Hopkins Berman Institute of Bioethics. She is also working towards obtaining an MSPH in Health Education and Communication from the Department of Health, Behavior and Society at Johns Hopkins Bloomberg School of Public Health. She received a B.S. in Nutrition Science from Russell Sage College in 2010. Prior to coming to Hopkins, Joanna was a Public Health Nutritionist for the New York State Child and Adult Care Food Program. Previous work included using nutrition interventions to enhance the quality of life of HIV/AIDS populations in upstate New York.