Viewing entries tagged
Breakthroughs

Do You Know Your Bottled Water?

Comment

Do You Know Your Bottled Water?

Nearly 54% of the American population drink bottled water.1 As they pick up packages of plastic-wrapped bottles off the shelf, they may believe the water within is cleansed of toxic chemicals, free of bacteria, and enhanced with minerals. In truth, bottled water is just as likely to have the same level of harmful chemicals as that of tap water, contain between 20,000 to 200,000 bacterial cells, and lack any beneficial minerals depending on water source.2

As sales of bottled water increase, consumption of tap water decreases.3 Since 1999, the average world consumption of bottled water has increased by 7% each year based on annual per capita,4 now surpassing the sales of milk and beer.5 This growing habit is costly to both the wallet and environment. Companies price bottled water 500 to 1,000 times higher than tap water on average,4 and producing one bottle of water requires 1,000 to 2,000 times more energy than producing the same amount of tap water—in addition to producing plastic waste.5 Are these increased costs justified by the belief that bottled water is safer or cleaner than tap water? In reality, that perception is a marketing myth.

Contrary to popular belief, bottled water is held under lower safety standards than tap water is. There are two different groups regulating drinking water: the Food and Drug Administration (FDA) oversees bottled water and the Environmental Protection Agency (EPA) regulates tap water. In setting health standards, the FDA follows the footsteps of the EPA. The EPA is the first to set limits on dangerous chemicals, biological wastes, pesticides, and microbial organisms in tap water, and the FDA then adopts those limits for bottled water. As a result, regulations on bottled water are no stricter than those on tap water. In fact, they are often weaker. Table 1 lists health standards that differ between bottled and tap water; bottled water only has stricter limits on copper, fluoride, and lead.

The higher lead level of 15 parts per billion (ppb) in tap water allowed by the EPA—three times the limit for bottled water—may alarm some, but research indicates lead exposure at 15 ppb does not elevate blood lead levels in most adults. When the FDA followed suit to establish a lead limit in bottled water, it lowered the limit to 5 ppb because the majority of bottled water manufacturers could reasonably reach that level.6 Bottled water does not need to travel through pipes made of lead, so a stricter limit is sensible and could only be beneficial.

Unfortunately, FDA standards are poorly enforced. Water that is bottled and sold within the same state is not covered by FDA rules: an estimated 60-70% of bottled water sold in the United States meet the criteria of being intrastate commerce and thus are only regulated by the state.6 A survey revealed that most states spend few, if any, resources policing bottled water, so compliance with standards for more than half of the bottled water on grocery shelves is discretionary.6 With overall weaker health standards and lax enforcement of regulation, bottled water is not obligated to be safer than tap water. How does the quality of bottled and tap water compare in reality? Examination reveals differences in their mineral content and microbial content that impacts personal health.

Mineral Comparison

In terms of mineral composition, the amount of mineral content in bottled and tap water largely depends on source and treatment. Bottled water can be designated as spring, mineral, or purified. Spring water, including brands such as Ozarka® and Arrowhead®, originate from surface springs with water flowing naturally from underground supplies. Mineral water is simply spring water with at least 250 parts per million (ppm) of dissolved minerals such as salts and metals.3 Purified water brands such as Aquafina® and Dasani® take water from either underground or tap water sources and filter it to remove all minerals.

Tap water is more simply categorized as being sourced from surface water or ground water. Surface water refers to lakes, rivers, or oceans, while ground water describes any reservoir located beneath the earth's surface. For example, most of the tap water in Houston, Texas originates from a single surface water source.3 The source of any drinking water affects which minerals ultimately remain in the drinking water.

Three specific minerals important for a healthy body are calcium, magnesium, and sodium. Adequate calcium intake is important to maintain and restore bone strength for the young and to prevent osteoporosis in the old. Insufficient consumption of magnesium has been associated with heart disease including arrhythmias and sudden death.3 On the other hand, overly high sodium intake is well associated with high blood pressure and death from heart disease.7 The intake of all three of these minerals can be ensured by drinking water high in calcium, high in magnesium, and low in sodium. In fact, magnesium in water is absorbed approximately 30% faster than magnesium in food.3

A comparative study in 2004 examined these three minerals in bottled and tap water across major U.S. regions. It concluded that drinking two liters per day of tap groundwater water in certain regions or bottled mineral water of certain brands can significantly supplement a person’s daily intake of calcium and magnesium (Table 2).3 To obtain mineral data, the study contacted tap water authorities in 21 different cities spanning the U.S. and obtained published data for 37 North American brands of commercial bottled water. While tap water sources showed wide variations in calcium, magnesium and sodium content, mineral levels of bottled water were more consistent from category to category. In general, tap water from groundwater sources had higher levels of calcium and magnesium than those from surface water sources. High levels of calcium correlated with high levels of magnesium, while sodium levels varied more independently. Out of 12 states, water mineral levels were highest in Arizona, California, Indiana, and Texas. In half of the sources from those states, two liters of tap water allow adults to fulfill between 8 - 16% of calcium and 6 - 31% of magnesium daily recommended intake (DRI). Additionally, more than 90% of all tap water sources contained less than 5% of sodium DRI in two liters.

Amongst the bottled waters, spring water consistently contained low levels of all three minerals, while mineral waters contained relatively high levels of all three minerals (Table 2). Ozarka® spring water, produced in Texas, provides less than 2% of the three minerals' DRIs. In contrast, one liter of Mendocino® mineral water supplies 30% of the calcium and magnesium DRIs in women, and one liter of Vichy Springs® mineral water provides more than 33% of the recommended maximum sodium DRI. Based on these percentages, drinking bottled “mineral” as well as tap water from groundwater sources in certain cities can supplement food intake to fulfil calcium and magnesium DRIs.

Microbial Comparison

Despite labels depicting pristine lakes and mountains, bottled drinking water nearly always contains living microorganisms. In general, processing drinking water serves to make the water safe—not necessarily sterile. The FDA and EPA only regulate bottled and tap water for coliform bacteria, which are independently harmless but indicate the presence of other disease-causing organisms.1 E. coli are an example of coliform bacteria that reside in the human and animal intestines and are widely present in drinking water.8 The total amount of microorganisms in water is often measured by incubating and counting the colony forming units (CFU), or bacteria that develop into colonies. Water with under 100 CFU/mL indicates microbial safety, while counts from 100-500 CFU/mL are questionable.8

In 2000, a research group compared the microbial content of bottled and tap water by obtaining samples from the four tap water processing plants in Cleveland, Ohio and 57 samples of bottled water from a number of stores.9 The bottled water samples included products classified as spring, artesian, purified, and distilled. Bacteria levels in the bottled waters ranged from 0.01 to 4,900 CFU/mL, while the tap water samples varied from 0.2 to 2.7 CFU/mL.9 More specifically, 15 bottled water samples contained at least 10 times as much bacteria as the tap water average, three bottled water samples contained about the same amount of bacteria, and 39 bottled water samples possessed less bacteria. As shown in Figure 1, one-fourth of the samples of bottled water, mainly spring and artesian water, had more bacteria than tap water, demonstrating that bottled water is not reliably more clean of bacteria than tap water. The bacteria in both bottled and tap water can cause gastrointestinal discomfort or illness.10

What is the Healthiest and Cleanest Water to Drink?

Since bottled water and tap waters contain varying levels of microbes, clean water is most reliably obtained by disinfecting tap water with commercially available water purifiers. Most purifiers also act as filters to remove chlorine, its byproducts, and other harmful chemicals that accumulate in tap water. However, chemical-removing filters will also remove any calcium and magnesium present as well, so purified tap water loses its mineral benefits. The loss of aqueous mineral intake can be overcome by eating foods rich in calcium and magnesium; maintaining mineral DRIs will result in a better level of health and energy. Bottled water is not guaranteed to be cleaner than tap water, so drinking properly filtered tap water may be the most economic and health way for hydration.

References

  1. Rosenberg, F. A. Clin. Microbiol. Newsl. 2003, 25, 41–44.
  2. Hammes, F. Drinking water microbiology: from understanding to applications; Eawag News: Duebendorf, Switzerland, 2011.
  3. Azoulay, A. et al. J. J. Gen. Intern. Med. 2001, 16, 168–175.
  4. Ferrier, C. AMBIO 2001, 30, 118–119.
  5. Gleick, P. H.; Cooley, H. S. Environ. Res. Lett. 2009, 4, 014009.
  6. Olson, E. D. et al. Bottled Water: Pure Drink or Pure Hype?; NRDC: New York, 1999.
  7. Chobanian, A. V; Hill, M. Hypertension 2000, 35, 858–863.
  8. Edberg, S. C. et al. J. J. Appl. Microbiol. 2000, 88, 106S–116S.
  9. Lalumandier, J. A.; Ayers, L. W. Arch. Fam. Med. 2000, 9, 246–250.
  10. Payment, P. et al. Water Sci. Technol. 1993, 27, 137–143

Comment

Cancer Ancestry: Identifying the Genomes of Individual Cancers

Comment

Cancer Ancestry: Identifying the Genomes of Individual Cancers

Cancer-causing mutations in the human genome have been a subject of intense research over the past decade. With increasing numbers of mutations identified and linked to individual cancers, the possibility of treating individual patients with a customized treatment plan based on their individual cancer genome is quickly becoming a reality.

Cancer arises when individual cells acquire mutations in their DNA. These mutations allow cancerous cells to proliferate uncontrollably, aggressively invade surrounding tissues, and metastasize to distant locations. Based on this progression, a potentially tremendous implication emerges: if every type of cancer arises from an ancestor cell that acquires a single mutation, then scientists should be able to trace every type of cancer back to its original mutation through modern genomic sequencing technologies. High volumes of the human genome have been analyzed in search of these ancestor mutations using a variety of techniques, the most common of which is a large Polymerase Chain Reaction (PCR) screen. In this type of study, DNA of up to one hundred cancer patients is sequenced; the sequences are then analyzed for repeating codons, the DNA units that determine single amino acids in a protein. Analyzing the enormous volume of data from these screens requires the efforts of several institutions. The first 90 cancer-causing mutations were identified at the Johns Hopkins Medical Institute, where scientists screened 11 breast cancer and 11 colorectal cancer patients’ genomes.3 After this study was published in 2006, researchers found these 90 mutations across every known type of cancer. These findings stimulated even more ambitious projects: if the original cancer-causing mutations are identified, scientists may be able to reverse the cancer process by removing faulty DNA sequences using precisely targeted DNA truncation proteins.

However, such a feat is obviously more easily said than done. One of the many obstacles in identifying cancer genomes is the fact that approximately 10 to 15% of cancers derive large portions of their DNA from viruses such as HIV and Hepatitis B. The addition of foreign DNA complicates the search for the original mutation, since viral DNA and RNA are propagated in human cells. This phenomenon masks human mutations that may have existed before the virus entered the host cell. In addition, because tumors are inherently unstable, cancers may lose up to 25% of their genetic code due to errors in cell division, making the task of tracing them even more difficult. Finally, the mutations in every individual cancer have accumulated over the patient’s lifetime; differentiating between mutations of the original cancerous cell line and those caused by aging and environmental factors is an arduous task.

In order to overcome these challenges, scientists use several approaches. First, they increase the sample size—this strategy ensures that the mutations are not specific to an individual organism or geographic area but are common in all patients with that type of cancer. Second, accumulated data concerning viral genomes allow scientists to screen for and mark the areas of viral origin in patients’ DNA. Several advances have already been made despite the difficulties: for instance, in endometrial cancer—a cancer originating in the uterine lining—mutations in the Nucleotide Excision Repair (NER) and MisMatch Repair (MMR) genes have been found to occur in 13% of all affected patients.4 NER and MMR are involved in DNA repair mechanisms and act as the body’s “guardians” of the DNA replication process. In a healthy individual, both NER and MMR ensure that each new cell receives a complete set of functional chromosomes following cell division. In a cancerous cell, these two genes acquire a mutation that permits replication of damaged and mismatched DNA sequences. Similarly, mutations in the normally tumor-suppressing Breast Cancer Type 1 Susceptibility Gene 1 and Gene 2 (BRCA1 and BRCA2) have been identified as major culprits in breast cancer. In prostate cancer, E-26 Transformation Specific (ETS) and Transmembrane Protease, Serine 2 (TMPRSS2) are two DNA transcription regulatory proteins discovered to initiate the disease process.5

One of the latest frontiers in cancer treatment is the identification and study of individual, disease-causing mutations. Thousands of tumor genomes have been sequenced to discover recurring mutations in each cancer, and tremendous advances have been made in this emergent field of cancer genomics. Further study will ultimately aim to tailor cancer treatment to the patient’s specific set of mutations in the emerging field of personalized medicine. This strategy is already being used in the treatment of leukemia at the Cincinnati Children’s Hospital, where a clinical study has been underway since August 2013.2 This trial uses a combined treatment program that includes standard drug therapy while targeting a specific mutation in the mTOR gene, which is responsible for DNA damage repair. Thus, less than a decade after researchers first began to identify unique cancer-causing mutations, treatment programs tailored to patient genomes are becoming a reality.

References

  1. Lengauer, C. et al. Nature 1998, 396, 643-649.
  2. Miller, N. http://www.cincinnatichildrens.org (accessed Nov 9, 2013).
  3. Sjöblom, T. et al. Science 2006, 314, 268-274.
  4. Stratton, M. et al. Nature 2009, 458, 719-724.
  5. Tomlins, S. et al. Science 2005, 310, 644-648. 

Comment

Stem Cells and Hearts

Comment

Stem Cells and Hearts

In the U.S., someone dies of heart disease every 33 seconds. Currently, heart disease is the number one cause of death in the U.S.; by 2020, this disease is predicted to be the leading cause of death worldwide.1 Many of these deaths can be prevented with heart transplants. However, only about 2,000 of the 3,000 patients on the wait-list for heart donations actually receive heart transplants every year. Furthermore, patients who do receive donor hearts often have to wait for months.2

The shortage of organ donors and a rise in demand for organ transplants have instigated research on artificial organ engineering.5 In Tokyo, Japan, cell biologist Takanori Takebe has successfully synthesized and transplanted a “liver bud,” a tiny functioning portion of a liver, into a mouse; his experiment was able to partially restore liver function.3 Dr. Alex Seifalian from University College London, who has previously conducted artificial nose transplants, is now working on engineering artificial cardiovascular components.5

Breakthroughs in artificial organ engineering are also being made more locally. At St. Luke’s Episcopal Hospital in the Texas Medical Center, Dr. Doris Taylor is working on cultivating fully functional human hearts from proteins and stem cells. She is renowned for her discovery of “whole-organ decellularization,” a process where organs are stripped of all living cells to leave a protein framework. Taylor has successfully used this method as the first step in breathing life into artificially grown hearts of rats and pigs, and she is attempting to achieve the same results with a human heart.2

In this method, Taylor uses a pig heart as a scaffold, or protein template, for the growth of the human heart, as they are similar in size and physiological structure. In order to create such a scaffold, Taylor first strips pig hearts of all their cells, leaving behind the extracellular matrix and creating a framework free of foreign cells. The heart is then immersed for at least two days in a detergent found commonly in baby shampoo, which results in whole-organ decellularization. The decellularized pig heart emerges from the detergent bath completely white, since the red color of organs is usually derived from the now-absent hemoglobin and myoglobin (two oxygen-carrying proteins) in the cells. Therefore, only the structural proteins of the organ—devoid of both color and life—remain.2

To bring this ghost heart to life, Taylor enlists the aid of stem cells from human bone marrow, blood, and fat. The immature stem cells have the potential to differentiate into any cell in the body and stimulate the growth of the artificial organ. After the stem cells are added,2 the artificial heart is placed in a bioreactor that mimics the exact conditions necessary for growth, including a separate blood and oxygen supply as well as a beating sensation.6 Amazingly, a heartbeat is observed after just a few days, and the artificial organ can successfully pump blood after just a few weeks.2

Of course, this method is not limited to the development of a single type of organ. Not only will Taylor’s research benefit patients suffering from heart failure, it will also increase availability of other artificial organs like livers, pancreases, kidneys, and lungs. Taylor has already proven that decellularization and stem cell scaffolding is a practical possibility with other organs; additionally, she has completed successful lab trials with organ implants in rats.4 While the full growth of a human heart is still being refined and other organ experiments have recently been completed, Taylor predicts that her team will be able to approach the Food and Drug Administration (FDA) with proposals for clinical trials within the next two years. The trial of integrating entire organ into human patients may be further into the future, but Taylor proposes that they will begin with cardiac patches and valves, smaller functioning artificial portions of a heart, to show the safety and superiority of the decellularization and stem cell scaffolding process. Hopefully, after refining the procedure and proving its success, whole-organ decellularization will be used to grow organs unique to every individual who needs it.2

While this process is useful for all transplant patients, it is especially important for people with heart disease. The muscle cells of the heart, cardiomyocytes, have no regenerative capabilities.4 Not only is heart tissue incapable of regeneration, but the transplant window for hearts is also extremely short: donor hearts will typically only last four hours before they are rendered useless to the patient, which means that a heart of matching blood type and proteins must be transported to the hospital within that time period. Due to high demand and time limitations, finding compatible hearts within a reasonable distance is difficult. Though mechanical hearts are emerging as possible replacements for donor hearts, they are not perfect; use of a natural heart would be vastly superior.2 Mechanical hearts face the issue of unnatural malfunction; natural hearts, which are designed for a human body, will better “fit” the individual and can be tailored to avoid patient rejection. With the advent of biologically grown hearts, more hospitals will have access to replacement organs, increasing the patients’ options for transplant. Another critical advantage of artificially grown hearts lies in the fact that the patient may not need anti-rejection medication. The patient’s own stem cells could be used to grow the heart. The artificial tissue would then grow to have the same protein markers as the rest of the cells in the body, minimizing the chances that the organ would be rejected.2 Still, the use of stem cells could be potentially problematic, as human stem cells decrease in number and deteriorate in function over time. In this respect, stem cells from younger patients are usually desirable, so the eradication of all anti-rejection medication is not feasible in the near future.

The development of artificial organs provides a solution to issues of organ rejection, availability, compatibility, and mechanical failure. Dr. Taylor’s stem cell research also presents the possibility of improving current technologies that help patients with partially functioning hearts. Her work has the potential to grow skin grafts for burn centers and aid in dialysis treatment for liver failure in the near future.2

While other organs are not as fragile as the heart, decellularization and protein scaffolding can potentially benefit the body holistically. Similar to the heart, other organs such as the kidney are capable of healing themselves of small injuries as opposed to major ones requiring transplant and emergency care. Taylor’s research, though still very much in development, could change the future of transplant medicine across all organs.

References

  1. The Heart Foundation. http://www.theheartfoundation.org/heart-disease-facts/heart-disease-statistics/ (accessed Oct 14, 2013).
  2. Galehouse, M. Saving Lives With Help From Pigs and Cells. Houston Chronicle, Houston, Jan 23, 2013.
  3. Jacobson, R. Liver Buds Show Promise, but Growing New Organs is Still a Long Way Off. http://www.pbs.org/newshour/rundown/2013/07/liver-buds-show-promise-but-growing-new-organs-is-still-a-long-way-off.html (accessed Oct 14, 2013).
  4. Moore, Charles. Texas Heart Institute’s Dr. Doris Taylor in the Forefront of Heart Tissue Regeneration Research. http://bionews-tx.com/news/2013/07/04/texas-heart-institutes-dr-doris-taylor-in-the-forefront-of-heart-tissue-regeneration-research/ (accessed Oct 14, 2013).
  5. Naik, G. Science Fiction Comes Alive as Researchers Grow Organs in Lab. http://online.wsj.com/news/articles/SB10001424127887323699704578328251335196648 (accessed Oct 14, 2013).
  6. Suchetka, D. 'Ghost Heart,' a Framework for Growing New Human Hearts, Could Be Answer for Thousands Waiting for New Heart. http://www.cleveland.com/healthfit/index.ssf/2012/08/ghost_heart_a_framework_for_gr.html (accessed Oct 14, 2013). 

Comment

Farming the Unknown: The Role of the Livestock Industry in Preserving Human Health

Comment

Farming the Unknown: The Role of the Livestock Industry in Preserving Human Health

The livestock industry is a vast network of expectations. A farmer expects meat, dairy, and eggs from his animals, and a consumer expects to obtain these products from grocery stores. Industry expects profitable revenue from the sales of these products. Given the intensiveness of modern agriculture, this chain of action has been massively amplified. Meat production has doubled since the 1950s, and currently almost 10 billion animals—not including additional goods such as dairy and eggs—are consumed every year in the United States alone.1 Due to the magnitude of this industry, even small changes can bring about large scale effects. Infections exemplify this chain of events.

Though animal infections might initially seem to be a lesser concern, their effects on human health are rapidly becoming more pronounced and pervasive. During the past few years, an increased number of food-borne disease outbreaks have been traced to products such as beef, pork, poultry, and milk.2 These outbreaks are especially concerning because the pathogens involved are new strains previously harmless to humans. Rather, these pathogens have become infectious to humans due to mutations that occur in animal hosts; such diseases that jump from animals to humans are termed zoonotic. Within the food industry, zoonotic illnesses can be transmitted by consumption or through contact with animals. Crucially, zoonotic cases are much harder to treat because there is no precedent for their treatment.

How often does this transmission occur? Since 1980, 87 new human pathogens have been identified, out of which a staggering 80% are zoonotic.3 Furthermore, many of these have been found in domestic animals, which serve as reservoirs for a variety of infectious agents. The large number of zoonoses raises several key questions. Are these outbreaks the product of our management of livestock or simply a natural phenomenon? How far could zoonotic illnesses escalate in terms of human cases and mortality? What practices or perspectives should we modify to prevent further damage?

Prominent virologist and Nobel laureate in medicine Sir Frank MacFarlane Burnet provided a timeless perspective to this issue in the mid-20th century. He conceptualized infectious disease as equally fundamental to other interactions between organisms such as predation, decomposition, and competition.4 Taking into account how we have harnessed nature, particularly with the aim of producing more food, we can see how farming animals has also inadvertently farmed pathogens.

Treating animals as living environments that can promote pathogenic evolution and diffusion is crucial to creating proper regulations in the livestock industry that protect the safety of consumers in the long run. Current practices risk the emergence of zoonotic diseases by facilitating transmission under heavily industrialized environments and by fostering antibiotic resistance in bacteria. Cooperative action between government, producers, and educated consumers is necessary to improve current practices and preserve good health for everyone.

Influenza: Old Threats, New Fears

The flu is not exactly a stranger to human health, but we must realize that the influenza virus not only affects humans but also other species such as pigs and birds. In fact, what is known as “the flu” is not a single virus but rather a whole family of viruses. The largest family of influenza viruses, influenza A, has different strains of viruses classified with a shorthand notation for their main surface glycoproteins –H for hemagglutinin and N for neuraminidase (Figure 1). These surface glycoproteins are important because their structure and shape determines if the virus will attach to the cellular receptors of its host and infect it. For example, the influenza H7N7 virus has a structure that allows it to specifically infect horses but not humans. Trouble arises when these surface glycoproteins undergo structural changes and the virus gains the capacity to infect humans, as was the case during the 2003 avian flu and the 2009 swine flu pandemics, when the influenza virus jumped from poultry and swine to humans.

Since 2003 when it was first documented in humans, avian influenza H5N1 has been responsible for over 600 human infections and associated with a 60% mortality rate due to severe respiratory failure.5 The majority of these cases occurred in Asia and Africa, particularly in countries such as Indonesia, Vietnam, and Egypt, which accounted for over 75% of all cases.5-6 Though no H5N1 cases have been reported in the U.S., there have been 17 low-pathogenicity outbreaks of avian flu in American poultry since 1997, and one highly pathogenic outbreak of H5N2 in 2004 with 7,000 chickens infected in Texas.5

Poultry is not the only area of livestock industry where flu viruses are a human health concern. The 2009 outbreak of influenza H1N1—popularly termed “swine flu” from its origin in pigs—was officially declared a pandemic by the WHO and the CDC. With an estimated 61 million cases and over 12,000 deaths attributed to the swine flu since 2009, H1N1 is an example of a zoonotic disease that became pandemic due to an interspecies jump that turned it from a regular pig virus to a multi-species contagion.7

The theory of how influenza viruses mutate to infect humans includes the role of birds and pigs as “mixing vessels” for mutant viruses to arise.8 In pigs, the genetic material from pig, bird, and human viruses (in any combination) reassorts within the cells to produce a virus that can be transmitted among several species. This process also occurs in birds with the mixing of human viruses and domestic and wild avian viral strains. If this theory is accurate, one can infer that a high density of pigs in an enclosed area could easily be a springboard for the emergence of new, infectious influenza strains. Thus, the “new” farms of America where pigs and poultry are stocked to minimize space and maximize production provide just the right environment for one infected pig to transfer the disease to the rest. Human handlers then face the risk of exposure to a new disease that can be as fatal as it is infectious, as the 2009 swine flu pandemic and the 2003 avian flu cases demonstrated. As consumers, adequate care of our food sources should not only be priority in avoiding disease but also in national and global health.

Feeding our Food: Antibiotic Resistance in the Food Industry

Interspecies transmission is not the only way through which new diseases can become pathogenic to humans. In the case of bacteria, new pathogenic strains can arise in animals from the action of another mechanism: antibiotic resistance. Antibiotic resistance is the result of the fundamental concept of evolutionary biology—individuals with advantageous traits that allow survival and reproduction will perpetuate these traits to their offspring. Even within the same population, antibiotic resistance varies among individual bacteria—some have a natural resistance to certain antibiotics while others simply die off when exposed. Thus, antibiotic use effectively selects bacteria with such resistance or, in some cases, total immunity. In this way, the livestock industry provides a selective environment.

The rise of these resistant strains—commonly termed “superbugs” for their extensive resistance to a variety of common antibiotics—has been a serious threat in hospitals; there, antibiotic use is widespread, and drug resistance causes almost 100,000 deaths each year from pathogens such as Methicillin-resistant Streptococcus aureus, Candida albicans, Acenitobacter baumanni, and dozens of other species.9 Our attention should not be exclusively focused to hospitals as sources of superbug infections, however. The widespread use of antibiotics in the livestock industry to avoid common bacterial diseases in food animals also poses the risk of emerging superbug strains, and it has not been without its share of outbreaks and casualties.

The Center for Science in the Public Interest –a non-profit organization that focuses on advocating for increased food safety in the US—has reported that antibiotic-resistant pathogens have been the cause of 55 major outbreaks since 1973, and that the majority of cases have come from dairy products, beef, and poultry. Furthermore, the same study reported that most of these pathogens exhibit resistance to over 7 different antibiotics.10 One of the main culprits identified in these outbreaks is the bacterium Salmonella typhimurium along with other Salmonella species, which account for over half of these cases. Salmonella is especially dangerous because it is so pervasive; it is able to lay dormant in a variety of livestock products such as uncooked eggs, milk, cheese, poultry, and beef until incubating in a live host for infection. Escherichia coli 0157:H7 (commonly known as E. coli), a bacterium that usually resides in the intestines of mammals, has also been implicated in a number of outbreaks related primarily to beef products. Overall, antibiotic-resistant pathogens have been the cause of over 20,500 illnesses, with over 31,000 hospitalizations and 27 deaths.10

These cases demonstrate how the widespread use of antibiotics in the food industry is perpetuating the risk of infections and damage to human health with antibiotic-resistant bacteria. Currently, the Food and Drug Administration (FDA) in the U.S. still approves of the use of antibiotics as a treatment for sick animals; furthermore, the organization allows antibiotic use in healthy animals as prevention and even as growth enhancers.11 In fact, over 74% of all antibiotics produced in the United States are used in livestock animals for these reasons.9,11 Using antibiotics in non-infected animals in this way generates a greater environmental pressure for superbugs to emerge; this type of use in particular should be restricted. Managing a proper use of antibiotics to reduce the risk of emerging strains of superbugs should be prioritized in the food industry just as it is in health care.

Hungry for a Solution

Still open to debate is the question of how many resources should be allocated to the problem of widespread antibiotic use. Currently, diseases are transmitted from animals to humans faster than they are evolving within humans. Not only that, many of these zoonotic diseases have high potential to become a pandemic due to their high infectivity, as in the case of H5N1 avian influenza. Measures to prevent the transmission of viruses among livestock animals and to reduce the rate of emergent antibiotic-resistant strains need to take into account the environmental and evolutionary nature of a zoonosis.

A more thorough surveillance of livestock animals and monitoring signs of new emerging strains are important in preventing the spread of such deadly pathogens. This strategy requires intensive molecular analysis, a larger number of professionals working in the field, and a nationwide initiative. Keeping an accurate record of where new strains arise and the number of animal and human cases would significantly improve epidemiological surveillance of infectious disease. This process requires cooperation at multiple levels to ensure that the logistics and public support for these initiatives is ongoing and effective. Additionally, educating people about the nature of zoonotic pathogens is crucial to fostering the dialogue and action necessary to secure the good health of animals, producers, and consumers.

References

  1. John’s Hopkins Center for a Livable Future: Industrial Food Animal Production in America. Fall 2013. http://www.jhsph.edu/research/centers-and-institutes/johns-hopkins-center-for-a-livable-future/_pdf/research/clf_reports/CLF-PEW-for%20Web.pdf (accessed Oct 24, 2013).
  2. Cleaveland, S. et al. Phil. Trans. R. Soc. B. 2001, 356, 991.
  3. Watanabe, M. E. BioScience 2008, 58, 680.
  4. Burnet, F. M. Biological Aspects of Infectious Disease. Macmillan: New York, 1940.
  5. Centers for Disease Control and Prevention: Avian Flu and Humans. http://www.cdc.gov/flu/avianflu/h5n1-people.html. (accessed Oct 12, 2013)
  6. Cumulative number of confirmed human cases of avian influenza A(H5N1) reported to WHO. http://www.who.int/influenza/human_animal_interface/H5N1_cumulative_table_archives/en/ (accessed March 14, 2013)
  7. Chan, M. World Now at the Start of the 2009 Influenza Pandemic. http://www.who.int/mediacentre/news/statements/2009/h1n1_pandemic_phase6_20090611/en/ (accessed March 14, 2013).
  8. Ma, W. et al. J. Mol. Genet. Med. [Online] 2009, 3, 158-164.
  9. Mathew, A. G. et al. Foodborne Pathog. Dis. 2007, 4, 115-133.
  10. DeWaal, C. S.; Grooters, S. V. Antibiotic Resistance in Foodborne Pathogens. http://cspinet.org/new/pdf/outbreaks_antibiotic_resistance_in_foodborne_pathogens_2013.pdf (accessed March 14, 2014).
  11. Shames, L. Agencies Have Made Limited Progress Addressing Antibiotic Use in Animals http://louise.house.gov/images/user_images/gt/stories/GAO_Report_on_Antibioic_Resistance.pdf. (accessed Jan 20, 2014).

Comment