Viewing entries tagged
Human Society

Eating Wheat: Avoiding the Bad and Getting the Good

Comment

Eating Wheat: Avoiding the Bad and Getting the Good

A bagel or bowl of cereal is common for breakfast, followed by a sandwich or burger for lunch. Dinner often stars pasta, pizza, or a casserole as the main dish. There is one ingredient that lurks in nearly every American meal.

Wheat. It’s the main ingredient in bread, the most purchased packaged food in the United States.1 It plays an integral role in many diets, but if not correctly consumed, it can damage the human body. The harmful effects include increased risk of weight gain, cardiovascular disease, and even cancer.2 To avoid adverse effects while reaping the benefits wheat offers, three factors should be considered: wheat type (whole-grain or refined), the portion size, and the accompanying ingredients.

Whole Grains Instead of Refined Grains

Guidelines by the United States Department of Agriculture (USDA) recommend Americans consume whole-grain rather than refined wheat. Currently, the average consumption of whole-grain foods is approximately one serving a day, falling short of the recommended three servings.3 Wheat grains are divided into three parts: endosperm, germ, and bran (Figure 1). Whole-grain wheat grains have the germ and bran intact. In contrast, refined grains that have the bran and germ separated from the starchy endosperm comprises 80% of the grain. Unfortunately, this processing robs wheat of the majority of its nutrients, which are concentrated in the bran and germ.

Whole grain wheat has nearly ten times more dietary fiber, five times as many vitamins and cancer-preventing phenolic compounds, and three times as many essential minerals including zinc, iron, and selenium (Table 1).

The extra dietary fiber of whole-grain wheat itself is a compelling reason to choose it over refined wheat. Increased consumption of dietary fiber has been observed to improve cholesterol concentrations, lower blood pressure, and aid in weight loss. These effects all reduce the risk for coronary heart disease, the leading cause of adult deaths in the United States.4 High-fiber foods facilitate metabolic effects and control caloric intake by increasing satiety. Dietary fiber, consisting of insoluble and soluble components, promotes gastrointestinal health as a probiotic for beneficial bacteria in the colon. Both fibers also provide cardiovascular benefits by lowering “bad” cholesterol, or LDL.

In the broader context of a person’s entire diet, high-fiber foods often have lower energy density and take longer to eat. These two traits promote satiety, curbing consumption of potentially unhealthy foods and lowering total caloric intake. Eating refined wheat, such as white bread and pasta, causes one to not only forego nutrients, but also consume more calories before feeling full. Overconsumption of calories coupled with physical inactivity are major risk factors leading to heart disease and obesity.5

Control Portion Size

In addition to considering what type of wheat one eats (e.g., wholewheat instead of white bread for toast in the morning), an equally important factor is quantity. Feasting upon large portions of wholegrain wheat regularly results in damaging spikes in blood sugar that can lead to an chronic state of Type 2 diabetes.6 Since diabetes is the leading cause of kidney failure in the United States and doubles the risk of stroke, its correlation with consumption of refined wheat is important to understand.7

The biochemical phenomenon underlying this link is called insulin resistance. Insulin is a hormone stimulating various tissues to store glucose from the blood as glycogen. When carbohydrates are digested, they are broken down into glucose, which is transported into the bloodstream, consequently increasing blood sugar levels. This causes pancreatic beta cells to synthesize insulin to convert the increased glucose into glycogen. When the body does not perform these functions well, the resulting condition is Type 2 diabetes.

Even though the USDA advises adding whole-grain wheat to one’s diet, USDA guidelines do not account for the spiking effect on blood sugar when a large portion is eaten in a short time frame. Their guidelines use a rating system called the glycemic index (GI) that is widely utilized in nutrition studies as a quality standard of carbohydrate foods.8 Wonder®, fully enriched white bread, has a GI of 71 while bread made of 80% whole-grain and 20% refined wheat flour has a GI of 52.8 In practical terms, these GI values indicate a 70% increase in blood sugar compared to the blood sugar increase caused by a comparable amount of pure glucose. Likewise, whole wheat bread causes an increase in blood sugar 52% of that caused by glucose. Based on the aforementioned pathogenic contribution of blood sugar spikes, the lower GI of whole-wheat bread quantitatively demonstrates its superiority over white bread.

However, consider the following: the Twix candy bar has an even lower GI of 44. Watermelon has a GI of 72. How does this make sense? The glycemic index fails to account for realistic portion sizes. When the foods are empirically tested on people for their effects on blood sugar, the quantities eaten are equivalent to 50 grams of carbohydrates. Three-quarters of a king-sized Twix bar constitutes 50 grams of carbohydrate, but so do 5 cups of diced watermelon. This difference in volume is due to the fiber and water content of watermelon.

Realistically, a person is likely to eat a whole king-sized Twix bar or one cup of diced watermelon in one sitting. Adjusting for actual serving sizes and assuming linearity, the Twix bar has what is now called a glycemic load (GL) of 58.7 and watermelon a GL of 14.4. As a more relevant implementation of GI values, glycemic load emphasizes control of portion sizes in eating carbohydrates. The GI value of whole-grain wheat is always lower than refined wheat vary, but the difference is small enough that one cup of refined flour pasta might be better than 2 cups of whole-wheat flour pasta in preventing Type 2 diabetes.

Watch Out for Accompanying Ingredients

The final factor to consider is that wheat is rarely eaten alone. In the processing and cooking to make it edible, wheat is nearly always mixed with other ingredients that are potentially harmful. Most breads, pastas, pancakes, cereals, and other wheat products have at least five ingredients trailing behind the primary wheat ingredient, which are broadly classified as preservatives, sweeteners, emulsifiers, leavening agents, flavor enhancers, and dough conditioners. All of these additives to wheat affect short-term feelings after consumption as well as long-term effects on the body. In particular, one should avoid partially hydrogenated oils and moderate high-fructose corn syrup.

Added as dough conditioners and preservatives, partially hydrogenated oils are considerable factors in coronary artery disease, which causes at least 30,000 premature American deaths per year.9 They contain trans fats, which have been unequivocally linked to lowering “good” high-density lipoprotein (HDL) cholesterol and raising “bad” low-density lipoprotein (LDL) cholesterol. Although large companies have removed trans fats, including partially hydrogenated oils, from foods such as Kraft’s Oreos in response to mounting criticism beginning in 2005, numerous food companies still include partially hydrogenated oils in their wheat products. For example, cake mixes, packaged baked goods, and peanut butter are commercially made with partially hydrogenated oils on a regular basis because they simplify manufacturing and reduce costs while increasing the final product’s shelf life. Manufacturers obfuscate this addition by stating the trans fat content of foods as “0g” on nutrition labels. This is allowed because 0.5 grams of trans fat is one serving. However, less than 0.5 grams of trans fat per serving can accumulate when consuming multiple servings of foods such as chips or crackers. Instead, check for the words “partially hydrogenated” or “shortening” in the ingredients list.

While partially hydrogenated oils are conclusively life-threatening, high-fructose corn syrup (HFCS) is a controversial additive. Manufacturers favor the use of HFCS as a sweetener in wheat products due to lower cost, sweeter taste, and higher miscibility. Scientists hypothesize that corn-derived sugar has endocrine effects that lead to obesity, Type 2 diabetes, and metabolic syndrome.8 Insulin and leptin are key hormone signals that regulate a person’s sense of hunger, but consumption of high-fructose corn syrup depresses these internal signals from controlling calorie intake. Another consequence of foods sweetened with HFCS is plaque buildup inside the arteries.10 Nearly any sweet good made from wheat will likely contain HFCS. Although data about its health effects are still inconclusive, HFCS should be avoided.

Being a health-conscious consumer of wheat can mean significant changes in daily choices of which foods to eat and how to eat them. Whole grains provide more fiber and life-boosting nutrients than refined grains, but accompanying ingredients in available food choices need to be considered as well. More importantly, the impacts of wheat on blood sugar need to be controlled by consuming a commensurate amount of fruits and vegetables. Awareness and application of these principles are the main steps to avoiding the bad and getting the good of wheat.

References

  1. Nielsen Homescan Facts, The Nielsen Company. http://www.marketingcharts.com/television/nielsen-issues-us-top-10-lists-for-2007-2700/nielsen-2007-top-10-cpg-purchased-us-homes.jpg/(Accessed Jan. 15, 2013).
  2. Slavin, J. L. Amer. J. Clin Nutr. 1999, 70, 459S-63S.
  3. Cleveland, L. E. J. Amer. Coll. Nutr. 2000, 19, 331–8.
  4. Anderson, J. W. Nutr. Rev. 2009, 67, 188–205.
  5. Swinburn, B. Public Health Nutr. 2007, 7, 123–46.
  6. Liu, S. J. Amer. Coll. Nutr. 2002, 21, 298–306.
  7. World Health Organization: Diabetes Fact Sheet, Media Centre. 2012 http://www.who.int/mediacentre/factsheets/fs312/en/index.html (Accessed Jan. 15, 2013).
  8. Foster-Powell, K. Amer. J. Clin Nutr. 2002, 76, 5–56.
  9. Ascherio, A. Amer. J. Clin Nutr. 1997, 66, 1006S–10S.
  10. Stanhope, K. Amer. J. Clin Nutr. 2008, 88, 1733S-7S.
  11. General Mills. What is Whole Grain, Anyway? Demystifying Whole Grains. http://wholegrainnation.eatbetteramerica.com/images/content/facts_seed.jpg (Accessed Jan. 15, 2013).
  12. Thompson, L. U. Contemp. Nutr. 1992, 17.

Comment

The Illusion of Race

Comment

The Illusion of Race

Race is one of the most pervasive features of American social life; neglecting the concept of race would be like questioning the existence of gravity. Though we would like to consider our nation a post-racial society, we still place great importance on race by asking for it on forms ranging from voter registration to the PSAT. However, many would be surprised to realize that race does not have a biological basis – there is no single defining characteristic or gene that can be unequivocally used to distinguish one race from another.1 Rather, it is a manmade concept used to describe differences in physical appearance. Yet, we have internalized the social construct of race to such a degree that it seems to have genetic significance, masking the fact that race is actually something we are raised with. That a simple internalized ideology creates disparities in contemporary American society, from socioeconomic status to healthcare accessibility, illustrates the urgency of exposing this myth of race.

Throughout American history, racial connotations have been fluid, with different ethnographic groups falling in and out of favor based upon societal views at a given time. Race was originally conceived as a way to justify colonialism. European colonizers institutionalized their ethnocentric attitudes by creating the concept of race in order to differentiate between the civilized and the savage, the Christians and the heathens. This dichotomy facilitated mercantilism, the economic policy designed to accrue astronomical profits for the European countries through the exploitation of “inferior” races. Scholars of Critical Race Theory show, more generally, that the boundaries of racial categories shift to accommodate political realities and conventional wisdom of a given time and place.2

This definition of race changed in the United States over the centuries. For example, when the Irish and Italians first immigrated in the early 20th century, they were seen as “swarthy drunkards” – clearly not part of the white “elite.” Within two generations, however, these same people were able to assimilate into the Caucasian-dominated culture while African-Americans were still considered a separate entity. Similarly, during the era of the Jim Crow laws, courts had the power of determining who was black and who was not; in Virginia, a person was considered to be black if he or she was at least 1/16th African-American; in Florida, a black person was at least 1/8th African-American; and in Alabama, having even the smallest sliver of African-American heritage made a person black.3 Thus, a person could literally change race by simply moving from one state to another. Today, the commonly defined race classifications, as specified by the US Census, include White, Black, Asian, American Indian or Alaska Native, Pacific Islander or Native Hawaiian, Other, and Multiracial. Because there is no scientific cut-offs to determine what race a person is, racial data is largely based on self-identification, which points to its lack of biological legitimacy. For example, 30% of Americans who label themselves as White do not have at least90% European ancestry.4

We may think our conceptualization of race is based upon biological makeup, but it is actually an expression of actions, attitudes, and social patterns. When examining the science behind race, most scholars across various disciplines, including evolutionary biology, sociology, and anthropology, have come to the consensus that distinctions made by race are not genetically discrete, cannot be reliably measured, and are not meaningful in the scientific sense.5

Some argue that race is a genetic concept based upon a higher incidence of particular diseases affecting certain races. However, purely hereditary diseases are extremely rare. For example, 1/2300 births for cystic fibrosis, 1/10000 births for Huntington’s disease, and 1/3000 births for Duchenne’s muscular dystrophy.6 Rather, diseases often reflect shared lifestyles and environments instead of shared genes, because factors such as poverty and malnutrition are also often “inherited” through family lines. Even genetic polymorphisms in hemoglobin, which lead to populations with lower susceptibility to malaria, can be partly explained by environmental factors.6-8 Thus, diseases traditionally tied to certain races cannot be explicitly attributed to genes, discrediting the idea that races are genetically disparate. Genetic differences are better described as percentages of people with a particular gene polymorphism, which change according to the environment.6

Racial groupings actually reflect little of the genetic variations existing in humans. Studies have shown that about 90% of variations in human genetics is present within a population on a continent, while around 10% of genetic variation occurs between continental populations.1 Variation in physical characteristics, the traditional basis for determining race, does not imply underlying genetic differences. When we internalize the false ideology that race is genetic, we are mistakenly implying that there are human subspecies.

Although race is a social construct, it has a widespread influence on society, especially in the United States. In particular, minorities face disadvantages in numerous areas ranging from healthcare to education.7,8 Reports about Mitt Romney’s rumored adoption of a darker skin tone when addressing Latino voters or statistics indicating that the median household wealth of whites is 20 times that of blacks reinforces the existence of a racialized society.5 This is shocking and disturbing; race may not be real, but its effects contribute to real inequality. Once everyone understands this racial illusion, we can begin making effective change.

References

  1. Bamshad, M. J.; Olson, S. E. Does Race Exist? Scientific American, New York City, Nov. 10, 2003, p. 78-85.
  2. Calavita, K. Invitation to Law and Society: An Introduction to the Study of Real Law. University of Chicago Press: Boston, MA, 2007.
  3. Rothenberg, P. S. Race, Class, and Gender in the United States, 7th ed.; Worth Publishers: New York, NY, 2007.
  4. Lorber, J.; Hess, B. B.; Ferree, M. M.; Eds. Revisioning Gender; AltaMira Press: Walnut Creek, CA, 2000.
  5. Costantini, C. ABC News. http://abcnews.go.com/ABC_Univision/News/mitt-romneys-tan-draws-media-fire-makeup-artist/story?id=17290303       (Accessed Oct. 26, 2012).
  6. Pearce, N. BMJ. 2004, 328, 1070-2.
  7. Stone, J. Theor. Med. Bioethics. 2002, 23, 499-518.
  8. Witzig, R. The Medicalization of Race: Scientific Legitimization of a Flawed Social Construct. Ann. Intern. Med. 1996, 125, 675-9.
  9. Tavernise, S. The New York Times. http://www.nytimes.com/2011/07/26/us/26hispanics.html?_r=0&pagewan (Accessed Oct. 26, 2012).

Comment

Nature vs Nurture of Politics

Comment

Nature vs Nurture of Politics

If you voted in last year’s election, what made you choose the candidate for whom you voted? Was it the platform, the party, or perhaps your genes? Since Mendel and his peas, the idea that genes affect physical traits has greatly influenced science. However, their role may be greater than we thought. Aristotle first posed the question of nature vs nurture, which is now the debate surrounding the relative importance of one’s genes (nature) against one’s upbringing (nurture) in determining physical and behavioral characteristics. For instance, is one’s intelligence an innate quality or one based on years of education? If genes are involved in political ideology, does that mean political freedom is limited? Do we have a choice in voting? The answer to the age-old question is more complex, yet nowadays, more people recognize the idea that both nature and nurture are involved in trait determination.

Family values, education, and the media were originally thought to determine an individual’s political behavior. The scientific community has gradually come to embrace political views as a legitimate factor in the nature vs. nurture debate. In fact, the study of genetic influence in political decisions has a name: genopolitics.1 For instance, an early study of the topic found that identical twins show more similarities than non-identical twins in voting behavior.2

Furthermore, although many still believe environmental influences play the sole role in determining political attitudes, a recent articles suggests that genes influence political preferences.3 This harks back to the notion that both nature and nurture are important in the development of an individual’s behavior. Whether your vote is liberal or conservative is not determined by a single gene, but rather by a combination of genes and regulatory pathways with environmental factors from one’s political principles.2

However, genetics does not play a roles in all political traits. In particular, because political parties are transient and vary between countries and across time, only nurture plays a role in party identification. A liberal in American politics is not the same as a liberal in European politics. On the other hand, most genopolitic researchers agree that genetics does influence the ideology (e.g., conservatism or liberalism) of group organization, as a relatively timeless matter.4 For example, your heritage might make you favor a powerful government over a weak government but would not directly influence you vote for a Democratic nominee over a Republican nominee.

How exactly genetics could influence your vote is difficult to understand. Hatemi and McDermott conceptualize that our Prehistoric ancestors faced issues with out-groups (immigration), security (foreign policy), and welfare (resource management), among others. Through evolutionary processes occurring over thousands of years, these issues became polarizing traits that are heritable.2 This by no means signifies that one ideology is more “fit” than another simply because it has dominated in the past.

These natural factors determine conservative vs. liberal preferences. However, environment may play a larger role in directing most individual political choices. More research is necessary to show that the influence of genes on political ideology cannot be explained by purely environmental influences.6

References

  1. Stafford, T. BBC Future. http://www.bbc.com/future/story/20121204-is-politics-really-in-our-genes (Accessed Dec. 4 2012).
  2. Biuso, E. Genopolitics. New York Times, New York, Dec. 14, 2008, p. MM57.
  3. Haterni, P. K. Trends in Genetics. 2012, 28(10), 525-33.
  4. Alford, J. R. Ann. Rev. Pol. Sci. 2008, 15(2), 183-203.
  5. Body politic: THe genetics of politics, The Economist, London, Oct. 6, 2012.
  6. Haterni, P. K. JOP2011, 73(1), 271-85.

Comment