Viewing entries in
T

Telomeres: Ways to Prolong Life

Comment

Telomeres: Ways to Prolong Life

Two hundred years ago, the average life expectancy oscillated between 30 and 40 years, as it had for centuries before. Medical knowledge was fairly limited to superstition and folk cures, and the science behind what actually caused disease and death was lacking. Since then, the average lifespan of human beings has skyrocketed due to scientific advancements in health care, such as an understanding of bacteria and infections. Today, new discoveries are being made in cellular biology which, in theory, could lead us to the next revolutionary leap in life span. Most promising among these recent discoveries is the manipulation of telomeres in order to slow the aging process, and the use of telomerase to identify cancerous cells.

Before understanding how telomeres can be utilized to increase the average lifespan of humans, it is essential to understand what a telomere is. When cells divide, their DNA must be copied so that all of the cells share an identical DNA sequence. However, the DNA cannot be copied all the way to the end of the strand, resulting in the loss of some DNA at the end of the sequence with every single replication.1 To prevent valuable genetic code from being cut off during cell division, our DNA contains telomeres, a meaningless combination of nucleotides at the end of our chromosomal sequences that can be cut off without consequences to the meaningful part of the DNA. Repeated cell replication causes these protective telomeres to become shorter and shorter, until valuable genetic code is eventually cut off, causing the cell to malfunction and ultimately die.1 The enzyme telomerase functions in cells to rebuild these constantly degrading telomeres, but its activity is relatively low in normal cells as compared to cancer cells.2

The applications of telomerase manipulation have only come up fairly recently, with the discovery of the functionality of both telomeres and telomerase in the mid 80’s by Nobel Prize winners Elizabeth Blackburn, Carol Grieder, and Jack Sjozak.3 Blackburn discovered a sequence at the end of chromosomes that was repeated several times, but could not determine what the purpose of this sequence was. At the same time, Sjozak was observing the degradation of minichromosomes, chromatin-like structures which replicated during cell division when introduced to a yeast cell. Together, they combined their work by isolating Blackburn’s repeating DNA sequences, attaching them to Sjozak’s minichromosomes, and then placing the minichromosomes back inside yeast cells. With the new addition to their DNA sequence, the minichromosomes did not degrade as they had before, thus proving that the purpose of the repeating DNA sequence, dubbed the telomere, was to protect the chromosome and delay cellular aging.

Because of the relationship between telomeres and cellular aging, many scientists theorize that cell longevity could be enhanced by finding a way to control telomere degradation and keep protective caps on the end of cell DNA indefinitely.1 Were this to be accomplished, the cells would be able to divide an infinite number of times before they started to lose valuable genetic code, which would theoretically extend the life of the organism as a whole.

In addition, studies into telomeres have revealed new ways of combatting cancer. Although there are many subtypes of cancer, all variations of cancer involve the uncontrollable, rapid division of cells. Despite this rapid division, the telomeres of cancer cells do not shorten like those of a normal cell upon division, otherwise this rapid division would be impossible. Cancer cells are likely able to maintain their telomeres due to their higher levels of telomerase.3 This knowledge allows scientists to use telomerase levels as an indicator of cancerous cells, and then proceed to target these cells. Vaccines that target telomerase production have the potential to be the newest weapon in combatting cancer.2 Cancerous cells continue to proliferate at an uncontrollable rate even when telomerase production is interrupted. However, without the telomerase to protect their telomeres from degradation, these cells eventually die.

As the scientific community advances its ability to control telomeres, it comes closer to controlling the process of cellular reproduction, one of the many factors associated with human aging and cancerous cells. With knowledge in these areas continuing to develop, the possibility of completely eradicating cancer and slowing the aging process is becoming more and more realistic.

References

  1. Genetic Science Learning Center. Learn.Genetics. http://learn.genetics.utah.edu (accessed Oct. 5, 2016).
  2. Shay, J. W.; Woodring W. E.  NRD. [Online] 2016, 5. http://www.nature.com/nrd/journal/v5/n7/full/nrd2081.html (accessed Oct. 16, 2016).
  3. The 2009 Nobel Prize in Physiology or Medicine - Press Release. The Nobel Prize. https://www.nobelprize.org/nobel_prizes/medicine/laureates/2009/press.html (accessed Oct. 4, 2016).

Comment

The Health of Healthcare Providers

Comment

The Health of Healthcare Providers

A car crash. A heart attack. A drug overdose. No matter what time of day, where you are, or what your problem is, emergency medical technicians (EMTs) will be on call and ready to come to your aid. These health care providers are charged with providing quality care to maintain or improve patient health in the field, and their efforts have saved the lives of many who could not otherwise find care on their own. While these EMTs deserve praise and respect for their line of work, what they deserve even more is consideration for the health issues that they themselves face. Emergency medical technicians suffer from a host of long-term health issues, including weight gain, burnout, and psychological changes.

The daily "schedule" of an EMT is probably most characterized by its variability and unpredictability. The entirety of their day is a summation of what everyone in their area is doing, those people's health issues, and the uncertainty of life itself. While there are start and end times to their shifts, even these are not hard and fast--shifts have the potential to start early or end late based on when people call 911. An EMT can spend their entire shift on the ambulance, without time to eat a proper meal or to get any sleep. These healthcare providers learn to catch a few minutes of sleep here and there when possible. Their yearly schedules are also unpredictable, with lottery systems in place to ensure that someone is working every day, at all hours of the day, while maintaining some fairness. Most services will have either 12 or 24 hour shifts, and this lottery system can result in EMTs having stacked shifts that are either back to back or at least within close proximity to one another. This only enhances the possibility of sleep disorders, with 70 percent of EMTs reporting having at least one sleep problem.1 While many people have experienced the effects of exhaustion and burnout due to a lack of sleep, few can say that their entire professional career has been characterized by these feelings. EMTs have been shown to be more than twice as likely than control groups to have moderate to high scores on the Epworth Sleepiness Scale (ESS), which is correlated with a greater likelihood of falling asleep during daily activities such as conversing, sitting in public places, and driving.1 The restriction and outright deprivation of sleep in EMTs has been shown to cause a large variety of health problems, and seems to be the main factor in the decline of both physical and mental health for EMTs.

A regular amount of sleep is essential in maintaining a healthy body. Reduced sleep has been associated with an increase in weight gain, cardiovascular disease, and weakened immune system functions. Studies have shown that, at least in men, short sleep durations are linked to weight gain and obesity, which is potentially due to alterations in hormones that regulate appetite.2,3 Due to this trend, it is no surprise that a 2009 study found that sleep durations that deviated from an ideal 7-8 hours, as well as frequent insomnia, increased the risk of cardiovascular disease. The fact that EMTs often have poor diets compounds that risk. An EMT needs to be ready around the clock to respond, which means there really isn’t any time to sit down and have a proper meal. Fast food becomes the meal of choice due to its convenience, both in availability and speed. Some hospitals have attempted to improve upon this shortcoming in the emergency medical service (EMS) world by providing some snacks and drinks at the hospital. This, however, creates a different issue due to the high calorie nature of these snacks. The body generally knows when it is full by detecting stretch in the stomach, and signaling the brain that enough food has been consumed. In a balanced diet, a lot of this space should be filled with fruits, vegetables, and overall low calorie items unless you are an athlete who uses a lot more energy. By eating smaller, high calorie items, an EMT will need to eat more in order to feel full, but this will result in the person exceeding their recommended daily calories. The extra energy will often get stored as fat, compounding the weight gain due to sleep deprivation. Studies involving the effects of restricted sleep on the immune system are less common, but one experiment demonstrated markers of systemic inflammation which could, again, lead to cardiovascular disease and obesity.2

Mental health is not spared from complications due to long waking periods with minimal sleep. A study was conducted to test the cognitive abilities of subjects experiencing varying amounts of sleep restriction;the results showed that less sleep led to cognitive deficits, and being awake for more than 16 hours led to deficits regardless of how much sleep the subject had gotten.4 This finding affects both the EMTs, who can injure themselves, and the patients, who may suffer due to more errors being made in the field. First year physicians, who similarly can work over 24 hour shifts, are subject to an increased risk of automobile crashes and percutaneous (skin) injuries when sleep deprived.5 These injuries often happen when leaving a shift. A typical EMT shift lasts from one morning to the next, and the EMT will leave his or her shift during rush hour on little to no sleep, increasing the dangerous possibility of falling asleep or dozing at the wheel. A similar study to the one on first year physicians mentioned prior studied extended duration work at critical-care units, and found that long shifts increased the risk of medical errors and lapses in attention.6 In addition to the more direct mental health problems posed by the continuous strain, EMTs and others in the healthcare field also face more personal issues, including burnout and changes in behavior. A study on pediatric residents, who face similar amounts of stress and workloads, established that 20% of participants were suffering from depression, and 75% met the criteria for burnout, both of which led to medical errors made during work.7 A separate study found that emergency physicians suffering from burnout also faced high emotional exhaustion, depersonalization, and a low sense of accomplishment.8 While many go into the healthcare field to help others, exhaustion and desensitization create a sort of cynicism in order to defend against the enormous emotional burden that comes with treating patients day in and day out.

Sleep deprivation, long work duration, and the stress that comes with the job contribute to a poor environment for the physical and mental health of emergency medical technicians and other healthcare providers. However, a recent study has shown that downtime, especially after dealing with critical patients, led to lower rates of depression and acute stress in EMTs.9 While this does not necessarily ameliorate post-traumatic stress or burnout, it is a start to addressing the situation. Other possible interventions would include providing more balanced meals at hospitals that are readily available to EMTs, as well as an improved scheduling system that prevents or limits back to back shifts. These concepts can apply to others facing high workloads with abnormal sleeping schedules as well, including college students, who are also at risk for mood disorders and a poorer quality of life due to the rigors of college life.10

References

  1. Pirrallo, R. G. et al. International Journal of the Science and Practice of Sleep Medicine. 2012, 16, 149-162.
  2. Banks, S. et al. J. Clin. Sleep Med. 2007, 3(5), 519-528.
  3. Watanabe, M. et al. Sleep  2010, 33(2), 161-167.
  4. Van Dongen, H. P. et al. Sleep 2004, 27(4), 117-126.
  5. Najib, T. A. et al. JAMA 2006, 296(9), 1055-1062.
  6. Barger, L. K. et al. PLoS Med. [Online] 2006, 3(12), e487. https://dx.doi.org/10.1371%2Fjournal.pmed.0030487 (accessed Oct. 3, 2016)
  7. Fahrenkopf, A. M. et al. BMJ [Online] 2008, 336, 488. http://dx.doi.org/10.1136/bmj.39469.763218.BE (accessed Oct. 3, 2016)
  8. Ben-Itzhak, S. et al. Clin. Exp. Emerg. Med. 2015, 2(4), 217-225.
  9. Halpern, J. et al. Biomed. Res. Int. [Online] 2014, 2014. http://dx.doi.org/10.1155/2014/483140 (accessed Oct. 3, 2016)
  10. Singh, R. et al. J. Clin. Diagn. Res. [Online] 2016, 10(5), JC01-JC05. https://dx.doi.org/10.7860%2FJCDR%2F2016%2F19140.7878 (accessed Oct 3, 2016)

Comment

The Creation of Successful Scaffolds for Tissue Engineering

Comment

The Creation of Successful Scaffolds for Tissue Engineering

Abstract

Tissue engineering is a broad field with applications ranging from pharmaceutical testing to total organ replacement. Recently, there has been extensive research on creating tissue that is able to replace or repair natural human tissue. Much of this research focuses on the creation of scaffolds that can both support cell growth and successfully integrate with the surrounding tissue. This article will introduce the concept of a scaffold for tissue engineering; discuss key areas of research including biomolecule use, vascularization, mechanical strength, and tissue attachment; and introduce some important recent advancements in these areas.

Introduction

Tissue engineering relies on four main factors: the growth of appropriate cells, the introduction of the proper biomolecules to these cells, the attachment of the cells to an appropriate scaffold, and the application of specific mechanical and biological forces to develop the completed tissue.1

Successful cell culture has been possible since the 1960’s, but these early methods lacked the adaptability necessary to make functioning tissues. With the introduction of induced pluripotent stem cells in 2008, however, researchers have not faced the same resource limitation previously encountered. As a result, the growth of cells of a desired type has not been limiting to researchers in tissue engineering and thus warrants less concern than other factors in contemporary tissue engineering.2,3

Similarly, the introduction of essential biomolecules (such as growth factors) to the developing tissue has generally not restricted modern tissue engineering efforts. Extensive research and knowledge of biomolecule function as well as relatively reliable methods of obtaining important biomolecules have allowed researchers to make engineered tissues more successfully emulate functional human tissue using biomolecules.4,5 Despite these advancements in information and procurement methods, however, the ability of biomolecules to improve engineered tissue often relies on the structure and chemical composition of the scaffold material.6

Cellular attachment has also been a heavily explored field of research. This refers specifically to the ability of the engineered tissue to seamlessly integrate into the surrounding tissue. Studies in cellular attachment often focus on qualities of scaffolds such as porosity as well as the introduction of biomolecules to encourage tissue union on the cellular level. Like biomolecule effectiveness, successful cellular attachment depends on the material and structure of the tissue scaffolding.7

Also critical to developing functional tissue is exposing it to the right environment. This development of tissue properties via the application of mechanical and biological forces depends strongly on finding materials that can withstand the required forces while supplying cells with the necessary environment and nutrients. Previous research in this has focused on several scaffold materials for various reasons. However, improvements to the material or the specific methods of development are still greatly needed in order to create functional implantable tissue. Because of the difficulty of conducting research in this area, devoted efforts to improving these methods remain critical to successful tissue engineering.

In order for a scaffold to be capable of supporting cells until the formation of a functioning tissue, it is necessary to satisfy several key requirements, principally introduction of helpful biomolecules, vascularization, mechanical function, appropriate chemical and physical environment, and compatibility with surrounding biological tissue.8,9 Great progress has been made towards satisfying many of these conditions, but further research in the field of tissue engineering must address challenges with existing scaffolds and improve their utility for replacing or repairing human tissue.

Key Research Areas of Scaffolding Design

Biomolecules

Throughout most early tissue engineering projects, researchers focused on simple cell culture surrounding specific material scaffolds.10 Promising developments such as the creation of engineered cartilage motivated further funding and interest in research. However, these early efforts missed out on several crucial factors to tissue engineering that allow implantable tissue to take on more complex functional roles. In order to create tissue that is functional and able to direct biological processes alongside nearby natural tissue, it is important to understand the interactions of biomolecules with engineered tissue.

Because the ultimate goal of tissue engineering is to create functional, implantable tissue that mimics biological systems, most important biomolecules have been explored by researchers in the medical field outside of tissue engineering. As a result, a solid body of research exists describing the functions and interactions of various biomolecules. Because of this existing information, understanding their potential uses in tissue engineering relies mainly on studying the interactions of biomolecules with materials which are not native to the body; most commonly, these non-biological materials are used as scaffolding. To complicate the topic further, biomolecules are a considerably large category encompassing everything from DNA to glucose to proteins. As such, it is most necessary to focus on those that interact closely with engineered tissue.

One type of biomolecule that is subject to much research and speculation in current tissue engineering is the growth factor.11 Specific growth factors can have a variety of functions from general cell proliferation to the formation of blood cells and vessels.12-14 They can also be responsible for disease, especially the unchecked cell generation of cancer.15 Many of the positive roles have direct applications to tissue engineering. For example, Transforming Growth Factor-beta (TGF-β) regulates normal growth and development in humans.16 One study found that while addition of ligands to engineered tissue could increase cellular adhesion to nearby cells, the addition also decreased the generation of the extracellular matrix, a key structure in functional tissue.17 To remedy this, the researchers then tested the same method with the addition of TGF-β. They saw a significant increase in the generation of the extracellular matrix, improving their engineered tissue’s ability to become functional faster and more effectively. Clearly, a combination of growth factors and other tissue engineering methods can lead to better outcomes for functional tissue engineering.

With the utility of growth factors established, delivery methods become very important. Several methods have been shown as effective, including delivery in a gelatin carrier.18 However, some of the most promising procedures rely on the scaffolding’s properties. One set of studies mimicked the natural release of growth factors through the extracellular matrix by creating a nanofiber scaffold containing growth factors for delayed release.19 The study saw an positive influence on the behavior of cells as a result of the release of growth factor. Other methods vary physical properties of the scaffold such as pore size to trigger immune pathways that release regenerative growth factors, as will be discussed later. The use of biomolecules and specifically growth factors is heavily linked to the choice of scaffolding material and can be critical to the success of an engineered tissue.

Vascularization

Because almost all tissue cannot survive without proper oxygenation, engineered tissue vascularization has been a focus of many researchers in recent years to optimize chances of engineered tissue success.20 For many of the areas of advancement, this process depends on the scaffold.21 The actual requirements for level and complexity of vasculature vary greatly based on the type of tissue; the requirements for blood flow in the highly vascularized lungs are different than those for cortical bone.22,23 Therefore, it is more appropriate for this topic to address the methods which have been developed for creating vascularized tissue rather than the actual designs of specific tissues.

One method that has shown great promise is the use of modified 3D printers to cast vascularized tissue.24 This method uses the relatively new printing technology to create carbohydrate glass networks in the form of the desired vascular network. The network is then coated with a hydrogel scaffold to allow cells to grow. The carbohydrate glass is then dissolved from inside of the hydrogel, leaving an open vasculature in a specific shape. This method has been successful in achieving cell growth in areas of engineered tissue that would normally undergo necrosis. Even more remarkably, the created vasculature showed the ability to branch into a more complex system when coated with endothelial cells.24

However, this method is not always applicable. Many tissue types require scaffolds that are more rigid or have different properties than hydrogels. In this case, researchers have focused on the effect of a material’s porosity on angiogenesis.7,25 Several key factors have been identified for blood vessel growth, including pore size, surface area, and endothelial cell seeding similar to that which was successful in 3D printed hydrogels. Of course, many other methods are currently being researched based on a variety of scaffolds. Improvements on these methods, combined with better research into the interactions of vascularization with biomaterial attachment, show great promise for engineering complex, differentiated tissue.

Mechanical Strength

Research has consistently demonstrated that large-scale cell culture is not limiting to bioengineering. With the introduction of technology like bioreactors or three-dimensional cell culture plates, growing cells of the desired qualities and in the appropriate form continues to become easier for researchers; this in turn allows for a focus on factors beyond simply gathering the proper types of cells.2 This is important because most applications in tissue engineering require more than just the ability to create groupings of cells—the cells must have a certain degree of mechanical strength in order to functionally replace tissue that experiences physical pressure.

The mechanical strength of a tissue is a result of many developmental factors and can be classified in different ways, often based on the type of force applied to the tissue or the amount of force the tissue is able to withstand. Regardless, mechanical strength of a tissue primarily relies on the physical strength of the tissue and its ability for its cells to function under an applied pressure; these are both products of the material and fabrication methods of the scaffolding used. For example, scaffolds in bone tissue engineering are often measured for compressive strength. Studies have found that certain techniques, such as cooking in a vacuum oven, may increase compressive strength.26 One group found that they were able to match the higher end of the possible strength of cancellous (spongy) bone via 3D printing by using specific molecules within the binding layers.27 This simple change resulted in scaffolding that displayed ten times the mechanical strength of scaffolding with traditional materials, a value within the range for natural bone. Additionally, the use of specific binding agents between layers of scaffold resulted in increased cellular attachment, the implications of which will be discussed later.27 These changes result in tissue that is more able to meet the functional requirements and therefore to be easily used as a replacement for bone. Thus, simple changes in materials and methods used can drastically increase the mechanical usability of scaffolds and often have positive effects on other important qualities for certain types of tissue.

Clearly, not all designed tissues require the mechanical strength of bone; contrastingly for contrast, the brain experiences less than one kPa of pressure compared to the for bone’s 106 kPa pressure bones experience.28 Thus, not all scaffolds must support the same amount of pressure, and scaffolds must be made accordingly to accommodate for these structural differences. Additionally, other tissues might experience forces such as tension or torsion based on their locations within the body. This means that mechanical properties must be looked at on a tissue-by-tissue basis in order to determine their corresponding scaffolding structures. But mechanical limitations are only a primary factor in bone, cartilage, and cardiovascular engineered tissue, the latter of which has significantly more complicated mechanical requirements.29

Research in the past few years has investigated increasingly complex aspects of scaffold design and their effects on macroscopic physical properties. For example, it is generally accepted that pore size and related surface area within engineered bone replacements are key to cellular attachment. However, recent advances in scaffold fabrication techniques have allowed researchers to investigate very specific properties of these pores such as their individual geometry. In one recent study, it was found that using an inverse opal geometry--an architecture known for its high strength in materials engineering--for pores led to a doubling of mineralization within a bone engineering scaffold.30 Mineralization is a crucial quality of bone because of its contribution to compressive strength.31 This result is so important because it demonstrates the recent ability of researchers to alter scaffolds on a microscopic level in order to affect macroscopic changes in tissue properties.

Attachment to Nearby Tissue

Even with an ideal design, a tissue’s success as an implant relies on its ability to integrate with the surrounding tissue. For some types of tissue, this is simply a matter of avoiding rejection by the host through an immune response.32 In these cases, it is important to choose materials with a specific consideration for reducing this immune response. Over the past several decades, it has been shown that the key requirement for biocompatibility is the use of materials that are nearly biologically inert and thus do not trigger a negative response from natural tissue.33 This is based on the strategy which focuses on minimizing the immune response of tissue surrounding the implant in order to avoid issues such as inflammation which might be detrimental to the patient undergoing the procedure. This method has been relatively effective for implants ranging from total joint replacements to heart valves.

Avoiding a negative immune response has proven successful for some medical fields. However, more complex solutions involving a guided immune response might be necessary for engineered tissue implants to survive and take on the intended function. This issue of balancing biochemical inertness and tissue survival has led researchers to investigate the possibility of using the host immune response in an advantageous way for the success of the implant.34 This method of intentionally triggering surrounding natural tissue relies on the understanding that immune response is actually essential to tissue repair. While an inert biomaterial may be able to avoid a negative reaction, it will also discourage a positive reaction. Without provoking some sort of response to the new tissue, an implant will remain foreign to bordering tissue; this means that the cells cannot take on important functions, limiting the success of any biomaterial that has more than a mechanical use.

Current studies have focused primarily on modifying surface topography and chemistry to target a positive immune reaction in the cells surrounding the new tissue. One example is the grafting of oligopeptides onto the surface of an implant to stimulate macrophage response. This method ultimately leads to the release of growth factors and greater levels of cellular attachment because of the chemical signals involved in the natural immune response.35 Another study found that the use of a certain pore size in the scaffold material led to faster and more complete healing in an in vivo study using rabbits. Upon further investigation, it was found that the smaller pore size was interacting with macrophages involved in the triggered immune response; this interaction ultimately led more macrophages to differentiate into a regenerative pathway, leading to better and faster healing of the implant with the surrounding tissue.36 Similar studies have investigated the effect of methods such as attaching surface proteins with similarly enlightening results. These and other promising studies have led to an increased awareness of chemical signaling as a method to enhance biomaterial integration with larger implications including faster healing time and greater functionality.

Conclusion

The use of scaffolds for tissue engineering has been the subject of much research because of its potential for extensive utilization in the medical field. Recent advancements have focused on several areas, particularly the use of biomolecules, improved vascularization, increases in mechanical strength, and attachment to existing tissue. Advancements in each of these fields have been closely related to the use of scaffolding. Several biomolecules, especially growth factors, have led to a greater ability for tissue to adapt as an integrated part of the body after implantation. These growth factors rely on efficient means of delivery, notably through inclusion in the scaffold, in order to have an effect on the tissue. The development of new methods and refinement of existing ones has allowed researchers to successfully vascularize tissue on multiple types of scaffolds. Likewise, better methods of strengthening engineered tissue scaffolds before cell growth and implantation have allowed for improved functionality, especially under mechanical forces. Modifications to scaffolding and the addition of special molecules have allowed for increased cellular attachment, improving the efficacy of engineered tissue for implantation. Further advancement in each of these areas could lead to more effective scaffolds and the ability to successfully use engineered tissue for functional implants in medical treatments.

References

  1. “Tissue Engineering and Regenerative Medicine.” National Institute of Biomedical Imaging and Bioengineering. N.p., 22 July 2013. Web. 29 Oct. 2016.
  2. Haycock, John W. “3D Cell Culture: A Review of Current Approaches and Techniques.” 3D Cell Culture: Methods and Protocols. Ed. John W. Haycock. Totowa, NJ: Humana Press, 2011. 1–15. Web.
  3. Takahashi, Kazutoshi, and Shinya Yamanaka. “Induction of Pluripotent Stem Cells from Mouse Embryonic and Adult Fibroblast Cultures by Defined Factors.” Cell 126.4 (2006): 663–676. ScienceDirect. Web.
  4. Richardson, Thomas P. et al. “Polymeric System for Dual Growth Factor Delivery.” Nat Biotech 19.11 (2001): 1029–1034. Web.
  5. Liao, IC, SY Chew, and KW Leong. “Aligned Core–shell Nanofibers Delivering Bioactive Proteins.” Nanomedicine 1.4 (2006): 465–471. Print.
  6. Elliott Donaghue, Irja et al. “Cell and Biomolecule Delivery for Tissue Repair and Regeneration in the Central Nervous System.” Journal of Controlled Release: Official Journal of the Controlled Release Society 190 (2014): 219–227. PubMed. Web.
  7. Murphy, Ciara M., Matthew G. Haugh, and Fergal J. O’Brien. “The Effect of Mean Pore Size on Cell Attachment, Proliferation and Migration in Collagen–glycosaminoglycan Scaffolds for Bone Tissue Engineering.” Biomaterials 31.3 (2010): 461–466. Web.
  8. Sachlos, E., and J. T. Czernuszka. “Making Tissue Engineering Scaffolds Work. Review: The Application of Solid Freeform Fabrication Technology to the Production of Tissue Engineering Scaffolds.” European Cells & Materials 5 (2003): 29-39-40. Print.
  9. Chen, Guoping, Takashi Ushida, and Tetsuya Tateishi. “Scaffold Design for Tissue Engineering.” Macromolecular Bioscience 2.2 (2002): 67–77. Wiley Online Library. Web.
  10. Vacanti, Charles A. 2006. “The history of tissue engineering.” Journal of Cellular and Molecular Medicine 10 (3): 569-576.
  11. Depprich, Rita A. “Biomolecule Use in Tissue Engineering.” Fundamentals of Tissue Engineering and Regenerative Medicine. Ed. Ulrich Meyer et al. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. 121–135. Web.
  12. Laiho, Marikki, and Jorma Keski-Oja. “Growth Factors in the Regulation of Pericellular Proteolysis: A Review.” Cancer Research 49.10 (1989): 2533. Print.
  13. Morstyn, George, and Antony W. Burgess. “Hemopoietic Growth Factors: A Review.” Cancer Research 48.20 (1988): 5624. Print.
  14. Yancopoulos, George D. et al. “Vascular-Specific Growth Factors and Blood Vessel Formation.” Nature 407.6801 (2000): 242–248. Web.
  15. Aaronson, SA. “Growth Factors and Cancer.” Science 254.5035 (1991): 1146. Web.
  16. Lawrence, DA. “Transforming Growth Factor-Beta: A General Review.” European cytokine network 7.3 (1996): 363–374. Print.
  17. Mann, Brenda K, Rachael H Schmedlen, and Jennifer L West. “Tethered-TGF-β Increases Extracellular Matrix Production of Vascular Smooth Muscle Cells.” Biomaterials 22.5 (2001): 439–444. Web.
  18. Malafaya, Patrícia B., Gabriela A. Silva, and Rui L. Reis. “Natural–origin Polymers as Carriers and Scaffolds for Biomolecules and Cell Delivery in Tissue Engineering Applications.” Matrices and Scaffolds for Drug Delivery in Tissue Engineering 59.4–5 (2007): 207–233. Web.
  19. Sahoo, Sambit et al. “Growth Factor Delivery through Electrospun Nanofibers in Scaffolds for Tissue Engineering Applications.” Journal of Biomedical Materials Research Part A 93A.4 (2010): 1539–1550. Web.
  20. Novosel, Esther C., Claudia Kleinhans, and Petra J. Kluger. “Vascularization Is the Key Challenge in Tissue Engineering.” From Tissue Engineering to Regenerative Medicine- The Potential and the Pitfalls 63.4–5 (2011): 300–311. Web.
  21. Drury, Jeanie L., and David J. Mooney. “Hydrogels for Tissue Engineering: Scaffold Design Variables and Applications.” Synthesis of Biomimetic Polymers 24.24 (2003): 4337–4351. Web.
  22. Lafage-Proust, Marie-Helene et al. “Assessment of Bone Vascularization and Its Role in Bone Remodeling.” BoneKEy Rep 4 (2015): n. pag. Web.
  23. Türkvatan, Aysel et al. “Multidetector CT Angiography of Renal Vasculature: Normal Anatomy and Variants.” European Radiology 19.1 (2009): 236–244. Web.
  24. Miller, Jordan S. et al. “Rapid Casting of Patterned Vascular Networks for Perfusable Engineered Three-Dimensional Tissues.” Nature Materials 11.9 (2012): 768–774. www.nature.com. Web.
  25. Lovett, Michael et al. “Vascularization Strategies for Tissue Engineering.” Tissue Engineering. Part B, Reviews 15.3 (2009): 353–370. Web.
  26. Cox, Sophie C. et al. “3D Printing of Porous Hydroxyapatite Scaffolds Intended for Use in Bone Tissue Engineering Applications.” Materials Science and Engineering: C 47 (2015): 237–247. ScienceDirect. Web.
  27. Fielding, Gary A., Amit Bandyopadhyay, and Susmita Bose. “Effects of Silica and Zinc Oxide Doping on Mechanical and Biological Properties of 3D Printed Tricalcium Phosphate Tissue Engineering Scaffolds.” Dental Materials 28.2 (2012): 113–122. ScienceDirect. Web.
  28. Engler, Adam J. et al. “Matrix Elasticity Directs Stem Cell Lineage Specification.” Cell 126.4 (2006): 677–689. ScienceDirect. Web.
  29. Bilodeau, Katia, and Diego Mantovani. “Bioreactors for Tissue Engineering: Focus on Mechanical Constraints. A Comparative Review.” Tissue Engineering 12.8 (2006): 2367–2383. online.liebertpub.com (Atypon). Web.
  30. Sommer, Marianne R. et al. “Silk Fibroin Scaffolds with Inverse Opal Structure for Bone Tissue Engineering.” Journal of Biomedical Materials Research Part B: Applied Biomaterials (2016): n/a-n/a. Wiley Online Library. Web.
  31. Sapir-Koren, Rony, and Gregory Livshits. “Bone Mineralization and Regulation of Phosphate Homeostasis.” IBMS BoneKEy 8.6 (2011): 286–300. www.nature.com. Web.
  32. Boehler, Ryan M., John G. Graham, and Lonnie D. Shea. “Tissue Engineering Tools for Modulation of the Immune Response.” BioTechniques 51.4 (2011): 239–passim. PubMed Central. Web.
  33. Follet, H. et al. “The Degree of Mineralization Is a Determinant of Bone Strength: A Study on Human Calcanei.” Bone 34.5 (2004): 783–789. PubMed. Web.
  34. Franz, Sandra et al. “Immune Responses to Implants – A Review of the Implications for the Design of Immunomodulatory Biomaterials.” Biomaterials 32.28 (2011): 6692–6709. ScienceDirect. Web.
  35. Kao, Weiyuan John, and Damian Lee. “In Vivo Modulation of Host Response and Macrophage Behavior by Polymer Networks Grafted with Fibronectin-Derived Biomimetic Oligopeptides: The Role of RGD and PHSRN Domains.” Biomaterials 22.21 (2001): 2901–2909. ScienceDirect. Web.
  36. Bryers, James D, Cecilia M Giachelli, and Buddy D Ratner. “Engineering Biomaterials to Integrate and Heal: The Biocompatibility Paradigm Shifts.” Biotechnology and bioengineering 109.8 (2012): 1898–1911. Web.

Comment

Transplanting Time

Comment

Transplanting Time

Nowadays, it is possible for patients with organ failure to live for decades after receiving an organ transplant. Since the first successful kidney transplant in the 1950s,1,2 advances in the procedure, including the improvement of drugs that facilitate acceptance of the foreign body parts,3 have allowed surgeons to transplant a wider variety of organs, such as the heart, lungs, liver, and pancreas.2,4 Over 750,000 lives have been saved and extended through the use of organ transplants, an unthinkable feat just over 50 years ago.2 Limitations to organ transplantation, such as the lack of available organs, and the development of new advancements that can improve the process promote ongoing discussion regarding the ethics of transplants.

The idea behind an organ transplant is simple. When both the recipient and the new organ are ready, surgeons detach the blood vessels attached to the failing organ before putting the new one in its place by reattaching the patient’s blood vessels to the functioning organ. To prevent rejection of the new organ, the recipient will continue to take immunosuppressant drugs3. In exchange for this lifelong commitment, the patient often receives a longer, more enjoyable life.2

The organs used in transplants usually originate from a cadaver or a living donor.1-3 Some individuals are deterred from becoming an organ donor because they are concerned that doctors will not do their best to save them if their organs are needed. This concern is further complicated by blurred definitions of “dead”; in one ethically ambiguous situation, dying patients who are brain dead may be taken off of life support so that their organs may be donated.1-3 Stories of patients who reawaken from comas after being pronounced “dead” may give some encouragement, but a patient’s family and doctors must decide when to give up that hope. Aside from organs received from the deceased, living donors, who may be family, friends, or strangers to the recipient, may donate organs that they can live without, such as a lung or a kidney.1-3 However, the potential injuring of a healthy person for the sake of another may contradict the oath that doctors take, which instructs physicians to help, not harm their patients.1

One of the most pressing issues today stems from the following question: who receives the organs? The transplant waiting list is constantly growing because the number of organs needed greatly exceeds the number of organs that are available.1-3 Unfortunately, 22 patients die every day while they are waiting for a new organ.4 Because the issue of receiving a transplant is time-sensitive, medical officials must decide who receives a transplant first. Should the person who needs a transplant the most have greater priority over another who has been on the waiting list longer? Should a child be eligible before a senior? Should a lifelong smoker be able to obtain a new lung? Currently, national policy takes different factors into account depending on the organ to be transplanted. For example, other than compatibility requirements, patients on the waiting list for liver transplants are ranked solely on their medical need and distance from the donor hospital.4 On the other hand, people waiting for kidneys are further considered based on whether they have donated a kidney previously, their age, and their time spent on the waiting list.4

Despite various efforts to increase the number of organ donors through education and legislation, the supply of organs does not meet the current and increasing need for them.1-3 As a result, other methods of obtaining these precious resources are currently being developed, one of which is the use of animal organs, a process known as xenotransplantation. Different animal cells, tissues, and organs are being researched for use in humans, giving some hope to those on the waiting list or those who do not quite qualify for a transplant.2,3 In the past, surgeons have attempted to use a primate’s heart and liver for transplantation, but the surgical outcomes were poor.2 Other applications of animal tissue are more promising, such as the use of pigs’ islet cells, which can produce insulin, in humans.2 However, a considerable risk of using these animal parts is that new diseases may be passed from animal to human. Additionally, animal rights groups have protested the use of primates as a source of whole organs.2

Another possible solution to the deficit of organs is the use of stem cells, which have the potential to grow and specialize. Embryonic stem cells can repair and regenerate damaged organs, but harvesting them destroys the source embryo.2,3 Although the embryos are created outside of humans, there are objections to their use. What differentiates a mass of cells from a living person? Fortunately, adult stem cells can be used for treatment as well.2 Researchers have developed a new method that causes adult stem cells to return to a state similar to that of the embryonic stem cells, although the efficacy of the induced adult stem cells compared to the embryonic stem cells is still unclear.7

Regardless of the continuous controversy over the ethics of transplantation, the boundaries for organ transplants are being pushed further and further. Head transplants have been attempted for over a century in other animals, such as dogs,5 but several doctors want to move on to work with humans. To attach a head to a new body, the surgeon would need to connect the old and new nerves in the spinal cord so that the patient’s brain could interact with the host body. Progress is already being made in repairing severe spinal cord injuries. In China, Dr. Ren Xiaoping plans to attempt a complete body transplant, believed by some to be currently impossible.6 There is not much information about the amount of pain that the recipient of a body transplant must endure,5 so it may ultimately decrease, rather than increase, the patient’s quality of life. Overall, most agree that it would be unethical to continue, considering the limited success of such projects and the high chance of failure and death.

Organ transplants and new developments in the field have raised many interesting questions about the ethics of the organ transplantation process. As a society, we should determine how to address these problems and set boundaries to decide what is “right.”

References

  1. Jonsen, A. R. Virtual Mentor. 2012, 14, 264-268.
  2. Abouna, G. M. Med. Princ. Prac. 2002, 12, 54-69.
  3. Paul, B. et al. Ethics of Organ Transplantation. University of Minnesota Center for Bioethics [Online], February 2004 http://www.ahc.umn.edu/img/assets/26104/Organ_Transplantation.pdf (accessed Nov. 4, 2016)
  4. Organ Procurement and Transplantation Network. https://optn.transplant.hrsa.gov/ (accessed Nov. 4 2016)
  5. Lamba, N. et al. Acta Neurochirurgica. 2016.
  6. Tatlow, D. K. Doctor’s Plan for Full-Body Transplants Raises Doubts Even in Daring China. The New York Times. http://www.nytimes.com/2016/06/12/world/asia/china-body-transplant.html?_r=0 (accessed Nov. 4, 2016)
  7. National Institutes of Health. stemcells.nih.gov/info/basics/6.htm (accessed Jan. 23, 2017)

 

Comment

The Fight Against Neurodegeneration

Comment

The Fight Against Neurodegeneration

“You know that it will be a big change, but you really don’t have a clue about your future.”A 34-year-old postdoctoral researcher at the Telethon Institute of Genetics and Medicine in Italy at the time, Dr. Sardiello had made a discovery that would change his life forever. Eight years later, Dr. Sardiello is now the principal investigator of a lab in the Jan and Dan Duncan Neurological Research Institute (NRI) where he continues the work that had brought him and his lab to America.

Throughout his undergraduate career, Sardiello knew he wanted to be involved in some manner with biology and genetics research, but his passion was truly revealed in 2000: the year he began his doctoral studies. It was during this year that the full DNA sequence of the common fruit fly was released, which constituted the first ever complete genome of a complex organism. At the time, Sardiello was working in a lab that used fruit flies as a model, and this discovery served to spur his interest in genetics. As the golden age of genetics began, so did Sardiello’s love for the subject, leading to his completion of a PhD in Genetic and Molecular Evolution at the Telethon Institute of Genetics and Medicine. It was at this institute that his team made the discovery that would bring him to America: the function of Transcription Factor EB, colloquially known as TFEB.

Many knew of the existence of TFEB, but no one knew of its function. Dr. Sardiello and his team changed that. In 2009, they discovered that the gene is the master regulator for lysosomal biogenesis and function. In other words, TFEB works as a genetic switch that turns on the production of new lysosomes, an exciting discovery.1 Before the discovery of TFEB’s function, lysosomes were commonly known as the incinerator or the garbage can of the cell, as these organelles were thought to be essentially specialized containers that get rid of cellular waste. However, with the discovery of TFEB’s function, we now know that lysosomes have a much more active role in catabolic pathways and the maintenance of cell homeostasis. Sardiello’s groundbreaking findings were published in Science, one of the most prestigious peer reviewed journals in the scientific world. Speaking about his success, Sardiello said, “The bottom line was that there was some sort of feeling that a big change was about to come, but we didn’t have a clue what. There was just no possible measure at the time.”

Riding the success of his paper, Sardiello moved to the United States and established his own lab with the purpose of defeating the family of diseases known as Neuronal Ceroid Lipofuscinosis (NCLs). NCLs are genetic diseases caused by the malfunction of lysosomes. This malfunction causes waste to accumulate in the cell and eventually block cell function, leading to cell death. While NCLs cause cell death throughout the body, certain specialized cells such as neurons do not regenerate. Therefore, NCLs are generally neurodegenerative diseases. While there are many variants of NCLs, they all result in premature death after loss of neural functions such as sight, motor ability, and memory.

“With current technology,” Sardiello said, “the disease is incurable, since it is genetic. In order to cure a genetic disease, you have to somehow bring the correct gene into every single cell of the body.” With our current understanding of biology, this is impossible. Instead, doctors can work to treat the disease, and halt the progress of the symptoms. Essentially, his lab has found a way using TFEB to enhance the function of the lysosomes in order to fight the progress of the NCL diseases.

In addition to genetic enhancement, Sardiello is also focusing on finding drugs that will activate TFEB and thereby increase lysosomal function. To test these new methods, the Sardiello lab uses mouse models that encapsulate most of the symptoms in NCL patients. “Our current results indicate that drug therapy for NCLs is viable, and we are working to incorporate these strategies into clinical therapy,” Sardiello said. So far the lab has identified three different drugs or drug combinations that may be viable for treatment of this incurable disease.

While it might be easy to talk about NCLs and other diseases in terms of their definitions and effects, it is important to realize that behind every disease are real people and real patients. The goal of the Sardiello Lab is not just to do science and advance humanity, but also to help patients and give them hope. One such patient is a boy named Will Herndon. Will was diagnosed with NCL type 3, and his story is one of resilience, strength, and hope.

When Will was diagnosed with Batten Disease at the age of six, the doctors informed him and his family that there was little they could do. At the time, there was little to no viable research done in the field. However, despite being faced with terminal illness, Will and his parents never lost sight of what was most important: hope. While others might have given up, Missy and Wayne Herndon instead founded The Will Herndon Research Fund - also known as HOPE - in 2009, playing a large role in bringing Dr. Sardiello and his lab to the United States. Yearly, the foundation holds a fundraiser to raise awareness and money that goes towards defeating the NCL diseases. Upon its inception, the fundraiser had only a couple of hundred attendees- now, only half a decade later, thousands of like-minded people arrive each year to support Will and others with the same disease. “Failure is not an option,” Missy Herndon said forcefully during the 2016 banquet. “Not for Will, and not for any other child with Batten disease.” It was clear from the strength of her words that she believed in the science, and that she believed in the research.

“I have a newborn son,” Sardiello said, recalling the speech. “I can’t imagine going through what Missy and Wayne had to. I felt involved and I felt empathy, but most of all, I felt respect for Will’s parents. They are truly exceptional people and go far and beyond what anyone can expect of them. In face of adversity, they are tireless, they won’t stop, and their commitment is amazing.”

When one hears about science and labs, it usually brings to mind arrays of test tubes and flasks or the futuristic possibilities of science. In all of this, one tends to forget about the people behind the test bench: the scientists that conduct the experiments and uncover the next step in the collective knowledge of humanity, people like Dr. Sardiello. However, Sardiello isn’t alone in his endeavors, as he is supported by the members of his lab.

Each and every one of the researchers in Marco’s lab is an international citizen, hailing from at least four different countries in order to work towards a common cause: Parisa Lombardi from Iran, Lakshya Bajaj, Jaiprakash Sharma, and Pal Rituraj from India, Abdallah Amawi, from Jordan, and of course, Marco Sardiello and Alberto di Ronza, from Italy. Despite the vast distances in both geography and culture, the chemistry among the team was palpable, and while how they got to America varied, the conviction that they had a responsibility to help other people and defeat disease was always the same.

Humans have always been predisposed to move forwards. It is because of this propensity that humans have been able to eradicate disease and change the environments that surround us. However, behind all of our achievements lies scientific advancement, and behind it are the people that we so often forget. Science shouldn’t be detached from the humans working to advance it, but rather integrated with the men and women working to make the world a better place. Dr. Sardiello and his lab represent the constant innovation and curiosity of the research community, ideals that are validated in the courage of Will Herndon and his family. In many ways, the Sardiello lab embodies what science truly represents: humans working for something far greater than themselves.

References

  1. Sardiello, M.; Palmieri, M.; di Ronza, A.; Medina, D.L.; Valenza, M.; Alessandro, V. Science. 2009, 325, 473-477.

 

Comment

Tactile Literacy: The Lasting Importance of Braille

Comment

Tactile Literacy: The Lasting Importance of Braille

On June 27th, 1880, a baby girl was born. At nineteen months old, the little girl contracted a severe fever, and once the fever dissipated, she woke up to a world of darkness and silence. This little girl was Helen Keller. By the age of two, Helen Keller had completely lost her sense of sight and hearing.

Over a century later, it is estimated that 285 million people are visually impaired worldwide, of which 39 million are blind.1 Blindness is defined as the complete inability to see with a corrected vision of 20/200 or worse.2 For Keller to absorb the information around her, she relied on the sensation of touch. The invention of the braille alphabet by Frenchman Louis Braille in the early 1800s allowed Keller to learn about the world and to communicate with others. Like Keller, the majority of the visually impaired today rely on braille as their main method of reading.

The technological advances of smartphones, artificial intelligence, and synthetic speech dictations have opened a whole new world for blind readers. With the advent of the electronic information age, it’s easy to think that blind people don’t need to rely on braille anymore to access information. In fact, braille literacy rates for school-age blind children have already declined from 50 percent 40 years ago to only 12 percent today.3 While current low literacy rates may be in part due to the inclusion of students with multiple disabilities that inhibit language acquisition, these statistics still reveal a major concern about literacy amongst the visually impaired. To substitute synthetic speech for reading and writing devalues the importance of learning braille.

“There are many misunderstandings and stereotypes of braille readers,” says Dr. Robert Englebretson, Professor of Linguistics at Rice University. “When a person reads, they learn about spelling and punctuation, and it’s the exact same for tactile readers. Humans better process information when they actively process it through reading instead of passively listening.”

Dr. Englebretson is also blind, and one part of his research agenda is a collaborative project with Dr. Simon Fischer-Baum in Psychology and pertains to understanding the cognitive and linguistic importance of braille to braille readers. He explores the questions surrounding the nature of perception and reading and explores the ways the mind groups the input of touch into larger pieces to form words.

In order to understand how written language is processed by tactile readers compared to visual readers, Dr. Englebretson conducted experiments to find out if braille readers exhibit an understanding of sublexical structures, or parts of words, similar to that of visual readers. An understanding of sublexical structures is crucial in recognizing letter groupings and acquiring reading fluency. Visual readers recognize sublexical structures automatically as the eye scans over words, whereas tactile readers rely on serially scanning fingers across a line of text.

To explore whether the blind have an understanding of sublexical structures, Dr. Englebretson studied the reaction time of braille readers in order to judge their understanding of word structures. The subjects were given tasks to determine whether the words were real or pseudowords, and the time taken to determine the real words from the pseudowords were recorded. The first experiment tested the ability for braille readers to identify diagraphs or parts of words, and the second experiment test the ability for braille readers to identify morphemes, or the smallest unit of meaning or grammatical function of a word. For braille readers, Dr. Englebretson and his team developed a foot pedal system that enabled braille readers to indicate their answer without pausing to click a screen as the visual readers did. This enabled the braille readers to continuously use their hands while reading. From the reaction times of the braille readers when presented with a morphologically complex word, the findings show evidence of braille readers processing the meaning of words and recognizing these diagraphs and morphemes.4

“What we discovered was that tactile readers do rely on sublexical structures and have similar cognitive processes to print readers,” says Dr. Englebretson. “The belief that braille is old-fashioned and not needed anymore is far from the truth. Tactile reading provides an advantage in learning just as visual reading does.”

Dr. Englebretson also gathered a large sample of braille readers and videotaped them reading using a finger tracking system. Similar to an eye tracking system that follows eye movements, the finger tracking system used LED lights on the backs of fingernails to track the LED movements over time using a camera. The movements of the LED lights on the x-y coordinates are then plotted on a graph. This system can track where each finger is, how fast they are moving, and the movements that are made during regressions, or the right-to-left re-reading movement of the finger.5 While this test was independent from the experiment about understanding sublexical structures, the data collected offers a paradigm for researchers about braille reading.

The outcome of these studies has not only scientific and academic implications, but also important social implications. “At the scientific level, we now better understand how perception [of written language] works, how the brain organizes and processes written language, and how reading works for tactile and visual readers,” says Dr. Englebretson. “Through understanding how tactile readers read, we will hopefully be able to implement policy on how teachers of blind and visually impaired students teach, and on how to guide the people who are working on updating and maintaining braille.”

With decreasing literacy rates among braille readers, an evidence-base approach to the teaching of braille is as critical as continuing to implement braille literacy programs. With an understanding of braille, someone who is blind can not only access almost infinite pages of literature, but also make better sense of their language and world.

References

  1. World Health Organization. http://www.who.int/mediacentre/factsheets/fs282/en/ (accessed Jan. 9, 2017).
  2. National Federation of the Blind. https://nfb.org/blindness-statistics (accessed Jan. 9, 2017).
  3. National Braille Press. https://www.nbp.org/ic/nbp/braille/needforbraille.html (accessed Jan. 10, 2017).
  4. Fischer-Baum,S.; Englebretson, R. Science Direct. 2016, http://www.sciencedirect.com/science/article/pii/S0010027716300762 (accessed Jan. 10, 2017)
  5. Ulusoy, M.; Sipahi, R. PLoS ONE. 2016, 11. http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0148356 (accessed Jan. 10, 2017)

 

Comment

The Secret Behind Social Stigma

Comment

The Secret Behind Social Stigma

How do you accurately quantify something as subjective and controversial as discrimination? What about stigma - a superficial mark imposed upon a prototypical group of individuals? How do you attempt to validate what is seemingly invisible? Dr. Michelle “Mikki” Hebl and her team in the Industrial/Organizational (I/O) department of social psychology at Rice University attempt to answer these questions.

In the world of social psychology, where human interactions are often unpredictable, researchers must get creative to control variables as much as possible while simultaneously mimicking real-life situations. Dr. Hebl integrates both laboratory procedures and field studies that involve standardized materials. “My research is fairly novel,” she notes. Unlike the majority of existing stigma and discrimination research, which depends on self-reported assessments, her studies examine real, non-simulated social interactions. Although her approach seeks to provide more realistic and unbiased settings, “it’s messier,” she adds, laughing about the many trials discarded due to uncontrollable circumstances. That attitude— optimistic, determined, and creative—is held proudly by Dr. Hebl. It is clear that her lab’s overall mission—to reduce discrimination and increase equity—is worth undertaking.

Dr. Hebl and her team focus on a form of behavior they call “interpersonal discrimination,” a type of discrimination that occurs implicitly while still shaping the impressions we form and the decisions we make.1 This kind of bias, rooted in stereotypes and negative social stigma, is far more subtle than some of the more well-known, explicit forms of discrimination. For example, in a field study evaluating bias against homosexual applicants in Texas, Dr. Hebl found that the members of both the experimental and control group, who were wearing hats that said “Gay and Proud” and “Texan and Proud” respectively, did not experience formal bias when entering stores to seek employment. For example, none of the subjects were denied job applications. What she did find, however, was a pattern of interpersonal reactions against the experimental group. Discrete recording devices worn by the subjects revealed a pattern of decreased word count per sentence and shorter interactions for the stigmatized group. Their self-reports further indicated on average a higher perceived negativity and lower perceived employer interest.1 In another study evaluating obesity-related stigma, results showed that obese individuals - in this case subjects wearing obese prosthetic suits - experience similarly negative interactions.2

While many of her studies evaluated biases in seeking employment, Dr. Hebl also explored the presence of interpersonal discrimination against lesser-known groups that experience bias. One surprising finding indicated negative stigmatization against cancer survivors.3 In other studies, the team found patterns relating to stereotypicality; this relatively new phenomena explores the lessened interpersonal discrimination against those who deviate from the stereotypical prototype of their minority group, i.e. a more light-skinned Hispanic male.4 A holistic review of her research reveals a pattern of discrimination against stigmatized groups on an implicit level. Once researchers like Dr. Hebl find these patterns, they can investigate them in the lab by further isolating variables to develop a more refined and widely-applicable conclusion.

What can make more subtle forms of bias so detrimental is the ambiguity surrounding them. When someone discriminates against another in a clear and explicit form, one can easily attribute the behavior to the person’s biases. On the other hand, when this bias is perceived in the form of qualitative behavior, such as shortened conversations and body language, it raises questions regarding the person’s intentions. In these cases, the victim often internalizes the negative treatment, questioning the effect of traits that they cannot control—be it race, sexual orientation, or physical appearance. This degree of uncertainty raises conflict and tension between differing groups, thus potentially hindering progress in today’s increasingly diverse workplaces, schools, and universities.5

Dr. Hebl knew that exploring the presence of this tension between individuals was only the first step. “One of the most exciting aspects of social psychology is that just learning about these things makes you inoculated against them,” she said. Thus emerges the search for practical solutions involving education and reformation of conventional practices in the workplace. Her current work looks at three primary methods: The first is acknowledging biases on an individual level. This strategy involves individuation, or the recognition of one’s own stigma and subsequent compensation for it.6 The second involves implementing organizational methods in the workplace, such as providing support for stigmatized groups and awareness training.7 The third, which has the most transformative potential, is the use of research to support reformation of policies that could protect these individuals.

“I won't rest…until we have equity,” she affirmed when asked about the future of her work. For Dr. Hebl, the ultimate goal is education and change. Human interactions are incredibly complex, unpredictable, and difficult to quantify. But they influence our daily decisions and actions, ultimately impacting how we view ourselves and others. Social psychology research suggests that biases, whether we realize it or not, are involved in the choices we make every day: from whom we decide to speak to whom we decide to work with. Dr. Hebl saw this and decided to do something about it. Her work brings us to the complex source of these disparities and suggests that understanding their foundations can lead to a real, desirable change.

References

  1. Hebl, M. R.; Foster, J. B.; Mannix, L. M.; Dovidio, J. F. Pers. Soc. Psychol. B. 2002, 28 (6), 815–825.
  2. Hebl, M. R.; Mannix, L. M. Pers. Soc. Psychol. B. 2003, 29 (1), 28–38.
  3. Martinez, L. R.; White, C. D.; Shapiro, J. R.; Hebl, M. R. J. Appl. Psychol. 2016, 101 (1), 122–128.
  4. Hebl, M. R.; Williams, M. J.; Sundermann, J. M.; Kell, H. J.; Davies, P. G. J. Exp. Soc. Psychol. 2012, 48 (6), 1329–1335.
  5. Szymanski, D. M.; Gupta, A. J. Couns. Psychol. 2009, 56 (2), 300–300.
  6. Singletary, S. L.; Hebl, M. R. J. Appl. Psychol. 2009, 94 (3), 797–805.
  7. Martinez, L. R.; Ruggs, E. N.; Sabat, I. E.; Hebl, M. R.; Binggeli, S. J. Bus. Psychol. 2013, 28 (4), 455–466.

    

 

Comment

The Depressive Aftermath of Brain Injury

Comment

The Depressive Aftermath of Brain Injury

One intuitively knows that experiencing a brain injury is often painful and terrifying; the fact that it can lead to the onset of depression, however, is a lesser known but equally serious concern. Dr. Roberta Diddel, a clinical psychologist and member of the adjunct faculty in the Psychology Department at Rice University, focuses on the treatment of individuals with mental health issues and cognitive disorders. In particular, she administers care to patients with cognitive disorders due to traumatic brain injury (TBI). Dr. Diddel acquired a PhD in clinical psychology from Boston University and currently runs a private practice in Houston, Texas. Patients who experience TBI often experience depression; Dr. Diddel uses her understanding of how this disorder comes about to create and administer potential treatments.

Traumatic brain injury (TBI) affects each patient differently based on which region of the brain is damaged. If a patient has a cerebellar stroke, affecting the region of the brain which regulates voluntary motor movements, he or she might experience dizziness and have trouble walking. However, that patient would be able to take a written test because the injury has not affected higher order cognitive functions such as language processing and critical reasoning.

Dr. Diddel said, “Where you see depression the most is when there is a more global injury, meaning it has affected a lot of the brain. For example, if you hit your forehead in a car accident or playing a sport, you’re going to have an injury to the front and back parts of your brain because your brain is sitting in cerebrospinal fluid, causing a whiplash of sorts. In turn, this injury will cause damage to your frontal cortex, responsible for thought processing and problem solving, and your visual cortex, located in the back of your brain. When your brain is bouncing around like that, you often have swelling which creates intracranial pressure. Too much of this pressure prevents the flow of oxygen-rich blood to the brain. That can cause more diffuse brain injury.”

In cases where people experience severe brain injury such as head trauma due to an explosion or a bullet, surgeons may remove blood clots that may have formed in order to relieve intracranial pressure and repair skull fractures.4 They may also remove a section of the skull for weeks or months at a time to let the brain swell, unrestricted to the small, cranial cavity. That procedure alone significantly reduces the damage that occurs from those sorts of injuries and is especially useful in the battlefield where urgent care trauma centers may not be available.

Depression is a common result of TBI. The Diagnostic and Statistical Manual of Mental Disorders (DSM) defines depression as a loss of interest or pleasure in daily activities for more than two weeks, a change in mood, and impaired function in society.1 These symptoms are caused by brain-related biochemical deficiencies that disrupt the nervous system and lead to various symptoms. Usually, depression occurs due to physical changes in the prefrontal cortex, the area of the brain associated with decision-making, social behavior, and personality. People with depression feel overwhelmed, anxious, lose their appetite, and have a lack of energy, often because of depleted serotonin levels. The mental disorder is a mixture of chemical imbalances and mindstate; if the brain is not correctly functioning, then a depressed mindstate will follow.

Dr. Diddel mentioned that in many of her depressed patients, their lack of motivation prevents them from addressing and improving their toxic mindset. “If you’re really feeling bad about your current situation, you have to be able to say ‘I can’t give in to this. I have to get up and better myself and my surroundings.’ People that are depressed are struggling to do that,” she said.

The causes of depression vary from patient to patient and often depends on genetic predisposition to the disease. Depression can arise due to physical changes in the brain such as the alterations in the levels of catecholamines, neurotransmitters that works throughout the sympathetic and central nervous systems. Catecholamines are broken down into other neurotransmitters such as serotonin, epinephrine, and dopamine, which are released during times of positive stimulation and help increase activity in specific parts of the brain. A decrease in these chemicals after an injury can affect emotion and thought process. Emotionally, the patient might have a hard time dealing with a new disability or change in societal role due to the trauma. Additionally, patients who were genetically loaded with genes predisposing them to depression before the injury are more prone to suffering from the mental disorder after the injury.2,3

Depression is usually treated with some form of therapy or antidepressant medication. In cognitive behavior therapy (CBT), the psychologist tries to change the perceptions and behavior that exacerbate a patient’s depression. Generally, the doctor starts by attempting to change the patient’s behavior because it is the only aspect of his or her current situation that can can described. Dr. Diddel suggests such practices to her patients, saying things like “I know you don’t feel like it, but I want you to go out and walk everyday.” Walking or any form of exercise increases catecholamines, which essentially increases the activity of serotonin in the brain and improves the patient’s mood. People who exercise as part of their treatment regimen are also less likely to experience another episode of depression.

The efficacy of antidepressant medication varies from patient to patient depending on the severity of depression a patient faces. People with mild to moderate depression generally respond better to CBT because the treatment aims to change their mindset and how they perceive the world around them. CBT can result in the patient’s depression gradually resolving as he or she perceives the surrounding stimuli differently, gets out and moves more, and pursues healthy endeavors. Psychologists usually begin CBT, and if the patient does not respond to that well, then they are given medication. Some medications increase serotonin levels while others target serotonin, dopamine, and norepinephrine; as a result, they boost the levels of neurotransmitters that increase arousal levels and dampen negative emotions. The population of patients with moderate to severe depressions usually respond better to antidepressant medication. Medication can restore ideal levels of neurotransmitters, which in turn encourages the patient to practice healthier behavior.

According to the Center for Disease Control and Prevention, the US saw about 2.5 million cases of traumatic brain injury in 2010 alone.5 That number rises every year and with it brings a number of patients who suffer from depression in the aftermath.5 Though the mental disorder has been studied for decades and treatment options and medications are available, depression is still an enigma to physicians and researchers alike. No two brains are wired the same, making it very difficult to concoct a treatment plan with a guaranteed success rate. The work of researchers and clinical psychologists like Dr. Diddel, however, aims to improve the currently available treatment. While no two patients are the same, understanding each individual’s depression and tailoring treatment to the specific case can vasty improve the patient’s outcome.

References

  1. American Psychiatric Association. Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC, 2013.
  2. Fann, J. Depression After Traumatic Brain Injury. Model Systems Knowledge Translation Center [Online]. http://www.msktc.org/tbi/factsheets/Depression-After-Traumatic-Brain-Injury (accessed Dec. 28, 2016).
  3. Fann, J.R., Hart, T., Schomer, K.G. J. Neurotrauma. 2009, 26, 2383-2402.
  4. Mayo Clinic Staff. Traumatic Brain Injury. Mayo Clinic, May 15, 2014. http://www.mayoclinic.org/diseases-conditions/traumatic-brain-injury/basics/treatment/con-20029302 (accessed Dec. 29, 2016).
  5. Injury Prevention and Control. Centers for Disease Control and Prevention. https://www.cdc.gov/traumaticbraininjury/get_the_facts.html (accessed Dec. 29, 2016).

Comment