A Fourth Neutrino? Explaining the Anomalies of Particle Physics

Comment

A Fourth Neutrino? Explaining the Anomalies of Particle Physics

Abstract

The very first neutrino experiments discovered that neutrinos exist in three flavors and can oscillate between those flavors as they travel through space. However, many recent experiments have collected anomalous data that contradicts a three neutrino flavor hypothesis, suggesting instead that there may exist a fourth neutrino, called the sterile neutrino, that interacts solely through the gravitational force. While there is no conclusive evidence proving the existence of a fourth neutrino flavor, scientists designed the IceCube laboratory at the South Pole to search for this newly hypothesized particle. Due to its immense size and sensitivity, the IceCube laboratory stands as the most capable neutrino laboratory to corroborate the existence of these particles.

Introduction

Neutrinos are subatomic, ubiquitous, elementary particles that are produced in a variety of ways. Some are produced from collisions in the atmosphere between different particles, while others result from the decomposition and decay of larger atoms.1,3 Neutrinos are thought to play a role in the interactions between matter and antimatter; furthermore, they are thought to have significantly influenced the formation of the universe.3 Thus, neutrinos are of paramount concern in the world of particle physics, with the potential of expanding our understanding of the universe. When they were first posited, neutrinos were thought to have no mass because they have very little impact on the matter around them. However, decades later, it was determined that they have mass but only interact with other matter in the universe through the weak nuclear force and gravity.2

Early neutrino experiments found that measuring the number of neutrinos produced from the sun resulted in a value almost one third of the predicted value. Coupled with other neutrino experiments, these observations gave rise to the notion of neutrino flavors and neutrino flavor oscillations. There are three flavors of the standard neutrino: electron (ve), muon (vμ), and tauon (v𝜏). Each neutrino is a decay product that is produced with its namesake particle; for example, ve is produced alongside an electron during the decay process.9 Neutrino oscillations were also proposed after these results, stating that if a given type of neutrino is produced during decay, then at a certain distance from that spot, the chance of observing that neutrino with the properties of a different flavor becomes non-zero.2 Essentially, if ve is produced, then at a sufficient distance, the neutrino may become either vμ or v𝜏. This is caused by a discrepancy in the flavor and mass eigenstates of neutrinos.

In addition to these neutrino flavor states, there are also three mass eigenstates, or states in which neutrinos have definite mass. Through experimental evidence, these two different states represent two properties of neutrinos. As a result, neutrinos of the same flavor can be of different masses. For example, two electron neutrinos will have the same definite flavor, but not necessarily the same definite mass state. It is this discrepancy in the masses of these particles that actually leads to their ability to oscillate between flavors with the probability function given by the formula P(ab) = sin2(2q)sin2(1.27Dm2LvEv-1), where a and b are two flavors, q is the mixing angle, Dm is the difference in the mass eigenstate values of the two different neutrino flavors, L is the distance from source to detector, and E is the energy of the neutrino.6 Thus, each flavor is a different linear combination of the three states of definite mass.

The equation introduces the important concept of the mixing angle, which defines the difference between flavor and mass states and accounts for neutrino flavor oscillations. Thus, if the mixing angle were zero, this would imply that the mass states and and flavor states were the same and therefore no oscillations could occur. For example, all muon neutrinos produced at a source would still be muon neutrinos when P(mb) = 0. On the other hand, at a mixing angle of π/4, when P(mb) = 1, all muon neutrinos would oscillate to the other flavors in the probability function.9

Anomalous Data

Some experimental data has countered the notion of three neutrino flavor oscillations.3 If the experimental interpretation is correct, it would point to the existence of a fourth or even an additional fifth mass state, opening up the possibility of other mass states that can be taken by the hypothesised sterile neutrino. The most conclusive anomalous data arises from the Liquid Scintillator Neutrino Detector (LSND) Collaboration and MiniBooNE. The LSND Collaboration at Los Alamos National Laboratory looked for oscillations between vm neutrinos produced from muon decay and ve neutrinos. The results showed a lower-than-expected probability of oscillation.6 These results highly suggest either an oscillation to another neutrino flavor. A subsequent experiment at Fermilab called the mini Booster Neutrino Experiment (MiniBooNE) again saw a discrepancy between predicted and observed values of ve appearance with an excess of ve events.7 All of these results have a low probability of fit when compared to the standard model of particle physics, which gives more plausibility to the hypothesis of the existence of more than three neutrino flavors.

GALLEX, an experiment measuring neutrino emissions from the sun and chromium-51 neutrino sources, as well as reactor neutrino experiments gave inconsistent data that did not coincide with the standard model’s predictions for neutrinos. This evidence merely suggests the presence of these new particles, but does not provide conclusive evidence for their existence.4,5 Thus, scientists designed a new project at the South Pole to search specifically for newly hypothesized sterile neutrinos.

IceCube Studies

IceCube, a particle physics laboratory, was designed specifically for collecting data concerning sterile neutrinos. In order to collect conclusive data about the neutrinos, IceCube’s vast resources and acute precision allow it to detect and register a large number of trials quickly. Neutrinos that come into contact with IceCube’s detectors are upgoing atmospheric neutrinos and thus have already traversed the Earth. This allows a fraction of the neutrinos to pass through the Earth’s core. If sterile neutrinos exist, then the large gravitational force of the Earth’s core should cause some muon neutrinos that traverse it to oscillate into sterile neutrinos, resulting in fewer muon neutrinos detected than expected in a model containing only three standard mass states, and confirming the existence of a fourth flavor.3

For these particles that pass upward through IceCube’s detectors, the Earth filters out the charged subatomic particle background noise, allowing only the detection of muons (the particles of interest) from neutrino interactions. The small fraction of upgoing atmospheric neutrinos that enter the ice surrounding the detector site will undergo reactions with the bedrock and ice to produce muons. These newly created muons then traverse the ice and react again to produce Cherenkov light, a type of electromagnetic radiation, that is finally able to be detected by the Digital Optical Modules (DOMs) of IceCube. This radiation is produced when a particle having mass passes through a substance faster than light can pass through that same substance.8

In 2011-2012, a study using data from the full range of DOMs, rather than just a portion, was conducted.8 This data, along with other previous data, were examined in order to search for conclusive evidence of sterile neutrino oscillations in samples of atmospheric neutrinos. Experimental data were compared to a Monte Carlo simulation. For each hypothesis of the makeup of the sterile neutrino, the Poissonian log likelihood, a probability function that finds the best correlation of experimental data to a hypothetical model, was calculated. Based on the results shown in Figure 2, no evidence points towards sterile neutrinos.8

Conclusion

Other studies have also been conducted at IceCube, and have also found no indication of sterile neutrinos. Although there is strong evidence against the existence of sterile neutrinos, this does not completely rule out their existence. These experiments have focused only on certain mixing angles and may have different results for different mixing angles. Also, if sterile neutrinos are conclusively found to be nonexistent by IceCube, there is still the question of why the anomalous data appeared at LSND and MiniBooNE. Thus, IceCube will continue sterile neutrino experiments at variable mixing angles to search for an explanation to the anomalies observed in the previous neutrino experiments.

References

  1. Fukuda, Y. et al. Evidence for Oscillation of Atmospheric Neutrinos. Phys. Rev. Lett. 1998, 81, 1562.
  2. Beringer, J. et al. Review of Particle Physics. Phys. Rev. D. 2012, 86, 010001.
  3. Schmitz, D. W. Viewpoint: Hunting the Sterile Neutrino. Physics. [Online] 2016, 9, 94. https://physics.aps.org/articles/pdf/10.1103/Physics.9.94
  4. Hampel, W. et al. Final Results of the 51Cr Neutrino Source Experiments in GALLEX. Phys. Rev. B. 1998, 420, 114.
  5. Mention, G. et al. Reactor Antineutrino Anomaly. Phys. Rev. D. 2011, 83, 073006.
  6. Aguilar-Arevalo, A. A. et al. Evidence for Neutrino Oscillations for the Observation of ve Appearance in a vμ Beam. Phys. Rev. D. 2001, 64, 122007.
  7. Aguilar-Arevalo, A. A. et al. Phys. Rev. Lett. 2013, 110, 161801.
  8. Aartsen, M. G. et al. Searches for Sterile Neutrinos with the IceCube Detector. Phys. Rev. Lett. 2016, 117, 071801.

 

Comment

Fire the Lasers

Comment

Fire the Lasers

Imagine a giant solar harvester flying in geosynchronous orbit, which using solar energy, beams radiation to a single point 36, 000 km away. It would look like a space weapon straight out of Star Wars. Surprisingly, this concept might be the next so-called “moonshot” project that humanity needs to move forward. In space-based solar power generation, a solar harvester in space like the one discussed above would generate DC current from solar radiation using photovoltaic cells, and then convert it into microwaves. These microwaves would then be beamed to a rectifying antenna (or a rectenna) on the ground, which would convert them back into direct current (DC). Finally, a converter would change the DC energy to AC to be supplied into the grid.1

With ever-increasing global energy consumption and rising concerns of climate change due to the burning of fossil fuels, there has been increasing interest in alternative energy sources. Although renewable energy technology is improving every year, its current energy capacity is not enough to obviate the need for fossil fuels. Currently, wind and solar sources have capacity factors (a ratio of an energy source’s actual output over a period of time to its potential output) of around 34 and 26 percent, respectively. In comparison, nuclear and coal sources have capacity factors of 90 and 70 percent, respectively.2 Generation of energy using space solar power satellites (SSPSs) could pave the path humanity needs to move towards a cleaner future. Unlike traditional solar power, which relies on favorable weather conditions, SSPSs would allow continuous, green energy generation.

Although space-based solar power (SBSP) might sound pioneering, scientists have been flirting with the idea since Dr. Peter Glaser introduced the concept in 1968. Essentially, SBSP systems can be characterized by three elements: a large solar collector in geostationary orbit fitted with reflective mirrors, wireless transmission via microwave or laser, and a receiving station on Earth armed with rectennas.3 Such an implementation would require complete proficiency in reliable space transportation, efficient power generation and capture, practical wireless transmission of power, economical satellite design, and precise satellite-antenna calibration systems. Collectively, these goals might seem insurmountable, but taken separately, they are actually feasible. Using the principles of optics, scientists are optimizing space station design to maximize energy collection.4 There have been advancements in rectennas that allow the capture of even weak, ambient microwaves.5 With the pace of advancement speeding up every year, it’s easy to feel like the future of renewable energy is rapidly approaching. However, these advancements will be limited to literature if there are no global movements to utilize SBSP.

Japan Aerospace Exploration Agency (JAXA) has taken the lead in translating SBSP from the page to the launch pad. Due to its lack of fossil fuel resources and the 2011 incident at the Fukushima Daiichi nuclear plant, Japan, in desperate need of alternative energy sources, has proposed a 25-year technological roadmap to the development of a one-gigawatt SSPS station. To accomplish this incredible feat, Japan plans on deploying a 10,000 metric ton solar collector that would reside in geostationary orbit around Earth.6 Surprisingly, the difficult aspect is not building and launching the giant solar collector; it’s the technical challenge of transmitting the energy back to earth both accurately and efficiently. This is where JAXA has focused its research.

Historically, wireless power transmission has been accomplished via laser or microwave transmissions. Laser and microwave radiation are similar in many ways, but when it comes down to which one to use for SBSP, microwaves are a clear winner. Microwaves have longer wavelengths (usually lying between five and ten centimeters) than those of lasers (which often are around one micrometer), and are thus better able to penetrate Earth’s atmosphere.7 Accordingly, JAXA has focused on optimizing powerful and accurate microwave generation. JAXA has developed kW-class high-power microwave power transmission using phased, synchronized, power-transmitting antenna panels. Due to current limitations on communication technologies, JAXA has also developed advanced retrodirective systems, which allow high-accuracy beam pointing.8 In 2015, JAXA was able to deliver 1.8 kilowatts accurately to a rectenna 55 meters away which, according to JAXA, is the first time that so much power has been transmitted with any appreciable precision . Although this may seem insignificant compared to the 36,000 km transmissions required for a satellite in geosynchronous orbit, this is huge achievement for mankind. It demonstrates that large scale wireless transmission is a realistic option to power electric cars, transmission towers, and even satellites. JAXA,continuing on its roadmap, plans to conduct the first microwave power transmission in space by 2018.

Although the challenges ahead for space based solar power generation are enormous in both economic and technical terms, the results could be revolutionary. In a manner similar to the introduction of coal and oil, practical SBSP systems would completely alter human civilization. With continuous green energy generation, SBSP systems could solve our energy conflicts and allow progression to next phase of civilization. If everything goes well, air pollution and oil spills may merely be bygones.

References

  1. Sasaki, S. IEEE Spec. 2014, 51, 46-51.
  2. EIA (U.S. Energy Information Administration). www.eia.gov/electricity/monthly (accessed     Oct. 29, 2016).
  3. Wolfgang, S. Acta Astro. 2004, 55, 389-399.
  4. Yang, Y. et al. Acta Astro. 2016, 121, 51-58.
  5. Wang, R. et al. IEEE Trans. Micro. Theo. Tech. 2014, 62, 1080-1089.
  6. Sasaki, S. Japan Demoes Wireless Power Transmission for Space-Based Solar Farms. IEEE Spectrum [Online], March 16, 2015. http://spectrum.ieee.org/ (accessed Oct. 29, 2016).
  7. Summerer, L. et al. Concepts for wireless energy transmission via laser. Europeans Space Agency (ESA)-Advanced Concepts Team [Online], 2009. https://www.researchgate.net/profile/Leopold_Summerer/ (accessed Oct. 29, 2016).
  8. Japan Space Exploration Agency. Research on Microwave Wireless Power Transmission Technology. http://www.ard.jaxa.jp/eng/research/ssps/hmi-mssps.html (accessed Oct. 29, 2016).

 

Comment

GMO: How Safe is Our Food?

Comment

GMO: How Safe is Our Food?

For thousands of years, humans have genetically enhanced other living beings through the practice of selective breeding. Sweet corn and seedless watermelons at local grocery stores as well as purebred dogs at the park are all examples of how humans have selectively enhanced desirable traits in other living creatures. In his 1859 book On the Origin of Species, Charles Darwin discussed how selective breeding by humans had been successful in producing change over time. As technology improves, our ability to manipulate plants and other organisms by introducing new genes promises both new innovations and potential risks.

Genetically modified organisms (GMOs) are plants, animals, or microorganisms in which genetic material, such as DNA, has been artificially manipulated to produce a certain advantageous product. This recombinant genetic engineering allows certain chosen genes, even those from unassociated species, to be transplanted from one organism into another.1 Genetically modified crops are usually utilized to yield an increased level of crop production and to introduce resistance against diseases. Virus resistance makes plants less susceptible to diseases caused by insects and viruses, resulting in higher crop yields.

Genetic enhancement has improved beyond selective breeding as gene transfer technology has become capable of directly altering genomic sequences . Using a “cut and paste” mechanism, a desired gene can be isolated from a target organism via restriction enzymes and then inserted into a bacterial host using DNA ligase. Once the new gene is introduced, the cells with the inserted DNA (known as “recombinant” DNA) can be bred to generate an advanced strain that can be further replicated to produce the desired gene product.1 Due to this genetic engineering process, researchers have been able to produce synthetic insect-resistant tomatoes, corn, and potatoes. Humans’ ability to modify crops has improved yields and nutrients in a given environment, becoming the keystone of modern agriculture.2 Despite these positive developments, skepticism still exists regarding the safety and societal impact of GMOs.

The technological advancement from selective breeding to genetic engineering has opened up a plethora of possibilities for the future of food. As scientific capabilities expand, ethics and ideals surrounding the invasive nature of the production of GMOs have given rise to concerns about safety and long-term impacts. According to the Center for Food Safety, GMO seeds are used in 90 percent of corn, soybeans, and cotton grown in the United States.2 Because GMO crops are so prevalent, any negative ecological interactions involving a GMO product could prove devastating for the environment.

While the dangers of genetic modification are being considered, genetic engineering has proven to have benefits to human health and the farming industry. Genetically modified foods maintain a longer shelf life, which allows for the safe transport of surplus foodstuffs to people in countries without access to nutrition-rich foods. Genetic engineering has supplemented staple crops with vital minerals and nutrients, , helping fight worldwide malnutrition. For example, Golden rice is a genetically-modified variant of rice that biosynthesizes beta-carotene, a precursor of vitamin A.3 This type of rice is intended to be produced and consumed in areas with a shortage of dietary vitamin A, which is a deficiency that kills 670,000 children each year. Despite the controversial risks, genetic engineering of crops promises to continually increase the availability and durability of food.

References

  1. Learn.Genetics. http://learn.genetics.utah.edu/content/science/gmfoods/ (accessed Sep 20, 2016)
  2. Fernandez-Cornejo, Jorge, and Seth James Wechsler. USDA ERS – Adoption of Genetically Engineered Crops in the U.S.: Recent Trends in GE Adoption. USDA ERS – Adoption of Genetically Engineered Crops in the U.S.: Recent Trends in GE Adoption. https://www.ers.usda.gov/data-products/adoption-of-genetically-engineered-crops-in-the-us/recent-trends-in-ge-adoption.aspx (accessed Sep 30,2016)
  3. Dan Charles. In A Grain Of Golden Rice, A World Of Controversy Over GMO Foods. http://www.npr.org/sections/thesalt/2013/03/07/173611461/in-a-grain-of-golden-rice-a-world-of-controversy-over-gmo-foods (accessed Sep 24, 2016)

Comment

Green Sea Turtles: A Shell of What They Once Were

Comment

Green Sea Turtles: A Shell of What They Once Were

Sea turtles appear in many cultures and myths, and are often beloved symbols of longevity and wisdom. However, in spite of the cultural respect shown towards them, green sea turtles have gradually become endangered due to factors such as nesting habitat loss, pollution, egg harvesting, climate change, and boat strikes. Now, there’s a new, even more dangerous threat on the block: herpes. And no, it’s not the herpes you’re thinking of - this kind, known as fibropapillomatosis (FP), is much, much worse.

FP has been observed across all species of sea turtles for years, but it has recently become especially widespread among green sea turtles (Chelonia mydas). The alarming incidence of FP is exacerbating the decline of this already vulnerable population. Among green sea turtles, the number of cases of FP increased 6000% from the 1980s to the mid-1990s, with FP becoming so globally pervasive that the outbreak has been classified as “panzootic,” the animal equivalent of “pandemic.” Now, you might think, “That sounds bad, but why are these turtles dying?” In humans, herpes is unpleasant, but it is seldom life-threatening. Unfortunately, in green sea turtles, the outlook isn’t nearly as optimistic. FP causes the development of tumors on the soft tissues, the shells, and even the eyes of infected turtles. When these growths are left untreated, they can grow to immense sizes, impairing the animal's vital activities, such as breathing and swallowing. So, while the tumors aren’t directly lethal, they invite hordes of secondary infections and pathogens that ultimately result in death.

To make matters worse, treatment for FP is still in development. A landmark study identified the specific pathogen responsible for FP as Chelonid herpesvirus 5 (ChHV5), a close relative of human genital herpes.1 This discovery was the first step to a cure, but it raised an important question - how had this variant of herpesvirus become so prevalent? Until recently, the answer to that question was elusive.

Fortunately, several recent discoveries offered new explanations for FP’s rise. One study reported a significant positive correlation between serum concentrations of heavy metals and the severity of FP, as well as a significant negative correlation between serum cholesterol concentrations and FP.2 In a related find, a team at the University of São Paulo discovered that many green sea turtles have been exposed to organochlorine compounds, which are known to have carcinogenic effects.3 Further research could potentially determine a direct causal relationship between the development of FP and exposure to heavy metals or organochlorine compounds. If such a relationship were found, projects that strive to decrease the prevalence of said compounds in the turtles’ habitats could prove effective in mitigating the spread of FP.

So what’s the prognosis for the green sea turtle? Unfortunately, even knowing what we now know, it may not be good. A study by Jones et. al. found almost all of the infected turtles are juveniles, potentially creating a big problem for the population.4 Jones believes the most optimistic explanation for this trend is that current adults and hatchlings have never been exposed to the disease, so only one generation (the juveniles) has been infected. Another optimistic possibility is that once infected turtles recover from the disease, they will simply acquire immunity as adults. However, there is another, devastating possibility: all of the affected juveniles will perish before they reach adulthood, leaving only the unaffected alive and dooming the species. In a heartbreaking aside, Jones reported that FP “grows on their [the turtles’] eyes, they can't see predators, they can't catch food, so sometimes they slowly starve to death — it's not a nice thing for the turtles to experience. Severely affected turtles are quite skinny and have other pathogens affecting them – that’s why they die.”

Eradicating such a devastating disease will no doubt take many more years of specialized research, and significant efforts are needed immediately to rehabilitate the green sea turtle population. Luckily, conservation groups such as The Turtle Hospital, located in the Florida Keys, are making an active effort to save infected sea turtles. They perform surgeries that remove FP tumors, rehabilitate the turtles, and then release them back into the wild. In addition, they collaborate with universities to study the virus and educate the public on sea turtle conservation. To date, the Turtle Hospital has successfully treated and released over 1,500 sea turtles. Through the hard work of conservation organizations and researchers across the globe, we may still be able to save the green sea turtle.

References

  1. Jacobson, E. R. et al. Dis. Aquat. Organ. 1991, 12.
  2. Carneiro da Silva, C., et al. Aquat. Toxicol. 2016, 170, 42-51.
  3. Sánchez-Sarmiento, A. M. et al. J. Mar. Biol. Assoc. U. K. 2016, 1-9.
  4. Jones, K., et al. Vet. J. 2016, 212, 48-57.
  5. Borrowman, K. Electronic Theses and Dissertations. 2008
  6. Monezi, T. A. et al. Vet. Microbiol. 2016, 186, 150-156.
  7. Herbst, L. H. et al. Dis. Aquat. Organ. 1995, 22.
  8. The Turtle Hospital. Rescue, Rehab, Release. http://www.turtlehospital.org/about-us/ (accessed Oct. 4, 2016).

 

Comment

Engineering Eden: Terraforming a Second Earth

Comment

Engineering Eden: Terraforming a Second Earth

Today’s world is faced with thousands of complex problems that seem to be insurmountable. One of the most pressing is the issue of the environment and how our over-worked planet can sustain such an ever-growing society. Our major source of energy is finite and rapidly depleting. Carbon dioxide emissions have passed the “irreversibility” threshold. Our oceans and atmosphere are polluted, and scientists predict a grim future for Mother Earth if humans do not change our wasteful ways. A future similar to the scenes of “Interstellar” or “Wall-E” is becoming increasing less fictitious. While most of the science world is turning to alternative fuels and public activism as vehicles for change, some radical experts in climate change and astronomy suggest relocation to a different planet: Mars. The Mars rover, Curiosity, presents evidence that Mars has the building blocks of a potential human colony, such as the presence of heavy metals and nutrients nestled in its iconic red surface. This planet, similar in location, temperature, and size to Earth, seems to have the groundwork to be our next home. Now is when we ponder: perhaps our Earth was not meant to sustain human life for eternity. Perhaps we are living at the tail end of our time on Earth.

Colonizing Mars would be a project beyond any in human history, and the rate-limiting step of this process would be developing an atmosphere that could sustain human, animal, and plant life. The future of mankind on Mars is contingent on developing a breathable atmosphere, so humans and animals could thrive without the assistance of oxygen tanks, and vegetation could grow without the assistance of a greenhouse. The Martian atmosphere has little oxygen, being almost 95.7 percent carbon dioxide. It is also one percent of the density of Earth’s atmosphere, so it provides no protection from the Sun’s radiation. Our atmosphere, armed with a thick layer of ozone, absorbs or deflects the majority of radiation before it hits our surface. Even if a human could breathe on the surface of Mars, he or she would die from radiation poisoning or cancer. Fascinating ways to address this have been discussed, one being mass hydrogen bombing across the entire surface of the planet, creating an atmosphere of dust and debris thick enough to block ultraviolet radiation. This feat can also be accomplished by physically harnessing nearby asteroids and catapulting them into the surface. The final popular idea is the use of mega-mirrors to capture the energy of the sun to warm up the surface to release greenhouse gases from deep within the soil.1

However, bioengineers have suggested another way of colonizing Mars--a way that does not require factories or asteroids or even human action for that matter. Instead, we would use genetically modified plants and algae to build the Martian atmosphere. The Defense Advanced Research Projects Agency (DARPA) is pursuing research in developing these completely new life forms.2 These life forms would not need oxygen or water to survive, but instead would synthesize a new atmosphere given the materials already on Mars. The bioengineering lab at DARPA has developed a software called DTA GView which has been called a “Google Maps of Genomes.” It acts as a library of genes, and DARPA has identified genes that could be inserted into extremophile organisms. A bacteria called Chroococcidiopsis is resistant to wide temperature changes and hypersalinity, two conditions found on Mars.3 Carnobacterium spp has proven to thrive under low pressure and in the absence of oxygen. These two organisms could potentially be genetically engineered to live on Mars and add vital life-sustaining molecules to the atmosphere.

Other scientific developments must occur before these organisms are ready to pioneer the human future on Mars. Curiosity must send Earth more data regarding what materials are present in Mars’ soil, and we must study how to choose, build, and transport the ideal candidate to Mars. Plus, many argue that our scientific research should be focused on healing our current home instead of building a new one. If we are willing to invest the immense scientific capital required to terraform another planet, we would likely also be able to mediate the problem of Earthly pollution. However, in such a challenging time, we must venture to new frontiers, and the bioengineers at DARPA have given us an alternative method to go where no man or woman has ever gone before.

References

  1. “The ethics of terraforming Mars: a review” iGem Valencia Team, 2010, 1-12 (Accessed November 2, 2016)
  2. Terraforming Mars With Microbeshttp://schaechter.asmblog.org/schaechter/2013/06/terraforming-mars-with-microbes.html (Accessed November 4, 2016)
  3. We are Engineering the Organisms that will terraform Mars.http://motherboard.vice.com/read/darpa-we-are-engineering-the-organisms-that-will-terraform-mars (Accessed November 4, 2016)

Comment

Biological Bloodhounds: Sniffing Out Cancer

Comment

Biological Bloodhounds: Sniffing Out Cancer

50 years ago, doctors needed to see cancer to diagnose it - and by then, it was usually too late to do anything about it. Newer tests have made cancer detection easier and more precise, but preventable cases continue to slip through the cracks, often with fatal consequences. However, a new test has the potential to stop these types of missed diagnoses--it can detect cancer from a single drop of blood, and it may finally allow us to ensure patients receive care when they need it.

Blood platelets are a major component of blood, best known for their ability to stop bleeding by clotting injured blood vessels. However, blood platelets are far more versatile than previously understood. When cancer is formed in the human body, the tumors shed molecules such as proteins and RNA directly into the bloodstream. The blood platelets come in contact with these shed molecules and will absorb them. This results in an alteration of the blood platelets’ own RNA. Persons with cancer will therefore have blood platelets that contain information about the specific cancer present. These “educated” blood platelets are called tumor educated platelets, or TEPs. Recently, TEPs have been used to aid in the detection of specific cancers, and even to identify their locations.1

In a recent study, a group of scientists investigated how TEPs could be used to diagnose cancer. The scientists took blood platelets from healthy individuals and from those with either advanced or early stages of six different types of cancer and compared their blood platelet RNA. While doing so, the researchers found that those with cancer had different amounts of certain platelet RNA molecules. For example, the scientists discovered that the levels of dozens of specific non-protein coding RNAs were altered in patients who had TEPs. The further analysis of hundreds of different RNA levels, from the nearly 300 patients in the study, enabled the scientists to distinguish a cancer-associated RNA profile from a healthy one. Using these results, the team created an algorithm that could classify if someone did or did not have cancer with up to 96% accuracy.1

Not only could the TEPs distinguish between healthy individuals and those with a specific type of cancer, but they could also identify the location of the cancer. The patients in the study had one of six types of cancer: non-small-cell lung cancer, breast cancer, pancreatic cancer, colorectal cancer, glioblastoma, or hepatobiliary cancer. The scientists analyzed the specific TEPs associated with the specific types of cancer and created an algorithm to predict tumor locations. The TEP-trained algorithm correctly identified the location of these six types of cancer 71% of the time.1

The authors of the study noted that this is the first bloodborne factor that can diagnose cancer and pinpoint the location of primary tumors. It is possible that in the near future, TEP-based tests could lead to a new class of extremely accurate liquid biopsies. Nowadays, many cancer tests are costly, invasive, or painful. For example, lung cancer tests require an X-ray, sputum cytology examination, or tissue sample biopsy. X-rays and sputum cytology must be performed after symptoms present, and can often have misleading results. Biopsies are more accurate, but are also highly painful and relatively dangerous. TEP-based blood tests have the potential to both obviate the need for these techniques and provide more granular, clinically useful information. They can be performed before symptoms are shown, at low cost, and with minimal patient discomfort, making them an ideal choice to interdict a growing tumor early.

The information that TEPs have revealed has opened a gate to many potential breakthroughs in the detection of cancer. With high accuracy and an early detection time, cancer blood tests have the potential to save many lives in the future.

References

  1. Best, M. et al. Cancer Cell 2015 28, 676
  2. Marquedant, K. "Tumor RNA within Platelets May Help Diagnose and Classify Cancer, Identify Treatment Strategies." Massachusetts General Hospital. 

Comment

Who Says Time Travel Isn't Possible?

Comment

Who Says Time Travel Isn't Possible?

There are several very real challenges that must be overcome when attempting to travel to another star, let alone another galaxy. However, with today’s technology and understanding of physics, we can envision potential ways to make interstellar travel, and even time travel, a reality, giving rise to the question of why other potential civilizations have not invented and made use of it yet. This is especially suspicious because considering the immensity of the universe, there are bound to be other intelligent civilizations, begging the famous question of the Fermi Paradox, “So where is everybody?” It’s answer would enable us to evolve into an interstellar or intergalactic species, while failing to do so could spell our demise.

Einstein’s theory of special relativity is where the cosmic speed limit (the speed of light) was first introduced. His theory also gives rise to the concept of time dilation, which states that time runs slower for those traveling extremely fast than it does for those on Earth, and that distances shrink when travelling at high speeds.1 So, when a spaceship is travelling close to the speed of light, time measured aboard runs slower than it would on clocks at rest. This can play an important role in interstellar travel, because it can allow travelers moving close to the cosmic speed limit to age slower than those on Earth. For example, if a spaceship left Earth in the year 2100 and made a roundtrip to the star Vega at 90% the speed of light, it would return in Earth year 2156, but only 24 years would have passed for the crew of the ship.2 Because of time dilation, journeys could be made to very distant places, and the crew would age very little. Due to this amazing effect, one could theoretically travel to the black hole at the center of the Milky Way Galaxy, 28,000 light years away, and only age 21 years, if travelling fast enough.2 At a high enough percentage of the speed of light, you would be able to reach Andromeda (2.5 million light years away), and return to Earth only 60 years older while 5 million years have passed on Earth.2 Clearly, time dilation is a real form of time travel to the future, assuming relativistic speeds are achievable. Therefore, it follows that the main obstacle for this method is reaching a percentage of the speed of light where time dilation becomes significant, requiring enormous amounts of energy.

Though obtaining the energy required for interstellar travel may seem like a far off goal, it will definitely be possible to travel great distances at great speeds in the near future with the science and technology that we have today. The first of these technologies are rockets that use nuclear energy as power. According to Albert Einstein’s equation, E=mc2, any small amount of mass can release a very large amount of energy. In fact, through the use of nuclear fission, only 0.6 grams of Uranium (less than the weight of an M&M) was sufficient to level Hiroshima during World War II.3 Nuclear fission makes use of the lost mass when an atomic nucleus splits into two. Nuclear fusion, on the other hand, involves two atomic nuclei fusing into one, releasing ten times the energy of fission. This process is the source of energy for the sun, occurring at its core. Nuclear fusion, if controlled, is a viable source of energy in the future. Just using the hydrogen present in the water coming out of one faucet could provide enough energy for the United States’ current needs, a staggering 2,850,000,000,000 joules/second.2

In order to conduct nuclear fusion, an environment similar to the center of the sun must be created, with the same temperatures and pressures. Gas must be converted into an highly ionized state known as plasma. Recently, MIT was able to use their Alcator C-Mod tokamak reactor’s extreme magnetic fields to create highest plasma pressures ever recorded4 In addition, a 7-story reactor in southern France, 800 times larger than MIT’s reactor, is set to be completed in 2025 and will have magnets that are each as heavy as a Boeing 747.5 Nuclear fusion could one day provide limitless energy for a spacecraft, accelerating it to relativistic speeds. Enormous scoops could be attached to the spacecraft to collect interstellar hydrogen gas throughout the journey, allowing travel to the distant corners of the galaxy.

Another technological development to facilitate interstellar travel is the EM drive, which is purported to be an electromagnetic thruster. This form of propulsion is highly controversial because it seemingly violates Newton’s 3rd law and thus the Law of Conservation of Momentum, which together say that for something to move in one direction, an equal and opposite force must be exerted in the opposite direction. The EM drive is thought to use electromagnetic waves as fuel and create thrust through microwaves within the engine cavity that push on the inside and cause the thruster to accelerate in the opposite direction.6 Simply put, the EM drive is able to go in one direction without a propellant or an opposite force pushing it. It has been tested multiple times, most notably by NASA’s Eagleworks Lab. The EM drive has repeatedly been measured to produce a small amount of thrust, making it difficult for scientists to dismiss the possibility that it works.7 The thrust seemingly cannot be explained by our current understanding of physics, but the Eagleworks Lab has nevertheless submitted its results to be published soon in the American Institute of Aeronautics and Astronautics’ Journal of Propulsion and Power. This Eagleworks experiment, if shown to be reproducible, would open up opportunities for researchers around the world to conduct further experimentation. In August, plans were announced to test the EM drive in space, which would be the most robust test of its efficacy to date. The EM drive could one day provide an essentially limitless supply of thrust on a spacecraft without the need for propellant, allowing it to constantly accelerate until it reached relativistic speeds.

These are only two examples of technologies that could make interstellar travel possible. In the next few decades, we can look forward to more innovative research that will push the boundaries of science and redefine interplanetary, interstellar, and intergalactic travel. If relativistic speeds are achieved, humans could travel thousands, if not millions of years into the future by aging much slower than the rate at which time would actually pass on Earth. So who says we can’t time travel? Certainly not science!

References

  1. Bennet, J. The Cosmic Perspective; Pearson: Boston, 2014.
  2. Bennett, J. Life In the Universe; Pearson: San Francisco, 2012.
  3. Glasstone, S. Dolan, P. The Effects of Nuclear Weapons, 3rd; United States Department of Defense and United States Department of Energy: 1977.
  4. Plasma Science and Fusion Center. https://www.psfc.mit.edu/research/magnetic-fusion-energy (accessed Nov. 11, 2016).
  5. ITER. https://www.iter.org/mach (accessed Nov. 05, 2016).
  6. Shawyer, R. New Scientist [Online] 2006, https://www.newscientist.com/data/images/ns/av/shawyertheory.pdf (accessed Nov. 10, 2016).
  7. Wang, B. NASA Emdrive experiments have force measurements while the device is in a hard vacuum. NextBigFuture, Feb. 07, 2015. http://www.nextbigfuture.com/2015/02/more-emdrive-experiment-information.html (accessed Nov 7, 2016).

 

 

 

Comment

Graphene Nanoribbons and Spinal Cord Repair

Comment

Graphene Nanoribbons and Spinal Cord Repair

The same technology that has been used to strengthen polymers1, de-ice helicopter wings2, and create more efficient batteries3 may one day help those with damaged or even severed spinal cords walk again. The Tour Lab at Rice University, headed by Dr. James Tour, is harnessing the power of graphene nanoribbons to create a special new material called Texas-PEG that may revolutionize the way we treat spinal cord injuries; one day, it may even make whole body transplants a reality.

Dr. Tour, the T.T. and W.F. Chao Professor of Chemistry, Professor of Materials Science and NanoEngineering, and Professor of Computer Science at Rice University, is a synthetic organic chemist who mainly focuses on nanotechnology. He currently holds over 120 patents and has published over 600 papers, and was inducted into the National Academy of Inventors in 2015.4 His lab is currently working on several different projects, such as investigating various applications of graphene, creating and testing nanomachines, and the synthesizing and imaging of nanocars. The Tour Lab first discovered graphene nanoribbons while working with graphene back in 2009.5 Their team found a way to “unzip” graphene nanotubes into smaller strips called graphene nanoribbons by injecting sodium and potassium atoms between nanotube layers in a nanotube stack until the tube split open. “We fell upon the graphene nanoribbons,” says Dr. Tour. “I had seen it a few years ago in my lab but I didn’t believe it could be done because there wasn’t enough evidence. When I realized what we had, I knew it was enormous.”

This discovery was monumental: graphene nanoribbons have been used in a variety of different applications because of their novel characteristics. Less than 50 nm wide ( which is about the width of a virus), graphene nanoribbons are 200 times stronger than steel and are great conductors of heat and electricity. They can be used to make materials significantly stronger or electrically conductive without adding much additional weight. It wasn’t until many years after their initial discovery, however, that the lab discovered that graphene nanoribbons could be used to heal severed spinal cords.

The idea began after one of Dr. Tour’s students read about European research on head and whole body transplants on Reddit. This research was focused on taking a brain dead patient with a healthy body and pairing them with someone who has brain activity but has lost bodily function. The biggest challenge, however, was melding the spine together. The neurons in the two separated parts of the spinal cord could not communicate with one another, and as a result, the animals involved with whole body and head transplant experiments only regained about 10% of their original motor function. The post-graduate student contacted the European researchers, who then proposed using the Tour lab’s graphene nanoribbons in their research, as Dr. Tour’s team had already proven that neurons grew very well along graphene.

“When a spinal cord is severed, the neurons grow from the bottom up and the top down, but they pass like ships in the night; they never connect. But if they connect, they will be fused together and start working again. So the idea was to put very thin nanoribbons in the gap between the two parts of the spinal cord to get them to align,” explains Dr. Tour. Nanoribbons are extremely conductive, so when their edges are activated with polyethylene glycol, or PEG, they form an active network that allows the spinal cord to reconnect. This material is called Texas-PEG, and although it is only about 1% graphene nanoribbons, this is still enough to create an electric network through which the neurons in the spinal cord can connect and communicate with one another.

The Tour lab tested this material on rats by severing their spinal cords and then using Texas-PEG to see how much of their mobility was recovered. The rats scored about 19/21 on a mobility scale after only 3 weeks, a remarkable advancement from the 10% recovery in previous European trials. “It was just phenomenal. There were rats running away after 3 weeks with a totally severed spinal cord! We knew immediately that something was happening because one day they would touch their foot and their brain was detecting it,” says Dr. Tour. The first human trials will begin in 2017 overseas. Due to FDA regulations, it may be awhile before we see trials in the United States, but the FDA will accept data from successful trials in other countries. Graphene nanoribbons may one day become a viable treatment option for spinal injuries.

This isn’t the end of Dr. Tour’s research with graphene nanoribbons. “We’ve combined our research with neurons and graphene nanoribbons with antioxidants: we inject antioxidants into the bloodstream to minimize swelling. All of this is being tested in Korea on animals. We will decide on an optimal formulation this year, and it will be tried on a human this year,” Dr. Tour explained. Most of all, Dr. Tour and his lab would like to see their research with graphene nanoribbons used in the United States to help quadriplegics who suffer from limited mobility due to spinal cord damage. What began as a lucky discovery now has the potential to change the lives of thousands.

References

  1. Wijeratne, Sithara S., et al. Sci. Rep. 2016, 6.
  2. Raji, Abdul-Rahman O., et al. ACS Appl. Mater. Interfaces. 2016, 8 (5), 3551-3556.
  3. Salvatierra, Rodrigo V., et al. Adv. Energy Mater. 2016, 6 (24).
  4. National Academy of Inventors. http://www.academyofinventors.org/ (accessed Feb. 1, 2017).
  5. Zehtab Yazdi, Alireza, et al. ACS Nano. 2015, 9 (6), 5833-5845.

Comment

Microbes: Partners in Cancer Research

Comment

Microbes: Partners in Cancer Research

To millions around the world, the word ‘cancer’ evokes emotions of sorrow and fear. For decades, scientists around the world have been trying to combat this disease, but to no avail. Despite the best efforts of modern medicine, about 46% of patients diagnosed with cancer still pass away as a direct result of the disease.1 However, the research performed by Dr. Michael Gustin at Rice University may change the field of oncology forever.

Cancer is a complex and multifaceted disease that is currently not fully understood by medical doctors and scientists. Tumors vary considerably between different types of cancers and from patient to patient, further complicating the problem. Understanding how cancer develops and responds to stimuli is essential to producing a viable cure, or even an individualized treatment.

Dr. Gustin’s research delves into the heart of this problem. The complexity of the human body and its component cells are currently beyond the scope of any one unifying model. For this reason, starting basic research with human subjects would be detrimental. Researchers turn instead to simpler eukaryotes in order to understand the signal pathways involved in the cell cycle and how they respond to stress.2 Through years of hard work and research, Dr. Gustin’s studies have made huge contributions to the field of oncology.

Dr. Gustin studied a species of yeast, Saccharomyces cerevisiae, and its response to osmolarity. His research uncovered the high osmolarity glycerol (HOG) pathway and mitogen-activated protein kinase (MAPK) cascade, which work together to maintain cellular homeostasis. The HOG pathway is much like a “switchboard [that] control[s] cellular behavior and survival within a cell, which is regulated by the MAPK cascade through the sequential phosphorylation of a series of protein kinases that mediates the stress response.”3 These combined processes allow the cell to respond to extracellular stress by regulating gene expression, cell proliferation, and cell survival and apoptosis. To activate the transduction pathway, the sensor protein Sln1 recognizes a stressor and subsequently phosphorylates, or activates, a receiver protein that mediates the cellular response. This signal transduction pathway leads to the many responses that protect a cell against external stressors. These same protective processes, however, allow cancer cells to shield themselves from the body’s immune system, making them much more difficult to attack.

Dr. Gustin has used this new understanding of the HOG pathway to expand his research into similar pathways in other organisms. Fascinatingly, the expression of human orthologs of HOG1 proteins within yeast cells resulted in the same stimulation of the pathway despite the vast evolutionary differences between yeast and mammals. Beyond the evolutionary implications of this research, this illustrates that the “[HOG] pathway defines a central stress response signaling network for all eukaryotic organisms”.3 So much has already been learned through studies on Saccharomyces cerevisiae and yet researchers have recently discovered an even more representative organism. This fungus, Candida albicans, is the new model under study by Dr. Gustin and serves as the next step towards producing a working model of cancer and its responses to stressors. Its more complex responses to signalling make it a better working model than Saccharomyces cerevisiae.4 The research that has been conducted on Candida albicans has already contributed to the research community’s wealth of information, taking great strides towards eventual human applications in the field of medicine. For example, biological therapeutics designed to combating breast cancer cells have already been tested on both Candida albicans biofilms and breast cancer cells to great success.5

This research could eventually be applied towards improving current chemotherapy techniques for cancer treatment. Eventual applications of this research are heavily oriented towards fighting cancer through the use of chemotherapy techniques. Current chemotherapy techniques utilize cytotoxic chemicals that damage and kill cancerous cells, thereby controlling the size and spread of tumors. Many of these drugs can disrupt the cell cycle, preventing the cancerous cell from proliferating efficiently. Alternatively, a more aggressive treatment can induce apoptosis, programmed cell death, within the cancerous cell.6 For both methods, the chemotherapy targets the signal pathways that control the vital processes of the cancer cell. Dr. Gustin’s research plays a vital role in future chemotherapy technologies and the struggle against mutant cancer cells.

According to Dr. Gustin, current chemotherapy is only effective locally, and often fails to completely incapacitate cancer cells that are farther away from the site of drug administration where drug toxicity is highest. As a result, distant cancer cells are given the opportunity to develop cytoprotective mechanisms that increase their resistance to the drug.7 Currently, a major goal of Dr. Gustin’s research is to discover how and why certain cancer cells are more resistant to chemotherapy. The long-term goal is to understand the major pathways involved with cancer resistance to apoptosis, and to eventually produce a therapeutic product that can target the crucial pathways and inhibitors. With its specificity, this new drug would vastly increase treatment efficacy and provide humanity with a vital tool with which to combat cancer, saving countless lives in the future.

References   

  1. American Cancer Society. https://www.cancer.org/latest-new/cancer-facts-and-figures-death-rate-down-25-since-1991.html (February 3 2017).                  
  2. Radmaneshfar, E.; Kaloriti, D.; Gustin, M.; Gow, N.; Brown, A.; Grebogi, C.; Thiel, M. Plos ONE, 2013, 8, e86067.                
  3. Brewster, J.; Gustin, M. Sci. Signal. 2014, 7, re7.
  4. Rocha, C.R.; Schröppel, K.; Harcus, D.; Marcil, A.; Dignard, D.; Taylor, B.N.; Thomas, D.Y.; Whiteway, M.; Leberer, E. Mol. Biol. Cell. 2001, 12, 3631-3643.
  5. Malaikozhundan, B.; Vaseeharan, B.; Vijayakumar, S.; Pandiselvi K.; Kalanjiam R.; Murugan K.; Benelli G. Microbial Pathogenesis 2017, 102, n.p. Manuscript in progress.
  6. Shapiro, G. and Harper, J.; J Clin Invest. 1999, 104, 1645–1653.
  7. Das, B.; Yeger, H.; Baruchel, H.; Freedman, M.H.; Koren, G.; Baruchel, S. Eur. J. Cancer. 2003, 39, 2556-2565.

Comment

Visualizing the Future of Medicine

Comment

Visualizing the Future of Medicine

What do you do when you get sick? Most likely you schedule a doctor’s appointment, show up, and spend ten to fifteen minutes with the doctor. The physician quickly scans your chart, combines your narrative of your illness with your medical history and his or her observations so that you can leave with diagnosis and prescription in hand. While few give the seemingly routine process a second thought, the very way in which healthcare providers approach the doctor-patient experience is evolving. There is a growing interest in the medical humanities, a more interdisciplinary study of illness. According to Baylor College of Medicine, the aim of the medical humanities is “understanding the profound effects of illness and disease on patients, health professionals, and the social worlds in which they live and work.”1 Yet medical humanities is somewhat of a catch all term. It encompasses disciplines including literature, anthropology, sociology, philosophy, the fine arts and even “science and technology studies.”1 This nuanced approach to medicine is exactly what Dr. Kirsten Ostherr, one of the developers of Rice University’s medical humanities program, promotes.

Dr. Ostherr uses this interdisciplinary approach to study the intersection of technology and medicine. She has conducted research on historical medical visualizations through media such as art and film and its application to medicine today. Originally a PhD recipient of American Studies and Media Studies at Brown University, Dr. Ostherr’s interest in medicine and media was sparked while working at the Department of Public Health at Oregon Health Sciences University, where researchers were using the humanities as a lens through which they could analyze health data. “I noticed that the epidemiologists there used narrative to make sense of data, and that intrigued me,” she said. This inspired Dr. Ostherr to use her background in media and public health to explore how film and media in general have affected medicine and to predict where the future of medical media lies.

While the integration of medicine and media may seem revolutionary, it is not a new concept. In her book, Medical Visions, Dr. Ostherr says that “We know we have become a patient when we are subjected to a doctor’s clinical gaze,” a gaze that is powerfully humanizing and can “transform subjects into patients.”2 With the integration of technology and medicine, this “gaze” has extended to include the visualizations vital to understanding the patient and decoding disease. Visualizations have been a part of the doctor-patient experience for longer than one might think, from X-rays in 1912 to the electronic medical records used by physicians today.3

In her book, Dr. Ostherr traces and analyzes a series of different types of medical visualizations throughout history. Her research begins with the study of scientific films of the early twentieth century, and their attempt to bridge the gap between scientific knowledge and the general public.2 The use of film in medical education was also significant in the 20th century. These technical films helped facilitate the globalization of health and media in the postwar era. Another form of medical visualizations that emerged with the advent of medicine on television. At the intersection of entertainment and education, medical documentary evolved into “health information programming” in the 1980’s which in turn transitioned into the rise of medical reality television.2 The history of this diverse and expanding media, she says, proves that the use of visualizations in healthcare and our daily lives has made medicine “a visual science.”

One of the main takeaways from Dr. Ostherr’s historical analysis of medical visualizations was the deep-rooted relationship between visualizations and their role in spreading medical knowledge to the average person. While skeptics may argue against this characterization, “this is a broad social change that is taking place,” Dr. Ostherr said, citing new scientific research emerging on human centered design and the use of visual arts in medical training. “It’s the future of medicine,” she said. There is already evidence that such a change is taking place: the method of recording patient information using health records has begun to change. In recent years there has been a movement to adopt electronic health records due to their potential to save the healthcare industry millions of dollars and improve efficiency.4 Yet recent studies show that the current systems in place are not as effective as predicted.5 Online patient portals allow patients to keep up with their health information, view test results and even communicate with their health care providers, but while these portals can involve patients as active participants in their care, they can also be quite technical.6 As a result, there is a push to develop electronic health records with more readily understandable language.

In order to conduct further research in the field including projects such as the development of better, easier to understand electronic health records, Dr. Ostherr co-founded and is the director of the Medical Futures Lab. The lab draws resources from Baylor College of Medicine, University of Texas Health Science Center, and Rice University and its diverse team ranges from humanist scholars to doctors to computer scientists.7 The use of technology in medicine has continued to develop rapidly alongside the increasing demand for personalized, humanizing care. While it seems like there is an inherent conflict between the two, Dr. Ostherr believes medicine needs the “right balance of high tech and high touch” which is what her team at the Medical Futures Lab (MFL) works to find. The MFL team works on projects heavily focused on deconstructing and reconstructing the role of the patient in education and diagnosis.7

The increasingly integrated humanistic and scientific approach to medicine is revolutionizing healthcare. As the Medical Futures Lab explores the relationship between personal care and technology, the world of healthcare is undergoing a broad cultural shift. Early on in their medical education, physicians are being taught the value of incorporating the humanities and social sciences into their training, and that science can only teach one so much about the doctor-patient relationship. For Dr. Ostherr, the question moving forward will be “what is it that is uniquely human about healing?” What are the limitations of technology in healing and what about healing process can be done exclusively by the human body? According to Dr. Ostherr, the histories of visualizations in medicine can serve as a roadmap and an inspiration for the evolution and implementation of new media and technology in transforming the medical subject into the patient.

References

  1. Baylor University Medical Humanities. http://www.baylor.edu/medical_humanities/ (accessed Nov. 27, 2017).
  2. Ostherr, K. Medical visions: producing the patient through film, television, and imaging technologies; Oxford University Press: Oxford, 2013.
  3. History of Radiography. https://www.nde-ed.org/EducationResources/CommunityCollege/Radiography/Introduction/history.htm (accessed Jan. 2017).
  4. Abelson, R.; Creswell, J. In Second Look, Few Savings From Digital Health Records. New York Times [Online], January 11, 2013. http://www.nytimes.com/2013/01/11/business/electronic-records-systems-have-not-reduced-health-costs-report-says.html (accessed Jan 2017).
  5. Abrams, L. The Future of Medical Records. The Atlantic [Online], January 17, 2013 http://www.theatlantic.com/health/archive/2013/01/the-future-of-medical-records/267202/ (accessed Jan. 25, 2017).
  6. Rosen, M. D. L. High Tech, High Touch: Why Technology Enhances Patient-Centered Care. Huffington Post [Online], December 13, 2012. http://www.huffingtonpost.com/lawrence-rosen-md/health-care-technology_b_2285712.html (accessed Jan 2017).
  7. Medical Futures Lab. http://www.medicalfutureslab.org/ (accessed Dec 2017).

Comment

The Fight Against Neurodegeneration

Comment

The Fight Against Neurodegeneration

“You know that it will be a big change, but you really don’t have a clue about your future.”A 34-year-old postdoctoral researcher at the Telethon Institute of Genetics and Medicine in Italy at the time, Dr. Sardiello had made a discovery that would change his life forever. Eight years later, Dr. Sardiello is now the principal investigator of a lab in the Jan and Dan Duncan Neurological Research Institute (NRI) where he continues the work that had brought him and his lab to America.

Throughout his undergraduate career, Sardiello knew he wanted to be involved in some manner with biology and genetics research, but his passion was truly revealed in 2000: the year he began his doctoral studies. It was during this year that the full DNA sequence of the common fruit fly was released, which constituted the first ever complete genome of a complex organism. At the time, Sardiello was working in a lab that used fruit flies as a model, and this discovery served to spur his interest in genetics. As the golden age of genetics began, so did Sardiello’s love for the subject, leading to his completion of a PhD in Genetic and Molecular Evolution at the Telethon Institute of Genetics and Medicine. It was at this institute that his team made the discovery that would bring him to America: the function of Transcription Factor EB, colloquially known as TFEB.

Many knew of the existence of TFEB, but no one knew of its function. Dr. Sardiello and his team changed that. In 2009, they discovered that the gene is the master regulator for lysosomal biogenesis and function. In other words, TFEB works as a genetic switch that turns on the production of new lysosomes, an exciting discovery.1 Before the discovery of TFEB’s function, lysosomes were commonly known as the incinerator or the garbage can of the cell, as these organelles were thought to be essentially specialized containers that get rid of cellular waste. However, with the discovery of TFEB’s function, we now know that lysosomes have a much more active role in catabolic pathways and the maintenance of cell homeostasis. Sardiello’s groundbreaking findings were published in Science, one of the most prestigious peer reviewed journals in the scientific world. Speaking about his success, Sardiello said, “The bottom line was that there was some sort of feeling that a big change was about to come, but we didn’t have a clue what. There was just no possible measure at the time.”

Riding the success of his paper, Sardiello moved to the United States and established his own lab with the purpose of defeating the family of diseases known as Neuronal Ceroid Lipofuscinosis (NCLs). NCLs are genetic diseases caused by the malfunction of lysosomes. This malfunction causes waste to accumulate in the cell and eventually block cell function, leading to cell death. While NCLs cause cell death throughout the body, certain specialized cells such as neurons do not regenerate. Therefore, NCLs are generally neurodegenerative diseases. While there are many variants of NCLs, they all result in premature death after loss of neural functions such as sight, motor ability, and memory.

“With current technology,” Sardiello said, “the disease is incurable, since it is genetic. In order to cure a genetic disease, you have to somehow bring the correct gene into every single cell of the body.” With our current understanding of biology, this is impossible. Instead, doctors can work to treat the disease, and halt the progress of the symptoms. Essentially, his lab has found a way using TFEB to enhance the function of the lysosomes in order to fight the progress of the NCL diseases.

In addition to genetic enhancement, Sardiello is also focusing on finding drugs that will activate TFEB and thereby increase lysosomal function. To test these new methods, the Sardiello lab uses mouse models that encapsulate most of the symptoms in NCL patients. “Our current results indicate that drug therapy for NCLs is viable, and we are working to incorporate these strategies into clinical therapy,” Sardiello said. So far the lab has identified three different drugs or drug combinations that may be viable for treatment of this incurable disease.

While it might be easy to talk about NCLs and other diseases in terms of their definitions and effects, it is important to realize that behind every disease are real people and real patients. The goal of the Sardiello Lab is not just to do science and advance humanity, but also to help patients and give them hope. One such patient is a boy named Will Herndon. Will was diagnosed with NCL type 3, and his story is one of resilience, strength, and hope.

When Will was diagnosed with Batten Disease at the age of six, the doctors informed him and his family that there was little they could do. At the time, there was little to no viable research done in the field. However, despite being faced with terminal illness, Will and his parents never lost sight of what was most important: hope. While others might have given up, Missy and Wayne Herndon instead founded The Will Herndon Research Fund - also known as HOPE - in 2009, playing a large role in bringing Dr. Sardiello and his lab to the United States. Yearly, the foundation holds a fundraiser to raise awareness and money that goes towards defeating the NCL diseases. Upon its inception, the fundraiser had only a couple of hundred attendees- now, only half a decade later, thousands of like-minded people arrive each year to support Will and others with the same disease. “Failure is not an option,” Missy Herndon said forcefully during the 2016 banquet. “Not for Will, and not for any other child with Batten disease.” It was clear from the strength of her words that she believed in the science, and that she believed in the research.

“I have a newborn son,” Sardiello said, recalling the speech. “I can’t imagine going through what Missy and Wayne had to. I felt involved and I felt empathy, but most of all, I felt respect for Will’s parents. They are truly exceptional people and go far and beyond what anyone can expect of them. In face of adversity, they are tireless, they won’t stop, and their commitment is amazing.”

When one hears about science and labs, it usually brings to mind arrays of test tubes and flasks or the futuristic possibilities of science. In all of this, one tends to forget about the people behind the test bench: the scientists that conduct the experiments and uncover the next step in the collective knowledge of humanity, people like Dr. Sardiello. However, Sardiello isn’t alone in his endeavors, as he is supported by the members of his lab.

Each and every one of the researchers in Marco’s lab is an international citizen, hailing from at least four different countries in order to work towards a common cause: Parisa Lombardi from Iran, Lakshya Bajaj, Jaiprakash Sharma, and Pal Rituraj from India, Abdallah Amawi, from Jordan, and of course, Marco Sardiello and Alberto di Ronza, from Italy. Despite the vast distances in both geography and culture, the chemistry among the team was palpable, and while how they got to America varied, the conviction that they had a responsibility to help other people and defeat disease was always the same.

Humans have always been predisposed to move forwards. It is because of this propensity that humans have been able to eradicate disease and change the environments that surround us. However, behind all of our achievements lies scientific advancement, and behind it are the people that we so often forget. Science shouldn’t be detached from the humans working to advance it, but rather integrated with the men and women working to make the world a better place. Dr. Sardiello and his lab represent the constant innovation and curiosity of the research community, ideals that are validated in the courage of Will Herndon and his family. In many ways, the Sardiello lab embodies what science truly represents: humans working for something far greater than themselves.

References

  1. Sardiello, M.; Palmieri, M.; di Ronza, A.; Medina, D.L.; Valenza, M.; Alessandro, V. Science. 2009, 325, 473-477.

 

Comment

Cognitive Neuroscience: A Glimpse of the Future

Comment

Cognitive Neuroscience: A Glimpse of the Future

Catalyst Volume 10

Cognitive Neuroscience is a branch of science that addresses the processes in the brain that occur during cognitive activity. The discipline addresses how psychological and cognitive activities are caused by and correlated to the neural connections in our brain. It bridges psychology and neuroscience.

Dr. Simon Fischer-Baum, an assistant professor and researcher at Rice University,  co-directs the neuroplasticity lab at the BioScience Research Collaborative. He received his B.A. in Neuroscience and Behavior from Columbia University in 2003 and received his Ph.D. in Cognitive Sciences from Johns Hopkins University in 2010.

Dr. Fischer-Baum describes his research as the “intersection of psychology and neuroscience and computer science to some extent.” He is interested in instances of how we understand and pronounce a word once we see it. He also studies memory and how information is encoded in the brain. In his opinion, functional magnetic resonance imaging (fMRI) and other tools of cognitive neuroscience are extremely relevant to cognitive psychology despite public perception. For example, he believes that there is a “serious disconnect” as a result of the belief that the methods and findings of cognitive neuroscience do not apply to cognitive psychology. Cognitive psychologists have been attempting to discover the variation between the different levels of processing and how information travels between these levels. Cognitive neuroscience can help achieve these goals through the use of fMRIs.

fMRI shows which parts of the brain are active when the subject is performing a task. During any task, multiple regions of the brain are involved, with each region processing different types of information. For example, reading a word involves processing both visual information and meaning; when you are reading a word, multiple regions of the brain are active. However, one problem with fMRIs is that while they demonstrate what regions of the brain are active, they do not convey what function each region is carrying out.  One of the main objectives of Dr. Fischer-Baum’s work is to pioneer new methods similar to computer algorithms to decode what data from an fMRI tells us about what tasks the brain is performing. “I want to be able to take patterns of activity and decode and relate it back to the levels of representation that cognitive psychologists think are going on in research,” Dr. Fischer-Baum explains.

Recently, Dr. Fischer-Baum published a study of a patient who suffered severe written language impairments after experiencing a hemorrhagic stroke. Although this patient’s reading of familiar words improved throughout the years, he still presented difficulties in processing abstract letter identity information for individual letters. Someone who is able to utilize abstract letter representations can  recognize letters independent of case or font; in other words, this person  is able to identify letters regardless of the whether they are upper case, lower case, or a different font. In the studied patient, Dr. Fischer-Baum’s team observed contralesional reorganization. Compromised regions of the left hemisphere that contained orthography-processing regions (regions that process the set of conventions for writing a language) were organized into homologous regions in the right hemisphere. Through the use of fMRI, the research team determined that the patient’s residual reading ability was supported by functional take-over, which is when injury-damaged functions are taken over by healthy brain regions. These results were found by scanning the brain of the patient as he read and comparing the data with that of a control group of young healthy adults with normal brain functions.

While Dr. Fischer-Baum has made substantial progress in this project, the research has not been without challenges. The project began in 2013 and took three years to complete, which is a long time for Dr. Fischer-Baum’s field of study. Due to this, none of the co-authors from Rice University know each other despite all working on the project at some point in time with another. Because of the amount of time spent on the project, many of the students rotated in and out while working on various parts; the students never worked on the project at the same time as their peers. In addition, the project’s  interdisciplinary approach required the input of  many collaborators with different abilities. All of the Rice undergraduate students that worked on the project were from different majors although most were from the Cognitive Sciences Department and the Statistics Department. At times, this led to miscommunication between the different students and researchers on the project. Since the students came from different backgrounds, they had different approaches to solving problems. This led to the students at times not being harmonious during many aspects of the project.  

Another major setback occurred in bringing ideas to fruition. “You realize quickly when you begin a project that there are a million different ways to solve the problem that you are researching, and trying to decide which is the right or best way can sometimes be difficult,” Dr. Fischer-Baum said. As a result of this, there have been a lot of false starts, and it has taken a long time in order to get work off the ground. How did Dr. Fischer-Baum get past this problem? “Time, thinking, discussion, and brute force,” he chuckled. “You realize relatively quickly that you need to grind it out and put in effort in order to get the job done.”

Despite these obstacles, Dr. Fischer-Baum has also undertaken other projects in order to keep his mind busy. In one, he works with stroke patients with either reading or writing deficits to understand how written language is broken down in the mind. He studies specific patterns in the patients’ brain activity to investigate how reading and writing ability differ from each other. In another of Dr. Fischer-Baum’s projects he works with Dr. Paul Englebretson of the Linguistics Department in order to research the brain activity of blind people as they read Braille. “There is a lot of work on how the reading system works, but a lot of it is based on the perspective of reading by sight,” Dr. Fischer-Baum acknowledged. “I am very interested to see how the way we read is affected by properties of our visual system. Comparing sight and touch can show how much senses are a factor in reading.”

Ultimately, Dr. Fischer-Baum conducts his research with several goals in mind. The first is to build an approach to cognitive neuroscience that is relevant to the kinds of theories that we have in the other cognitive sciences, especially cognitive psychology. “While it feels like studying the mind and studying the brain are two sides of the same coin and that all of this data should be relevant for understanding how the human mind works, there is still a disconnect between the two disciplines,” Dr. Fischer-Baum remarked. He works on building methods in order to bridge this disconnect.

In addition to these goals for advancing the field of cognitive neuroscience, there are clinical implications as well to Dr. Fischer-Baum’s research. Gaining more insight into brain plasticity following strokes can be used to build better treatment and recovery programs. Although the research requires further development, the similarity between different regions and their adaptations following injury can lead to a better understanding of the behavioral and neural differences in patterns of recovery. Additionally, Dr. Fischer-Baum aims to understand the relationship between spontaneous and treatment-induced recovery and how the patterns of recovery of language differ as a result of the initial brain injury type and location. Through the combined use of cognitive psychology and fMRI data, the brains of different stroke patients can be mapped and the data can be used to create more successful treatment-induced methods of language recovery. By virtue of Dr. Fischer-Baum’s research, not only can cognitive neuroscience be applied to many other disciplines, but it can also significantly improve the lives of millions of people around the world.  

 

Comment

Tactile Literacy: The Lasting Importance of Braille

Comment

Tactile Literacy: The Lasting Importance of Braille

On June 27th, 1880, a baby girl was born. At nineteen months old, the little girl contracted a severe fever, and once the fever dissipated, she woke up to a world of darkness and silence. This little girl was Helen Keller. By the age of two, Helen Keller had completely lost her sense of sight and hearing.

Over a century later, it is estimated that 285 million people are visually impaired worldwide, of which 39 million are blind.1 Blindness is defined as the complete inability to see with a corrected vision of 20/200 or worse.2 For Keller to absorb the information around her, she relied on the sensation of touch. The invention of the braille alphabet by Frenchman Louis Braille in the early 1800s allowed Keller to learn about the world and to communicate with others. Like Keller, the majority of the visually impaired today rely on braille as their main method of reading.

The technological advances of smartphones, artificial intelligence, and synthetic speech dictations have opened a whole new world for blind readers. With the advent of the electronic information age, it’s easy to think that blind people don’t need to rely on braille anymore to access information. In fact, braille literacy rates for school-age blind children have already declined from 50 percent 40 years ago to only 12 percent today.3 While current low literacy rates may be in part due to the inclusion of students with multiple disabilities that inhibit language acquisition, these statistics still reveal a major concern about literacy amongst the visually impaired. To substitute synthetic speech for reading and writing devalues the importance of learning braille.

“There are many misunderstandings and stereotypes of braille readers,” says Dr. Robert Englebretson, Professor of Linguistics at Rice University. “When a person reads, they learn about spelling and punctuation, and it’s the exact same for tactile readers. Humans better process information when they actively process it through reading instead of passively listening.”

Dr. Englebretson is also blind, and one part of his research agenda is a collaborative project with Dr. Simon Fischer-Baum in Psychology and pertains to understanding the cognitive and linguistic importance of braille to braille readers. He explores the questions surrounding the nature of perception and reading and explores the ways the mind groups the input of touch into larger pieces to form words.

In order to understand how written language is processed by tactile readers compared to visual readers, Dr. Englebretson conducted experiments to find out if braille readers exhibit an understanding of sublexical structures, or parts of words, similar to that of visual readers. An understanding of sublexical structures is crucial in recognizing letter groupings and acquiring reading fluency. Visual readers recognize sublexical structures automatically as the eye scans over words, whereas tactile readers rely on serially scanning fingers across a line of text.

To explore whether the blind have an understanding of sublexical structures, Dr. Englebretson studied the reaction time of braille readers in order to judge their understanding of word structures. The subjects were given tasks to determine whether the words were real or pseudowords, and the time taken to determine the real words from the pseudowords were recorded. The first experiment tested the ability for braille readers to identify diagraphs or parts of words, and the second experiment test the ability for braille readers to identify morphemes, or the smallest unit of meaning or grammatical function of a word. For braille readers, Dr. Englebretson and his team developed a foot pedal system that enabled braille readers to indicate their answer without pausing to click a screen as the visual readers did. This enabled the braille readers to continuously use their hands while reading. From the reaction times of the braille readers when presented with a morphologically complex word, the findings show evidence of braille readers processing the meaning of words and recognizing these diagraphs and morphemes.4

“What we discovered was that tactile readers do rely on sublexical structures and have similar cognitive processes to print readers,” says Dr. Englebretson. “The belief that braille is old-fashioned and not needed anymore is far from the truth. Tactile reading provides an advantage in learning just as visual reading does.”

Dr. Englebretson also gathered a large sample of braille readers and videotaped them reading using a finger tracking system. Similar to an eye tracking system that follows eye movements, the finger tracking system used LED lights on the backs of fingernails to track the LED movements over time using a camera. The movements of the LED lights on the x-y coordinates are then plotted on a graph. This system can track where each finger is, how fast they are moving, and the movements that are made during regressions, or the right-to-left re-reading movement of the finger.5 While this test was independent from the experiment about understanding sublexical structures, the data collected offers a paradigm for researchers about braille reading.

The outcome of these studies has not only scientific and academic implications, but also important social implications. “At the scientific level, we now better understand how perception [of written language] works, how the brain organizes and processes written language, and how reading works for tactile and visual readers,” says Dr. Englebretson. “Through understanding how tactile readers read, we will hopefully be able to implement policy on how teachers of blind and visually impaired students teach, and on how to guide the people who are working on updating and maintaining braille.”

With decreasing literacy rates among braille readers, an evidence-base approach to the teaching of braille is as critical as continuing to implement braille literacy programs. With an understanding of braille, someone who is blind can not only access almost infinite pages of literature, but also make better sense of their language and world.

References

  1. World Health Organization. http://www.who.int/mediacentre/factsheets/fs282/en/ (accessed Jan. 9, 2017).
  2. National Federation of the Blind. https://nfb.org/blindness-statistics (accessed Jan. 9, 2017).
  3. National Braille Press. https://www.nbp.org/ic/nbp/braille/needforbraille.html (accessed Jan. 10, 2017).
  4. Fischer-Baum,S.; Englebretson, R. Science Direct. 2016, http://www.sciencedirect.com/science/article/pii/S0010027716300762 (accessed Jan. 10, 2017)
  5. Ulusoy, M.; Sipahi, R. PLoS ONE. 2016, 11. http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0148356 (accessed Jan. 10, 2017)

 

Comment

The Secret Behind Social Stigma

Comment

The Secret Behind Social Stigma

How do you accurately quantify something as subjective and controversial as discrimination? What about stigma - a superficial mark imposed upon a prototypical group of individuals? How do you attempt to validate what is seemingly invisible? Dr. Michelle “Mikki” Hebl and her team in the Industrial/Organizational (I/O) department of social psychology at Rice University attempt to answer these questions.

In the world of social psychology, where human interactions are often unpredictable, researchers must get creative to control variables as much as possible while simultaneously mimicking real-life situations. Dr. Hebl integrates both laboratory procedures and field studies that involve standardized materials. “My research is fairly novel,” she notes. Unlike the majority of existing stigma and discrimination research, which depends on self-reported assessments, her studies examine real, non-simulated social interactions. Although her approach seeks to provide more realistic and unbiased settings, “it’s messier,” she adds, laughing about the many trials discarded due to uncontrollable circumstances. That attitude— optimistic, determined, and creative—is held proudly by Dr. Hebl. It is clear that her lab’s overall mission—to reduce discrimination and increase equity—is worth undertaking.

Dr. Hebl and her team focus on a form of behavior they call “interpersonal discrimination,” a type of discrimination that occurs implicitly while still shaping the impressions we form and the decisions we make.1 This kind of bias, rooted in stereotypes and negative social stigma, is far more subtle than some of the more well-known, explicit forms of discrimination. For example, in a field study evaluating bias against homosexual applicants in Texas, Dr. Hebl found that the members of both the experimental and control group, who were wearing hats that said “Gay and Proud” and “Texan and Proud” respectively, did not experience formal bias when entering stores to seek employment. For example, none of the subjects were denied job applications. What she did find, however, was a pattern of interpersonal reactions against the experimental group. Discrete recording devices worn by the subjects revealed a pattern of decreased word count per sentence and shorter interactions for the stigmatized group. Their self-reports further indicated on average a higher perceived negativity and lower perceived employer interest.1 In another study evaluating obesity-related stigma, results showed that obese individuals - in this case subjects wearing obese prosthetic suits - experience similarly negative interactions.2

While many of her studies evaluated biases in seeking employment, Dr. Hebl also explored the presence of interpersonal discrimination against lesser-known groups that experience bias. One surprising finding indicated negative stigmatization against cancer survivors.3 In other studies, the team found patterns relating to stereotypicality; this relatively new phenomena explores the lessened interpersonal discrimination against those who deviate from the stereotypical prototype of their minority group, i.e. a more light-skinned Hispanic male.4 A holistic review of her research reveals a pattern of discrimination against stigmatized groups on an implicit level. Once researchers like Dr. Hebl find these patterns, they can investigate them in the lab by further isolating variables to develop a more refined and widely-applicable conclusion.

What can make more subtle forms of bias so detrimental is the ambiguity surrounding them. When someone discriminates against another in a clear and explicit form, one can easily attribute the behavior to the person’s biases. On the other hand, when this bias is perceived in the form of qualitative behavior, such as shortened conversations and body language, it raises questions regarding the person’s intentions. In these cases, the victim often internalizes the negative treatment, questioning the effect of traits that they cannot control—be it race, sexual orientation, or physical appearance. This degree of uncertainty raises conflict and tension between differing groups, thus potentially hindering progress in today’s increasingly diverse workplaces, schools, and universities.5

Dr. Hebl knew that exploring the presence of this tension between individuals was only the first step. “One of the most exciting aspects of social psychology is that just learning about these things makes you inoculated against them,” she said. Thus emerges the search for practical solutions involving education and reformation of conventional practices in the workplace. Her current work looks at three primary methods: The first is acknowledging biases on an individual level. This strategy involves individuation, or the recognition of one’s own stigma and subsequent compensation for it.6 The second involves implementing organizational methods in the workplace, such as providing support for stigmatized groups and awareness training.7 The third, which has the most transformative potential, is the use of research to support reformation of policies that could protect these individuals.

“I won't rest…until we have equity,” she affirmed when asked about the future of her work. For Dr. Hebl, the ultimate goal is education and change. Human interactions are incredibly complex, unpredictable, and difficult to quantify. But they influence our daily decisions and actions, ultimately impacting how we view ourselves and others. Social psychology research suggests that biases, whether we realize it or not, are involved in the choices we make every day: from whom we decide to speak to whom we decide to work with. Dr. Hebl saw this and decided to do something about it. Her work brings us to the complex source of these disparities and suggests that understanding their foundations can lead to a real, desirable change.

References

  1. Hebl, M. R.; Foster, J. B.; Mannix, L. M.; Dovidio, J. F. Pers. Soc. Psychol. B. 2002, 28 (6), 815–825.
  2. Hebl, M. R.; Mannix, L. M. Pers. Soc. Psychol. B. 2003, 29 (1), 28–38.
  3. Martinez, L. R.; White, C. D.; Shapiro, J. R.; Hebl, M. R. J. Appl. Psychol. 2016, 101 (1), 122–128.
  4. Hebl, M. R.; Williams, M. J.; Sundermann, J. M.; Kell, H. J.; Davies, P. G. J. Exp. Soc. Psychol. 2012, 48 (6), 1329–1335.
  5. Szymanski, D. M.; Gupta, A. J. Couns. Psychol. 2009, 56 (2), 300–300.
  6. Singletary, S. L.; Hebl, M. R. J. Appl. Psychol. 2009, 94 (3), 797–805.
  7. Martinez, L. R.; Ruggs, E. N.; Sabat, I. E.; Hebl, M. R.; Binggeli, S. J. Bus. Psychol. 2013, 28 (4), 455–466.

    

 

Comment

Haptics: Touching Lives

Comment

Haptics: Touching Lives

Everyday you use a device that has haptic feedback: your phone. Every little buzz for notifications, key presses, and failed unlocks are all examples of haptic feedback. Haptics is essentially tactile feedback, a form of physical feedback that uses vibrations. It is a field undergoing massive development and applications of haptic technology are expanding rapidly. Some of the up-and-coming uses for haptics include navigational cues while driving, video games, virtual reality, robotics, and, as in Dr. O’Malley’s case, in the medical field with prostheses and medical training tools.

Dr. Marcia O’Malley has been involved in the biomedical field ever since working in an artificial knee implant research lab as an undergraduate at Purdue University. While in graduate school at Vanderbilt University, she worked in a lab focused on human-robot interfaces where she spent her time designing haptic feedback devices. Dr. O’Malley currently runs the Mechatronics and Haptic Interfaces (MAHI) Lab at Rice University, and she was recently awarded a million dollar National Robotics Initiative grant for one of her projects. The MAHI Lab “focuses on the design, manufacture, and evaluation of mechatronic or robotic systems to model, rehabilitate, enhance or augment the human sensorimotor control system.”1 Her current research is focused on prosthetics and rehabilitation with an effort to include haptic feedback. She is currently working on the MAHI EXO- II. “It’s a force feedback exoskeleton, so it can provide forces, it can move your limb, or it can work with you,” she said. The primary project involving this exoskeleton is focused on “using electrical activity from the brain captured with EEG… and looking for certain patterns of activation of different areas of the brain as a trigger to move the robot.” In other words, Dr. O’Malley is attempting to enable exoskeleton users to control the device through brain activity.

Dr. O’Malley is also conducting another project, utilizing the National Robotics Initiative grant, to develop a haptic cueing system to aid medical students training for endovascular surgeries. The idea for this haptic cueing system came from two different sources. The first part was her prior research which consisted of working with joysticks. She worked on a project that involved using a joystick, incorporated with force feedback, to swing a ball to hit targets.2 As a result of this research, Dr. O’Malley found that “we could measure people’s performance, we could measure how they used the joystick, how they manipulated the ball, and just from different measures about the characteristics of the ball movement, we could determine whether you were an expert or a novice at the task… If we use quantitative measures that tell us about the quality of how they’re controlling the tools, those same measures correlate with the experience they have.” After talking to some surgeons, Dr. O’Malley found that these techniques of measuring movement could work well for training surgeons.

The second impetus for this research came from an annual conference about haptics and force feedback. At the conference she noticed that more and more people were moving towards wearable haptics, such as the Fitbit, which vibrates on your wrist. She also saw that everyone was using these vibrational cues to give directional information. However, “nobody was really using it as a feedback channel about performance,” she said. These realizations led to the idea of the vibrotactile feedback system.

Although the project is still in its infancy, the current anticipated product is a virtual reality simulator which will track the movements of the tool. According to Dr. O’Malley, the technology would provide feedback through a single vibrotactile disk worn on the upper limb. The disk would use a voice coil actuator that moves perpendicular to the wearer’s skin. Dr. O’Malley is currently working with Rice psychologist Dr. Michael Byrne to determine which frequency and amplitude to use for the actuator, as well as the timing of the feedback to avoid interrupting or distracting the user.

Ultimately, this project would measure the medical students’ smoothness and precision while using tools, as well as give feedback to the students regarding their performance. In the future, it could also be used in surgeries during which a doctor operates a robot and receives force feedback through similar haptics. During current endovascular surgery, a surgeon uses screens that project a 2D image of the tools in the patient. Incorporating 3D views would need further FDA approval and could distract and confuse surgeons given the number of screens they would have to monitor. This project would offer surgeons a simpler way to operate. From exoskeletons to medical training, there is a huge potential for haptic technologies. Dr. O’Malley is making this potential a reality.

References

  1. Mechatronics and Haptic Interfaces Lab Home Page. http://mahilab.rice.edu (accessed   Nov. 7, 2016).
  2. O’Malley, M. K. et al. J. Dyn. Sys., Meas., Control. 2005, 128 (1), 75-85.

Comment

The Depressive Aftermath of Brain Injury

Comment

The Depressive Aftermath of Brain Injury

One intuitively knows that experiencing a brain injury is often painful and terrifying; the fact that it can lead to the onset of depression, however, is a lesser known but equally serious concern. Dr. Roberta Diddel, a clinical psychologist and member of the adjunct faculty in the Psychology Department at Rice University, focuses on the treatment of individuals with mental health issues and cognitive disorders. In particular, she administers care to patients with cognitive disorders due to traumatic brain injury (TBI). Dr. Diddel acquired a PhD in clinical psychology from Boston University and currently runs a private practice in Houston, Texas. Patients who experience TBI often experience depression; Dr. Diddel uses her understanding of how this disorder comes about to create and administer potential treatments.

Traumatic brain injury (TBI) affects each patient differently based on which region of the brain is damaged. If a patient has a cerebellar stroke, affecting the region of the brain which regulates voluntary motor movements, he or she might experience dizziness and have trouble walking. However, that patient would be able to take a written test because the injury has not affected higher order cognitive functions such as language processing and critical reasoning.

Dr. Diddel said, “Where you see depression the most is when there is a more global injury, meaning it has affected a lot of the brain. For example, if you hit your forehead in a car accident or playing a sport, you’re going to have an injury to the front and back parts of your brain because your brain is sitting in cerebrospinal fluid, causing a whiplash of sorts. In turn, this injury will cause damage to your frontal cortex, responsible for thought processing and problem solving, and your visual cortex, located in the back of your brain. When your brain is bouncing around like that, you often have swelling which creates intracranial pressure. Too much of this pressure prevents the flow of oxygen-rich blood to the brain. That can cause more diffuse brain injury.”

In cases where people experience severe brain injury such as head trauma due to an explosion or a bullet, surgeons may remove blood clots that may have formed in order to relieve intracranial pressure and repair skull fractures.4 They may also remove a section of the skull for weeks or months at a time to let the brain swell, unrestricted to the small, cranial cavity. That procedure alone significantly reduces the damage that occurs from those sorts of injuries and is especially useful in the battlefield where urgent care trauma centers may not be available.

Depression is a common result of TBI. The Diagnostic and Statistical Manual of Mental Disorders (DSM) defines depression as a loss of interest or pleasure in daily activities for more than two weeks, a change in mood, and impaired function in society.1 These symptoms are caused by brain-related biochemical deficiencies that disrupt the nervous system and lead to various symptoms. Usually, depression occurs due to physical changes in the prefrontal cortex, the area of the brain associated with decision-making, social behavior, and personality. People with depression feel overwhelmed, anxious, lose their appetite, and have a lack of energy, often because of depleted serotonin levels. The mental disorder is a mixture of chemical imbalances and mindstate; if the brain is not correctly functioning, then a depressed mindstate will follow.

Dr. Diddel mentioned that in many of her depressed patients, their lack of motivation prevents them from addressing and improving their toxic mindset. “If you’re really feeling bad about your current situation, you have to be able to say ‘I can’t give in to this. I have to get up and better myself and my surroundings.’ People that are depressed are struggling to do that,” she said.

The causes of depression vary from patient to patient and often depends on genetic predisposition to the disease. Depression can arise due to physical changes in the brain such as the alterations in the levels of catecholamines, neurotransmitters that works throughout the sympathetic and central nervous systems. Catecholamines are broken down into other neurotransmitters such as serotonin, epinephrine, and dopamine, which are released during times of positive stimulation and help increase activity in specific parts of the brain. A decrease in these chemicals after an injury can affect emotion and thought process. Emotionally, the patient might have a hard time dealing with a new disability or change in societal role due to the trauma. Additionally, patients who were genetically loaded with genes predisposing them to depression before the injury are more prone to suffering from the mental disorder after the injury.2,3

Depression is usually treated with some form of therapy or antidepressant medication. In cognitive behavior therapy (CBT), the psychologist tries to change the perceptions and behavior that exacerbate a patient’s depression. Generally, the doctor starts by attempting to change the patient’s behavior because it is the only aspect of his or her current situation that can can described. Dr. Diddel suggests such practices to her patients, saying things like “I know you don’t feel like it, but I want you to go out and walk everyday.” Walking or any form of exercise increases catecholamines, which essentially increases the activity of serotonin in the brain and improves the patient’s mood. People who exercise as part of their treatment regimen are also less likely to experience another episode of depression.

The efficacy of antidepressant medication varies from patient to patient depending on the severity of depression a patient faces. People with mild to moderate depression generally respond better to CBT because the treatment aims to change their mindset and how they perceive the world around them. CBT can result in the patient’s depression gradually resolving as he or she perceives the surrounding stimuli differently, gets out and moves more, and pursues healthy endeavors. Psychologists usually begin CBT, and if the patient does not respond to that well, then they are given medication. Some medications increase serotonin levels while others target serotonin, dopamine, and norepinephrine; as a result, they boost the levels of neurotransmitters that increase arousal levels and dampen negative emotions. The population of patients with moderate to severe depressions usually respond better to antidepressant medication. Medication can restore ideal levels of neurotransmitters, which in turn encourages the patient to practice healthier behavior.

According to the Center for Disease Control and Prevention, the US saw about 2.5 million cases of traumatic brain injury in 2010 alone.5 That number rises every year and with it brings a number of patients who suffer from depression in the aftermath.5 Though the mental disorder has been studied for decades and treatment options and medications are available, depression is still an enigma to physicians and researchers alike. No two brains are wired the same, making it very difficult to concoct a treatment plan with a guaranteed success rate. The work of researchers and clinical psychologists like Dr. Diddel, however, aims to improve the currently available treatment. While no two patients are the same, understanding each individual’s depression and tailoring treatment to the specific case can vasty improve the patient’s outcome.

References

  1. American Psychiatric Association. Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC, 2013.
  2. Fann, J. Depression After Traumatic Brain Injury. Model Systems Knowledge Translation Center [Online]. http://www.msktc.org/tbi/factsheets/Depression-After-Traumatic-Brain-Injury (accessed Dec. 28, 2016).
  3. Fann, J.R., Hart, T., Schomer, K.G. J. Neurotrauma. 2009, 26, 2383-2402.
  4. Mayo Clinic Staff. Traumatic Brain Injury. Mayo Clinic, May 15, 2014. http://www.mayoclinic.org/diseases-conditions/traumatic-brain-injury/basics/treatment/con-20029302 (accessed Dec. 29, 2016).
  5. Injury Prevention and Control. Centers for Disease Control and Prevention. https://www.cdc.gov/traumaticbraininjury/get_the_facts.html (accessed Dec. 29, 2016).

Comment