Everyday you use a device that has haptic feedback: your phone. Every little buzz for notifications, key presses, and failed unlocks are all examples of haptic feedback. Haptics is essentially tactile feedback, a form of physical feedback that uses vibrations. It is a field undergoing massive development and applications of haptic technology are expanding rapidly. Some of the up-and-coming uses for haptics include navigational cues while driving, video games, virtual reality, robotics, and, as in Dr. O’Malley’s case, in the medical field with prostheses and medical training tools.

Dr. Marcia O’Malley has been involved in the biomedical field ever since working in an artificial knee implant research lab as an undergraduate at Purdue University. While in graduate school at Vanderbilt University, she worked in a lab focused on human-robot interfaces where she spent her time designing haptic feedback devices. Dr. O’Malley currently runs the Mechatronics and Haptic Interfaces (MAHI) Lab at Rice University, and she was recently awarded a million dollar National Robotics Initiative grant for one of her projects. The MAHI Lab “focuses on the design, manufacture, and evaluation of mechatronic or robotic systems to model, rehabilitate, enhance or augment the human sensorimotor control system.”1 Her current research is focused on prosthetics and rehabilitation with an effort to include haptic feedback. She is currently working on the MAHI EXO- II. “It’s a force feedback exoskeleton, so it can provide forces, it can move your limb, or it can work with you,” she said. The primary project involving this exoskeleton is focused on “using electrical activity from the brain captured with EEG… and looking for certain patterns of activation of different areas of the brain as a trigger to move the robot.” In other words, Dr. O’Malley is attempting to enable exoskeleton users to control the device through brain activity.

Dr. O’Malley is also conducting another project, utilizing the National Robotics Initiative grant, to develop a haptic cueing system to aid medical students training for endovascular surgeries. The idea for this haptic cueing system came from two different sources. The first part was her prior research which consisted of working with joysticks. She worked on a project that involved using a joystick, incorporated with force feedback, to swing a ball to hit targets.2 As a result of this research, Dr. O’Malley found that “we could measure people’s performance, we could measure how they used the joystick, how they manipulated the ball, and just from different measures about the characteristics of the ball movement, we could determine whether you were an expert or a novice at the task… If we use quantitative measures that tell us about the quality of how they’re controlling the tools, those same measures correlate with the experience they have.” After talking to some surgeons, Dr. O’Malley found that these techniques of measuring movement could work well for training surgeons.

The second impetus for this research came from an annual conference about haptics and force feedback. At the conference she noticed that more and more people were moving towards wearable haptics, such as the Fitbit, which vibrates on your wrist. She also saw that everyone was using these vibrational cues to give directional information. However, “nobody was really using it as a feedback channel about performance,” she said. These realizations led to the idea of the vibrotactile feedback system.

Although the project is still in its infancy, the current anticipated product is a virtual reality simulator which will track the movements of the tool. According to Dr. O’Malley, the technology would provide feedback through a single vibrotactile disk worn on the upper limb. The disk would use a voice coil actuator that moves perpendicular to the wearer’s skin. Dr. O’Malley is currently working with Rice psychologist Dr. Michael Byrne to determine which frequency and amplitude to use for the actuator, as well as the timing of the feedback to avoid interrupting or distracting the user.

Ultimately, this project would measure the medical students’ smoothness and precision while using tools, as well as give feedback to the students regarding their performance. In the future, it could also be used in surgeries during which a doctor operates a robot and receives force feedback through similar haptics. During current endovascular surgery, a surgeon uses screens that project a 2D image of the tools in the patient. Incorporating 3D views would need further FDA approval and could distract and confuse surgeons given the number of screens they would have to monitor. This project would offer surgeons a simpler way to operate. From exoskeletons to medical training, there is a huge potential for haptic technologies. Dr. O’Malley is making this potential a reality.

References

  1. Mechatronics and Haptic Interfaces Lab Home Page. http://mahilab.rice.edu (accessed   Nov. 7, 2016).
  2. O’Malley, M. K. et al. J. Dyn. Sys., Meas., Control. 2005, 128 (1), 75-85.

Comment