Welcome to The Neuromorphic Engineer
Applications » Biomedical

Sensory feedback for body awareness in prosthetic applications

PDF version | Permalink

Alejandro Hernandez Arieta, Konstantinos Dermitzakis, and Dana Damian

19 April 2010

With the ability to feel through artificial limbs, users regain more function and increasingly see the prosthetics as parts of their own bodies.

Every time we perform an action, our brain receives a large amount of sensory information about both the environment and our own bodies. All this information is used to continuously update our model of the world and act accordingly. Sensory-motor coordination—literally the coordination of our sensory and motor systems—allows us to create a mental representation of our own bodies. When this coordination is disrupted, as is the case with amputees, our ability to interact with the environment is reduced. The lack of sensory feedback from that limb means our representation of our bodies is no longer complete, and this can producing a ‘phantom limb’ effect. This is generally experienced as a false sensation in the missing limb, which can include severe pain.1

Though motor control can, to an extent be restored through myoelectric-controlled prosthetic devices,3 providing rich feedback to the user is still an open challenge. The major problem for amputees using such devices is a lack of awareness of the artificial limb: it has not been ‘incorporated’ into their body representation or body schema.4 A side-effect of this is that users experience an increased cognitive load while using the prosthesis, which in turn results in them abandoning it.5 The principal goal of our research is to find ways to include rich sensory feedback in prosthetic devices that would aid their incorporation of the user's body representation or schema. This should reduce the cognitive load and increase the device's functional repertoire.

There have been several attempts to provide tactile feedback in prosthetic applications.6 Our studies build upon two approaches for conveying information to the prosthetic users: using electrical stimulation of the skin, and audio feedback modulation. In terms of the former, we developed a high frequency electrical stimulator2 to convey somatosensory feedback. Depending on the waveform and intensity provided to the stimulator, we can generate a wide range of sensations. Skin stimulation faces the challenge of habituation, where the perception of the stimuli received is reduced over time. However, by using high frequency electrical stimulation, we can reduce these habituation effects, and transmit information for more extended periods of time2 as can be seen in Figure 1.

We also explored the use of audio feedback to convey proprioceptive information. The results of the study presented similar responses between both audio and visual feedback in the performance between groups.7


Shown is a comparison of stimulation methods tested to see how best to overcome habituation. We tested the recognition rate of five stimulation frequencies over a period of five trials. We utilized two stimulation methods: a) presents a drop in the recognition rate, whereas b) presents a more stable recognition rate for the same period of time.2

To evaluate the effects of the interaction between sensory and motor modalities we performed some experiments using an electromyography- (EMG-) controlled prosthetic hand with an amputee patient. Using an fMRI scanner, we recorded the activation of the motor and somatosensory cortices during her interaction with the robot hand. The participant controlled it using the EMG signals from her right arm (stump), whilst the electrical stimulation was administered in the contralateral upper arm.8,9

Our studies presented an interesting reaction of the brain when the stimulation occurs as an isolated event. In such case, the parietal lobe perceives the stimulus presented to the body as an uncorrelated event, without a strong mapping to the somatosensory area as can be seen in Figure 2. However, when electrical stimulation was used to provide haptic feedback while grasping an object with the robot hand—even though the stimulus was provided to the arm opposite to the one used to control the robot hand—we detected activation in the somatosensory area of the left hemisphere. This can be seen in Figure 3.


With passive electrical stimulation we detected a spread activation over the parietal and occipital lobe of the patient.


With active electrical stimulation the motor and the somatosensory cortices presented an increment of activity in the left hemisphere of the amputee's brain.

The results presented here provide insights to the mechanisms responsible for generating body image representations. The production of this tactile illusion points to a correlative behavior of the brain in processing active motor commands and their corresponding feedback. This behavior can be used to shape the interaction between an amputee and a prosthetic device, promoting the ‘incorporation’ of the latter in the mental body representation of the amputee. In our future work, we will explore the benefits from using a multisensory feedback to promote this incorporation, which we believe will reduce the currently constant requirement of conscious mental activity in controlling a prosthetic device.




Authors

Alejandro Hernandez Arieta
Artificial Intelligence Laboratory, University of Zurich

Alejandro Hernandez Arieta received his PhD in precision engineering from the University of Tokyo in 2007. He has been a research fellow at the AI Lab, U. Zurich, since November 2007 and his research involves functional electrical stimulation, prosthetic devices, and sensory neuroprostheses. He is a member of the IEEE Engineering in Medicine and Biology Society.

Konstantinos Dermitzakis
Artificial Intelligence Laboratory, University of Zurich

Konstantinos Dermitzakis received his BSc in Computer Science and MSc in AI from the University of Edinburgh and is currently a PhD candidate at the AI Lab, UZH. His research interests are in prosthetics, bionics, sensorimotor control, machine learning and neurosciences in general.

Dana Damian
Artificial Intelligence Laboratory, University of Zurich

Dana D. Damian received her BS and MS degrees in Computer Science from the Technical University Timisoara, Romania, in 2007. Since then, she has been a PhD student in the Artificial Intelligence Laboratory, under the supervision of Prof. Rolf Pfeifer. Her research interests include human-robot interaction, sensory substitution, and prosthetic adaptation metrics.


References
  1. H. Flor, T. Elbert, W. Mïhlnickel, C. Pantev, C. Wienbruch and E. Taub, Cortical reorganization and phantom phenomena in congenital and traumatic upper-extremity amputees, Exp. Brain. Res. 119 (2), pp. 205-212 Mar, 1998.

  2. A. Hernandez Arieta, Development of a multi-channel functional electrical stimulation system for prosthetic applications of limbs October, 2007. PhD thesis, The University of Tokyo

  3. M. Asghari Oskoei and H. Hu, Myoelectric control systems– survey, Biomed. Sig. Proc. and Control 2 (4), pp. 275-294, 2007. http://www.sciencedirect.com/science/article/B7XMN-4PKP4MN-1/2/7f6f6653d2da796b0325e86cdfd50e1f.

  4. A. Karl, N. Birbaumer, W. Lutzenberger, L. G. Cohen and H. Flor, Reorganization of motor and somatosensory cortex in upper extremity amputees with phantom limb pain., J. Neurosci. 21 (10), pp. 3609-3618 May, 2001.

  5. E. A. Biddiss and T. T. Chau, Upper limb prosthesis use and abandonment: A survey of the last 25 years, Prosthetics and Orthotics Int'l 31 (3), pp. 236-257, 2007.

  6. Y. Visell, Tactile sensory substitution: Models for enaction in HCI, Interacting with Computers 21 (1–2), pp. 38-53, 2009. Special issue: Enactive Interfaces http://www.sciencedirect.com/science/article/B6V0D-4TB7792-1/2/e62be0c137b2cdf45735f54a7110c683.

  7. J. González, W. Yu and A. Hernandez-Arieta, Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users, Industrial Robot: An In'l J. 37 (2), pp. 148-156, 2010.

  8. A. Hernandez Arieta, K. Dermitzakis, D. Damian, M. Lungarella and R. Pfeifer, Sensory-motor coupling in rehabilitation robotics, Handbook of Service Robotics, pp. 21-36, I-Tech Education and Publishing, 2008.

  9. R. Kato, H. Yokoi, A. Hernandez Arieta, W. Yu and T. Arai, Mutual adaptation among man and machine by using f-MRI analysis, Robot. Auton. Syst. 57 (2), pp. 161-166, 2009.


 
DOI:  10.2417/1201004.002880

@NeuroEng



Tell us what to cover!

If you'd like to write an article or know of someone else who is doing relevant and interesting stuff, let us know. E-mail the editor and suggest the subject for the article and, if you're suggesting someone else's work, tell us their name, affiliation, and e-mail.