Smart Hand – robotic hand gives amputees a sense of touch

10 02 2014

 Developed by EU researchers, the Smart Hand is a complex prosthesis with four motors and forty sensors designed to provide realistic motion and sense to the user. Te sensors enabled it to become the first device of its kind to send signals back to the wearer, allowing them to feel what they touch. Phantom limb syndrome is the sensation amputees have that their missing body part is still there. The brain has remained open to receiving input from those nerves although they were cut off long ago. Likewise, impulses from the brain to control the missing limb still travel down the neurons towards the sight of amputation.

Scientists can use electronic sensors to pick up the control signals and relay them to a mechanical device. We’ve seen this technology used in the HAL exoskeleton from Cyberdyne, and in the i-Limb prostheses. Smart Hand is unique because it also takes advantage of those phantom limb pathways still being open. Doctors connect the sensors in the hand to the nerves in the stump of the arm, hence the patients can feel as well as in control of an artificial limb.


Imagen de previsualización de YouTube


The goal of the Smart Hand project is to create a replacement limb that is almost identical to the lost one. In both objectives, the Smart Hand is far from ultimate success. Four motors, although providing an impressive range of motion, do not have the full degrees of freedom, nor the variation in applied strength that a human hand has. It is amazing that the forty sensors can communicate with the human brain at all, but they do not provide nearly as much sensation as the millions of nerves in your biological hand. Yet, as mentioned in the video, the current Smart Hand prototype represents more than ten years of dedicated work.


Imagen de previsualización de YouTube


Robin af Ekenstam, the first amputee who tried their robotic hand, said it was just like using his real hand. He lost his hand after it was amputated in order to prevent a tumor spreading. He said: “It’s a feeling I have not had in a long time. When I grab something tightly I can feel it in the fingertips. It’s strange since I don’t have them any more! It’s amazing.”

He is able to use it in order to pick up objects, with a feedback manifested as a sense of touch on the fingertips of the prosthesis even at this stage of development. It is clear from his involvement in this project that this level of capability is well worth the time and effort involved. In other words, an imperfect Smart Hand is still a very desirable hand, and can perform remarkable tasks.

Beside limb replacement, it is thought the hand could also help eliminate ‘phantom pains’ that amputees and as a result improve their quality of life. Beyond that, if artificial limbs will one day match the human ones, there’s no reason they couldn’t be further improved. We would then see bionic limbs, or perhaps entirely bionic bodies, which exceed human limitations. Could these mechanical bodies be accepted as authentically human? The Olympic Committee already decided that some athletes with prostheses have an unfair advantage and are ineligible to compete. In the years to come we will see how society reacts when “good enough” becomes “good as new” and finally “better than ever”. [en línea] Novi Sad (SRB):, 10 de febrero de 2014 [ref. 11 de noviembre de 2009] Disponible en Internet:

El Rol de las Tecnologias Exponenciales en la Innovación Médica

27 06 2013

Introducción a las tecnologias exponenciales y ejemplos de como nos pueden ayudar a innovar y solucionar determinados problemas utilizando imaginación y creatividad.



Por Christian Assad: Interventional Cardiologist deeply interested in the incorporation of exponential technologies in medicine. Singularity University Alumni, FutureMed e-Magazine editor. [en línea] San Diego, CA (USA):, 27 de junio de 2013 [ref. junio de 2013] Disponible en Internet:

Woman with Quadriplegia Feeds Herself Using Mind-Controlled Robot Arm

27 12 2012

PITTSBURGH, Dec. 16, 2012 – Reaching out to “high five” someone, grasping and moving objects of different shapes and sizes, feeding herself dark chocolate. For Jan Scheuermann and a team of researchers from the University of Pittsburgh School of Medicine and UPMC, accomplishing these seemingly ordinary tasks demonstrated for the first time that a person with longstanding quadriplegia can maneuver a mind-controlled, human-like robot arm in seven dimensions (7D) to consistently perform many of the natural and complex motions of everyday life.

In a study published in the online version of The Lancet, the researchers described the brain-computer interface (BCI) technology and training programs that allowed Ms. Scheuermann, 53, of Whitehall Borough in Pittsburgh, Pa. to intentionally move an arm, turn and bend a wrist, and close a hand for the first time in nine years.

Less than a year after she told the research team, “I’m going to feed myself chocolate before this is over,” Ms. Scheuermann savored its taste and announced as they applauded her feat, “One small nibble for a woman, one giant bite for BCI.”

Jan Scheuermann, who has quadriplegia, brings a chocolate bar to her mouth using a robot arm she is guiding with her thoughts. Researcher Elke Brown, M.D., watches in the background. Click the photo to download it in high resolution. Photo credit: "UPMC"

“This is a spectacular leap toward greater function and independence for people who are unable to move their own arms,” agreed senior investigator Andrew B. Schwartz, Ph.D., professor, Department of Neurobiology, Pitt School of Medicine. “This technology, which interprets brain signals to guide a robot arm, has enormous potential that we are continuing to explore. Our study has shown us that it is technically feasible to restore ability; the participants have told us that BCI gives them hope for the future.”

In 1996, Ms. Scheuermann was a 36-year-old mother of two young children, running a successful business planning parties with murder-mystery themes and living in California when one day she noticed her legs seemed to drag behind her. Within two years, her legs and arms progressively weakened to the point that she required a wheelchair, as well as an attendant to assist her with dressing, eating, bathing and other day-to-day activities. After returning home to Pittsburgh in 1998 for support from her extended family, she was diagnosed with spinocerebellar degeneration, in which the connections between the brain and muscles slowly, and inexplicably, deteriorate.

“Now I can’t move my arms and legs at all. I can’t even shrug my shoulders,” she said. “But I have come to the conclusion that worrying about something is experiencing it twice. I try to dwell on the good things that I have.”

A friend pointed out an October 2011 video about another Pitt/UPMC BCI research study in which Tim Hemmes, a Butler, Pa., man who sustained a spinal cord injury that left him with quadriplegia, moved objects on a computer screen and ultimately reached out with a robot arm to touch his girlfriend.

“Wow, it’s so neat that he can do that,” Ms. Scheuermann thought as she watched him. “I wish I could do something like that.” She had her attendant call the trial coordinator immediately, and said, “I’m a quadriplegic. Hook me up, sign me up! I want to do that!”

On Feb. 10, 2012, after screening tests to confirm that she was eligible for the study, co-investigator and UPMC neurosurgeon Elizabeth Tyler-Kabara, M.D., Ph.D., assistant professor, Department of Neurological Surgery, Pitt School of Medicine, placed two quarter-inch square electrode grids with 96 tiny contact points each in the regions of Ms. Scheuermann’s brain that would normally control right arm and hand movement.

“Prior to surgery, we conducted functional imaging tests of the brain to determine exactly where to put the two grids,” she said. “Then we used imaging technology in the operating room to guide placement of the grids, which have points that penetrate the brain’s surface by about one-sixteenth of an inch.”

The electrode points pick up signals from individual neurons and computer algorithms are used to identify the firing patterns associated with particular observed or imagined movements, such as raising or lowering the arm, or turning the wrist, explained lead investigator Jennifer Collinger, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation (PM&R), and research scientist for the VA Pittsburgh Healthcare System. That intent to move is then translated into actual movement of the robot arm, which was developed by Johns Hopkins University’s Applied Physics Lab.

Two days after the operation, the team hooked up the two terminals that protrude from Ms. Scheuermann’s skull to the computer. “We could actually see the neurons fire on the computer screen when she thought about closing her hand,” Dr. Collinger said. “When she stopped, they stopped firing. So we thought, ‘This is really going to work.’”

Within a week, Ms. Scheuermann could reach in and out, left and right, and up and down with the arm, which she named Hector, giving her 3-dimensional control that had her high-fiving with the researchers. “What we did in the first week they thought we’d be stuck on for a month,” she noted.

Before three months had passed, she also could flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to what scientists call 7D control. In a study task called the Action Research Arm Test, Ms. Scheuermann guided the arm from a position four inches above a table to pick up blocks and tubes of different sizes, a ball and a stone and put them down on a nearby tray. She also picked up cones from one base to restack them on another a foot away, another task requiring grasping, transporting and positioning of objects with precision.

“Our findings indicate that by a variety of measures, she was able to improve her performance consistently over many days,” Dr. Schwartz explained. “The training methods and algorithms that we used in monkey models of this technology also worked for Jan, suggesting that it’s possible for people with long-term paralysis to recover natural, intuitive command signals to orient a prosthetic hand and arm to allow meaningful interaction with the environment.”

In a separate study, researchers also continue to study BCI technology that uses an electrocortigraphy (ECoG) grid, which sits on the surface of the brain rather than slightly penetrates the tissue as in the case of the grids used for Ms. Scheuermann.

In both studies, “we’re recording electrical activity in the brain, and the goal is to try to decode what that activity means and then use that code to control an arm,” said senior investigator Michael Boninger, M.D., professor and chair, PM&R, and director of UPMC Rehabilitation Institute. “We are learning so much about how the brain controls motor activity, thanks to the hard work and dedication of our trial participants. Perhaps in five to 10 years, we will have a device that can be used in the day-to-day lives of people who are not able to use their own arms.”

The next step for BCI technology will likely use a two-way electrode system that can not only capture the intention to move, but in addition, will stimulate the brain to generate sensation, potentially allowing a user to adjust grip strength to firmly grasp a doorknob or gently cradle an egg.

After that, “we’re hoping this can become a fully implanted, wireless system that people can actually use in their homes without our supervision,” Dr. Collinger said. “It might even be possible to combine brain control with a device that directly stimulates muscles to restore movement of the individual’s own limb.”

For now, Ms. Scheuermann is expected to continue to put the BCI technology through its paces for two more months, and then the implants will be removed in another operation.

“This is the ride of my life,” she said. “This is the rollercoaster. This is skydiving. It’s just fabulous, and I’m enjoying every second of it.”

In addition to Drs. Collinger, Tyler-Kabara, Boninger and Schwartz, study co-authors include Brian Wodlinger, Ph.D., John E. Downey, Wei Wang, Ph.D., and Doug Weber, Ph.D., all of PM&R; and Angus J. McMorland, Ph.D., and Meel Velliste, Ph.D., of the Department of Neurobiology, Pitt School of Medicine.

One Giant Bite: Woman with Quadriplegia Feeds Herself Using Mind-Controlled Robot Arm VIDEO

The BCI projects are funded by the Defense Advanced Research Projects Agency, National Institutes of Health grant 8KL2TR000146-07, the U.S. Department of Veteran’s Affairs, the UPMC Rehabilitation Institute and the University of Pittsburgh Clinical and Translational Science Institute.

For more information about participating in the trials, call 412-383-1355. [en línea] Pittsburgh (USA):, 27 de diciembre de 2012 [ref. 16 de diciembre de 2012] Disponible en Internet:

Proyecto médico pionero: uso de TIC en tratamiento de personas con deterioro cognitivo grave

27 09 2012

TIC en deterioro cognitivo graveEste proyecto implantado por Casta Salud en Langreo, se basa en aprovechar los beneficios que la terapia con animales puede producir en este tipo de pacientes.


Casta Salud, perteneciente al grupo Eptisa, primera empresa privada en el sector de los servicios de salud mental, y psicogeriatría en España, ha puesto en marcha en su centro de Langreo, (Asturias), un proyecto médico pionero, financiado por la Unión Europea, que tiene como objetivo  investigar el uso de nuevas tecnologías TIC en el tratamiento de personas con deterioro cognitivo grave, desde demencia mínima, que se caracteriza por déficit limitado y variable en la adquisición de nueva información, hasta la demencia severa que supone una pérdida muy importante de procesos amnésicos, rellenando las lagunas con cuadros confabulatorios, incapacidad total de resolución de problema e incluso de reconocer a su familia.

Este proyecto implantado por Casta Salud en Langreo, se basa en aprovechar los beneficios que la terapia con animales puede producir en este tipo de pacientes como son: efecto psicológico, fisiológico y social, por lo que Casta Salud ha incorporado a su terapia a NUKA, un robot con forma de foca bebé, con la que se busca aplicar las técnicas de interacción animal con estos colectivos.

Nuka simula la interacción animal mediante el uso de cinco sensores (táctil, luz, audición, temperatura y sensores de postura). De esta manera, el robot es capaz de  percibir a las personas y sus entornos y de ofrecer una respuesta específica a cada situación. Los residentes no diferencian al robot de un animal vivo, lo que para la comunidad terapéutica representa un ejercicio de simulación con un objeto real y básico en la vida diaria.


Principales beneficios

Entre los principales beneficios que se persiguen a través de la colaboración con el paciente mediante su interacción con NUKA destacan; un efecto psicológico que se manifiesta en forma de relajación y motivación; un efecto fisiológico, del que se deriva una mejora de los signos vitales, y un efecto social, dado que se consigue una activación de la comunicación entre los pacientes y los profesionales que les atienden.

Las reacciones de NUKA ante los estímulos que recibe devuelven la sonrisa a una persona deprimida, estimulan su carácter social y los bloqueos afectivos, ayudan a controlar sus impulsos violentos y proporciona actividad y entretenimiento, disminuyendo la dependencia que los pacientes tienen de sus cuidadores favoreciendo su autonomía. Todo ello conduce a una mejora de la situación global: cognitivo, estado de ánimo, alteraciones conductuales y comunicación de los pacientes.

Olga Ginés, Directora General de Casta Salud señala que: “desde el inicio del proyecto, se aprecia un mejor rendimiento de los residentes en las sesiones terapéuticas, mostrando un mayor grado de interacción con la comunidad de residentes, al tiempo que un incremento del nivel de alerta y de la capacidad comunicativa”


La institución ha contado con una subvención cercana a los 70.000 euros para poder desarrollar esta investigación, cuyo elemento más costoso es el propio robot, que constituye un sofisticado desarrollo tecnológico procedente de la industria japonesa. Una vez testado el comportamiento de los pacientes en su relación con la mascota, se espera que esta nueva forma de terapia se implante en los centros de Casta Salud en Guadarrama, Langreo, Arévalo y Ontiveros, donde actualmente residen más de 1.000 pacientes con enfermedad mental.

Junto a Casta Salud, participan en este proyecto pionero en España el Centro de Investigación de Enfermedades Neurológicas, perteneciente a la Fundación Reina Sofía, y Alzheimer León, especializados en el tratamiento de demencias, lesiones cerebrales y patologías relacionadas con la tercera edad.

Hasta la fecha, sólo siete países (Japón, Reino Unido, Suecia, Italia Corea, Brunei y Estados Unidos) han realizados experiencias con robots en el tratamiento de enfermedades mentales. Este hecho subraya el carácter pionero de esta investigación en España. El proyecto se puso en marcha el pasado mes de mayo, y ya se ha realizado una primera evaluación que arroja resultados muy positivos para los facultativos de Casta Salud. [en línea] Torrelodones (ESP):, 27 de septiembre de 2012 [ref. 11 de septiembre de 2012] Disponible en Internet:

Robots Get a Feel for the World at USC Viterbi

9 07 2012

Robots equipped with tactile sensor better able to identify materials through touch than humans, enabling more lifelike prosthetics.

a robot hand equipped with SynTouch's BioTac sensors.

a robot hand equipped with SynTouch’s BioTac sensors.


What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel – or at least the ability to identify materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the BioTac is even more sensitive.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by a pair of similar textures that human subjects making their own exploratory movements could not distinguish at all.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to judge the feel of consumer products and even human skin.

Robots Get A Feel For The World from USC Viterbi on Vimeo.  Click here.

Loeb and Fishel are partners in SynTouch LLC, which develops and manufactures tactile sensors for mechatronic systems that mimic the human hand. Founded in 2008 by researchers from USC’s Medical Device Development Facility, the start-up is now selling their BioTac sensors to other researchers and manufacturers of industrial robots and prosthetic hands.

Another paper from this research group in the same issue of Frontiers in Neurorobotics describes the use of their BioTac sensor to identify the hardness of materials like rubber.

Original funding for development of the sensor was provided by the Keck Futures Initiative of the National Academy of Sciences to develop a better prosthetic hand for amputees. SynTouch also received a grant from the National Institutes of Health to integrate BioTac sensors with such prostheses. The texture discrimination project was funded by the U.S. Defense Advanced Research Projects Agency (DARPA) and the material hardness study by the National Science Foundation.

Fishel just completed his doctoral dissertation in biomedical engineering based on the texture research. Loeb, also Director of the USC Medical Device Development Facility, holds 54 U.S. Patents and has published over 200 journal articles on topics ranging from cochlear implants for the deaf to fundamental studies of muscles and nerves. [en línea]  Los Angeles (USA):, 09 de julio de 2012 [ref. 18 de junio de 2012] Disponible en Internet:

Principales avances médicos gracias a la convergencia entre biología y tecnología

5 01 2012

En el campo de la salud se está dando una cada vez mayor convergencia entre biología y tecnología. ¿Se imagina un futuro en el que su médico sea una máquina? ¿O en el que se puedan imprimir riñones o huesos de repuesto? ¿Prótesis robóticas, celulares que controlan nuestros niveles de azúcar o realidad aumentada para detectar, por ejemplo, un cáncer de piel?

Aunque muchas de estas tecnologías están todavía en pañales, se sorprendería al saber cuán cerca estamos de emplear algunos de estos recursos en medicina y cómo éstos revolucionarán los tratamientos médicos en la próxima década. Contenido relacionado Una impresora 3D para regenerar huesos Celulares para descubrir si una medicina ha sido falsificada El iPhone, ¿el nuevo estetoscopio?

¿Adiós a los médicos… Humanos?

Algunos recordarán aquella máquina que escaneaba la salud de los protagonistas de Viaje a las estrellas. Fue precisamente al ver esta serie de ciencia ficción que a Walter Brouwer, uno de los fundadores de la compañía Scanadu, se inspiró para plantear la fabricación del Medical Tricorder.

Estamos hablando de Inteligencia Artificial, toda esa serie de programas que piensan y llegan a conclusiones a partir del procesamiento y contraste de datos.

El concepto es elaborar un dispositivo capaz de obtener diversos datos del paciente (como la presión arterial o la presencia de infecciones a partir del análisis de la sangre o la saliva) y que a partir de ellos elabore un diagnóstico y diseñe un tratamiento.

La Fundación X-PRIZE propuso un premio de US$10 millones para quién desarrolle esta tecnología. Ya hay una decena de empresas trabajando en un modelo y el objetivo es que esté disponible comercialmente en tres o cinco años.

Por su parte, otros programas inteligentes como Siri de Apple y Watson de IBM ya se están incorporando al mundo de la medicina.

Combinados con los sistemas de computación en nube, pueden convertir nuestros celulares en médicos personales en potencia.

Imprimiendo riñones

Las impresoras 3D, cada vez más asequibles, van a dar mucho que hablar en muchas áreas, pero en medicina podrían ser particularmente revolucionarias. Si un paciente ha perdido una pierna, estas máquinas podrían escanear el miembro seccionado y elaborar una prótesis que se ajuste a la medida y color de piel del paciente.

Pero el concepto va muchísimo más allá y podría poner fin al drama de aquellos pacientes que necesitan un transplante de órganos.

La idea es sustituir la “tinta” que emplean estas máquinas por células madre para fabricar riñones, hígados o corazones, utilizando el ADN del paciente, lo que evitaría rechazos. Estamos hablando de la bioimpresión y no es un concepto nuevo.

Hace años que se viene planteando esta posibilidad, pero sólo recientemente parece estar, nunca mejor dicho, materializándose.

En marzo de 2011, el investigador Anthony Atala, del Instituto de Medicina regenerativa Wake Forest en Estados Unidos, sorprendió a un auditorio entero durante una conferencia al imprimir un riñón en vivo y en directo.

Cierto que el riñón no era funcional, pero estaba hecho de tejido humano.

Hospital líquido

Jorge Juan Fernández, director del área de EHealth y Salud 2.0 en el clic Hospital Sant Joan de Déu de Barcelona, es impulsor del primer “hospital líquido” de España, proyecto que pretende que los hospitales trasciendan virtualmente sus muros para interactuar tanto con el paciente como con el resto de la comunidad médica.

Esto lo hace a través de diversos recursos de internet, desde cuentas en Facebook en donde, por ejemplo, se ofrecen consejos de salud, páginas donde se recopilan publicaciones recientes en el campo de la medicina, o cuentas de Twitter con vínculos a videos de conferencia orientadas a padres preocupados por el control de la salud de sus hijos.

La oferta de información digital del Hospital Sant Joan de Déu es amplia y trasciende continentes.

En 2010 empezó a retransmitir tanto a España como a América Latina un Webcast (retransmisión en directo a través de internet) con cursos y jornadas de formación para médicos y enfermeras, que permite seguir los cursos en tiempo real o en diferido.

En cuanto al paciente, señala Fernández, con las redes sociales éste se convierte en una especie de “corresponsal de la salud”, participando aportando información, o su opinión, e interaccionando de forma distinta con el médico.

Hace años que el concepto de hospital líquido se viene extendiendo en Estados Unidos.

En este país, 575 centros ya poseen una cuenta de Youtube, 1.068 tienen una cuenta en Facebook, 814 en Twitter y 149 publican blogs, según, página de información sobre redes sociales para centros de salud.

En ese país, hace años que funcionan con éxito sociales como clic PatientsLikeme (pacientes como yo) o clic CureTogether (Curémonos juntos), donde pacientes con dolencias similares comparten sus experiencias o conocimientos, e incluso donde se pueden impulsar campañas de clic crowdsurcing (tercerización masiva).

Sensores y aplicaciones

En el último año hemos visto un auge de aplicaciones para teléfonos inteligentes y la medicina no está exenta de ellos.

Muy pronto podremos controlar nuestra salud usando nuestros celulares.

Imagínese la situación: un teléfono que accede a nuestro registro médico, que controlar tu ritmo cardíaco y envía los datos a la nube para que los vea el médico.

Ya son muy populares, sobre todo en Estados Unidos, aplicaciones como Fitbit o Jawbone UP, que nos ayudan a mantenernos en forma.

Al final, estas aplicaciones se convierten en lo que comentábamos al inicio de este artículo, una especie de máquina que escanea el estado de nuestra salud y que automáticamente elabora un diagnóstico o tratamiento.

Un recurso, que podría tener una gran repercusión sobre todo en países en desarrollo, con escaso acceso a servicios de salud.

Aparte del celular, expertos en electrónica también están desarrollando toda clase de sensores. Aparatos cada vez más pequeños y baratos que podrán medir nuestra temperatura o presión y transmitir esos datos por Internet.

La delgada línea entre lo humano y lo robótico

“Son tecnologías que se mezclan con la biología. Esto irá cada vez a más y ahí vendrá el debate sobre el límite entre lo humano y la máquina”, explica Fernández.

Se refiere a la elaboración de implantes ortopédicos biónicos, como los que lleva el deportista sudafricano clic Oscar Pistorius, corredor que practica este deporte con una especie prótesis con forma de guadaña.

Pero también destaca la aparición de los primeros exoesqueletos, que quizás podrían en un futuro hacer caminar a las personas parapléjicas.

Son exoesqueletos robóticos que detectan los impulsos nerviosos emitidos por el cerébro hacia los músculos.

De he hecho, trascendiendo el mundo de la medicina, Estados Unidos está invirtiendo millones en el desarrollo de exoesqueletos mecánicos para unidades de Marines, con el fin de aumentar su rendimiento. [en línea] Londres (Reino Unido):, 5 de enero de 2012 [ref. 4 de enero de 2012] Disponible en Internet: