Mind-Controlled Arm Boasts Fine Precision

Posted in Research and Development by Kristopher Sturgis on December 18, 2014
Robotic arm that can be controlled with the mind
This robotic arm used at the University of Pittsburgh offers ten dimensional control of hand movements.

Jan Scheuermann, a patient with longstanding quadriplegia, has managed to control a robotic arm with a range of complex human hand movements using a brain–machine interface, according to a story from the Institute of Physics. The maneuverability of the mind-controlled robotic arm has recently increased from seven dimensions to ten (which includes 3-D translation, 3-D orientation, and 4-D hand shaping, according to an abstract describing the research). This gives Scheuermann considerable more flexibility than she had in 2012, when she could use the robotic arm to feed herself, and give researchers at the University of Pittsburgh “high fives” and a “thumbs up” sign. The study, also at the University of Pittsburgh, showed that high-dimensional precision in the control of a prosthetic arm could be accomplished using relatively simple algorithms.

The extra dimensions come from four distinct hand movements: finger abduction, a scoop, thumb extension, and a pinch. These new movements have enabled Scheuermann to pick up, grasp, and move a range of objects with much more precision than before; previously, grasping control was limited to a single dimension. The hope is that these functions can help us move closer to a world where prosthetics move and feel with as much ease as normal functioning limbs.

Scheuermann was approved for the study in 2012 and underwent surgery soon after to be fitted with two quarter-inch electrode grids, each fitted with 96 tiny contact points, in the regions of her brain that are responsible for right arm and hand movements. Afterwards, the electrode grids were connected to a computer, creating a brain-machine interface. Now, the 96 individual contact points could pick up pulses of electricity that were fired between neurons. Computer algorithms were then used to decode these firing signals and identify certain patterns associated with a particular arm movement, such as raising the arm, or turning the wrist.

This allowed Scheuermann to simply think of controlling certain arm movements before watching the robotic arm carry out the commands. She could force the robotic arm to reach out to objects, as well as move it in a number of directions while flexing and rotating the wrist.

The technology obviously comes with various roadblocks, as researchers continue to refine the technology to work as seamlessly with the user as possible. A recent study from North Carolina State University spent significant time examining the various challenges of powered bionic limbs, in an effort to identify the major issues facing the technology. The hope is that the data could provide useful feedback for people like Scheuermann, who depend on a relationship with powered bionic limbs that can work both safely and efficiently.

As for Scheuermann, two years on from the beginning of her journey, she can now successfully maneuver the robotic arm in more dimensions through a number of hand movements, allowing for more detailed interaction with objects.

Researchers hope that with more time, they will continue to improve the level of control with additional participants in an effort to make the system more robust and beneficial. It may not be long before countless patients can be benefiting from this emerging technology, providing an advanced brain-machine interface prosthesis that can mimic the movements of real human limbs unlike ever before.

Refresh your medical device industry knowledge at MD&M West, in Anaheim, CA, February 10–12, 2015.

Kristopher Sturgis is a contributor to Qmed and MPMN.

Like what you’re reading? Subscribe to our daily e-newsletter.