Amputees can control a robotic hand with their thoughts — plus machine learning.

What’s new: University of Michigan researchers developed a system that uses signals from an amputee’s nervous system to control a prosthetic hand.

How it works: The researchers grafted bits of muscle onto the severed nerve bundles at the ends of amputees’ forearms, then implanted electrodes into the muscle. They amplified and recorded the electric signals transmitted to the nerves when the recipients thought about, say, making a fist, pointing a finger, or rotating a thumb. Then they trained a pair of models to match the signals with the corresponding hand motions.

  • A naive Bayes classifier learned to associate nerve signal patterns with common hand shapes.
  • The researchers asked the subjects to mimic a virtual thumb as it made back-and-forth and side-to-side motions on a computer screen. A Kalman filter took in the electrical signals and the position and velocity of the avatar and learned to control the digit.
  • Once trained, the software enabled the subjects to pick up and move objects and play Rock, Paper, Scissors.

Behind the news: Other research groups are using similar methods to control robotic prostheses. Some promising approaches:

  • Epineural electrodes wrap around nerves like a cuff to track signals from the brain.
  • Intraneural electrodes tap into nerves using needles, so researchers can target brain signals more precisely.
  • Targeted muscle reinnervation re-routes nerves from a severed limb into a nearby muscle. Sensors attached to the skin pick up the signals and transmit them to a prosthesis.

Why it matters: Nearly two million Americans have lost a limb, along with millions more worldwide. More responsive prostheses could dramatically improve their quality of life.

We’re thinking: Will they train robotic hands to do COVID-19-safe, palms-together namaste greetings?

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox