MIT Engineers Develop Ultrasound Wristband for High-Precision Wireless Control of Dexterous Robotic Hands

MIT engineers have created an AI-powered ultrasound wristband that tracks internal muscle movements to control robotic hands and virtual objects in real time.

By: AXL Media

Published: Mar 25, 2026, 6:55 AM EDT

Source: Information for this report was sourced from Massachusetts Institute of Technology

MIT Engineers Develop Ultrasound Wristband for High-Precision Wireless Control of Dexterous Robotic Hands - article image
MIT Engineers Develop Ultrasound Wristband for High-Precision Wireless Control of Dexterous Robotic Hands - article image

Decoding the Mechanics of Human Dexterity

The human hand is a biological marvel governed by a complex network of 34 muscles and over 100 tendons, making it the most nimble part of the anatomy. Capturing this nuance has long been a hurdle for engineers attempting to bridge the gap between human intent and robotic execution. According to Xuanhe Zhao, a professor of mechanical engineering at MIT, previous attempts using cameras or sensor-laden gloves often restricted natural movement or suffered from environmental interference. By shifting the focus from external tracking to internal imaging, the MIT team has developed a way to "see" the mechanical triggers of hand movement before they even result in a full gesture.

Ultrasound Imaging as a Puppet Master's Map

The core of the technology lies in a wearable wristband that produces continuous ultrasound images of the wearer's internal wrist structure. These images capture the shifting positions of tendons and ligaments, which the researchers liken to the strings of a puppet. According to Gengxi Lu, a lead researcher on the project, every finger movement corresponds to a specific "state" of these internal strings. By documenting these internal shifts, the device creates a comprehensive map of the hand's 22 degrees of freedom, allowing for a level of tracking sensitivity that far exceeds traditional electrical muscle sensors.

Artificial Intelligence and Motion Translation

To translate raw, black-and-white ultrasound data into fluid digital movement, the team utilized a sophisticated AI algorithm trained on a vast library of hand gestures. During the training phase, volunteers wore the wristband while being recorded by a 360-degree camera system, allowing the AI to correlate specific ultrasound patterns with exact finger positions. According to the study published in Nature Electronics, the algorithm can now predict complex motions—such as the 26 signs of American Sign Language—with high precision. This pairing of imaging and machine learning allows the system to operate in real time, providing a seamless link between the user's arm and a remote machine.

Categories

Topics

Related Coverage