Tuesday, December 2, 2025

Researchers Developed AI-Powered Wearables

AI-Powered Wearables
A new wearable system was developed that can read your gestures so accurately that you can control a robot while sprinting, bouncing in a car, or drifting through choppy ocean waves.

And for the first time, the motion noise that usually ruins these signals no longer matters.

Engineers have built a next-generation human–machine interface that works in real-world conditions. The breakthrough brings gesture-based control closer to everyday use, from medical rehab to underwater robotics.

Developed at the University of California San Diego, the system pairs soft, stretchable sensors with a deep-learning engine that cleans noisy data in real time, yielding a reliable interface that interprets natural arm gestures under nearly any disturbance.

Wearable gesture sensors typically fail when the user moves too much.

"Wearable technologies with gesture sensors work fine when a user is sitting still, but the signals start to fall apart under excessive motion noise," said co-first author Xiangjun Chen. "Our system overcomes this limitation."

This technology could transform how people interact with machines in high-motion or unpredictable environments.

Patients with limited mobility could use simple gestures to control robotic aids without precise finger movement.

Industrial workers and first responders could operate tools or robots hands-free in hazardous settings. Even divers or remote operators might command underwater robots despite turbulent currents.

Consumer gadgets could also benefit, enabling gesture controls that stay reliable during everyday motion, including walking, riding in a car, or exercising.

The project is the result of collaboration between the labs of professors Sheng Xu and Joseph Wang at UC San Diego.

According to the researchers, this is the first wearable human-machine interface that consistently performs across such a broad range of motion disturbances.

The soft electronic patch, glued onto a cloth armband, integrates motion sensors, muscle sensors, a Bluetooth microcontroller, and a stretchable battery into a thin, multilayered package.

It collects signals from the arm and feeds them to a specialized deep-learning model that strips out interference and identifies the intended gesture.

"This advancement brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life," Chen said.

The team stress-tested the system in extreme conditions. Participants used it to control a robotic arm while running, while exposed to high-frequency vibrations, and under combinations of disruptive motions.

To push the limits further, the researchers validated it in simulated ocean scenarios inside the Scripps Ocean-Atmosphere Research Simulator.

The tank recreated both lab-generated and real sea motion, and the wearable still delivered accurate, low-latency performance.

No comments:

Post a Comment