Imagine a world where people who cannot see can still move freely, confidently navigating their surroundings without fear. That world is becoming a reality.
In a study published this week in Nature Machine Intelligence, researchers from China unveiled a new wearable AI system that empowers blind and visually impaired individuals to move independently. Using a combination of video, vibrations, and audio prompts, the system provides real-time guidance to users.
How the AI wearable helps blind users navigate with ease
The setup includes a camera, an AI processor, and bone conduction headphones. Mounted between the user’s eyebrows, the camera captures live footage, which the AI analyses instantly. Short, simple audio cues are then delivered directly through the headphones — without blocking ambient sounds.
Adding another layer of support, thin sensors worn on the wrists detect nearby obstacles. If a wall or object is close, the corresponding wrist vibrates, alerting the user to change course.
Developed by a team from Shanghai Jiao Tong University, Shanghai Artificial Intelligence Laboratory, East China Normal University, Hong Kong University of Science and Technology, and the State Key Laboratory of Medical Neurobiology at Fudan University, the system marks a significant leap forward.
Also Read
“This research paves the way for user-friendly visual assistance systems, offering alternative avenues to enhance the quality of life for people with visual impairment,” the team wrote.
AI system uses simple audio and haptic cues for real-time navigation
Lead researcher Gu Leilei, an associate professor at Shanghai Jiao Tong University, emphasised the team’s commitment to making the system as practical and easy to use as possible.
“Lengthy audio descriptions of the environment can overwhelm and tire users, making them reluctant to use such systems,” Gu told The South China Morning Post.
“Unlike a car navigation system with detailed directions, our work aims to minimise AI system output, communicating information key for navigation in a way that the brain can easily absorb,” Gu said.
Lightweight and compact, the equipment is designed for all-day comfort, allowing users to move naturally without feeling burdened.
Tested with visually impaired users and designed for daily use
The system was tested indoors with 20 visually impaired volunteers. After just 10 to 20 minutes of practice, most users could operate it smoothly, the study said.
Setting a destination is simple — users issue a voice command, and the AI charts a safe route, offering only essential prompts along the way.
Currently, the system can recognise 21 common objects, including beds, tables, chairs, sinks, televisions, and food items. Researchers plan to expand its recognition capabilities further.
Researchers plan to expand object recognition and outdoor use
Looking ahead, Gu said the team’s next focus is to refine the system for outdoor environments, where navigation challenges are far more complex. Enhancements could include improved object detection, dynamic route adaptation, and integration with real-world GPS systems.
With further development, this AI-powered wearable may offer a new level of autonomy and confidence to millions of visually impaired people worldwide.

)