On April 15th, the international academic journal “Nature Machine Intelligence” published a paper online at around 23:00 on April 14th reporting a wearable system that can provide navigation assistance for blind and partially visually impaired individuals. The system is based on AI algorithms as its intelligent core, which capture visual information such as images through cameras. Then, based on AI algorithms, it recognizes, judges, and outputs images to select an accessible route for the user. Finally, in the navigation prompt, it comprehensively provides auditory and tactile signals of the left and right hand skin. For example, bone conduction headphones transmit brief sound signals, and stretchable artificial skin that can be worn on the wrist can transmit vibration signals to the user to guide the direction of movement, avoid objects on both sides, and assist visually impaired people in passing through mazes, avoiding obstacles, and grasping certain objects.
Scan code to share