Acoustic Touch Lets the Blind ‘See’ Using Sound

Researchers in Australia are developing smart glasses for blind people using “acoustic touch” to turn images into sounds. The research suggests that wearable spatial audio technology could help those who are blind or have significantly impaired vision to locate nearby objects.

The development of smart glasses for this purpose is becoming viable because of improvements in augmented reality, practical wearable camera technology, and deep learning-based computer vision. The smart glasses incorporate cameras, GPS systems, a microphone, and inertial measurement and depth sensing units to deliver navigation, voice recognition control, or rendering objects, text, or surroundings as computer-synthesized speech.

Colleagues at the University of Technology Sydney (UTS) and the University of Sydney investigated the addition of acoustic touch to smart glasses that use head scanning and the activation of auditory icons as objects appear within a defined field-of-view (FOV).

The researchers claim that acoustic touch offers several advantages over existing approaches, including ease of integration with smart glasses technology and greater intuitive use than computer-synthesized speech. The team created a foveated audio device (FAD) to test their assumptions on seven volunteers with no or low vision, plus seven sighted blindfolded participants. The FAD comprises a smartphone and the NREAL augmented-reality glasses. The team attached motion-capture reflective markers to enable tracking of head movements.

The FAD performs object recognition and determines its distance using the stereo cameras on the glasses. It also assigns appropriate auditory icons to the objects, such as a page-turning sound for books. The volunteers participated in seated and standing exercises and compared FAD performance with two conventional speech cues: clock-face verbal directions and the sequential playing of auditory icons from speakers co-located with each item. For blind or low-vision participants, using the FAD was comparable to the two idealized conditions. The blindfolded sighted group, however, performed worse when using the FAD.

A standing reaching task required participants to use FAD to search and reach for a target item situated among multiple distractor items. Participants were asked to find objects placed on three tables that were surrounded by four bottles of different shapes, assessing the functional performance of the system and human behavior when using full-body movement during searching.

Leave A Reply

Your email address will not be published.