Drawing inspiration from bats and their echolocation abilities, scientists have designed intelligent glasses that convert visual cues into specific auditory symbols, offering the blind and those with vision impairments a novel way to understand their environment. This innovation holds the promise of revolutionizing accessibility for those with visual challenges.
Assistive technologies aim to craft solutions that empower those with sensory impairments to break through everyday challenges. Specifically, blindness and low vision (BLV) can significantly impede daily activities and social engagements.
A major focus in assistive technology research revolves around supplementing sensory perception using visual, tactile, and auditory feedback. Enter the team from the University of Technology Sydney (UTS), who have pioneered smart glasses that convert visual data into unique sound markers, termed “acoustic touch,” offering a fresh perspective for BLV individuals.
Mirroring the way bats employ echolocation, where they emit sound waves that reflect off objects, relaying details about the object’s dimensions and proximity, the scientists engineered these innovative glasses, naming them the Foveated Audio Device (FAD).
The FAD amalgamates augmented reality glasses with an OPPO Find X3 Pro Android smartphone. The Unity Game Engine 2022 harmonizes the glasses’ audio reception with the camera and head tracking features. Collectively, this allows the FAD to represent objects with distinct auditory symbols once they come into the device’s purview.
In their trial, the team tested the glasses on 14 adults, comprising seven BLV participants and seven sighted individuals who wore blindfolds as a control group.
The trial included a training phase, a seated task involving object scanning and sonification on a table using the FAD, and a standing assignment where the participants navigated a cluttered area while seeking specific items. The objects in the study were a bowl, book, cup, and bottle.
The results revealed that the device considerably boosted the BLV participants’ capacity to identify and interact with objects without excessive cognitive strain.
With further refinement, this acoustic touch mechanism might become a cornerstone in assistive devices, enabling BLV individuals to perceive their surroundings in a more enriched manner.