Digital navigation tools for helping people with visual impairments have become increasingly popular in recent years. While conventional navigation solutions give routing instructions to the user, systems such as GoogleMaps, BlindSquare, or Soundscape offer additional information about the surroundings and, thereby, improve the orientation of people with visual impairments. However, these systems only provide information about static environments, while dynamic scenes comprising objects such as bikes, dogs, and persons are not considered. In addition, both the routing and the information about the environment are usually conveyed by speech.
We address this gap and implement a mobile system that combines object identification with a sonification interface. Our system can be used in three different scenarios of macro and micro navigation: orientation, obstacle avoidance, and exploration of known and unknown routes. Our proposed system leverages popular computer vision methods to localize 18 static and dynamic object classes in real-time. At the heart of our system is a mixed reality sonification interface which is adaptable to the user’s needs and is able to transmit the recognized semantic information to the user. ... mehrThe system is designed in a user-centered approach. An exploratory user study conducted by us showed that our object-to-sound mapping with auditory icons is intuitive. On average, users perceived our system as useful and indicated that they want to know more about their environment, apart from wayfinding and points of interest.