WonderVision

New ways for the blind

How can blind people move around in public spaces without barriers and make full use of mobility offers? A technology solution from IAV shows how the smartphone opens up a completely new world for visually impaired people in public hotspots, also when it comes to freedom of movement.

Komp 1 0 01 06 20
WonderVision is a smart assistance system that supports blind people and navigates them to their destination in public places.

In Germany alone, there are more than 80,000 blind people, and several hundred thousand people are visually impaired, with the trend rising. However, their independence in the public space is often more or less limited by inadequate access to information such as road or information signs.

Precise environment and object recognition are of central importance for the future topic of autonomous driving, as much as the artificial intelligence (AI) required for image acquisition and processing.

IAV has been developing system and software solutions for autonomous driving for years and wants to use its know-how in methods and technology to help people and the environment benefit from digital progress.

With “WonderVision”, IAV has developed a system that makes it easier for people with visual impairments to orient themselves via computer vision in busy indoor spaces, such as train stations, airports, hospitals or shopping centers. In the current use case, IAV focuses on railway stations, where users are guided to their destination via smartphone.

“For both the visually and the blind, mobility is a central aspect of quality of life,” says Dr. Ahmed Hussein, Project Manager “WonderVision” and Concept Developer Autonomous Driving at IAV.  “As a technology provider, we want to bring our expertise to other fields of application and, for example, make the environment more accessible to people with visual impairments.”

The application runs as follows: The “WonderVision” technology uses image recognition as well as distance and depth estimation to record a wide variety of direction indicators, such as signs, textual notes and pictograms. From this, the system filters relevant navigation notes and makes them available to the user via voice output through an app in the smartphone.

Development lead in the backend

IAV focuses on the evaluation of the data in the AI-supported backend, with the aim of integrating the features from object and environment detection into the existing app of a mobility or service provider. IAV has already given proof (“Proof of Concept” – PoC) that innovative computer vision methods for integrating detection features via smartphone in the backend can be implemented.

For 2023, the company is aiming for the further development of the PoC for the initial version of a product (“Minimum Viable Product” – MVP). For almost a year, IAV has been driving the development of the backend control center of the “WonderVision” technology and has built up extensive know-how.

In addition to a running backend server with functioning sign and text recognition at stations, IAV has set up a signaling server in the cloud. This is responsible for the communication between the customer frontend and backend, and an important lever to scale the “WonderVision” technology.

Unlimited potential for detection features

As an additional “quality barrier”, IAV has established an app for error elimination with connection to the backend server to test its own algorithms before they are integrated into a customer app.

Whether it is target navigation, location determination, person recognition or simply a product search in the supermarket – the possibilities to add additional features
to “WonderVision” are almost unlimited.

“By focusing on backend development, we are and remain flexible in order to be able to serve a wide range of use cases and different customers according to their needs,” explains Christian März, Data Scientist in the Analytics & AI Methods team at IAV. “Railway stations are only the beginning for us.”

Contact: 
ahmed.hussein@iav.de

marko.gustke@iav.de