This app uses AR, ‘3D sound’ and a camera to guide blind people through big cities

When blind or visually impaired people try to get around a city like Paris, they can easily get lost the moment they exit a metro station.

A pair of young French engineering students are working to address this problem by developing an augmented reality (AR) navigation app that identifies the most practical route for them and uses “spatial sound,” also known as 3D sound, to guide them on the right path. address.

“We’re trying to do something very simple where you get the 3D sound from the right direction. You turn in the direction the sound is coming from and you’re ready to go,” SonarVision co-founder and CEO Nathan Daix told Euronews Next.

The app is currently in development and testing, but the young startup aims to have it available in 2023. The prototype works in Paris, but it can be rolled out “fairly easily” in other big European capitals, he said.

There are already apps that alert the user to surrounding points of interest, such as Blindsquare and Soundscape, but SonarVision’s added value, he said, is guiding users from point A to B like a “super-high-precision GPS.” also very intuitive.

Major wayfinding apps, such as Google Maps and Apple Maps, have not been designed to accommodate the needs of the visually impaired and are difficult to use with screen readers, he said.

“One of the most frustrating issues with many of these products is accuracy,” Daix explained.

“GPS in cities can be in good times with an accuracy of 4 to 5 meters. But at the worst times, which is at least 30 percent of the time, you get an inaccuracy of more than 10 meters.

“That means the GPS will tell you that you’ve arrived at your bus stop, but it’s actually across the street, and you still need to figure out how to get to your actual bus stop and have no idea where it is.” .

Scanning buildings and city streets

To address this, SonarVision uses the phone’s camera to scan buildings using AR technology and compares them to Apple’s database of scanned buildings for a given city.

“This allows us to perform precise geographic tracking of our users with an accuracy of between 20 centimeters and one meter,” Daix said, adding that this allowed the app to keep the user at intersections and sidewalks avoiding, as far as possible. , stairs and construction areas. .

For the technology to work, all the user needs is a headset and an iPhone with a camera pointed at the road, although in the future, the camera could be part of the AR goggles for added convenience.

“Your phone would be in your pocket and you would have the glasses on your face doing the seeing part and the 3D sound coming out of the branches,” Daix said.

There is no replacement for a white cane

However, the app does not detect obstacles in real time and is only designed to be an “add-on” to a white cane, guide dog, or other devices that visually impaired people use to get around.

The technology exists to do that, though: LiDAR, or light detection and ranging, technology has the potential to “really help visually impaired people,” he said.

“What it allows you to do is scan the depth in the environment. In fact, we started working with LiDAR on an iPhone 12 Pro and were able to develop a prototype that essentially replaces the white cane.

“It allows us to detect obstacles, but not just obstacles on the ground, obstacles that are at head level, at body level… It’s really powerful stuff.”

The main reason SonarVision isn’t focusing on using LiDAR yet, Daix said, is that smartphones that include it (such as the iPhone 12 Pro) are expensive, and SonarVision aims to make its technology as accessible as possible. .

“Today, the biggest feature we can work on is guidance, accurate and affordable guidance, which is really one of the biggest unresolved issues for visually impaired people,” he said.

Leave a Reply

Your email address will not be published.