The American Printing House for the Blind (APH), located on Frankfort Ave., recorded the first audio book in the world in its basement. Now it is helping create technology that helps the visually navigate indoors and outdoors safely.
GoodMaps Explore is an accessible navigation application created by GoodMaps. Louisville Future sat down with CEO Jose Gaztambide to discuss the tech behind the product.
In a nutshell, how does the tech work?
Gaztambide: We are using lidar (light detection and ranging) with imagery recognition to map the indoors and outdoors to help people who are blind and visually impaired navigate the world around them. Lidar has become more mainstream because it’s the technology that’s being used in self-driving vehicles.
How did the company come about?
Gaztambide: We were born out of the APH. They funded the creation of this company out of their endowment.
We’ve been around for about a year and a half. We looked around and wondered why indoor mapping and indoor navigation hadn’t taken off. We identified three issues that we needed to concentrate on and we’re now working on solving those things and are in the process of rolling out and raising some funds.
What were the issues you identified?
Gaztambide: The first one was the speed of mapping the indoors. It’s historically been a pretty manual process. For example, a local library did their own mapping about three years ago. It took their team two and a half weeks to map the library.
We’re using lidar to conduct indoor mapping and have created a mobile backpack device. We conducted the same scan in about 17 or 18 minutes. So we’ve just drastically reduced the time that it takes to map.
Another issue is that, while having a map is great, you have to know where you are within that map.
Is no one else doing this?
Gaztambide: Historically, people have tried a whole bunch of different solutions like wi-fi fingerprinting, Bluetooth beacons, and ultra-wideband beacons. The best case scenario is you’re getting within about 5 meters of accuracy.
We’ve developed an approach that we call camera-based positioning, which basically uses images from an individual’s phone to identify where they are within the building. If you’re standing still, we can get to about 10 centimeters of accuracy. And if you’re moving we can get to about a meter of accuracy.
Problem number three is how do you achieve number two without installing any kind of hardware? Nobody wants to put beacons up on their wall. You’ve got to pay for them, you’ve got to maintain them, and they leave holes in your walls.
Our solution requires no infrastructure whatsoever. We just come in and we do the scan with our camera backpack and that’s the end of it. There’s no maintenance.
The tech is completely auditory, as you would expect, and while there’s a number of ways that people can engage with it, the one that’s most intuitive is the turn-by-turn navigation. You tell the app where it is that you want to go either because you received a link with that address or you just want to go, say, to Steve’s office. And then it just gives you turn-by-turn directions. The camera-based positioning can work both indoors and outdoors.