The Technology Helping Blind People “See Through Their Ears”

Here at Tech & Innovation Daily, our M.O. is pretty simple: To bring you the sharpest insights of the hottest tech trends, profile fastest-growing and most promising companies, and highlight the most exciting tech innovations and breakthroughs.

This column falls squarely into the latter category – and adds to our existing articles on the subject. Namely, breakthroughs in the field of vision. Or more specifically, vision impairment.

Back in February, for example, we profiled the Argus II glasses. Created by Californian firm, Second Sight, it’s designed to help people with a degenerative, incurable eye disease called retinitis pigmentosa.

And in August, we wrote about Israel’s OrCam – an ingenious set of glasses that identifies text and objects and relays the information to the wearer via audio.

And now, researchers at the University of Bath in England have thrown their hat into the ring…

A New “Sound Language” for the Blind

Along the same lines as OrCam, the Bath team has developed software technology called The vOICe, which essentially lets blind people “see through their ears.”

In other words, it turns surroundings and objects taken with a camera into sounds. Then, much like learning a new language, the software trains the brain to translate the sounds the person hears into the corresponding objects.

Originally designed by Dutch scientist, Peter Meijer, in the 1990s, the vOICe technology at Bath is now under the guidance of Psychology professor, Dr. Michael Proulx. And like both Argus II and OrCam, he highlights that vOICe offers a non-invasive, alternative solution to eye surgery.

In fact, the latest model under development at Bath has a notable accolade. In tests on both sighted and blind people, the system showed better recognition levels than those of post-stem cell implant patients in a 2012 U.S. study.

As Dr. Proulx explains, “It’s true that if you have a retinal implant, you do have some experience of vision. However, what’s really surprising is that the ability to actually use those invasive devices still requires a lot of learning and in a lot of ways, it’s very similar to the amount of learning involved in learning to use a device like this.”

So how does it work?

The vOICe in Action

Simply put, the vOICe software takes regular two-dimensional images and creates a scale of sounds and notes that relate to the object’s height and density.

After some training with the system in order to differentiate between sounds and identify objects, users then put their knowledge to practical use by navigating around a room and finding real 3-D objects.

Check out the video below, as one of the vOICe research assistants demonstrates how it works – both quickly and successfully…

The team’s Dave Brown, who trains the volunteers to use the vOICe system, says it’s “like learning a new language – the more you practice, the better you get.”

Right now, it’s best suited to indoor use, where navigation and objects are more controlled, self-contained – and thus, easier to learn. Ultimately, Proulx plans to publish an open-source training manual for the system, where it could be used either in place of eye surgery, or as a complement to it.

Ahead of the tape,

Martin Denholm

You May Also Be Interested In:

Copy Warren Buffett’s Latest Move

Here at Tech & Innovation Daily, our M.O. is pretty simple: To bring you the sharpest insights of the hottest tech trends, profile fastest-growing and most promising companies, and highlight the most exciting tech innovations and breakthroughs. This column falls squarely into the latter category – and adds to our existing articles on the subject....

Martin Denholm

, Managing Editor

View More By Martin Denholm