[home]
Enabling Blind People to See with Sound

Goal: Working with Peter Meijer, the original inventor of The vOICe, commercialize a sensory substitution device that enables blind people to "see with sound".

See our company website, MetaModal, LLC.

An article was written about us in the Pasadena Star news: (image and text copied below in case Star News link breaks) More pictures are available there!


Two Pasadena researchers seek integrated system to help the blind `see'


By Beige Luciano-Adams, Staff Writer, Posted: 11/07/2010 06:04:11 AM PST
Founders of MetaModal, LLC Enrico Di Bernardo, left, and Luis Goncalves, center, work with blind participants Bruce Benefiel, of Los Angeles, second left, and Gloria Broderick, of Pasadena, right, at MetaModal's office in Pasadena on Thursday, Nov. 4, 2010. The vOICe vision glasses, by MetaModal, helps the totally blind users to experience live camera views through sophisticated image-to-sound renderings. (SGVN/Staff photo by Watchara Phomicinda)

PASADENA - Bruce Benefiel sits facing five white strips arranged on a black wall. He tries to decipher the shape, tracing the air with his fingers.

"The slanted line cuts across the horizontal," he ventures. A voice from off-camera asks him if he sees the lines overlapping - "have another look," it urges.

"Oh, I see," says Bruce. He struggles for a minute, then nails it: "Is that supposed to be a house?"

Benefiel has been blind since birth.

But with a pair of camera-equipped glasses and a hand-held computer that translates images into raw sounds - all fed to him via earbugs - he is seeing the world around him.

For the last 10 months, Pasadena-based partners Luis Goncalves and Enrico Di Bernardo have been working with trainees like Benefiel in their MetaModal project, developing a prototype for what they envision will be an integrated platform for high-tech blindness aides.

Or, as Di Bernardo terms it, an iPhone for the blind.

The two researchers are currently funded by a National Science Foundation grant, but note that their mandate is more social than scientific.

"Of course there are a lot of people working in the area of restoring sight," Di Bernardo said, explaining that MetaModal is using mostly extant technologies, like the sensory substitution behind the "vOICevision technology" glasses Benefiel wears.

"We're not so much looking at restoring sight. We're working on providing independent living for people who are blind," he said.

The goal is to help blind people enhance their awareness of the world around them and give them confidence to enter it.

"A lot of blind people don't leave their bedroom. They just don't take on life," said Di Bernardo.

Benefiel, MetaModal's irrepressible first trainee, would be an exception to that rule.

"I can see you," he declared on a recent morning. "I can see Rico, Luis - I knew his head was bald as soon as I met him. And I can tell if Rico shaved, which he did today."

Goncalves and Di Bernardo murmured in agreement.

Benefiel's impairment stems from being born three months early.

"In the `50s, that was a death sentence," he said. "I never really had vision. I faked it like all blind people do - `Oh! I see shadows,"' he mocked.

But the glasses, says Benefiel, have "added so much to my world, because now I have an idea of what sight is... I could never quite grasp the concept. Now it's like having invisible fingers that come out of my head."

The Goodwill employee says he uses the prototype to try cleaning his house.

"I can do everything but chase my cat around, because my cat is black. It doesn't show up on this thing," he said.

Lighter objects make louder sounds, objects higher in the frame of vision make higher sounds, and sound increases on one side or the other if you look off-center from an object.

Trainees start off in front of a black felt wall with simple, white felt shapes - circles, horizontal and vertical lines that they can touch as they learn to recognize the sounds.

The idea is that they'll fine-tune their sense of perception through repeated exercises, eventually using the device to map spatial environments.

Once they learn how the sound works and they get a sense of centering objects, trainees can start to imagine where things are in space - and reach for them, or walk towards them as they look down at the ground to see where they're going, explains Goncalves.

He says this virtual hand-eye coordination catches on fairly quickly.

"At the beginning it can even be hard to center things, but they get better at reaching for it. After three months of trying, they're pretty confident," Goncalves said.

Vince Fazzi, an orientation and mobility specialist who works with MetaModal trainees, compares the company's image-to-sound device to its sound-only precursors, noting this is the first time a camera has been integrated.

While still complex, Fazzi says he can see the system as a compliment to canes or dogs, especially if the barrage of sounds coming in could be refined.

Sonic guides, he recalls, fell out of favor.

"People realized they could travel using their cane and get to where they wanted to go and really didn't need to know that much about the world around them," he said.

Goncalves said they've been experimenting with this particular sound-based sensory substitution, developed by partner Peter Meijer in Holland, because it was available. But their vision is bigger.

For their next project, which they hope to fund either with another NSF grant or private investment, they want to try something different, maybe another kind of sensory substitution, or something that taps their backgrounds as Caltech-educated computer vision experts.

The technology they have in mind would allow people to recognize objects and faces, read text, detect things moving in space.

In practice, this could mean the glasses scan bar codes in supermarkets, or items at home in your fridge, or recognize faces or landmarks when you're out walking.

Ultimately, they said, it would have to be "all in the glasses" to make sense.

"When you look at common electronic aides to assist the independent living of people with blindness," Di Bernardo said, "they would have to carry a backpack full of different gadgets and take out one they need at any particular time."

Instead, theirs would be a single, integrated device, and function as an open platform where third parties could program new apps, explained Goncalves.

He predicts that speech-based commands and other computer interpretations of the environment could mingle with sensory substitution schemes to give the user raw information, thereby maintaining the experiential component that MetaModal's trainees are currently struggling to master.

"The key is to make the user interface as seamless, easy-to-use, and functional as possible," said Goncalves. "No need to carry a bunch of gadgets that do one thing each, no need to fumble with your phone, no need to hold anything in your hands, which is important, since often one hand will already be occupied holding your cane or guide dog."

beige.luciano-adams@sgvn.com

626-578-6300, ext. 4444

[home]