15
September
2015
|
08:16 AM
America/Los_Angeles

Technologies For Understanding Human Sign Language

AdamMunder.jpg
A report from Intel Free Press on the challenges in developing communications technologies for deaf people... 

 


Adam Munder, a lithography process engineer at Intel, has been 100 percent deaf since he was 1 year old. He doesn't read lips, nor does he speak. His communication is primarily based on using American Sign Language (ASL) and using two interpreters, one who listens to conversations and converts the dialog into ASL and another who reads his responses in ASL and voices this to his audience.

Munder seldom uses technology to communicate, despite being immersed in an extremely technical environment within Intel. And these technical conversations about lithography, a process for etching geometric shapes on a silicon wafer, are what actually pose the biggest difficulty for him communicating.


Spinal meningitis took away Munder's hearing at an early age. He was mainstreamed with other deaf students and has been fully immersed in deaf culture since he was young. He doesn't rely on hearing aids, and the only technology he uses to communicate outside of Intel is a video phone that connects to a video relay service.

"Outside of Intel, I use video conferencing to a relay operator who interprets," explains Munder through one of his interpreters, Christian Hansen. "For instance, if I want to order a pizza, I'll call the relay service, they'll see me on the video phone, I'll see them, and then they'll place the call to the pizza place, and they can interpret the call for me."

Being deaf inside Intel...

Within Intel, it's a different story as Munder frequently has highly technical meetings with groups of engineers who often speak simultaneously, use acronyms specific to Intel, and who may speak with accents, each of which are often challenges for interpreters to understand and communicate.

"Inside Intel, it's kind of challenging," says Munder. "External technology isn't really applicable here. If I use video relay to call someone here at Intel, that interpreter is not going to know any jargon or lingo plus IP [intellectual property] happening here at Intel, so we just have the interpreter team here."

Intel has contracted with an outside agency to provide interpreting services for deaf employees.

"There were a lot of different challenges when I first joined Intel," says Munder. "The first six months, I was completely lost. I didn't understand Intel's language, it was very difficult for me to figure out the Intel culture."

Over time, Munder and his interpreters developed ways to share and understand acronyms and other technical terms within Intel.

"The interpreters had to listen to the Intel language and they would have to figure out how to translate it to English so that it made sense and understand the concept behind the science," says Munder. "And then they would have to interpret it and translate it into the American Sign Language for me."

American Sign Language has its own grammar, syntax, structure and doesn't follow the same grammar as English. ASL also includes body language and facial expression. For example, a question is indicated through a facial expression - raising one's eyebrows. A simple spoken question of "what is your name?" is translated into essentially "name, what" in combination with raised eyebrows.

"When you write ASL down, it doesn't make sense in English," explains Munder. "The actual grammar structure is more similar to Romance languages, so it's very different than in English."

"The actual interpreting process that happens here at Intel is very unique to interpreting in general," explains one of the interpreters, Bekki Mullenaux. "This is a highly contextual, fast-paced environment. In general, as interpreters, we try to fade into the background and try not to be conspicuous...but not at Intel." Mullenaux adds that she had to force her way into the middle of groups of Intel engineers, shout over the noise of loud equipment and engage in multiple, simultaneous conversations.

Another challenge that Mullenaux recounts is having to do sign language while fully suited up in an Intel "bunny suit," the protective suit used in extremely clean manufacturing environment, with only her eyes being visible.

"Sign language is a very visual language depending on small hand movements, facial expression and body language," says Mullenaux. "It was like trying to talk to someone while being gagged."

The coming of signing technology...

While there is a promise of technology coming that will be able to fully understand ASL as well as lip reading, it still seems to be a ways off.

"In order to fully interpret American Sign Language grammar and structures into spoken English, a computer needs to read all facial expressions, body language and signs in parallel," explains Munder. "It requires a quite complicated algorithm for that."

Even within Intel, there are technologies that map faces and gestures that are commercially available, like RealSense and Pocket Avatars (now called YAP). But despite the complex algorithms powering these services, much more would need to be done.

"Facial expressions, body languages and signs are all related," says Munder. "The major challenge is that we need to create a range of innovative tools to allow the recording and the processing of motion to capture American Sign Language context. It is not that easy to design a complete workflow from the movement capture, including all upper body parts articulations, facial expression and gaze direction."

But efforts are being made to bridge technology and human gestures, from monitoring movement, facial expressions and motion using cameras and sensor-laden gloves, to a developer working with the RealSense SDK to create rudimentary sign-language interpretation.

Until that day arrives Munder is just fine with the way things are. "Being deaf is a difference in human experience rather than a disability."