Medindia LOGIN REGISTER
Medindia

New App can Help Speak Language of Eyes

by Bidita Debnath on Feb 19 2017 9:53 PM

Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration.

New App can Help Speak Language of Eyes
Scientists have developed an app that could help people speak the language of eyes -- literally.
The smartphone app that researchers working with Microsoft have developed can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication.

Called GazeSpeak, the app would help people with amyotrophic lateral sclerosis (ALS), a condition resulting in individuals gradually losing their strength and the ability to speak, eat or move.

As part of the Enable team at Microsoft Research, the scientists developed GazeSpeak to help people with ALS who can move their eyes but cannot speak.

ALS also causes other motor impairments that affect voluntary muscle movement.

According to the researchers, current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups.

Eye-gaze transfer (e-tran) boards, a low-tech alternative, are challenging to master and offer slow communication rates.

Advertisement
"To mitigate the drawbacks of these two status quo approaches, we created GazeSpeak, an eye gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable, and easy-to-learn, with a higher communication bandwidth than an e-tran board," the researchers wrote in the abstract of a paper scheduled to be presented in May at the Conference on Human Factors in Computing Systems in Colorado, US.

"GazeSpeak can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication, with different user interfaces for speakers and interpreters," the researchers said.

Advertisement
GazeSpeak uses artificial intelligence to convert eye movements into speech, and runs on the listener's device so that he/she can understand what is being said in real time, New Scientist reported.

The app will be available on the Apple App Store before the conference in May, according to the report.

Source-IANS


Advertisement