Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration.
Scientists have developed an app that could help people speak the language of eyes -- literally. The smartphone app that researchers working with Microsoft have developed can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication.
‘Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups.’
Called GazeSpeak, the app would help people with amyotrophic lateral sclerosis (ALS), a condition resulting in individuals gradually losing their strength and the ability to speak, eat or move. As part of the Enable team at Microsoft Research, the scientists developed GazeSpeak to help people with ALS who can move their eyes but cannot speak.
ALS also causes other motor impairments that affect voluntary muscle movement.
According to the researchers, current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups.
Eye-gaze transfer (e-tran) boards, a low-tech alternative, are challenging to master and offer slow communication rates.
Advertisement
"GazeSpeak can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication, with different user interfaces for speakers and interpreters," the researchers said.
Advertisement
The app will be available on the Apple App Store before the conference in May, according to the report.
Source-IANS