The project is based on Tobii eyegaze technology, which provides communication support for people with speech impairments. The system precisely tracks users’ eye moving through a special camera device, enabling them to create spoken messages during interactive conversations. Users have access to four English phonemes in different directions that can be selected, played and blended into vocal words. Instead of getting visual feedback through a screen, the user looks directly at the person he or she is talking to, while selected phonemes and words are played by speakers.
The aim of this study is to program out a basic eyegaze-speech system, but more importantly an auxiliary settings function, in order to facilitate researchers in doing experiments. The efficiency of people communicating using eye gaze may be affected by various factors, such as the accessibility of multiple phonemes and executing modes for different actions. These variables are extracted from different designing schemes and made changeable in the ‘Settings’ form. Researchers, regardless of programming experience, are able to use the research tool in experiments and investigate a best system for non-speaking users.