Research Article: Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems

Date Published: February 9, 2018

Publisher: Public Library of Science

Author(s): Fuming Fang, Takahiro Shinozaki, Stefano Federici.

http://doi.org/10.1371/journal.pone.0192684

Abstract

Human-computer interface systems whose input is based on eye movements can serve as a means of communication for patients with locked-in syndrome. Eye-writing is one such system; users can input characters by moving their eyes to follow the lines of the strokes corresponding to characters. Although this input method makes it easy for patients to get started because of their familiarity with handwriting, existing eye-writing systems suffer from slow input rates because they require a pause between input characters to simplify the automatic recognition process. In this paper, we propose a continuous eye-writing recognition system that achieves a rapid input rate because it accepts characters eye-written continuously, with no pauses. For recognition purposes, the proposed system first detects eye movements using electrooculography (EOG), and then a hidden Markov model (HMM) is applied to model the EOG signals and recognize the eye-written characters. Additionally, this paper investigates an EOG adaptation that uses a deep neural network (DNN)-based HMM. Experiments with six participants showed an average input speed of 27.9 character/min using Japanese Katakana as the input target characters. A Katakana character-recognition error rate of only 5.0% was achieved using 13.8 minutes of adaptation data.

Partial Text

Eye movement-based communication is extremely important for people such as patients with amyotrophic lateral sclerosis (ALS) who have lost nearly all their ability to control voluntary movements, including loss of speech and handwriting but not eye movement [1, 2]. For these patients, the most common means of communication is to have a caregiver face the patient through a transparent character board and then identify which character the patient is looking at [3]. Instead of using a caregiver, this study investigates human-computer interface systems whose input is based on eye movements. Based on whether these systems require a computer screen, they can be split into two groups. Among those that use a screen, Kate et al. [4] and Majaranta et al. [5] designed an on-screen keyboard selection system in which a user could select a key by either stopping a moving cursor using a triggering eye movement or by gazing continuously at a particular key for a fixed duration (this duration is termed “dwell time” [6]). Urbina and Huckauf developed a two-level hierarchical menu selection method [7], in which the user could select a character group from the bottom menu and then select a character from a corresponding pop-up menu by simply glancing through the relevant menu items. Ward and MacKay proposed a continuous selection method named Dasher [8, 9], in which a language model was used to predict the next likely characters. The predicted characters were displayed near the current character and then the user could simply glance from character to character without stopping for selection. The average text entry rate of the on-screen keyboard system was slow—approximately 7.0 words/min [10]—but faster rates of 13.0 and 17.3 words/min were obtained for the hierarchical selection and Dasher, respectively, because those systems did not require any dwell time. This suggests that rapid input rate is a primary concern for these a ssistive communication systems and that it can be achieved without dwell time.

In this paper, we proposed a continuous eye-writing recognition system for patients with locked-in syndrome. People using this system can input eye-written characters continuously. Because this system requires no waiting time between characters, it achieves input rates higher than conventional isolated eye-writing systems in which input characters must be separated by a pause to simplify the automatic recognition process.

 

Source:

http://doi.org/10.1371/journal.pone.0192684

 

0 0 vote
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments