67
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain–computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

          Related collections

          Most cited references23

          • Record: found
          • Abstract: found
          • Article: not found

          Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials

          This paper describes the development and testing of a system whereby one can communicate through a computer by using the P300 component of the event-related brain potential (ERP). Such a system may be used as a communication aid by individuals who cannot use any motor system for communication (e.g., 'locked-in' patients). The 26 letters of the alphabet, together with several other symbols and commands, are displayed on a computer screen which serves as the keyboard or prosthetic device. The subject focuses attention successively on the characters he wishes to communicate. The computer detects the chosen character on-line and in real time. This detection is achieved by repeatedly flashing rows and columns of the matrix. When the elements containing the chosen character are flashed, a P300 is elicited, and it is this P300 that is detected by the computer. We report an analysis of the operating characteristics of the system when used with normal volunteers, who took part in 2 experimental sessions. In the first session (the pilot study/training session) subjects attempted to spell a word and convey it to a voice synthesizer for production. In the second session (the analysis of the operating characteristics of the system) subjects were required simply to attend to individual letters of a word for a specific number of trials while data were recorded for off-line analysis. The analyses suggest that this communication channel can be operated accurately at the rate of 0.20 bits/sec. In other words, under the conditions we used, subjects can communicate 12.0 bits, or 2.3 characters, per min.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            System for assisted mobility using eye movements based on electrooculography.

            This paper describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on electroculographic signal is proposed and its validity is studied. Several human-machine interfaces (HMI) based on EOG are commented, focusing our study on guiding and controlling a wheelchair for disabled people, where the control is actually effected by eye movements within the socket. Different techniques and guidance strategies are then shown with comments on the advantages and disadvantages of each one. The system consists of a standard electric wheelchair with an on-board computer, sensors and a graphic user interface run by the computer. On the other hand, this eye-control method can be applied to handle graphical interfaces, where the eye is used as a mouse computer. Results obtained show that this control technique could be useful in multiple applications, such as mobility and communication aid for handicapped persons.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Design of a Novel Efficient Human–Computer Interface: An Electrooculagram Based Virtual Keyboard

                Bookmark

                Author and article information

                Journal
                Sensors (Basel)
                Sensors (Basel)
                sensors
                Sensors (Basel, Switzerland)
                MDPI
                1424-8220
                23 June 2017
                July 2017
                : 17
                : 7
                : 1485
                Affiliations
                [1 ]Interdisciplinary Program of Bioengineering, Seoul National University, Seoul 03080, Korea; hjeong20@ 123456bmsil.snu.ac.kr (J.H.); hnyoon@ 123456bmsil.snu.ac.kr (H.Y.)
                [2 ]Department of Biomedical Engineering, College of Medicine, Seoul National University, Seoul 03080, Korea
                Author notes
                [* ]Correspondence: pks@ 123456bmsil.snu.ac.kr ; Tel.: +82-2-2072-3135
                Article
                sensors-17-01485
                10.3390/s17071485
                5539556
                28644398
                5fc7633d-84b8-44f8-bbd0-305d13874de0
                © 2017 by the authors.

                Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

                History
                : 02 May 2017
                : 20 June 2017
                Categories
                Article

                Biomedical engineering
                human computer interface,hci,electrooculogram,eog,forehead,eye movement
                Biomedical engineering
                human computer interface, hci, electrooculogram, eog, forehead, eye movement

                Comments

                Comment on this article