8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Comparison of Four Control Methods for a Five-Choice Assistive Technology

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Severe motor impairments can affect the ability to communicate. The ability to see has a decisive influence on the augmentative and alternative communication (AAC) systems available to the user. To better understand the initial impressions users have of AAC systems we asked naïve healthy participants to compare two visual (a visual P300 brain-computer interface (BCI) and an eye-tracker) and two non-visual systems (an auditory and a tactile P300 BCI). Eleven healthy participants performed 20 selections in a five choice task with each system. The visual P300 BCI used face stimuli, the auditory P300 BCI used Japanese Hiragana syllables and the tactile P300 BCI used a stimulator on the small left finger, middle left finger, right thumb, middle right finger and small right finger. The eye-tracker required a dwell time of 3 s on the target for selection. We calculated accuracies and information-transfer rates (ITRs) for each control method using the selection time that yielded the highest ITR and an accuracy above 70% for each system. Accuracies of 88% were achieved with the visual P300 BCI (4.8 s selection time, 20.9 bits/min), of 70% with the auditory BCI (19.9 s, 3.3 bits/min), of 71% with the tactile BCI (18 s, 3.4 bits/min) and of 100% with the eye-tracker (5.1 s, 28.2 bits/min). Performance between eye-tracker and visual BCI correlated strongly, correlation between tactile and auditory BCI performance was lower. Our data showed no advantage for either non-visual system in terms of ITR but a lower correlation of performance which suggests that choosing the system which suits a particular user is of higher importance for non-visual systems than visual systems.

          Related collections

          Most cited references52

          • Record: found
          • Abstract: found
          • Article: not found

          Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials

          This paper describes the development and testing of a system whereby one can communicate through a computer by using the P300 component of the event-related brain potential (ERP). Such a system may be used as a communication aid by individuals who cannot use any motor system for communication (e.g., 'locked-in' patients). The 26 letters of the alphabet, together with several other symbols and commands, are displayed on a computer screen which serves as the keyboard or prosthetic device. The subject focuses attention successively on the characters he wishes to communicate. The computer detects the chosen character on-line and in real time. This detection is achieved by repeatedly flashing rows and columns of the matrix. When the elements containing the chosen character are flashed, a P300 is elicited, and it is this P300 that is detected by the computer. We report an analysis of the operating characteristics of the system when used with normal volunteers, who took part in 2 experimental sessions. In the first session (the pilot study/training session) subjects attempted to spell a word and convey it to a voice synthesizer for production. In the second session (the analysis of the operating characteristics of the system) subjects were required simply to attend to individual letters of a word for a specific number of trials while data were recorded for off-line analysis. The analyses suggest that this communication channel can be operated accurately at the rate of 0.20 bits/sec. In other words, under the conditions we used, subjects can communicate 12.0 bits, or 2.3 characters, per min.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Selective attention. Long-range and local circuits for top-down modulation of visual cortex processing.

            Top-down modulation of sensory processing allows the animal to select inputs most relevant to current tasks. We found that the cingulate (Cg) region of the mouse frontal cortex powerfully influences sensory processing in the primary visual cortex (V1) through long-range projections that activate local γ-aminobutyric acid-ergic (GABAergic) circuits. Optogenetic activation of Cg neurons enhanced V1 neuron responses and improved visual discrimination. Focal activation of Cg axons in V1 caused a response increase at the activation site but a decrease at nearby locations (center-surround modulation). Whereas somatostatin-positive GABAergic interneurons contributed preferentially to surround suppression, vasoactive intestinal peptide-positive interneurons were crucial for center facilitation. Long-range corticocortical projections thus act through local microcircuits to exert spatially specific top-down modulation of sensory processing. Copyright © 2014, American Association for the Advancement of Science.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Optimal eye movement strategies in visual search.

              To perform visual search, humans, like many mammals, encode a large field of view with retinas having variable spatial resolution, and then use high-speed eye movements to direct the highest-resolution region, the fovea, towards potential target locations. Good search performance is essential for survival, and hence mammals may have evolved efficient strategies for selecting fixation locations. Here we address two questions: what are the optimal eye movement strategies for a foveated visual system faced with the problem of finding a target in a cluttered environment, and do humans employ optimal eye movement strategies during a search? We derive the ideal bayesian observer for search tasks in which a target is embedded at an unknown location within a random background that has the spectral characteristics of natural scenes. Our ideal searcher uses precise knowledge about the statistics of the scenes in which the target is embedded, and about its own visual system, to make eye movements that gain the most information about target location. We find that humans achieve nearly optimal search performance, even though humans integrate information poorly across fixations. Analysis of the ideal searcher reveals that there is little benefit from perfect integration across fixations--much more important is efficient processing of information on each fixation. Apparently, evolution has exploited this fact to achieve efficient eye movement strategies with minimal neural resources devoted to memory.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Hum Neurosci
                Front Hum Neurosci
                Front. Hum. Neurosci.
                Frontiers in Human Neuroscience
                Frontiers Media S.A.
                1662-5161
                06 June 2018
                2018
                : 12
                : 228
                Affiliations
                [1] 1Systems Neuroscience Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities , Tokorozawa, Saitama, Japan
                [2] 2Department of Molecular Medicine, University of Oslo , Oslo, Norway
                [3] 3Brain Science Inspired Life Support Research Center, The University of Electro-Communications , Tokyo, Japan
                [4] 4Department of Physiology and Biological Information, Dokkyo Medical University School of Medicine , Tochigi, Japan
                Author notes

                Edited by: Jan Babic, Jožef Stefan Institute (IJS), Slovenia

                Reviewed by: Emanuel Donchin, University of South Florida, United States; Rifai Chai, University of Technology Sydney, Australia

                *Correspondence: Sebastian Halder sebastian.halder@ 123456medisin.uio.no
                Article
                10.3389/fnhum.2018.00228
                5997833
                429f63db-6136-4ea7-a0d2-f104fb21aaaf
                Copyright © 2018 Halder, Takano and Kansaku.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 22 February 2018
                : 16 May 2018
                Page count
                Figures: 9, Tables: 1, Equations: 0, References: 59, Pages: 14, Words: 10045
                Funding
                Funded by: Japan Society for the Promotion of Science 10.13039/501100001691
                Award ID: 15H03126
                Award ID: 15H05880
                Award ID: 16K13113
                Award ID: 16H05583
                Categories
                Neuroscience
                Original Research

                Neurosciences
                bci,eeg/erp,assistive technology,eye-tracking,visual stimulation,auditory stimulation,tactile stimulation

                Comments

                Comment on this article