940
views
0
recommends
+1 Recommend
1 collections
    4
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      An Eye-gaze Oriented Context Based Interaction Paradigm Design

      Published
      proceedings-article
      2 , 1 , 2
      Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI)
      Human Computer Interaction Conference
      4 - 6 July 2018
      Human-Computer Interaction, Eye-gaze Interaction, Machine Learning, Cognitive Computing
      Bookmark

            Abstract

            The human eye's state of motion and content of interest can express people's cognitive status based on their situations. When observing the surroundings, the human eyes make different eye movements to interact with the observed objects which reflects people’s attention and interest intentions. Currently, most of the eye-gaze interactions lack the context information from the environment. To investigate the cognition awareness of people when they are performing eye-gaze interactions to the surroundings, we analyse the composition of the environment, and divide the essential factors of it into interactive subject, interactive object and context. The eye-object movement attention model and the eye-object feature preference model are constructed to understand people’s attention and preference diversities through eye-gaze interaction to different interactive objects in different contexts, and furthermore to predict their behavioural intentions. Then, an eye-gaze oriented context based interaction paradigm is designed to explain the relationships among eye movement, eye-gaze interaction and people’s behavioural intentions when they are involved and performing eye-gaze interactions in different environments. The paradigm shows the eye-gaze interaction patterns and people’s cognitive behavioural intentions in different context based environments, which can dynamically adapt the intention prediction results to interact with multiple interfaces properly, such as game, PC system, social robots, HCI & HRI applications and serve as one of the computable modals of cognitive computing.

            Content

            Author and article information

            Contributors
            Conference
            July 2018
            July 2018
            : 1-4
            Affiliations
            [0001]Software School, Xiamen University
            [0002]Quanzhou Institute of Equipment Manufacturing CAS
            Article
            10.14236/ewic/HCI2018.229
            591c3f21-2d76-486a-93a3-200db4f130a6
            © He et al. Published by BCS Learning and Development Ltd.Proceedings of British HCI 2018. Belfast, UK.

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Proceedings of the 32nd International BCS Human Computer Interaction Conference
            HCI
            32
            Belfast, UK
            4 - 6 July 2018
            Electronic Workshops in Computing (eWiC)
            Human Computer Interaction Conference
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/HCI2018.229
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Human-Computer Interaction,Eye-gaze Interaction,Machine Learning,Cognitive Computing

            REFERENCES

            1. Augmenting human interaction capabilities with proximity, natural gestures, and eye gaze.” The, International Conference 2017 1 3

            2. Springer Journal on Multimodal User Interfaces[J] 2015

            3. Eye moving behaviors identification for gaze tracking interaction[J] Journal on Multimodal User Interfaces 2015 9 2 89 104

            4. http://www.tobii.com/group/news-media/press-releases/2017/8/tobii-and-microsoft-collaborate-to-bring-eye-tracking-support-in-windows-10/

            5. Gazemarks:gaze-based visual placeholders to ease attention switching.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ACM 2010 2093 2102

            6. A Gaze-Contingent Adaptive Virtual Reality Driving Environment for Intervention in Individuals with Autism Spectrum Disorders[J] Acm Transactions on Interactive Intelligent Systems 2016 6 1 3

            7. Gazing at games:using eye tracking to control virtual characters.” 2010 1 160

            8. Eye-Gaze Tracking Analysis of Driver Behavior While Interacting With Navigation Systems in an Urban Area[J] IEEE Transactions on Human-Machine Systems 2016 46 4 546 556

            9. Classification of user performance in the Ruff Figural Fluency Test based on eye-tracking features[J] 2017 15 380 02002

            10. Reading performance Using Eye Tracking to Assess Reading performance in Patients with Glaucoma: A Within-Person Study Hindawi Publishing Corporation Journal of Ophthalmology Volume 2014 120528 10

            11. “Behavior-intention analysis and human-aware computing: Case study and discussion,” 2017 12th IEEE Conference on Industrial Electronics and Applications (ICIEA), Siem Reap, Cambodia 2017 516 521 10.1109/ICIEA.2017.8282899

            12. Integrated Approach of Dynamic Human Eye Movement Recognition and Tracking in Real-time.” International Conference on Virtual Reality and Visualization IEEE 2017 94 101

            13. http://en.wikipedia.org/wiki/Kinematics

            Comments

            Comment on this article