Blog
About

  • Record: found
  • Abstract: found
  • Article: found
Is Open Access

An Eye-gaze Oriented Context Based Interaction Paradigm Design

2 , 1 , 2

Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI)

Human Computer Interaction Conference

4 - 6 July 2018

Human-Computer Interaction, Eye-gaze Interaction, Machine Learning, Cognitive Computing

Read this article at

Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Abstract

      The human eye's state of motion and content of interest can express people's cognitive status based on their situations. When observing the surroundings, the human eyes make different eye movements to interact with the observed objects which reflects people’s attention and interest intentions. Currently, most of the eye-gaze interactions lack the context information from the environment. To investigate the cognition awareness of people when they are performing eye-gaze interactions to the surroundings, we analyse the composition of the environment, and divide the essential factors of it into interactive subject, interactive object and context. The eye-object movement attention model and the eye-object feature preference model are constructed to understand people’s attention and preference diversities through eye-gaze interaction to different interactive objects in different contexts, and furthermore to predict their behavioural intentions. Then, an eye-gaze oriented context based interaction paradigm is designed to explain the relationships among eye movement, eye-gaze interaction and people’s behavioural intentions when they are involved and performing eye-gaze interactions in different environments. The paradigm shows the eye-gaze interaction patterns and people’s cognitive behavioural intentions in different context based environments, which can dynamically adapt the intention prediction results to interact with multiple interfaces properly, such as game, PC system, social robots, HCI & HRI applications and serve as one of the computable modals of cognitive computing.

      Related collections

      Most cited references 8

      • Record: found
      • Abstract: not found
      • Article: not found

      A Gaze-Contingent Adaptive Virtual Reality Driving Environment for Intervention in Individuals with Autism Spectrum Disorders

        Bookmark
        • Record: found
        • Abstract: not found
        • Article: not found

        Eye-Gaze Tracking Analysis of Driver Behavior While Interacting With Navigation Systems in an Urban Area

          Bookmark
          • Record: found
          • Abstract: not found
          • Article: not found

          Integrated Approach of Dynamic Human Eye Movement Recognition and Tracking in Real-time.”

           J. Xiahou,  H. HE,  K Wei (2019)
            Bookmark

            Author and article information

            Affiliations
            Software School, Xiamen University
            Quanzhou Institute of Equipment Manufacturing CAS
            Contributors
            Conference
            July 2018
            July 2018
            : 1-4
            10.14236/ewic/HCI2018.229
            © He et al. Published by BCS Learning and Development Ltd.Proceedings of British HCI 2018. Belfast, UK.

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Proceedings of the 32nd International BCS Human Computer Interaction Conference
            HCI
            32
            Belfast, UK
            4 - 6 July 2018
            Electronic Workshops in Computing (eWiC)
            Human Computer Interaction Conference
            Product
            Product Information: 1477-9358 BCS Learning & Development
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Comments

            Comment on this article