+1 Recommend
1 collections
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      An Eye-gaze Oriented Context Based Interaction Paradigm Design

      2 , 1 , 2

      Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI)

      Human Computer Interaction Conference

      4 - 6 July 2018

      Human-Computer Interaction, Eye-gaze Interaction, Machine Learning, Cognitive Computing

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          The human eye's state of motion and content of interest can express people's cognitive status based on their situations. When observing the surroundings, the human eyes make different eye movements to interact with the observed objects which reflects people’s attention and interest intentions. Currently, most of the eye-gaze interactions lack the context information from the environment. To investigate the cognition awareness of people when they are performing eye-gaze interactions to the surroundings, we analyse the composition of the environment, and divide the essential factors of it into interactive subject, interactive object and context. The eye-object movement attention model and the eye-object feature preference model are constructed to understand people’s attention and preference diversities through eye-gaze interaction to different interactive objects in different contexts, and furthermore to predict their behavioural intentions. Then, an eye-gaze oriented context based interaction paradigm is designed to explain the relationships among eye movement, eye-gaze interaction and people’s behavioural intentions when they are involved and performing eye-gaze interactions in different environments. The paradigm shows the eye-gaze interaction patterns and people’s cognitive behavioural intentions in different context based environments, which can dynamically adapt the intention prediction results to interact with multiple interfaces properly, such as game, PC system, social robots, HCI & HRI applications and serve as one of the computable modals of cognitive computing.

          Related collections

          Most cited references 8

          • Record: found
          • Abstract: not found
          • Article: not found

          A Gaze-Contingent Adaptive Virtual Reality Driving Environment for Intervention in Individuals with Autism Spectrum Disorders

            • Record: found
            • Abstract: not found
            • Article: not found

            Eye-Gaze Tracking Analysis of Driver Behavior While Interacting With Navigation Systems in an Urban Area

              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              “Behavior-intention analysis and human-aware computing: Case study and discussion,”


                Author and article information

                July 2018
                July 2018
                : 1-4
                Software School, Xiamen University
                Quanzhou Institute of Equipment Manufacturing CAS
                © He et al. Published by BCS Learning and Development Ltd.Proceedings of British HCI 2018. Belfast, UK.

                This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit

                Proceedings of the 32nd International BCS Human Computer Interaction Conference
                Belfast, UK
                4 - 6 July 2018
                Electronic Workshops in Computing (eWiC)
                Human Computer Interaction Conference
                Product Information: 1477-9358BCS Learning & Development
                Self URI (journal page):
                Electronic Workshops in Computing


                Comment on this article