9
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Assumptions about the positioning of virtual stimuli affect gaze direction estimates during Augmented Reality based interactions

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We investigated gaze direction determination in dyadic interactions mediated by an Augmented Reality (AR) head-mounted-display. With AR, virtual content is overlaid on top of the real-world scene, offering unique data visualization and interaction opportunities. A drawback of AR however is related to uncertainty regarding the AR user’s focus of attention in social-collaborative settings: an AR user looking in our direction might either be paying attention to us or to augmentations positioned somewhere in between. In two psychophysical experiments, we assessed what impact assumptions concerning the positioning of virtual content attended by an AR user have on other people’s sensitivity to their gaze direction. In the first experiment we found that gaze discrimination was better when the participant was aware that the AR user was focusing on stimuli positioned on their depth plane as opposed to being positioned halfway between the AR user and the participant. In the second experiment, we found that this modulatory effect was explained by participants’ assumptions concerning which plane the AR user was focusing on, irrespective of these being correct. We discuss the significance of AR reduced gaze determination in social-collaborative settings as well as theoretical implications regarding the impact of this technology on social behaviour.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: not found

          Neural basis of eye gaze processing deficits in autism.

          Impairments in using eye gaze to establish joint attention and to comprehend the mental states and intentions of other people are striking features of autism. Here, using event-related functional MRI (fMRI), we show that in autism, brain regions involved in gaze processing, including the superior temporal sulcus (STS) region, are not sensitive to intentions conveyed by observed gaze shifts. On congruent trials, subjects watched as a virtual actor looked towards a checkerboard that appeared in her visual field, confirming the subject's expectation regarding what the actor 'ought to do' in this context. On incongruent trials, she looked towards empty space, violating the subject's expectation. Consistent with a prior report from our laboratory that used this task in neurologically normal subjects, 'errors' (incongruent trials) evoked more activity in the STS and other brain regions linked to social cognition, indicating a strong effect of intention in typically developing subjects (n = 9). The same brain regions were activated during observation of gaze shifts in subjects with autism (n = 10), but did not differentiate congruent and incongruent trials, indicating that activity in these regions was not modulated by the context of the perceived gaze shift. These results demonstrate a difference in the response of brain regions underlying eye gaze processing in autism. We conclude that lack of modulation of the STS region by gaze shifts that convey different intentions contributes to the eye gaze processing deficits associated with autism.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Dynamic Sounds Capture the Boundaries of Peripersonal Space Representation in Humans

            Background We physically interact with external stimuli when they occur within a limited space immediately surrounding the body, i.e., Peripersonal Space (PPS). In the primate brain, specific fronto-parietal areas are responsible for the multisensory representation of PPS, by integrating tactile, visual and auditory information occurring on and near the body. Dynamic stimuli are particularly relevant for PPS representation, as they might refer to potential harms approaching the body. However, behavioural tasks for studying PPS representation with moving stimuli are lacking. Here we propose a new dynamic audio-tactile interaction task in order to assess the extension of PPS in a more functionally and ecologically valid condition. Methodology/Principal Findings Participants vocally responded to a tactile stimulus administered at the hand at different delays from the onset of task-irrelevant dynamic sounds which gave the impression of a sound source either approaching or receding from the subject’s hand. Results showed that a moving auditory stimulus speeded up the processing of a tactile stimulus at the hand as long as it was perceived at a limited distance from the hand, that is within the boundaries of PPS representation. The audio-tactile interaction effect was stronger when sounds were approaching compared to when sounds were receding. Conclusion/Significance This study provides a new method to dynamically assess PPS representation: The function describing the relationship between tactile processing and the position of sounds in space can be used to estimate the location of PPS boundaries, along a spatial continuum between far and near space, in a valuable and ecologically significant way.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Do the eyes have it? Cues to the direction of social attention

                Bookmark

                Author and article information

                Contributors
                nicolabinetti@gmail.com
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                22 February 2019
                22 February 2019
                2019
                : 9
                : 2566
                Affiliations
                [1 ]ISNI 0000000121901201, GRID grid.83440.3b, UCL Interaction Centre, , University College London, ; London, UK
                [2 ]ISNI 0000 0001 2171 1133, GRID grid.4868.2, School of Biological and Chemical Sciences, , Psychology, Queen Mary University of London, ; London, UK
                [3 ]ISNI 0000000121901201, GRID grid.83440.3b, Department of Computer Science, , University College London, ; London, UK
                Author information
                http://orcid.org/0000-0003-2846-2592
                Article
                39311
                10.1038/s41598-019-39311-1
                6384932
                30796287
                b66d46a9-af6b-4a53-8b4b-a86fc1f854c6
                © The Author(s) 2019

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 22 June 2018
                : 21 December 2018
                Funding
                Funded by: EC | European Research Council (723737)
                Categories
                Article
                Custom metadata
                © The Author(s) 2019

                Uncategorized
                Uncategorized

                Comments

                Comment on this article