19
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Help! I Need a Remote Guide in My Mixed Reality Collaborative Environment

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The help of a remote expert in performing a maintenance task can be useful in many situations, and can save time as well as money. In this context, augmented reality (AR) technologies can improve remote guidance thanks to the direct overlay of 3D information onto the real world. Furthermore, virtual reality (VR) enables a remote expert to virtually share the place in which the physical maintenance is being carried out. In a traditional local collaboration, collaborators are face-to-face and are observing the same artifact, while being able to communicate verbally and use body language, such as gaze direction or facial expression. These interpersonal communication cues are usually limited in remote collaborative maintenance scenarios, in which the agent uses an AR setup while the remote expert uses VR. Providing users with adapted interaction and awareness features to compensate for the lack of essential communication signals is therefore a real challenge for remote MR collaboration. However, this context offers new opportunities for augmenting collaborative abilities, such as sharing an identical point of view, which is not possible in real life. Based on the current task of the maintenance procedure, such as navigation to the correct location or physical manipulation, the remote expert may choose to freely control his/her own viewpoint of the distant workspace, or instead may need to share the viewpoint of the agent in order to better understand the current situation. In this work, we first focus on the navigation task, which is essential to complete the diagnostic phase and to begin the maintenance task in the correct location. We then present a novel interaction paradigm, implemented in an early prototype, in which the guide can show the operator the manipulation gestures required to achieve a physical task that is necessary to perform the maintenance procedure. These concepts are evaluated, allowing us to provide guidelines for future systems targeting efficient remote collaboration in MR environments.

          Related collections

          Most cited references31

          • Record: found
          • Abstract: not found
          • Article: not found

          A taxonomy of mixed reality visual displays

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            A Descriptive Framework of Workspace Awareness for Real-Time Groupware

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Real-time inverse kinematics techniques for anthropomorphic limbs.

              In this paper we develop a set of inverse kinematics algorithms suitable for an anthropomorphic arm or leg. We use a combination of analytical and numerical methods to solve generalized inverse kinematics problems including position, orientation, and aiming constraints. Our combination of analytical and numerical methods results in faster and more reliable algorithms than conventional inverse Jacobian and optimization-based techniques. Additionally, unlike conventional numerical algorithms, our methods allow the user to interactively explore all possible solutions using an intuitive set of parameters that define the redundancy of the system. c2000 Academic Press.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Robot AI
                Front Robot AI
                Front. Robot. AI
                Frontiers in Robotics and AI
                Frontiers Media S.A.
                2296-9144
                15 November 2019
                2019
                : 6
                : 106
                Affiliations
                [1] 1IRT b<>com , Rennes, France
                [2] 2IMT Atlantique , Brest, France
                [3] 3Lab-STICC, UMR CNRS 6285 , Brest, France
                [4] 4Univ Rennes, INSA Rennes, Inria, CNRS, IRISA , Rennes, France
                Author notes

                Edited by: Bernd Froehlich, Bauhaus-Universität Weimar, Germany

                Reviewed by: Jonathan M. Aitken, University of Sheffield, United Kingdom; Marta Fairén, Universitat Politecnica de Catalunya, Spain

                *Correspondence: Morgan Le Chénéchal morgan.le.chenechal@ 123456gmail.com

                This article was submitted to Virtual Environments, a section of the journal Frontiers in Robotics and AI

                Article
                10.3389/frobt.2019.00106
                7805679
                33501121
                fff5af42-3da9-433a-a60d-4909db2058f2
                Copyright © 2019 Le Chénéchal, Duval, Gouranton, Royan and Arnaldi.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 28 September 2018
                : 10 October 2019
                Page count
                Figures: 17, Tables: 0, Equations: 0, References: 31, Pages: 16, Words: 10548
                Categories
                Robotics and AI
                Original Research

                3d user interaction,collaborative virtual environments,virtual reality,augmented reality,mixed reality

                Comments

                Comment on this article