+1 Recommend
0 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Exodex Adam—A Reconfigurable Dexterous Haptic User Interface for the Whole Hand


      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Applications for dexterous robot teleoperation and immersive virtual reality are growing. Haptic user input devices need to allow the user to intuitively command and seamlessly “feel” the environment they work in, whether virtual or a remote site through an avatar. We introduce the DLR Exodex Adam, a reconfigurable, dexterous, whole-hand haptic input device. The device comprises multiple modular, three degrees of freedom (3-DOF) robotic fingers, whose placement on the device can be adjusted to optimize manipulability for different user hand sizes. Additionally, the device is mounted on a 7-DOF robot arm to increase the user’s workspace. Exodex Adam uses a front-facing interface, with robotic fingers coupled to two of the user’s fingertips, the thumb, and two points on the palm. Including the palm, as opposed to only the fingertips as is common in existing devices, enables accurate tracking of the whole hand without additional sensors such as a data glove or motion capture. By providing “whole-hand” interaction with omnidirectional force-feedback at the attachment points, we enable the user to experience the environment with the complete hand instead of only the fingertips, thus realizing deeper immersion. Interaction using Exodex Adam can range from palpation of objects and surfaces to manipulation using both power and precision grasps, all while receiving haptic feedback. This article details the concept and design of the Exodex Adam, as well as use cases where it is deployed with different command modalities. These include mixed-media interaction in a virtual environment, gesture-based telemanipulation, and robotic hand–arm teleoperation using adaptive model-mediated teleoperation. Finally, we share the insights gained during our development process and use case deployments.

          Related collections

          Most cited references70

          • Record: found
          • Abstract: not found
          • Article: not found

          Manipulability of Robotic Mechanisms

            • Record: found
            • Abstract: not found
            • Article: not found

            The GRASP Taxonomy of Human Grasp Types

              • Record: found
              • Abstract: found
              • Article: not found

              Four channels mediate the mechanical aspects of touch.

              Although previous physiological and anatomical experiments have identified four afferent fiber types (PC, RA, SA II, and SA I) in glabrous (nonhairy) skin of the human somatosensory periphery, only three have been shown to mediate tactile (mechanoreceptive) sensation. Psychophysical evidence that four channels (P, NP I, NP II, and NP III) do, indeed, participate in the perceptual process is presented. In a series of experiments involving selective masking of the various channels, modification of the skin-surface temperature, and testing cutaneous sensitivity down to very low-vibratory frequencies, the fourth psychophysical channel (NP III) is defined. Based on these experiments and previous work from our laboratory, it is concluded that the four channels work in conjunction at threshold to create an operating range for the perception of vibration that extends from at least 0.4 to greater than 500 Hz. Each of the four channels appears to mediate specific portions of the overall threshold-frequency characteristic. Selection of appropriate neural-response criteria from previously published physiological data and correlation of their derived frequency characteristics with the four psychophysical channels indicates that each channel has its own physiological substrate: P channel and PC fibers, NP I channel and RA fibers, NP II channel and SA II fibers, and NP III channel and SA I fibers. These channels partially overlap in their absolute sensitivities, making it likely that suprathreshold stimuli may activate two or more of the channels at the same time. Thus the perceptual qualities of touch may be determined by the combined inputs from four channels.

                Author and article information

                Front Robot AI
                Front Robot AI
                Front. Robot. AI
                Frontiers in Robotics and AI
                Frontiers Media S.A.
                03 March 2022
                : 8
                : 716598
                [1] 1 Institute of Robotics and Mechatronics , German Aerospace Center (DLR) , Wessling, Germany
                [2] 2 Faculty of Mechanical Engineering , Munich University of Applied Science , Munich, Germany
                [3] 3 Faculty of Informatics , Technical University of Munich , Munich, Germany
                [4] 4 Department of Informatics , Faculty of Mathematics, Informatics and Natural Science , University of Hamburg , Hamburg, Germany
                [5] 5 Department of Electrical Engineering , Chalmers University of Technology , Göteborg, Sweden
                [6] 6 Department of Computer Science and Electrical Engineering , Stanford University , Stanford, CA, United States
                Author notes

                Edited by: Manivannan Muniyandi, Indian Institute of Technology Madras, India

                Reviewed by: Abhishek Gupta, Indian Institute of Technology Bombay, India

                Abhijit Biswas, Indian Institute of Science (IISc), India

                *Correspondence: Neal Y. Lii, neal.lii@ 123456dlr.de
                [ † ]

                Present address: Georg Stillfried, Agile Robots AG, Wessling, Germany; Hadi Beik-Mohammadi, Bosch Center for Artificial Intelligence, Renningen, Germany

                This article was submitted to Haptics, a section of the journal Frontiers in Robotics and AI

                Copyright © 2022 Lii, Pereira, Dietl, Stillfried, Schmidt, Beik-Mohammadi, Baker, Maier, Pleintinger, Chen, Elawad, Mentzer , Pineault, Reisich and Albu-Schäffer.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                : 10 June 2021
                : 24 December 2021
                Robotics and AI
                Original Research

                haptic user interface,hand exoskeletons,human–machine interface (hmi),human–robot interface (hri),teleoperation


                Comment on this article