5
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      SARAFun, Smart Assembly Robot with Advanced FUNctionalities, H2020

      , , , , , ,

      Impact

      Science Impact, Ltd.

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          While Industrial robots are very successful in many areas of industrial manufacturing, assembly automation still suffers from complex time consuming programming and the need of dedicated hardware. ABB has developed YuMi, a collaborative inherently safe assembly robot that is expected to reduce integration costs significantly by offering a standardized hardware setup and simple fitting of the robot into existing workplaces. Internal Pilot testing at ABB has however shown that when YuMi is programmed with traditional methods the programming time even for simple assembly tasks will remain very long.The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a YuMi robot in less than a day. This will be accomplished by augmenting the YuMi robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task. The overall conceptual approach is that the robot should be capable of learning and executing assembly tasks in a human like manner. Studies will be made to understand how human assembly workers learn and perform assembly tasks. The human performance will be modelled and transferred to the YuMi robot as assembly skills. The robot will learn assembly tasks, such as insertion or folding, by observing the task being performed by a human instructor. The robot will then analyze the task and generate an assembly program, including exception handling, and design 3D printable fingers tailored for gripping the parts at hand. Aided by the human instructor, the robot will finally learn to perform the actual assembly task, relying on sensory feedback from vision, force and tactile sensing as well as physical human robot interaction. During this phase the robot will gradually improve its understanding of the assembly at hand until it is capable of performing the assembly in a fast and robust manner.

          Related collections

          Author and article information

          Journal
          Impact
          impact
          Science Impact, Ltd.
          2398-7073
          June 14 2017
          June 14 2017
          : 2017
          : 5
          : 67-69
          Article
          10.21820/23987073.2017.5.67
          © 2017

          This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

          Earth & Environmental sciences, Medicine, Computer science, Agriculture, Engineering

          Comments

          Comment on this article