274
views
0
recommends
+1 Recommend
0 collections
    4
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Planning for robotic exploration based on forward simulation

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We address the problem of controlling a mobile robot to explore a partially known environment. The robot's objective is the maximization of the amount of information collected about the environment. We formulate the problem as a partially observable Markov decision process (POMDP) with an information-theoretic objective function, and solve it applying forward simulation algorithms with an open-loop approximation. We present a new sample-based approximation for mutual information useful in mobile robotics. The approximation can be seamlessly integrated with forward simulation planning algorithms. We investigate the usefulness of POMDP based planning for exploration, and to alleviate some of its weaknesses propose a combination with frontier based exploration. Experimental results in simulated and real environments show that, depending on the environment, applying POMDP based planning for exploration can improve performance over frontier exploration.

          Related collections

          Author and article information

          Journal
          2015-02-09
          2016-06-29
          Article
          10.1016/j.robot.2016.06.008
          1502.02474
          e4102bc0-f5f6-4635-b8ab-72337d38f949

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          90C40
          19 pages, 11 figures in Robotics and Autonomous Systems (2016)
          cs.RO cs.SY

          Performance, Systems & Control,Robotics
          Performance, Systems & Control, Robotics

          Comments

          Comment on this article