35
views
0
recommends
+1 Recommend
0 collections
    4
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Integration of egocentric and allocentric information during memory-guided reaching to images of a natural environment

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          When interacting with our environment we generally make use of egocentric and allocentric object information by coding object positions relative to the observer or relative to the environment, respectively. Bayesian theories suggest that the brain integrates both sources of information optimally for perception and action. However, experimental evidence for egocentric and allocentric integration is sparse and has only been studied using abstract stimuli lacking ecological relevance. Here, we investigated the use of egocentric and allocentric information during memory-guided reaching to images of naturalistic scenes. Participants encoded a breakfast scene containing six objects on a table (local objects) and three objects in the environment (global objects). After a 2 s delay, a visual test scene reappeared for 1 s in which 1 local object was missing (= target) and of the remaining, 1, 3 or 5 local objects or one of the global objects were shifted to the left or to the right. The offset of the test scene prompted participants to reach to the target as precisely as possible. Only local objects served as potential reach targets and thus were task-relevant. When shifting objects we predicted accurate reaching if participants only used egocentric coding of object position and systematic shifts of reach endpoints if allocentric information were used for movement planning. We found that reaching movements were largely affected by allocentric shifts showing an increase in endpoint errors in the direction of object shifts with the number of local objects shifted. No effect occurred when one local or one global object was shifted. Our findings suggest that allocentric cues are indeed used by the brain for memory-guided reaching towards targets in naturalistic visual scenes. Moreover, the integration of egocentric and allocentric object information seems to depend on the extent of changes in the scene.

          Related collections

          Most cited references37

          • Record: found
          • Abstract: found
          • Article: not found

          Flexible strategies for sensory integration during motor planning.

          When planning target-directed reaching movements, human subjects combine visual and proprioceptive feedback to form two estimates of the arm's position: one to plan the reach direction, and another to convert that direction into a motor command. These position estimates are based on the same sensory signals but rely on different combinations of visual and proprioceptive input, suggesting that the brain weights sensory inputs differently depending on the computation being performed. Here we show that the relative weighting of vision and proprioception depends both on the sensory modality of the target and on the information content of the visual feedback, and that these factors affect the two stages of planning independently. The observed diversity of weightings demonstrates the flexibility of sensory integration and suggests a unifying principle by which the brain chooses sensory inputs so as to minimize errors arising from the transformation of sensory signals between coordinate frames.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Direct visuomotor transformations for reaching.

            The posterior parietal cortex (PPC) is thought to have a function in the sensorimotor transformations that underlie visually guided reaching, as damage to the PPC can result in difficulty reaching to visual targets in the absence of specific visual or motor deficits. This function is supported by findings that PPC neurons in monkeys are modulated by the direction of hand movement, as well as by visual, eye position and limb position signals. The PPC could transform visual target locations from retinal coordinates to hand-centred coordinates by combining sensory signals in a serial manner to yield a body-centred representation of target location, and then subtracting the body-centred location of the hand. We report here that in dorsal area 5 of the PPC, remembered target locations are coded with respect to both the eye and hand. This suggests that the PPC transforms target locations directly between these two reference frames. Data obtained in the adjacent parietal reach region (PRR) indicate that this transformation may be achieved by vectorially subtracting hand location from target location, with both locations represented in eye-centred coordinates.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              In what ways do eye movements contribute to everyday activities?

              Two recent studies have investigated the relations of eye and hand movements in extended food preparation tasks, and here the results are compared. The tasks could be divided into a series of actions performed on objects. The eyes usually reached the next object in the sequence before any sign of manipulative action, indicating that eye movements are planned into the motor pattern and lead each action. The eyes usually fixated the same object throughout the action upon it, although they often moved on to the next object in the sequence before completion of the preceding action. The specific roles of individual fixations could be identified as locating (establishing the locations of objects for future use), directing (establishing target direction prior to contact), guiding (supervising the relative movements of two or three objects) and checking (establishing whether some particular condition is met, prior to the termination of an action). It is argued that, at the beginning of each action, the oculomotor system is supplied with the identity of the required object, information about its location, and instructions about the nature of the monitoring required during the action. The eye movements during this kind of task are nearly all to task-relevant objects, and thus their control is seen as primarily 'top-down', and influenced very little by the 'intrinsic salience' of objects.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Hum Neurosci
                Front Hum Neurosci
                Front. Hum. Neurosci.
                Frontiers in Human Neuroscience
                Frontiers Media S.A.
                1662-5161
                25 August 2014
                2014
                : 8
                : 636
                Affiliations
                [1] 1Department of Experimental Psychology, Justus-Liebig-University Giessen, Germany
                [2] 2Canadian Action and Perception Network (CAPnet), Centre for Neuroscience Studies, Queen’s University Kingston, ON, Canada
                Author notes

                Edited by: Simona Monaco, York University, Canada

                Reviewed by: Alexander Gail, German Primate Center, Germany; Constanze Hesse, University of Aberdeen, UK

                *Correspondence: Katja Fiehler, Experimental Psychology, Justus-Liebig-University, Otto-Behaghel-Str. 10F, D-35394 Giessen, Germany e-mail: Katja.fiehler@ 123456psychol.uni-giessen.de

                This article was submitted to the journal Frontiers in Human Neuroscience.

                Article
                10.3389/fnhum.2014.00636
                4141549
                25202252
                444dd22f-a205-4672-8a4b-d21e1377ed1b
                Copyright © 2014 Fiehler, Wolf, Klinghammer and Blohm.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 09 May 2014
                : 30 July 2014
                Page count
                Figures: 8, Tables: 2, Equations: 0, References: 45, Pages: 12, Words: 8889
                Categories
                Neuroscience
                Original Research Article

                Neurosciences
                reference frame,reaching,natural scene,allocentric information,egocentric information,human

                Comments

                Comment on this article