+1 Recommend
0 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Perception of Body Ownership Is Driven by Bayesian Sensory Inference

      1 , * , 1 , 1 , 2
      PLoS ONE
      Public Library of Science

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Recent studies have shown that human perception of body ownership is highly malleable. A well-known example is the rubber hand illusion (RHI) wherein ownership over a dummy hand is experienced, and is generally believed to require synchronized stroking of real and dummy hands. Our goal was to elucidate the computational principles governing this phenomenon. We adopted the Bayesian causal inference model of multisensory perception and applied it to visual, proprioceptive, and tactile stimuli. The model reproduced the RHI, predicted that it can occur without tactile stimulation, and that synchronous stroking would enhance it. Various measures of ownership across two experiments confirmed the predictions: a large percentage of individuals experienced the illusion in the absence of any tactile stimulation, and synchronous stroking strengthened the illusion. Altogether, these findings suggest that perception of body ownership is governed by Bayesian causal inference—i.e., the same rule that appears to govern the perception of outside world.

          Related collections

          Most cited references18

          • Record: found
          • Abstract: found
          • Article: not found

          Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas.

          In the "rubber-hand illusion," the sight of brushing of a rubber hand at the same time as brushing of the person's own hidden hand is sufficient to produce a feeling of ownership of the fake hand. We shown previously that this illusion is associated with activity in the multisensory areas, most notably the ventral premotor cortex (Ehrsson et al., 2004). However, it remains to be demonstrated that this illusion does not simply reflect the dominant role of vision and that the premotor activity does not reflect a visual representation of an object near the hand. To address these issues, we introduce a somatic rubber-hand illusion. The experimenter moved the blindfolded participant's left index finger so that it touched the fake hand, and simultaneously, he touched the participant's real right hand, synchronizing the touches as perfectly as possible. After approximately 9.7 s, this stimulation elicited an illusion that one was touching one's own hand. We scanned brain activity during this illusion and two control conditions, using functional magnetic resonance imaging. Activity in the ventral premotor cortices, intraparietal cortices, and the cerebellum was associated with the illusion of touching one's own hand. Furthermore, the rated strength of the illusion correlated with the degree of premotor and cerebellar activity. This finding suggests that the activity in these areas reflects the detection of congruent multisensory signals from one's own body, rather than of visual representations. We propose that this could be the mechanism for the feeling of body ownership.
            • Record: found
            • Abstract: found
            • Article: not found

            On the other hand: dummy hands and peripersonal space.

            Where are my hands? The brain can answer this question using sensory information arising from vision, proprioception, or touch. Other sources of information about the position of our hands can be derived from multisensory interactions (or potential interactions) with our close environment, such as when we grasp or avoid objects. The pioneering study of multisensory representations of peripersonal space was published in Behavioural Brain Research almost 30 years ago [Rizzolatti G, Scandolara C, Matelli M, Gentilucci M. Afferent properties of periarcuate neurons in macaque monkeys. II. Visual responses. Behav Brain Res 1981;2:147-63]. More recently, neurophysiological, neuroimaging, neuropsychological, and behavioural studies have contributed a wealth of evidence concerning hand-centred representations of objects in peripersonal space. This evidence is examined here in detail. In particular, we focus on the use of artificial dummy hands as powerful instruments to manipulate the brain's representation of hand position, peripersonal space, and of hand ownership. We also review recent studies of the 'rubber hand illusion' and related phenomena, such as the visual capture of touch, and the recalibration of hand position sense, and discuss their findings in the light of research on peripersonal space. Finally, we propose a simple model that situates the 'rubber hand illusion' in the neurophysiological framework of multisensory hand-centred representations of space.
              • Record: found
              • Abstract: found
              • Article: not found

              Integration of proprioceptive and visual position-information: An experimentally supported model.

              To localize one's hand, i.e., to find out its position with respect to the body, humans may use proprioceptive information or visual information or both. It is still not known how the CNS combines simultaneous proprioceptive and visual information. In this study, we investigate in what position in a horizontal plane a hand is localized on the basis of simultaneous proprioceptive and visual information and compare this to the positions in which it is localized on the basis of proprioception only and vision only. Seated at a table, subjects matched target positions on the table top with their unseen left hand under the table. The experiment consisted of three series. In each of these series, the target positions were presented in three conditions: by vision only, by proprioception only, or by both vision and proprioception. In one of the three series, the visual information was veridical. In the other two, it was modified by prisms that displaced the visual field to the left and to the right, respectively. The results show that the mean of the positions indicated in the condition with both vision and proprioception generally lies off the straight line through the means of the other two conditions. In most cases the mean lies on the side predicted by a model describing the integration of multisensory information. According to this model, the visual information and the proprioceptive information are weighted with direction-dependent weights, the weights being related to the direction-dependent precision of the information in such a way that the available information is used very efficiently. Because the proposed model also can explain the unexpectedly small sizes of the variable errors in the localization of a seen hand that were reported earlier, there is strong evidence to support this model. The results imply that the CNS has knowledge about the direction-dependent precision of the proprioceptive and visual information.

                Author and article information

                Role: Academic Editor
                PLoS One
                PLoS ONE
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                6 February 2015
                : 10
                : 2
                [1 ]Department of Psychology, University of California, Los Angeles, CA, USA
                [2 ]Department of Bioengineering, University of California, Los Angeles, CA, USA
                Duke University, UNITED STATES
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                Conceived and designed the experiments: MS AJC LS. Performed the experiments: MS AJC. Analyzed the data: MS AJC. Contributed reagents/materials/analysis tools: MS AJC LS. Wrote the paper: MS AJC LS. Developed the computational model: MS LS. Programmed the experiment and performed simulations: MS.


                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

                Page count
                Figures: 11, Tables: 0, Pages: 23
                The authors have no support or funding to report.
                Research Article
                Custom metadata
                All relevant data are within the paper and its Supporting Information files.



                Comment on this article