10
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Peripersonal Space: An Index of Multisensory Body–Environment Interactions in Real, Virtual, and Mixed Realities

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references39

          • Record: found
          • Abstract: found
          • Article: not found

          Coding of peripersonal space in inferior premotor cortex (area F4).

          1. We studied the functional properties of neurons in the caudal part of inferior area 6 (area F4) in awake monkeys. In agreement with previous reports, we found that the large majority (87%) of neurons responded to sensory stimuli. The responsive neurons fell into three categories: somatosensory neurons (30%); visual neurons (14%); and bimodal, visual and somatosensory neurons (56%). Both somatosensory and bimodal neurons typically responded to light touch of the skin. Their RFs were located on the face, neck, trunk, and arms. Approaching objects were the most effective visual stimuli. Visual RFs were mostly located in the space near the monkey (peripersonal space). Typically they extended in the space adjacent to the tactile RFs. 2. The coordinate system in which visual RFs were coded was studied in 110 neurons. In 94 neurons the RF location was independent of eye position, remaining in the same position in the peripersonal space regardless of eye deviation. The RF location with respect to the monkey was not modified by changing monkey position in the recording room. In 10 neurons the RF's location followed the eye movements, remaining in the same retinal position (retinocentric RFs). For the remaining six neurons the RF organization was not clear. We will refer to F4 neurons with RF independent of eye position as somatocentered neurons. 3. In most somatocentered neurons (43 of 60 neurons) the background level of activity and the response to visual stimuli were not modified by changes in eye position, whereas they were modulated in the remaining 17. It is important to note that eye deviations were constantly accompanied by a synergic increase of the activity of the ipsilateral neck muscles. It is not clear, therefore, whether the modulation of neuron discharge depended on eye position or was a consequence of changes in neck muscle activity. 4. The effect of stimulus velocity (20-80 cm/s) on neuron response intensity and RF extent in depth was studied in 34 somatocentered neurons. The results showed that in most neurons the increase of stimulus velocity produced an expansion in depth of the RF. 5. We conclude that space is coded differently in areas that control somatic and eye movements. We suggest that space coding in different cortical areas depends on the computational necessity of the effectors they control.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Visuospatial properties of ventral premotor cortex.

            In macaque ventral premotor cortex, we recorded the activity of neurons that responded to both visual and tactile stimuli. For these bimodal cells, the visual receptive field extended from the tactile receptive field into the adjacent space. Their tactile receptive fields were organized topographically, with the arms represented medially, the face represented in the middle, and the inside of the mouth represented laterally. For many neurons, both the visual and tactile responses were directionally selective, although many neurons also responded to stationary stimuli. In the awake monkeys, for 70% of bimodal neurons with a tactile response on the arm, the visual receptive field moved when the arm was moved. In contrast, for 0% the visual receptive field moved when the eye or head moved. Thus the visual receptive fields of most "arm + visual" cells were anchored to the arm, not to the eye or head. In the anesthetized monkey, the effect of arm position was similar. For 95% of bimodal neurons with a tactile response on the face, the visual receptive field moved as the head was rotated. In contrast, for 15% the visual receptive field moved with the eye and for 0% it moved with the arm. Thus the visual receptive fields of most "face + visual" cells were anchored to the head, not to the eye or arm. To construct a visual receptive field anchored to the arm, it is necessary to integrate the position of the arm, head, and eye. For arm + visual cells, the spontaneous activity, the magnitude of the visual response, and sometimes both were modulated by the position of the arm (37%), the head (75%), and the eye (58%). In contrast, to construct a visual receptive field that is anchored to the head, it is necessary to use the position of the eye, but not of the head or the arm. For face + visual cells, the spontaneous activity and/or response magnitude was modulated by the position of the eyes (88%), but not of the head or the arm (0%). Visual receptive fields anchored to the arm can encode stimulus location in "arm-centered" coordinates, and would be useful for guiding arm movements. Visual receptive fields anchored to the head can likewise encode stimuli in "head-centered" coordinates, useful for guiding head movements. Sixty-three percent of face + visual neurons responded during voluntary movements of the head. We suggest that "body-part-centered" coordinates provide a general solution to a problem of sensory-motor integration: sensory stimuli are located in a coordinate system anchored to a particular body part.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Reference frames for representing visual and tactile locations in parietal cortex.

              The ventral intraparietal area (VIP) receives converging inputs from visual, somatosensory, auditory and vestibular systems that use diverse reference frames to encode sensory information. A key issue is how VIP combines those inputs together. We mapped the visual and tactile receptive fields of multimodal VIP neurons in macaque monkeys trained to gaze at three different stationary targets. Tactile receptive fields were found to be encoded into a single somatotopic, or head-centered, reference frame, whereas visual receptive fields were widely distributed between eye- to head-centered coordinates. These findings are inconsistent with a remapping of all sensory modalities in a common frame of reference. Instead, they support an alternative model of multisensory integration based on multidirectional sensory predictions (such as predicting the location of a visual stimulus given where it is felt on the skin and vice versa). This approach can also explain related findings in other multimodal areas.
                Bookmark

                Author and article information

                Journal
                Frontiers in ICT
                Front. ICT
                Frontiers Media SA
                2297-198X
                January 22 2018
                January 22 2018
                : 4
                Article
                10.3389/fict.2017.00031
                6d4e2076-3afe-4935-92d9-6dc9422f521f
                © 2018

                Free to read

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article