1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Biological Action Identification Does Not Require Early Visual Input for Development

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Visual input during the first years of life is vital for the development of numerous visual functions. While normal development of global motion perception seems to require visual input during an early sensitive period, the detection of biological motion (BM) does not seem to do so. A more complex form of BM processing is the identification of human actions. Here, we tested whether identification rather than detection of BM is experience dependent. A group of human participants who had been treated for congenital cataracts (CC; of up to 18 years in duration, CC group) had to identify ten actions performed by human line figures. In addition, they performed a coherent motion (CM) detection task, which required identifying the direction of CM amid the movement of random dots. As controls, developmental cataract (DC) reversal individuals (DC group) who had undergone the same surgical treatment as CC group were included. Moreover, normally sighted controls were tested both with vision blurred to match the visual acuity (VA) of CC individuals [vision matched (VM) group] and with full sight [sighted control (SC) group]. The CC group identified biological actions with an extraordinary high accuracy (on average ∼85% correct) and was indistinguishable from the VM control group. By contrast, CM processing impairments of the CC group persisted even after controlling for VA. These results in the same individuals demonstrate an impressive resilience of BM processing to aberrant early visual experience and at the same time a sensitive period for the development of CM processing.

          Related collections

          Most cited references 56

          • Record: found
          • Abstract: found
          • Article: not found

          A predisposition for biological motion in the newborn baby.

          An inborn predisposition to attend to biological motion has long been theorized, but had so far been demonstrated only in one animal species (the domestic chicken). In particular, no preference for biological motion was reported for human infants of <3 months of age. We tested 2-day-old babies' discrimination after familiarization and their spontaneous preferences for biological vs. nonbiological point-light animations. Newborns were shown to be able to discriminate between two different patterns of motion (Exp. 1) and, when first exposed to them, selectively preferred to look at the biological motion display (Exp. 2). This preference was also orientation-dependent: newborns looked longer at upright displays than upside-down displays (Exp. 3). These data support the hypothesis that detection of biological motion is an intrinsic capacity of the visual system, which is presumably part of an evolutionarily ancient and nonspecies-specific system predisposing animals to preferentially attend to other animals.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Neural mechanisms for the recognition of biological movements.

            The visual recognition of complex movements and actions is crucial for the survival of many species. It is important not only for communication and recognition at a distance, but also for the learning of complex motor actions by imitation. Movement recognition has been studied in psychophysical, neurophysiological and imaging experiments, and several cortical areas involved in it have been identified. We use a neurophysiologically plausible and quantitative model as a tool for organizing and making sense of the experimental data, despite their growing size and complexity. We review the main experimental findings and discuss possible neural mechanisms, and show that a learning-based, feedforward model provides a neurophysiologically plausible and consistent summary of many key experimental results.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The Freiburg Visual Acuity test--automatic measurement of visual acuity.

               M. Bach (1995)
              The Freiburg Visual Acuity test is an automated procedure for self-administered measurement of visual acuity. Landolt-Cs are presented on a monitor in one of eight orientations. The subject presses one of eight buttons, which are spatially arranged on a response box according to the eight possible positions of the Landolt-Cs' gap. To estimate the acuity threshold, a best PEST (best Parameter Estimation by Sequential Testing) procedure is used in which a psychometric function having a constant slope on a logarithmic acuity scale is assumed. Measurement terminates after a fixed number of trials. With computer monitors, pixel-discreteness artifacts limit the presentation of small stimuli. By using anti-aliasing, i.e., smoothing of contours by multiple gray levels, the spatial resolution was improved by a factor of four. Thus, even the shape of small Landolt-Cs with oblique gaps is adequate and visual acuities from 5/80 (0.06) up to 5/1.4 (3.6) can be tested at a distance of 5 m.
                Bookmark

                Author and article information

                Journal
                eNeuro
                eNeuro
                eneuro
                eneuro
                eNeuro
                eNeuro
                Society for Neuroscience
                2373-2822
                15 October 2020
                27 October 2020
                Sep-Oct 2020
                : 7
                : 5
                Affiliations
                [1 ]Biological Psychology and Neuropsychology, University of Hamburg , Hamburg, Germany
                [2 ]Jasti V Ramanamma Children’s Eye Care Center, LV Prasad Eye Institute , Hyderabad, India
                [3 ]IMT School for Advanced Studies Lucca , Lucca, Italy
                [4 ]Department of Biology, Centre for Vision Research, York University , Toronto, Ontario, Canada
                Author notes

                The authors declare no competing financial interests.

                Author contributions: S.S.R., D.B., S.S., N.F.T., and B.R. designed research; S.S.R., D.B., I.S., K.P., S.S., R.K., and B.R. performed research; S.S.R. analyzed data; S.S.R. wrote the paper.

                The work was supported by the European Research Council Grant ERC-2009-AdG 249425-CriticalBrainChanges and the Deutsche Forschungsgemeinschaft Grant Ro 2625/10-1 (to B.R.).

                Correspondence should be addressed to Siddhart S. Rajendran at siddhart.srivatsav.rajendran@ 123456uni-hamburg.de .
                Article
                eN-NWR-0534-19
                10.1523/ENEURO.0534-19.2020
                7598910
                33060179
                Copyright © 2020 Rajendran et al.

                This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.

                Page count
                Figures: 6, Tables: 2, Equations: 0, References: 58, Pages: 10, Words: 00
                Product
                Funding
                Funded by: http://doi.org/10.13039/501100000781EC | European Research Council (ERC)
                Award ID: ERC-2009-AdG249425-CriticalBrainChanges
                Funded by: http://doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft (DFG)
                Award ID: DFG RO 2625/10-1
                Categories
                1
                Research Article: New Research
                Cognition and Behavior
                Custom metadata
                September/October 2020

                Comments

                Comment on this article