4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Exogenous attention facilitates perceptual learning in visual acuity to untrained stimulus locations and features

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Visual perceptual learning (VPL) refers to the improvement in performance on a visual task due to practice. A hallmark of VPL is specificity, as improvements are often confined to the trained retinal locations or stimulus features. We have previously found that exogenous (involuntary, stimulus-driven) and endogenous (voluntary, goal-driven) spatial attention can facilitate the transfer of VPL across locations in orientation discrimination tasks mediated by contrast sensitivity. Here, we investigated whether exogenous spatial attention can facilitate such transfer in acuity tasks that have been associated with higher specificity. We trained observers for 3 days (days 2–4) in a Landolt acuity task ( Experiment 1) or a Vernier hyperacuity task ( Experiment 2), with either exogenous precues (attention group) or neutral precues (neutral group). Importantly, during pre-tests (day 1) and post-tests (day 5), all observers were tested with neutral precues; thus, groups differed only in their attentional allocation during training. For the Landolt acuity task, we found evidence of location transfer in both the neutral and attention groups, suggesting weak location specificity of VPL. For the Vernier hyperacuity task, we found evidence of location and feature specificity in the neutral group, and learning transfer in the attention group—similar improvement at trained and untrained locations and features. Our results reveal that, when there is specificity in a perceptual acuity task, exogenous spatial attention can overcome that specificity and facilitate learning transfer to both untrained locations and features simultaneously with the same training. Thus, in addition to improving performance, exogenous attention generalizes perceptual learning across locations and features.

          Related collections

          Most cited references111

          • Record: found
          • Abstract: found
          • Article: not found

          Visual attention: the past 25 years.

          This review focuses on covert attention and how it alters early vision. I explain why attention is considered a selective process, the constructs of covert attention, spatial endogenous and exogenous attention, and feature-based attention. I explain how in the last 25 years research on attention has characterized the effects of covert attention on spatial filters and how attention influences the selection of stimuli of interest. This review includes the effects of spatial attention on discriminability and appearance in tasks mediated by contrast sensitivity and spatial resolution; the effects of feature-based attention on basic visual processes, and a comparison of the effects of spatial and feature-based attention. The emphasis of this review is on psychophysical studies, but relevant electrophysiological and neuroimaging studies and models regarding how and where neuronal responses are modulated are also discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A tutorial on a practical Bayesian alternative to null-hypothesis significance testing.

            Null-hypothesis significance testing remains the standard inferential tool in cognitive science despite its serious disadvantages. Primary among these is the fact that the resulting probability value does not tell the researcher what he or she usually wants to know: How probable is a hypothesis, given the obtained data? Inspired by developments presented by Wagenmakers (Psychonomic Bulletin & Review, 14, 779-804, 2007), I provide a tutorial on a Bayesian model selection approach that requires only a simple transformation of sum-of-squares values generated by the standard analysis of variance. This approach generates a graded level of evidence regarding which model (e.g., effect absent [null hypothesis] vs. effect present [alternative hypothesis]) is more strongly supported by the data. This method also obviates admonitions never to speak of accepting the null hypothesis. An Excel worksheet for computing the Bayesian analysis is provided as supplemental material.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The reverse hierarchy theory of visual perceptual learning.

              Perceptual learning can be defined as practice-induced improvement in the ability to perform specific perceptual tasks. We previously proposed the Reverse Hierarchy Theory as a unifying concept that links behavioral findings of visual learning with physiological and anatomical data. Essentially, it asserts that learning is a top-down guided process, which begins at high-level areas of the visual system, and when these do not suffice, progresses backwards to the input levels, which have a better signal-to-noise ratio. This simple concept has proved powerful in explaining a broad range of findings, including seemingly contradicting data. We now extend this concept to describe the dynamics of skill acquisition and interpret recent behavioral and electrophysiological findings.
                Bookmark

                Author and article information

                Journal
                J Vis
                J Vis
                jovi
                JOVI
                Journal of Vision
                The Association for Research in Vision and Ophthalmology
                1534-7362
                27 April 2020
                April 2020
                : 20
                : 4
                : 18
                Affiliations
                [1] Department of Psychology and Neural Science, New York University, New York, NY, USA
                [2] Department of Psychology, New York University, New York, NY, USA
                [3] Department of Psychology, New York University, New York, NY, USA
                [4] Department of Psychology, New York University, New York, NY, USA
                [5] Center for Neural Science, New York University, New York, NY, USA
                [6] Department of Psychology, New York University, New York, NY, USA
                [7] Center for Neural Science, New York University, New York, NY, USA
                Author notes
                Article
                JOV-07122-2019
                10.1167/jov.20.4.18
                7405812
                32340029
                96a8de15-01a5-4318-9ef9-21b803f4db12
                Copyright 2020 The Authors

                This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

                History
                : 08 January 2020
                : 03 October 2019
                Page count
                Pages: 19
                Categories
                Article
                Article

                perceptual learning,covert attention,visual acuity,location specificity,location generalization,feature specificity,feature generalization

                Comments

                Comment on this article