53
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Comparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondence

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The complex multi-stage architecture of cortical visual pathways provides the neural basis for efficient visual object recognition in humans. However, the stage-wise computations therein remain poorly understood. Here, we compared temporal (magnetoencephalography) and spatial (functional MRI) visual brain representations with representations in an artificial deep neural network (DNN) tuned to the statistics of real-world visual recognition. We showed that the DNN captured the stages of human visual processing in both time and space from early visual areas towards the dorsal and ventral streams. Further investigation of crucial DNN parameters revealed that while model architecture was important, training on real-world categorization was necessary to enforce spatio-temporal hierarchical relationships with the brain. Together our results provide an algorithmically informed view on the spatio-temporal dynamics of visual object recognition in the human visual brain.

          Related collections

          Most cited references35

          • Record: found
          • Abstract: found
          • Article: not found

          Speed of processing in the human visual system.

          How long does it take for the human visual system to process a complex natural image? Subjectively, recognition of familiar objects and scenes appears to be virtually instantaneous, but measuring this processing time experimentally has proved difficult. Behavioural measures such as reaction times can be used, but these include not only visual processing but also the time required for response execution. However, event-related potentials (ERPs) can sometimes reveal signs of neural processing well before the motor output. Here we use a go/no-go categorization task in which subjects have to decide whether a previously unseen photograph, flashed on for just 20 ms, contains an animal. ERP analysis revealed a frontal negativity specific to no-go trials that develops roughly 150 ms after stimulus onset. We conclude that the visual processing needed to perform this highly demanding task can be achieved in under 150 ms.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Information-based functional brain mapping.

            The development of high-resolution neuroimaging and multielectrode electrophysiological recording provides neuroscientists with huge amounts of multivariate data. The complexity of the data creates a need for statistical summary, but the local averaging standardly applied to this end may obscure the effects of greatest neuroscientific interest. In neuroimaging, for example, brain mapping analysis has focused on the discovery of activation, i.e., of extended brain regions whose average activity changes across experimental conditions. Here we propose to ask a more general question of the data: Where in the brain does the activity pattern contain information about the experimental condition? To address this question, we propose scanning the imaged volume with a "searchlight," whose contents are analyzed multivariately at each location in the brain.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Spatiotemporal signal space separation method for rejecting nearby interference in MEG measurements.

              Limitations of traditional magnetoencephalography (MEG) exclude some important patient groups from MEG examinations, such as epilepsy patients with a vagus nerve stimulator, patients with magnetic particles on the head or having magnetic dental materials that cause severe movement-related artefact signals. Conventional interference rejection methods are not able to remove the artefacts originating this close to the MEG sensor array. For example, the reference array method is unable to suppress interference generated by sources closer to the sensors than the reference array, about 20-40 cm. The spatiotemporal signal space separation method proposed in this paper recognizes and removes both external interference and the artefacts produced by these nearby sources, even on the scalp. First, the basic separation into brain-related and external interference signals is accomplished with signal space separation based on sensor geometry and Maxwell's equations only. After this, the artefacts from nearby sources are extracted by a simple statistical analysis in the time domain, and projected out. Practical examples with artificial current dipoles and interference sources as well as data from real patients demonstrate that the method removes the artefacts without altering the field patterns of the brain signals.
                Bookmark

                Author and article information

                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group
                2045-2322
                10 June 2016
                2016
                : 6
                : 27755
                Affiliations
                [1 ]Computer Science and Artificial Intelligence Laboratory, MIT , Cambridge, MA, USA
                [2 ]Department of Education and Psychology, Free University Berlin , Berlin, Germany
                [3 ]McGovern Institute for Brain Research, MIT , Cambridge, MA, USA
                Author notes
                Article
                srep27755
                10.1038/srep27755
                4901271
                27282108
                51740c5b-aa1c-49b6-bce8-2f2acfbd1961
                Copyright © 2016, Macmillan Publishers Limited

                This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                History
                : 26 January 2016
                : 23 May 2016
                Categories
                Article

                Uncategorized
                Uncategorized

                Comments

                Comment on this article