3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      XDream: Finding preferred stimuli for visual neurons using generative networks and gradient-free optimization

      research-article
      1 , 2 , * , 2 , 3
      PLoS Computational Biology
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          A longstanding question in sensory neuroscience is what types of stimuli drive neurons to fire. The characterization of effective stimuli has traditionally been based on a combination of intuition, insights from previous studies, and luck. A new method termed XDream (E Xtending DeepDream with real-time evolution for activation maximization) combined a generative neural network and a genetic algorithm in a closed loop to create strong stimuli for neurons in the macaque visual cortex. Here we extensively and systematically evaluate the performance of XDream. We use ConvNet units as in silico models of neurons, enabling experiments that would be prohibitive with biological neurons. We evaluated how the method compares to brute-force search, and how well the method generalizes to different neurons and processing stages. We also explored design and parameter choices. XDream can efficiently find preferred features for visual units without any prior knowledge about them. XDream extrapolates to different layers, architectures, and developmental regimes, performing better than brute-force search, and often better than exhaustive sampling of >1 million images. Furthermore, XDream is robust to choices of multiple image generators, optimization algorithms, and hyperparameters, suggesting that its performance is locally near-optimal. Lastly, we found no significant advantage to problem-specific parameter tuning. These results establish expectations and provide practical recommendations for using XDream to investigate neural coding in biological preparations. Overall, XDream is an efficient, general, and robust algorithm for uncovering neuronal tuning preferences using a vast and diverse stimulus space. XDream is implemented in Python, released under the MIT License, and works on Linux, Windows, and MacOS.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: not found
          • Book Chapter: not found

          Identity Mappings in Deep Residual Networks

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A cortical region consisting entirely of face-selective cells.

            Face perception is a skill crucial to primates. In both humans and macaque monkeys, functional magnetic resonance imaging (fMRI) reveals a system of cortical regions that show increased blood flow when the subject views images of faces, compared with images of objects. However, the stimulus selectivity of single neurons within these fMRI-identified regions has not been studied. We used fMRI to identify and target the largest face-selective region in two macaques for single-unit recording. Almost all (97%) of the visually responsive neurons in this region were strongly face selective, indicating that a dedicated cortical area exists to support face processing in the macaque.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Places: A 10 million Image Database for Scene Recognition

              The rise of multi-million-item dataset initiatives has enabled data-hungry machine learning algorithms to reach near-human semantic classification performance at tasks such as visual object and scene recognition. Here we describe the Places Database, a repository of 10 million scene photographs, labeled with scene semantic categories, comprising a large and diverse list of the types of environments encountered in the world. Using the state-of-the-art Convolutional Neural Networks (CNNs), we provide scene classification CNNs (Places-CNNs) as baselines, that significantly outperform the previous approaches. Visualization of the CNNs trained on Places shows that object detectors emerge as an intermediate representation of scene classification. With its high-coverage and high-diversity of exemplars, the Places Database along with the Places-CNNs offer a novel resource to guide future progress on scene recognition problems.
                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Data curationRole: Formal analysisRole: InvestigationRole: MethodologyRole: SoftwareRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Role: ConceptualizationRole: Funding acquisitionRole: MethodologyRole: Project administrationRole: ResourcesRole: SupervisionRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Role: Editor
                Journal
                PLoS Comput Biol
                PLoS Comput. Biol
                plos
                ploscomp
                PLoS Computational Biology
                Public Library of Science (San Francisco, CA USA )
                1553-734X
                1553-7358
                June 2020
                15 June 2020
                : 16
                : 6
                : e1007973
                Affiliations
                [1 ] Department of Molecular and Cellular Biology, Harvard University, Cambridge, Massachusetts, United States of America
                [2 ] Center for Brains, Minds, and Machines, Boston, Massachusetts, United States of America
                [3 ] Department of Ophthalmology, Boston Children’s Hospital, Boston, Massachusetts, United States of America
                University of Alberta, CANADA
                Author notes

                The authors have declared that no competing interests exist.

                Author information
                http://orcid.org/0000-0001-5555-3217
                http://orcid.org/0000-0003-3505-8475
                Article
                PCOMPBIOL-D-19-01642
                10.1371/journal.pcbi.1007973
                7316361
                32542056
                739e5407-cc86-4cf5-85f7-0eafabdee458
                © 2020 Xiao, Kreiman

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 24 September 2019
                : 21 May 2020
                Page count
                Figures: 4, Tables: 1, Pages: 15
                Funding
                Funded by: funder-id http://dx.doi.org/10.13039/100000002, National Institutes of Health;
                Award ID: R01EY026025
                Award Recipient :
                Funded by: funder-id http://dx.doi.org/10.13039/100000001, National Science Foundation;
                Award ID: STC CCF-1231216
                W.X. and G.K. are supported by the Center for Brains, Minds and Machines funded by NSF528STC award CCF-1231216 and also by NIH R01EY026025. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. https://www.nsf.gov https://www.nih.gov.
                Categories
                Research Article
                Biology and Life Sciences
                Cell Biology
                Cellular Types
                Animal Cells
                Neurons
                Biology and Life Sciences
                Neuroscience
                Cellular Neuroscience
                Neurons
                Physical Sciences
                Mathematics
                Applied Mathematics
                Algorithms
                Genetic Algorithms
                Research and Analysis Methods
                Simulation and Modeling
                Algorithms
                Genetic Algorithms
                Physical Sciences
                Mathematics
                Optimization
                Physical Sciences
                Mathematics
                Applied Mathematics
                Algorithms
                Research and Analysis Methods
                Simulation and Modeling
                Algorithms
                Biology and Life Sciences
                Neuroscience
                Sensory Perception
                Vision
                Biology and Life Sciences
                Psychology
                Sensory Perception
                Vision
                Social Sciences
                Psychology
                Sensory Perception
                Vision
                Biology and Life Sciences
                Neuroscience
                Neuronal Tuning
                Research and Analysis Methods
                Imaging Techniques
                Computer and Information Sciences
                Neural Networks
                Biology and Life Sciences
                Neuroscience
                Neural Networks
                Custom metadata
                vor-update-to-uncorrected-proof
                2020-06-25
                All data underlying the findings described in their manuscript have been made available in the publicly available GitHub repository at https://github.com/willwx/XDream.

                Quantitative & Systems biology
                Quantitative & Systems biology

                Comments

                Comment on this article