18
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Brain Extraction Using Label Propagation and Group Agreement: Pincram

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Accurately delineating the brain on magnetic resonance (MR) images of the head is a prerequisite for many neuroimaging methods. Most existing methods exhibit disadvantages in that they are laborious, yield inconsistent results, and/or require training data to closely match the data to be processed. Here, we present pincram, an automatic, versatile method for accurately labelling the adult brain on T1-weighted 3D MR head images. The method uses an iterative refinement approach to propagate labels from multiple atlases to a given target image using image registration. At each refinement level, a consensus label is generated. At the subsequent level, the search for the brain boundary is constrained to the neighbourhood of the boundary of this consensus label. The method achieves high accuracy (Jaccard coefficient > 0.95 on typical data, corresponding to a Dice similarity coefficient of > 0.97) and performs better than many state-of-the-art methods as evidenced by independent evaluation on the Segmentation Validation Engine. Via a novel self-monitoring feature, the program generates the "success index," a scalar metadatum indicative of the accuracy of the output label. Pincram is available as open source software.

          Related collections

          Most cited references13

          • Record: found
          • Abstract: found
          • Article: not found

          Construction of a 3D probabilistic atlas of human cortical structures.

          We describe the construction of a digital brain atlas composed of data from manually delineated MRI data. A total of 56 structures were labeled in MRI of 40 healthy, normal volunteers. This labeling was performed according to a set of protocols developed for this project. Pairs of raters were assigned to each structure and trained on the protocol for that structure. Each rater pair was tested for concordance on 6 of the 40 brains; once they had achieved reliability standards, they divided the task of delineating the remaining 34 brains. The data were then spatially normalized to well-known templates using 3 popular algorithms: AIR5.2.5's nonlinear warp (Woods et al., 1998) paired with the ICBM452 Warp 5 atlas (Rex et al., 2003), FSL's FLIRT (Smith et al., 2004) was paired with its own template, a skull-stripped version of the ICBM152 T1 average; and SPM5's unified segmentation method (Ashburner and Friston, 2005) was paired with its canonical brain, the whole head ICBM152 T1 average. We thus produced 3 variants of our atlas, where each was constructed from 40 representative samples of a data processing stream that one might use for analysis. For each normalization algorithm, the individual structure delineations were then resampled according to the computed transformations. We next computed averages at each voxel location to estimate the probability of that voxel belonging to each of the 56 structures. Each version of the atlas contains, for every voxel, probability densities for each region, thus providing a resource for automated probabilistic labeling of external data types registered into standard spaces; we also computed average intensity images and tissue density maps based on the three methods and target spaces. These atlases will serve as a resource for diverse applications including meta-analysis of functional and structural imaging data and other bioinformatics applications where display of arbitrary labels in probabilistically defined anatomic space will facilitate both knowledge-based development and visualization of findings from multiple disciplines.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found

            BEaST: brain extraction based on nonlocal segmentation technique.

            Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors. Copyright © 2011 Elsevier Inc. All rights reserved.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Automatic segmentation of brain MRIs of 2-year-olds into 83 regions of interest.

              Three-dimensional atlases and databases of the brain at different ages facilitate the description of neuroanatomy and the monitoring of cerebral growth and development. Brain segmentation is challenging in young children due to structural differences compared to adults. We have developed a method, based on established algorithms, for automatic segmentation of young children's brains into 83 regions of interest (ROIs), and applied this to an exemplar group of 33 2-year-old subjects who had been born prematurely. The algorithm uses prior information from 30 normal adult brain magnetic resonance (MR) images, which had been manually segmented to create 30 atlases, each labeling 83 anatomical structures. Each of these adult atlases was registered to each 2-year-old target MR image using non-rigid registration based on free-form deformations. Label propagation from each adult atlas yielded a segmentation of each 2-year-old brain into 83 ROIs. The final segmentation was obtained by combination of the 30 propagated adult atlases using decision fusion, improving accuracy over individual propagations. We validated this algorithm by comparing the automatic approach with three representative manually segmented volumetric regions (the subcortical caudate nucleus, the neocortical pre-central gyrus and the archicortical hippocampus) using similarity indices (SI), a measure of spatial overlap (intersection over average). SI results for automatic versus manual segmentations for these three structures were 0.90+/-0.01, 0.90+/-0.01 and 0.88+/-0.03 respectively. This registration approach allows the rapid construction of automatically labelled age-specific brain atlases for children at the age of 2 years.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                2015
                10 July 2015
                : 10
                : 7
                : e0129211
                Affiliations
                [1 ]MedTech West at Sahlgrenska University Hospital, Gothenburg, Sweden
                [2 ]Institute of Neuroscience and Physiology, Gothenburg University, Gothenburg, Sweden
                [3 ]Centre for Brain Sciences, Imperial College, London, United Kingdom
                [4 ]The Neurodis Foundation, Lyon, France
                [5 ]Department of Computing, Imperial College, London, United Kingdom
                [6 ]Imaging Sciences and Biomedical Engineering, King’s College, London, United Kingdom
                Nanjing University of Aeronautic and Astronautics, CHINA
                Author notes

                Competing Interests: CL and KRG are employees of IXICO PLC., UK, a provider of medical image analysis services. DR and JVH are founders and scientific advisors of IXICO PLC. RAH, PA, and AH declare that they have no competing interests. This does not alter the authors' adherence to PLOS ONE policies on sharing data and materials.

                Conceived and designed the experiments: RAH JVH DR AH PA. Performed the experiments: RAH CL KRG. Analyzed the data: RAH AH CL KRG. Contributed reagents/materials/analysis tools: RAH PA DR. Wrote the paper: RAH CL KRG PA DR JVH AH.

                Article
                PONE-D-14-54814
                10.1371/journal.pone.0129211
                4498771
                26161961
                165468f8-c193-423a-9c9c-acfec0f183b6
                Copyright @ 2015

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

                History
                : 7 December 2014
                : 6 May 2015
                Page count
                Figures: 7, Tables: 3, Pages: 18
                Funding
                The authors have no support or funding to report.
                Categories
                Research Article
                Custom metadata
                All relevant data not accessible from previously published resources will be available from http://soundray.org/pincram/ upon publication.

                Uncategorized
                Uncategorized

                Comments

                Comment on this article