26
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Non-Rigid Event-by-Event Continuous Respiratory Motion Compensated List-Mode Reconstruction for PET

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Respiratory motion during PET/CT imaging can cause significant image blurring and underestimation of tracer concentration for both static and dynamic studies. In this study, with the aim to eliminate both intra-cycle and inter-cycle motions, and apply to dynamic imaging, we developed a non-rigid event-by-event (NR-EBE) respiratory motion compensated list-mode reconstruction algorithm. The proposed method consists of 2 components, the first component estimates a continuous non-rigid motion field of the internal organs using the internal-external motion correlation (NR-INTEX). This continuous motion field is then incorporated into the second component, non-rigid MOLAR (NR-MOLAR) reconstruction algorithm, to deform the system matrix to the reference location where the attenuation CT is acquired. The point spread function (PSF) and time-of-flight (TOF) kernels in NR-MOLAR are incorporated in the system matrix calculation and therefore are also deformed according to motion. We first validated NR-MOLAR using a XCAT phantom with a simulated respiratory motion. Non-rigid EBE motion compensated image reconstruction using both components were then validated on three human studies injected with 18 F-FPDTBZ and one with 18 F-FDG tracers. The human results were compared to conventional non-rigid motion correction using discrete motion field (NR-Discrete, one motion field per gate) and a previously proposed rigid EBE motion compensated image reconstruction (R-EBE) that was designed to correct for rigid motion on a target lesion/organ. The XCAT results demonstrated that NR-MOLAR incorporating both PSF and TOF kernels effectively corrected for non-rigid motion. The 18 F-FPDTBZ studies showed that NR-EBE out-performed NR-Discrete, and yielded comparable results with R-EBE on target organs while yielding superior image quality in other regions. The FDG study showed that NR-EBE clearly improved the visibility of multiple moving lesions in the liver where some of them could not be discerned in other reconstructions, in addition to improving quantification. These results show that NR-EBE motion compensated image reconstruction appears to be a promising tool for lesion detection and quantification when imaging thoracic and abdominal regions using PET.

          Related collections

          Most cited references29

          • Record: found
          • Abstract: found
          • Article: not found

          Unified framework for development, deployment and robust testing of neuroimaging algorithms.

          Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software--BioImage Suite (bioimagesuite.org).
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Image similarity and tissue overlaps as surrogates for image registration accuracy: widely used but unreliable.

            The accuracy of nonrigid image registrations is commonly approximated using surrogate measures such as tissue label overlap scores, image similarity, image difference, or transformation inverse consistency error. This paper provides experimental evidence that these measures, even when used in combination, cannot distinguish accurate from inaccurate registrations. To this end, we introduce a "registration" algorithm that generates highly inaccurate image transformations, yet performs extremely well in terms of the surrogate measures. Of the tested criteria, only overlap scores of localized anatomical regions reliably distinguish reasonable from inaccurate registrations, whereas image similarity and tissue overlap do not. We conclude that tissue overlap and image similarity, whether used alone or together, do not provide valid evidence for accurate registrations and should thus not be reported or accepted as such.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Respiratory gating in positron emission tomography: a quantitative comparison of different gating schemes.

              Respiratory gating is used for reducing the effects of breathing motion in a wide range of applications from radiotherapy treatment to diagnostical imaging. Different methods are feasible for respiratory gating. In this study seven gating methods were developed and tested on positron emission tomography (PET) listmode data. The results of seven patient studies were compared quantitatively with respect to motion and noise. (1) Equal and (2) variable time-based gating methods use only the time information of the breathing cycle to define respiratory gates. (3) Equal and (4) variable amplitude-based gating approaches utilize the amplitude of the respiratory signal. (5) Cycle-based amplitude gating is a combination of time and amplitude-based techniques. A baseline correction was applied to methods (3) and (4) resulting in two new approaches: Baseline corrected (6) equal and (7) variable amplitude-based gating. Listmode PET data from seven patients were acquired together with a respiratory signal. Images were reconstructed applying the seven gating methods. Two parameters were used to quantify the results: Motion was measured as the displacement of the heart due to respiration and noise was defined as the standard deviation of pixel intensities in a background region. The amplitude-based approaches (3) and (4) were superior to the time-based methods (1) and (2). The improvement in capturing the motion was more than 30% (up to 130%) in all subjects. The variable time (2) and amplitude (4) methods had a more uniform noise distribution among all respiratory gates compared to equal time (1) and amplitude (3) methods. Baseline correction did not improve the results. Out of seven different respiratory gating approaches, the variable amplitude method (4) captures the respiratory motion best while keeping a constant noise level among all respiratory phases.
                Bookmark

                Author and article information

                Journal
                IEEE Transactions on Medical Imaging
                IEEE Trans. Med. Imaging
                Institute of Electrical and Electronics Engineers (IEEE)
                0278-0062
                1558-254X
                February 2018
                February 2018
                : 37
                : 2
                : 504-515
                Article
                10.1109/TMI.2017.2761756
                7304524
                29028189
                6055d15b-acce-49a3-ba3a-b27447f5f047
                © 2018
                History

                Comments

                Comment on this article