2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Book Chapter: not found
      Medical Image Computing and Computer Assisted Intervention – MICCAI 2020: 23rd International Conference, Lima, Peru, October 4–8, 2020, Proceedings, Part III 

      Interacting with Medical Volume Data in Projective Augmented Reality

      other
      , , ,
      Springer International Publishing

      Read this book at

      Buy book Bookmark
          There is no author summary for this book yet. Authors can add summaries to their books on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references20

          • Record: found
          • Abstract: found
          • Article: not found

          Augmented reality for anatomical education.

          The use of Virtual Environments has been widely reported as a method of teaching anatomy. Generally such environments only convey the shape of the anatomy to the student. We present the Bangor Augmented Reality Education Tool for Anatomy (BARETA), a system that combines Augmented Reality (AR) technology with models produced using Rapid Prototyping (RP) technology, to provide the student with stimulation for touch as well as sight. The principal aims of this work were to provide an interface more intuitive than a mouse and keyboard, and to evaluate such a system as a viable supplement to traditional cadaver based education.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Advanced 3D visualization in student-centred medical education.

            Healthcare students have difficulties achieving a conceptual understanding of 3D anatomy and misconceptions about physiological phenomena are persistent and hard to address. 3D visualization has improved the possibilities of facilitating understanding of complex phenomena. A project was carried out in which high quality 3D visualizations using high-resolution CT and MR images from clinical research were developed for educational use. Instead of standard stacks of slices (original or multiplanar reformatted) volume-rendering images in the quicktime VR format that enables students to interact intuitively were included. Based on learning theories underpinning problem based learning, 3D visualizations were implemented in the existing curricula of the medical and physiotherapy programs. The images/films were used in lectures, demonstrations and tutorial sessions. Self-study material was also developed. To support learning efficacy by developing and using 3D datasets in regular health care curricula and enhancing the knowledge about possible educational value of 3D visualizations in learning anatomy and physiology. Questionnaires were used to investigate the medical and physiotherapy students' opinions about the different formats of visualizations and their learning experiences. The 3D images/films stimulated the students will to understand more and helped them to get insights about biological variations and different organs size, space extent and relation to each other. The virtual dissections gave a clearer picture than ordinary dissections and the possibility to turn structures around was instructive. 3D visualizations based on authentic, viable material point out a new dimension of learning material in anatomy, physiology and probably also pathophysiology. It was successful to implement 3D images in already existing themes in the educational programs. The results show that deeper knowledge is required about students' interpretation of images/films in relation to learning outcomes. There is also a need for preparations and facilitation principles connected to the use of 3D visualizations.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Augmented reality in laparoscopic surgical oncology.

              Minimally invasive surgery represents one of the main evolutions of surgical techniques aimed at providing a greater benefit to the patient. However, minimally invasive surgery increases the operative difficulty since the depth perception is usually dramatically reduced, the field of view is limited and the sense of touch is transmitted by an instrument. However, these drawbacks can currently be reduced by computer technology guiding the surgical gesture. Indeed, from a patient's medical image (US, CT or MRI), Augmented Reality (AR) can increase the surgeon's intra-operative vision by providing a virtual transparency of the patient. AR is based on two main processes: the 3D visualization of the anatomical or pathological structures appearing in the medical image, and the registration of this visualization on the real patient. 3D visualization can be performed directly from the medical image without the need for a pre-processing step thanks to volume rendering. But better results are obtained with surface rendering after organ and pathology delineations and 3D modelling. Registration can be performed interactively or automatically. Several interactive systems have been developed and applied to humans, demonstrating the benefit of AR in surgical oncology. It also shows the current limited interactivity due to soft organ movements and interaction between surgeon instruments and organs. If the current automatic AR systems show the feasibility of such system, it is still relying on specific and expensive equipment which is not available in clinical routine. Moreover, they are not robust enough due to the high complexity of developing a real-time registration taking organ deformation and human movement into account. However, the latest results of automatic AR systems are extremely encouraging and show that it will become a standard requirement for future computer-assisted surgical oncology. In this article, we will explain the concept of AR and its principles. Then, we will review the existing interactive and automatic AR systems in digestive surgical oncology, highlighting their benefits and limitations. Finally, we will discuss the future evolutions and the issues that still have to be tackled so that this technology can be seamlessly integrated in the operating room.
                Bookmark

                Author and book information

                Contributors
                (View ORCID Profile)
                Book Chapter
                2020
                September 29 2020
                : 429-439
                10.1007/978-3-030-59716-0_41
                3defd99b-e487-4b1c-bd3f-2c33a431f218
                History

                Comments

                Comment on this book

                Book chapters

                Similar content849