22
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Importance of Human–Computer Interaction in Radiology E-learning

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          With the development of cross-sectional imaging techniques and transformation to digital reading of radiological imaging, e-learning might be a promising tool in undergraduate radiology education. In this systematic review of the literature, we evaluate the emergence of image interaction possibilities in radiology e-learning programs and evidence for effects of radiology e-learning on learning outcomes and perspectives of medical students and teachers. A systematic search in PubMed, EMBASE, Cochrane, ERIC, and PsycInfo was performed. Articles were screened by two authors and included when they concerned the evaluation of radiological e-learning tools for undergraduate medical students. Nineteen articles were included. Seven studies evaluated e-learning programs with image interaction possibilities. Students perceived e-learning with image interaction possibilities to be a useful addition to learning with hard copy images and to be effective for learning 3D anatomy. Both e-learning programs with and without image interaction possibilities were found to improve radiological knowledge and skills. In general, students found e-learning programs easy to use, rated image quality high, and found the difficulty level of the courses appropriate. Furthermore, they felt that their knowledge and understanding of radiology improved by using e-learning. In conclusion, the addition of radiology e-learning in undergraduate medical education can improve radiological knowledge and image interpretation skills. Differences between the effect of e-learning with and without image interpretation possibilities on learning outcomes are unknown and should be subject to future research.

          Related collections

          Most cited references27

          • Record: found
          • Abstract: found
          • Article: not found

          Association between funding and quality of published medical education research.

          Methodological shortcomings in medical education research are often attributed to insufficient funding, yet an association between funding and study quality has not been established. To develop and evaluate an instrument for measuring the quality of education research studies and to assess the relationship between funding and study quality. Internal consistency, interrater and intrarater reliability, and criterion validity were determined for a 10-item medical education research study quality instrument (MERSQI). This was applied to 210 medical education research studies published in 13 peer-reviewed journals between September 1, 2002, and December 31, 2003. The amount of funding obtained per study and the publication record of the first author were determined by survey. Study quality as measured by the MERSQI (potential maximum total score, 18; maximum domain score, 3), amount of funding per study, and previous publications by the first author. The mean MERSQI score was 9.95 (SD, 2.34; range, 5-16). Mean domain scores were highest for data analysis (2.58) and lowest for validity (0.69). Intraclass correlation coefficient ranges for interrater and intrarater reliability were 0.72 to 0.98 and 0.78 to 0.998, respectively. Total MERSQI scores were associated with expert quality ratings (Spearman rho, 0.73; 95% confidence interval [CI], 0.56-0.84; P < .001), 3-year citation rate (0.8 increase in score per 10 citations; 95% CI, 0.03-1.30; P = .003), and journal impact factor (1.0 increase in score per 6-unit increase in impact factor; 95% CI, 0.34-1.56; P = .003). In multivariate analysis, MERSQI scores were independently associated with study funding of $20 000 or more (0.95 increase in score; 95% CI, 0.22-1.86; P = .045) and previous medical education publications by the first author (1.07 increase in score per 20 publications; 95% CI, 0.15-2.23; P = .047). The quality of published medical education research is associated with study funding.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            eLearning: a review of Internet-based continuing medical education.

            The objective was to review the effect of Internet-based continuing medical education (CME) interventions on physician performance and health care outcomes. Data sources included searches of MEDLINE (1966 to January 2004), CINAHL (1982 to December 2003), ACP Journal Club (1991 to July/August 2003), and the Cochrane Database of Systematic Reviews (third quarter, 2003). Studies were included in the analyses if they were randomized controlled trials of Internet-based education in which participants were practicing health care professionals or health professionals in training. CME interventions were categorized according to the nature of the intervention, sample size, and other information about educational content and format. Sixteen studies met the eligibility criteria. Six studies generated positive changes in participant knowledge over traditional formats; only three studies showed a positive change in practices. The remainder of the studies showed no difference in knowledge levels between Internet-based interventions and traditional formats for CME. The results demonstrate that Internet-based CME programs are just as effective in imparting knowledge as traditional formats of CME. Little is known as to whether these positive changes in knowledge are translated into changes in practice. Subjective reports of change in physician behavior should be confirmed through chart review or other objective measures. Additional studies need to be performed to assess how long these new learned behaviors could be sustained. eLearning will continue to evolve as new innovations and more interactive modes are incorporated into learning.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The Kirkpatrick model: A useful tool for evaluating training outcomes.

              Services employing staff to support people with disability usually provide training in a range of areas including communication and managing challenging behaviour. Given that such training can be costly and time-consuming, it is important to evaluate the evidence presented in support of such programs. Efficacy in clinical practice is measured using evidence-based practice. However, there is currently no model that is widely used to compare and evaluate training programs despite the large number of training programs reported each year.
                Bookmark

                Author and article information

                Contributors
                +31 88 755 6687 , a.m.denharder@umcutrecht.nl
                Journal
                J Digit Imaging
                J Digit Imaging
                Journal of Digital Imaging
                Springer International Publishing (Cham )
                0897-1889
                1618-727X
                13 October 2015
                13 October 2015
                April 2016
                : 29
                : 2
                : 195-205
                Affiliations
                [ ]Department of Radiology, Utrecht University Medical Center, P.O. Box 85500, E01.132, 3508 GA Utrecht, The Netherlands
                [ ]Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
                Article
                9828
                10.1007/s10278-015-9828-y
                4788615
                26464115
                b8374fcc-eee8-417d-b37a-c9f76db73c81
                © The Author(s) 2015

                Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

                History
                Categories
                Article
                Custom metadata
                © Society for Imaging Informatics in Medicine 2016

                Radiology & Imaging
                human–computer interaction,e-learning,radiology,education
                Radiology & Imaging
                human–computer interaction, e-learning, radiology, education

                Comments

                Comment on this article