60
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Expediting systematic reviews: methods and implications of rapid reviews

      , ,
      Implementation Science
      Springer Science and Business Media LLC

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background Policy makers and others often require synthesis of knowledge in an area within six months or less. Traditional systematic reviews typically take at least 12 months to conduct. Rapid reviews streamline traditional systematic review methods in order to synthesize evidence within a shortened timeframe. There is great variation in the process of conducting rapid reviews. This review sought to examine methods used for rapid reviews, as well as implications of methodological streamlining in terms of rigour, bias, and results. Methods A comprehensive search strategy--including five electronic databases, grey literature, hand searching of relevant journals, and contacting key informants--was undertaken. All titles and abstracts (n = 1,989) were reviewed independently by two reviewers. Relevance criteria included articles published between 1995 and 2009 about conducting rapid reviews or addressing comparisons of rapid reviews versus traditional reviews. Full articles were retrieved for any titles deemed relevant by either reviewer (n = 70). Data were extracted from all relevant methodological articles (n = 45) and from exemplars of rapid review methods (n = 25). Results Rapid reviews varied from three weeks to six months; various methods for speeding up the process were employed. Some limited searching by years, databases, language, and sources beyond electronic searches. Several employed one reviewer for title and abstract reviewing, full text review, methodological quality assessment, and/or data extraction phases. Within rapid review studies, accelerating the data extraction process may lead to missing some relevant information. Biases may be introduced due to shortened timeframes for literature searching, article retrieval, and appraisal. Conclusions This review examined the continuum between diverse rapid review methods and traditional systematic reviews. It also examines potential implications of streamlined review methods. More of these rapid reviews need to be published in the peer-reviewed literature with an emphasis on articulating methods employed. While one consistent methodological approach may not be optimal or appropriate, it is important that researchers undertaking reviews within the rapid to systematic continuum provide detailed descriptions of methods used and discuss the implications of their chosen methods in terms of potential bias introduced. Further research comparing full systematic reviews with rapid reviews will enhance understanding of the limitations of these methods.

          Related collections

          Most cited references76

          • Record: found
          • Abstract: not found
          • Article: not found

          Systematic reviews in health care: Assessing the quality of controlled clinical trials.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Does quality of reports of randomised trials affect estimates of intervention efficacy reported in meta-analyses?

            Few meta-analyses of randomised trials assess the quality of the studies included. Yet there is increasing evidence that trial quality can affect estimates of intervention efficacy. We investigated whether different methods of quality assessment provide different estimates of intervention efficacy evaluated in randomised controlled trials (RCTs). We randomly selected 11 meta-analyses that involved 127 RCTs on the efficacy of interventions used for circulatory and digestive diseases, mental health, and pregnancy and childbirth. We replicated all the meta-analyses using published data from the primary studies. The quality of reporting of all 127 clinical trials was assessed by means of component and scale approaches. To explore the effects of quality on the quantitative results, we examined the effects of different methods of incorporating quality scores (sensitivity analysis and quality weights) on the results of the meta-analyses. The quality of trials was low. Masked assessments provided significantly higher scores than unmasked assessments (mean 2.74 [SD 1.10] vs 2.55 [1.20]). Low-quality trials (score 2), were associated with an increased estimate of benefit of 34% (ratio of odds ratios [ROR] 0.66 [95% CI 0.52-0.83]). Trials that used inadequate allocation concealment, compared with those that used adequate methods, were also associated with an increased estimate of benefit (37%; ROR=0.63 [0.45-0.88]). The average treatment benefit was 39% (odds ratio [OR] 0.61 [0.57-0.65]) for all trials, 52% (OR 0.48 [0.43-0.54]) for low-quality trials, and 29% (OR 0.71 [0.65-0.77]) for high-quality trials. Use of all the trial scores as quality weights reduced the effects to 35% (OR 0.65 [0.59-0.71]) and resulted in the least statistical heterogeneity. Studies of low methodological quality in which the estimate of quality is incorporated into the meta-analyses can alter the interpretation of the benefit of intervention, whether a scale or component approach is used in the assessment of trial quality.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Bias in location and selection of studies.

                Bookmark

                Author and article information

                Journal
                Implementation Science
                Implementation Sci
                Springer Science and Business Media LLC
                1748-5908
                December 2010
                July 19 2010
                December 2010
                : 5
                : 1
                Article
                10.1186/1748-5908-5-56
                3343323e-7a51-44bf-bf28-b533d04c7cfe
                © 2010

                http://www.springer.com/tdm

                History

                Comments

                Comment on this article