Blog
About

12
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found

      How to Do a Systematic Review: A Best Practice Guide for Conducting and Reporting Narrative Reviews, Meta-Analyses, and Meta-Syntheses

      1 , 2 , 3

      Annual Review of Psychology

      Annual Reviews

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Systematic reviews are characterized by a methodical and replicable methodology and presentation. They involve a comprehensive search to locate all relevant published and unpublished work on a subject; a systematic integration of search results; and a critique of the extent, nature, and quality of evidence in relation to a particular research question. The best reviews synthesize studies to draw broad theoretical conclusions about what a literature means, linking theory to evidence and evidence to theory. This guide describes how to plan, conduct, organize, and present a systematic review of quantitative (meta-analysis) or qualitative (narrative review, meta-synthesis) information. We outline core standards and principles and describe commonly encountered problems. Although this guide targets psychological scientists, its high level of abstraction makes it potentially relevant to any subject area or discipline. We argue that systematic reviews are a key methodology for clarifying whether and how research findings replicate and for explaining possible inconsistencies, and we call for researchers to conduct systematic reviews to help elucidate whether there is a replication crisis.

          Related collections

          Most cited references 19

          • Record: found
          • Abstract: not found
          • Article: not found

          Rothstein HR

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The hazards of scoring the quality of clinical trials for meta-analysis.

             Peter Jüni (1999)
            Although it is widely recommended that clinical trials undergo some type of quality review, the number and variety of quality assessment scales that exist make it unclear how to achieve the best assessment. To determine whether the type of quality assessment scale used affects the conclusions of meta-analytic studies. Meta-analysis of 17 trials comparing low-molecular-weight heparin (LMWH) with standard heparin for prevention of postoperative thrombosis using 25 different scales to identify high-quality trials. The association between treatment effect and summary scores and the association with 3 key domains (concealment of treatment allocation, blinding of outcome assessment, and handling of withdrawals) were examined in regression models. Pooled relative risks of deep vein thrombosis with LMWH vs standard heparin in high-quality vs low-quality trials as determined by 25 quality scales. Pooled relative risks from high-quality trials ranged from 0.63 (95% confidence interval [CI], 0.44-0.90) to 0.90 (95% CI, 0.67-1.21) vs 0.52 (95% CI, 0.24-1.09) to 1.13 (95% CI, 0.70-1.82) for low-quality trials. For 6 scales, relative risks of high-quality trials were close to unity, indicating that LMWH was not significantly superior to standard heparin, whereas low-quality trials showed better protection with LMWH (P<.05). Seven scales showed the opposite: high quality trials showed an effect whereas low quality trials did not. For the remaining 12 scales, effect estimates were similar in the 2 quality strata. In regression analysis, summary quality scores were not significantly associated with treatment effects. There was no significant association of treatment effects with allocation concealment and handling of withdrawals. Open outcome assessment, however, influenced effect size with the effect of LMWH, on average, being exaggerated by 35% (95% CI, 1%-57%; P= .046). Our data indicate that the use of summary scores to identify trials of high quality is problematic. Relevant methodological aspects should be assessed individually and their influence on effect sizes explored.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The power of statistical tests in meta-analysis.

              Calculations of the power of statistical tests are important in planning research studies (including meta-analyses) and in interpreting situations in which a result has not proven to be statistically significant. The authors describe procedures to compute statistical power of fixed- and random-effects tests of the mean effect size, tests for heterogeneity (or variation) of effect size parameters across studies, and tests for contrasts among effect sizes of different studies. Examples are given using 2 published meta-analyses. The examples illustrate that statistical power is not always high in meta-analysis.
                Bookmark

                Author and article information

                Journal
                Annual Review of Psychology
                Annu. Rev. Psychol.
                Annual Reviews
                0066-4308
                1545-2085
                January 04 2019
                January 04 2019
                : 70
                : 1
                : 747-770
                Affiliations
                [1 ]Behavioural Science Centre, Stirling Management School, University of Stirling, Stirling FK9 4LA, United Kingdom;
                [2 ]Department of Psychological and Behavioural Science, London School of Economics and Political Science, London WC2A 2AE, United Kingdom
                [3 ]Department of Statistics, Northwestern University, Evanston, Illinois 60208, USA;
                Article
                10.1146/annurev-psych-010418-102803
                © 2019

                Comments

                Comment on this article

                Similar content 1,395

                Cited by 33

                Most referenced authors 1,101