21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      An experimental search strategy retrieves more precise results than PubMed and Google for questions about medical interventions

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objective. We compared the precision of a search strategy designed specifically to retrieve randomized controlled trials (RCTs) and systematic reviews of RCTs with search strategies designed for broader purposes.

          Methods. We designed an experimental search strategy that automatically revised searches up to five times by using increasingly restrictive queries as long at least 50 citations were retrieved. We compared the ability of the experimental and alternative strategies to retrieve studies relevant to 312 test questions. The primary outcome, search precision, was defined for each strategy as the proportion of relevant, high quality citations among the first 50 citations retrieved.

          Results. The experimental strategy had the highest median precision (5.5%; interquartile range [IQR]: 0%–12%) followed by the narrow strategy of the PubMed Clinical Queries (4.0%; IQR: 0%–10%). The experimental strategy found the most high quality citations (median 2; IQR: 0–6) and was the strategy most likely to find at least one high quality citation (73% of searches; 95% confidence interval 68%–78%). All comparisons were statistically significant.

          Conclusions. The experimental strategy performed the best in all outcomes although all strategies had low precision.

          Related collections

          Most cited references38

          • Record: found
          • Abstract: not found
          • Book: not found

          R: A Language and Environment for Statistical Computing.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Rationale for systematic reviews.

            C D Mulrow (1994)
            Systematic literature reviews including meta-analyses are invaluable scientific activities. The rationale for such reviews is well established. Health care providers, researchers, and policy makers are inundated with unmanageable amounts of information; they need systematic reviews to efficiently integrate existing information and provide data for rational decision making. Systematic reviews establish whether scientific findings are consistent and can be generalised across populations, settings, and treatment variations, or whether findings vary significantly by particular subsets. Meta-analyses in particular can increase power and precision of estimates of treatment effects and exposure risks. Finally, explicit methods used in systematic reviews limit bias and, hopefully, will improve reliability and accuracy of conclusions.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Optimal search strategies for retrieving scientifically strong studies of treatment from Medline: analytical survey.

              To develop and test optimal Medline search strategies for retrieving sound clinical studies on prevention or treatment of health disorders. Analytical survey. 161 clinical journals indexed in Medline for the year 2000. Sensitivity, specificity, precision, and accuracy of 4862 unique terms in 18 404 combinations. Only 1587 (24.2%) of 6568 articles on treatment met criteria for testing clinical interventions. Combinations of search terms reached peak sensitivities of 99.3% (95% confidence interval 98.7% to 99.8%) at a specificity of 70.4% (69.8% to 70.9%). Compared with best single terms, best multiple terms increased sensitivity for sound studies by 4.1% (absolute increase), but with substantial loss of specificity (absolute difference 23.7%) when sensitivity was maximised. When terms were combined to maximise specificity, 97.4% (97.3% to 97.6%) was achieved, about the same as that achieved by the best single term (97.6%, 97.4% to 97.7%). The strategies newly reported in this paper outperformed other validated search strategies except for two strategies that had slightly higher specificity (98.1% and 97.6% v 97.4%) but lower sensitivity (42.0% and 92.8% v 93.1%). New empirical search strategies have been validated to optimise retrieval from Medline of articles reporting high quality clinical studies on prevention or treatment of health disorders.
                Bookmark

                Author and article information

                Contributors
                Journal
                PeerJ
                PeerJ
                PeerJ
                PeerJ
                PeerJ
                PeerJ Inc. (San Francisco, USA )
                2167-8359
                23 April 2015
                2015
                : 3
                : e913
                Affiliations
                [1 ]Department of Internal Medicine, Kansas University School of Medicine - Wichita , Wichita, KS, USA
                [2 ]Katy Campus Library, Houston Community College , Houston, TX, USA
                [3 ]No institutional affiliation , Birmingham, AL, USA
                [4 ]School of Information, University of Texas at Austin , Austin, TX, USA
                Article
                913
                10.7717/peerj.913
                4411517
                25922798
                3249b566-3f42-4d7c-8327-a1cac4f1ae15
                © 2015 Badgett et al.

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.

                History
                : 21 November 2014
                : 5 April 2015
                Funding
                The authors declare there was no funding for this work.
                Categories
                Evidence Based Medicine
                Science and Medical Education
                Statistics
                Computational Science

                information retrieval,evidence-based medicine,google,pubmed

                Comments

                Comment on this article