98
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Retrieving Clinical Evidence: A Comparison of PubMed and Google Scholar for Quick Clinical Searches

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Physicians frequently search PubMed for information to guide patient care. More recently, Google Scholar has gained popularity as another freely accessible bibliographic database.

          Objective

          To compare the performance of searches in PubMed and Google Scholar.

          Methods

          We surveyed nephrologists (kidney specialists) and provided each with a unique clinical question derived from 100 renal therapy systematic reviews. Each physician provided the search terms they would type into a bibliographic database to locate evidence to answer the clinical question. We executed each of these searches in PubMed and Google Scholar and compared results for the first 40 records retrieved (equivalent to 2 default search pages in PubMed). We evaluated the recall (proportion of relevant articles found) and precision (ratio of relevant to nonrelevant articles) of the searches performed in PubMed and Google Scholar. Primary studies included in the systematic reviews served as the reference standard for relevant articles. We further documented whether relevant articles were available as free full-texts.

          Results

          Compared with PubMed, the average search in Google Scholar retrieved twice as many relevant articles (PubMed: 11%; Google Scholar: 22%; P<.001). Precision was similar in both databases (PubMed: 6%; Google Scholar: 8%; P=.07). Google Scholar provided significantly greater access to free full-text publications (PubMed: 5%; Google Scholar: 14%; P<.001).

          Conclusions

          For quick clinical searches, Google Scholar returns twice as many relevant articles as PubMed and provides greater access to free full-text articles.

          Related collections

          Most cited references51

          • Record: found
          • Abstract: found
          • Article: not found

          Optimal search strategies for retrieving scientifically strong studies of treatment from Medline: analytical survey.

          To develop and test optimal Medline search strategies for retrieving sound clinical studies on prevention or treatment of health disorders. Analytical survey. 161 clinical journals indexed in Medline for the year 2000. Sensitivity, specificity, precision, and accuracy of 4862 unique terms in 18 404 combinations. Only 1587 (24.2%) of 6568 articles on treatment met criteria for testing clinical interventions. Combinations of search terms reached peak sensitivities of 99.3% (95% confidence interval 98.7% to 99.8%) at a specificity of 70.4% (69.8% to 70.9%). Compared with best single terms, best multiple terms increased sensitivity for sound studies by 4.1% (absolute increase), but with substantial loss of specificity (absolute difference 23.7%) when sensitivity was maximised. When terms were combined to maximise specificity, 97.4% (97.3% to 97.6%) was achieved, about the same as that achieved by the best single term (97.6%, 97.4% to 97.7%). The strategies newly reported in this paper outperformed other validated search strategies except for two strategies that had slightly higher specificity (98.1% and 97.6% v 97.4%) but lower sensitivity (42.0% and 92.8% v 93.1%). New empirical search strategies have been validated to optimise retrieval from Medline of articles reporting high quality clinical studies on prevention or treatment of health disorders.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Analysis of questions asked by family doctors regarding patient care.

            To characterise the information needs of family doctors by collecting the questions they asked about patient care during consultations and to classify these in ways that would be useful to developers of knowledge bases. Observational study in which investigators visited doctors for two half days and collected their questions. Taxonomies were developed to characterise the clinical topic and generic type of information sought for each question. Eastern Iowa. Random sample of 103 family doctors. Number of questions posed, pursued, and answered; topic and generic type of information sought for each question; time spent pursuing answers; information resources used. Participants asked a total of 1101 questions. Questions about drug prescribing, obstetrics and gynaecology, and adult infectious disease were most common and comprised 36% of all questions. The taxonomy of generic questions included 69 categories; the three most common types, comprising 24% of all questions, were "What is the cause of symptom X?" "What is the dose of drug X?" and "How should I manage disease or finding X?" Answers to most questions (702, 64%) were not immediately pursued, but, of those pursued, most (318, 80%) were answered. Doctors spent an average of less than 2 minutes pursuing an answer, and they used readily available print and human resources. Only two questions led to a formal literature search. Family doctors in this study did not pursue answers to most of their questions. Questions about patient care can be organised into a limited number of generic types, which could help guide the efforts of knowledge base developers.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Obstacles to answering doctors' questions about patient care with evidence: qualitative study.

              To describe the obstacles encountered when attempting to answer doctors' questions with evidence. Qualitative study. General practices in Iowa. 9 academic generalist doctors, 14 family doctors, and 2 medical librarians. A taxonomy of obstacles encountered while searching for evidence based answers to doctors' questions. 59 obstacles were encountered and organised according to the five steps in asking and answering questions: recognise a gap in knowledge, formulate a question, search for relevant information, formulate an answer, and use the answer to direct patient care. Six obstacles were considered particularly salient by the investigators and practising doctors: the excessive time required to find information; difficulty modifying the original question, which was often vague and open to interpretation; difficulty selecting an optimal strategy to search for information; failure of a seemingly appropriate resource to cover the topic; uncertainty about how to know when all the relevant evidence has been found so that the search can stop; and inadequate synthesis of multiple bits of evidence into a clinically useful statement. Many obstacles are encountered when asking and answering questions about how to care for patients. Addressing these obstacles could lead to better patient care by improving clinically oriented information resources.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                JMIR
                Journal of Medical Internet Research
                JMIR Publications Inc. (Toronto, Canada )
                1439-4456
                1438-8871
                August 2013
                15 August 2013
                : 15
                : 8
                : e164
                Affiliations
                [1] 1Kidney Clinical Research Unit Division of Nephrology Western University London, ONCanada
                [2] 2Department of Epidemiology and Biostatistics Western University London, ONCanada
                [3] 3McMaster University Department of Clinical Epidemiology and Biostatistics Hamilton, ONCanada
                [4] 4Department of Medicine McMaster University Hamilton, ONCanada
                Author notes
                Corresponding Author: Salimah Z Shariff salimah.shariff@ 123456lhsc.on.ca
                Article
                v15i8e164
                10.2196/jmir.2624
                3757915
                23948488
                aaac7f9b-16fd-4792-b24c-5db484f6e8d8
                ©Salimah Z Shariff, Shayna AD Bejaimal, Jessica M Sontrop, Arthur V Iansavichus, R Brian Haynes, Matthew A Weir, Amit X Garg. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.08.2013.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                History
                : 18 March 2013
                : 04 May 2013
                : 16 May 2013
                : 11 June 2013
                Categories
                Original Paper

                Medicine
                information dissemination/methods,information storage and retrieval,medical,library science,pubmed,google scholar,nephrology

                Comments

                Comment on this article