Blog
About

  • Record: found
  • Abstract: found
  • Article: found
Is Open Access

ClinicalCodes: An Online Clinical Codes Repository to Improve the Validity and Reproducibility of Research Using Electronic Medical Records

Read this article at

Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Abstract

      Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

      Related collections

      Most cited references 19

      • Record: found
      • Abstract: found
      • Article: not found

      Validation and validity of diagnoses in the General Practice Research Database: a systematic review

      AIMS To investigate the range of methods used to validate diagnoses in the General Practice Research Database (GPRD), to summarize findings and to assess the quality of these validations. METHODS A systematic literature review was performed by searching PubMed and Embase for publications using GPRD data published between 1987 and April 2008. Additional publications were identified from conference proceedings, back issues of relevant journals, bibliographies of retrieved publications and relevant websites. Publications that reported attempts to validate disease diagnoses recorded in the GPRD were included. RESULTS We identified 212 publications, often validating more than one diagnosis. In total, 357 validations investigating 183 different diagnoses met our inclusion criteria. Of these, 303 (85%) utilized data from outside the GPRD to validate diagnoses. The remainder utilized only data recorded in the database. The median proportion of cases with a confirmed diagnosis was 89% (range 24–100%). Details of validation methods and results were often incomplete. CONCLUSIONS A number of methods have been used to assess validity. Overall, estimates of validity were high. However, the quality of reporting of the validations was often inadequate to permit a clear interpretation. Not all methods provided a quantitative estimate of validity and most methods considered only the positive predictive value of a set of diagnostic codes in a highly selected group of cases. We make recommendations for methodology and reporting to strengthen further the use of the GPRD in research.
        Bookmark
        • Record: found
        • Abstract: found
        • Article: not found

        The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.

        Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalisability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a 2-day workshop in September 2004, with methodologists, researchers, and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results, and discussion sections of articles. 18 items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the websites of PLoS Medicine, Annals of Internal Medicine and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.
          Bookmark
          • Record: found
          • Abstract: not found
          • Article: not found

          The inevitable application of big data to health care.

            Bookmark

            Author and article information

            Affiliations
            [1 ]Centre for Primary Care, Institute for Population Health, University of Manchester, Manchester, United Kingdom
            [2 ]Centre for Biostatistics, Institute for Population Health, University of Manchester, Manchester, United Kingdom
            [3 ]Centre for Health Informatics, Institute for Population Health, University of Manchester, Manchester, United Kingdom
            [4 ]Centre for Pharmacoepidemiology and Drug Safety Research, Manchester Pharmacy School, University of Manchester, Manchester, United Kingdom
            [5 ]Manchester Institute for Biotechnology, University of Manchester, Manchester, United Kingdom
            UCL, United Kingdom
            Author notes

            Competing Interests: The authors have declared that no competing interests exist.

            Conceived and designed the experiments: DAS. Performed the experiments: DAS DR EK IO RP DA EC. Analyzed the data: DAS. Contributed to the writing of the manuscript: DAS DR EK IO RP DA EC.

            Contributors
            Role: Editor
            Journal
            PLoS One
            PLoS ONE
            plos
            plosone
            PLoS ONE
            Public Library of Science (San Francisco, USA )
            1932-6203
            2014
            18 June 2014
            : 9
            : 6
            24941260 4062485 PONE-D-14-13144 10.1371/journal.pone.0099825

            This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

            Counts
            Pages: 6
            Funding
            The project “How valid are evaluations of effectiveness based on Primary Care Databases” was funded by the National Institute for Health Research (NIHR) School for Primary Care Research (SPCR, http://www.nihr.ac.uk/research/Pages/programmes_primary_care_research.aspx). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
            Categories
            Research Article
            Medicine and Health Sciences
            Epidemiology
            Disease Informatics
            Epidemiological Methods and Statistics
            Pharmacoepidemiology
            Research and Analysis Methods
            Database and Informatics Methods
            Health Informatics
            Research Assessment
            Publication Practices
            Open Access
            Peer Review
            Reproducibility
            Research Validity
            Research Design
            Observational Studies
            Custom metadata
            The authors confirm that all data underlying the findings are fully available without restriction. All data are held in the Github repository for the project at https://github.com/rOpenHealth/ClinicalCodes/tree/master/paper and on Figshare: http://figshare.com/account/projects/1286.

            Uncategorized

            Comments

            Comment on this article