15
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      What can we learn from a decade of database audits? The Duke Clinical Research Institute experience, 1997--2006.

      Clinical Trials (London, England)
      Clinical Trials as Topic, Databases as Topic, Hospitals, University, Humans, Information Management, methods, Management Audit, Research Design, Sample Size, Selection Bias

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Despite a pressing and well-documented need for better sharing of information on clinical trials data quality assurance methods, many research organizations remain reluctant to publish descriptions of and results from their internal auditing and quality assessment methods. We present findings from a review of a decade of internal data quality audits performed at the Duke Clinical Research Institute, a large academic research organization that conducts data management for a diverse array of clinical studies, both academic and industry-sponsored. In so doing, we hope to stimulate discussions that could benefit the wider clinical research enterprise by providing insight into methods of optimizing data collection and cleaning, ultimately helping patients and furthering essential research. We present our audit methodologies, including sampling methods, audit logistics, sample sizes, counting rules used for error rate calculations, and characteristics of audited trials. We also present database error rates as computed according to two analytical methods, which we address in detail, and discuss the advantages and drawbacks of two auditing methods used during this 10-year period. Our review of the DCRI audit program indicates that higher data quality may be achieved from a series of small audits throughout the trial rather than through a single large database audit at database lock. We found that error rates trended upward from year to year in the period characterized by traditional audits performed at database lock (1997-2000), but consistently trended downward after periodic statistical process control type audits were instituted (2001-2006). These increases in data quality were also associated with cost savings in auditing, estimated at 1000 h per year, or the efforts of one-half of a full time equivalent (FTE). Our findings are drawn from retrospective analyses and are not the result of controlled experiments, and may therefore be subject to unanticipated confounding. In addition, the scope and type of audits we examine here are specific to our institution, and our results may not be broadly generalizable. Use of statistical process control methodologies may afford advantages over more traditional auditing methods, and further research will be necessary to confirm the reliability and usability of such techniques. We believe that open and candid discussion of data quality assurance issues among academic and clinical research organizations will ultimately benefit the entire research community in the coming era of increased data sharing and re-use.

          Related collections

          Author and article information

          Journal
          19342467
          3494997
          10.1177/1740774509102590

          Chemistry
          Clinical Trials as Topic,Databases as Topic,Hospitals, University,Humans,Information Management,methods,Management Audit,Research Design,Sample Size,Selection Bias

          Comments

          Comment on this article