11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Evidenced-Based Claims About Evidence

      editorial
      , PhD
      MDM Policy & Practice
      SAGE Publications

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Presentations on evidence translation in health care often begin by citing the statistic that it takes 17 years for evidence to diffuse into practice. The source of the 17 year estimate is a paper by Balas and Boren published in 2000 in the Yearbook of Medical Informatics. 1 The authors identified nine mostly primary care services supported by evidence from randomized trials and calculated, based on the trials’ publication dates and current levels of use, that it takes 15.6 years for use to reach 50%. Including the time from submission to publication, 1.4 years, brings the implementation lag up to the familiar 17-year figure. Although the paper was published in a relatively obscure venue, it was mentioned in the Institute of Medicine’s Crossing the Quality Chasm report 2 and has been widely cited in the literature on evidence translation ever since. The paper itself has been cited more than 800 times, 3 but this figure understates its influence because some authors reference the Institute of Medicine report instead. The 17-year figure is cited by policy makers (e.g., Agency for Healthcare Research and Quality director Andrew Bindman, 4 former White House Office of Management and Budget director Peter Orszag 5 ) and in the popular press. 6 “Evidence” is a somewhat amorphous term, but Balas and Boren focus on the adoption of treatments following the publication of “landmark trials.” Some treatments are tested in multiple, high-quality trials that produce conflicting results, but these cases are rare. The evidence for many treatments rests on observational studies or decision analyses, but if trials do not affect practice, it is unlikely physicians will respond to studies that are lower in the hierarchy of evidence. The publication of landmark trials produces the cleanest test of the impact of evidence on practice. Understanding how evidence from trials affects practice is important, not only for identifying opportunities to improve care but also, more broadly, for assessing the performance of the health system. Balas and Boren’s finding suggests that the system is deeply flawed: physicians routinely ignore evidence that could benefit their patients. However, despite its troubling implications, their work has received little scrutiny. It turns out there are a number of problems with the 17-year estimate and how it is used. Some of their utilization estimates are low and not well-referenced. For example, they assumed that only 20% of diabetic patients had foot exams in 1998. According to the Centers for Disease Control and Prevention, the actual rate was closer to 60%. 7 They use an estimate of mammography rates from 1997, which, when combined with their assumption that rates increase linearly, implies that the mammography rate reached 50% in 1993, when in fact rates reached that level in 1990 8 or earlier. 9 Physicians can apply evidence only to patients who show up. Most of the utilization measures are calculated using broad-based denominators (e.g., all adults, 10 adult health plan members with diabetes 11 ). They include patients who do not have regular check-ups. Consequently, they partly reflect the behavior of patients and do not isolate the impact of evidence on physicians’ decisions. Also, widescale delivery of some services requires an extensive infrastructure, such as manufacturing capacity and distribution systems for vaccines. Diffusion rates for these services do not tell us much about how physicians respond to evidence generally, yet the 17-year estimate is typically quoted without caveat. Dissemination and implementation science researchers have latched on to and amplified the 17-year implementation lag estimate to highlight the relevance of their field. 12 Their counterparts in behavioral economics have also embraced the notion that practice changes slowly to illustrate the applicability of behavioral economics to medicine. 13 They claim that physicians display belief persistence and confirmation bias—physicians are slow to let go of long-standing beliefs in response to new evidence and irrationally overweight evidence that is consistent with their preconceptions. However, discussions of this phenomenon are often long on theory and short on real-world examples. The examples that are commonly cited to illustrate how physicians discount new evidence, such as prostate-specific antigen screening and breast cancer screening for women under age 50, are notable mainly because changes in screening recommendations were prompted by a reassessment of the existing scant and conflicting evidence base rather than the release of new evidence from a randomized trial. Counterexamples are discounted or ignored, though there are cases where evidence has led to rapid changes in practice. The Z0011 trial, 14 which randomly assigned breast cancer patients with one to two positive sentinel lymph nodes to undergo a compete axillary dissection or no additional treatment following lumpectomy, is a case in point. There was a large decline in the proportion of breast cancer patients meeting the Z0011 trial inclusion criteria who had eight or more nodes examined (a proxy for the receipt of an axillary dissection). 15 The share of patients who had eight or more nodes removed decreased from over 50% to below 30% in a matter of months. In fact, treatment started changing shortly after the results of the trial were presented at a medical conference, even before the trial was published. There are characteristics of the procedure that may have facilitated the translation of evidence into practice. The audience for the evidence was a close-knit community of specialists, physicians who performed fewer axillary dissections did not pay a price in terms of fame or fortune, and the procedure is associated with side effects, and so patients faced tangible harms from overtreatment. However, axillary dissection is no more atypical than the treatments and tests commonly used to illustrate the theory that physicians are slow to respond to evidence. In fact, there was concern that inertia and a pro-intervention bias in oncology would lead physicians to ignore the Z0011 trial results. 16 The case of the Z0011 trial suggests that physicians are not always locked-in to established therapies. Sometimes high-quality evidence is sufficient to change practice. There are of course cases where high-quality evidence has failed to change how physicians treat patients. For example, use of tight glycemic control in patients admitted to intensive care units did not decline after a trial found that it increased mortality. 17 However, these must be considered alongside cases where evidence affected practice. Rather than trying to divine a one-size-fits-all rule about the speed at which results are translated into practice, researchers should instead try to build a more robust knowledge base about the factors that influence how physicians respond to evidence through observational analyses and interventional dissemination studies. How does the quality of evidence affect the uptake of findings? Are there characteristics of trials that predict rapid uptake? How does uptake vary based on physicians’ training, compensation scheme, practice environment, and other modifiable factors? Are studies that highlight patient harms more influential than ones that emphasize equivalence of a primary endpoint? The answers will help investigators and funders design better trials and dissemination strategies and policymakers better health systems. In the meantime, the 17-year implementation lag estimate should be laid to rest. There is still a lot of work to be done to make evidenced-based medicine a reality. Acknowledging that sometimes evidence leads to rapid change does not negate that fact.

          Related collections

          Most cited references9

          • Record: found
          • Abstract: found
          • Article: not found

          Effect of published scientific evidence on glycemic control in adult intensive care units.

          Little is known about the deadoption of ineffective or harmful clinical practices. A large clinical trial (the Normoglycemia in Intensive Care Evaluation and Survival Using Glucose Algorithm Regulation [NICE-SUGAR] trial) demonstrated that strict blood glucose control (tight glycemic control) in patients admitted to adult intensive care units (ICUs) should be deadopted; however, it is unknown whether deadoption occurred and how it compared with the initial adoption.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Faded promises: the challenge of deadopting low-value care.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Breast cancer screening practices among users of county-funded health centers vs women in the entire community.

              Breast cancer screening rates tend to be lower among women with lower income and/or education.
                Bookmark

                Author and article information

                Journal
                MDM Policy Pract
                MDM Policy Pract
                MPP
                spmpp
                MDM Policy & Practice
                SAGE Publications (Sage CA: Los Angeles, CA )
                2381-4683
                12 October 2017
                Jul-Dec 2017
                : 2
                : 2
                : 2381468317734527
                Affiliations
                [1-2381468317734527]Winship Cancer Center, Department of Health Policy and Management, Emory University School of Public Health, Atlanta, Georgia
                Author notes
                [*]David H. Howard, PhD, Winship Cancer Center, Department of Health Policy and Management, Emory University School of Public Health, 1518 Clifton Rd NE, Atlanta, GA 30322; telephone: (404) 727-3907; e-mail: david.howard@ 123456emory.edu .
                Article
                10.1177_2381468317734527
                10.1177/2381468317734527
                6125044
                f525b341-e775-470a-b14c-1114382e07dd
                © The Author(s) 2017

                This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License ( http://www.creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page( https://us.sagepub.com/en-us/nam/open-access-at-sage).

                History
                : 8 September 2017
                : 8 September 2017
                Categories
                Editorial
                Custom metadata
                July-December 2017

                Comments

                Comment on this article