52
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead

      Nature Machine Intelligence
      Springer Science and Business Media LLC

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          <p class="first" id="d1530035e55">Black box machine learning models are currently being used for high stakes decision-making throughout society, causing problems throughout healthcare, criminal justice, and in other domains. People have hoped that creating methods for explaining these black box models will alleviate some of these problems, but trying to explain black box models, rather than creating models that are interpretable in the first place, is likely to perpetuate bad practices and can potentially cause catastrophic harm to society. There is a way forward - it is to design models that are inherently interpretable. This manuscript clarifies the chasm between explaining black boxes and using inherently interpretable models, outlines several key reasons why explainable black boxes should be avoided in high-stakes decisions, identifies challenges to interpretable machine learning, and provides several example applications where interpretable models could potentially replace black box models in criminal justice, healthcare, and computer vision. </p>

          Related collections

          Author and article information

          Journal
          Nature Machine Intelligence
          Nat Mach Intell
          Springer Science and Business Media LLC
          2522-5839
          May 2019
          May 13 2019
          May 2019
          : 1
          : 5
          : 206-215
          Article
          10.1038/s42256-019-0048-x
          9122117
          35603010
          05258117-c607-429e-a788-23d85c319973
          © 2019

          http://www.springer.com/tdm

          History

          Comments

          Comment on this article