35
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Interpretability of machine learning‐based prediction models in healthcare

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references78

          • Record: found
          • Abstract: not found
          • Book: not found

          The Elements of Statistical Learning

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead

            Black box machine learning models are currently being used for high stakes decision-making throughout society, causing problems throughout healthcare, criminal justice, and in other domains. People have hoped that creating methods for explaining these black box models will alleviate some of these problems, but trying to explain black box models, rather than creating models that are interpretable in the first place, is likely to perpetuate bad practices and can potentially cause catastrophic harm to society. There is a way forward - it is to design models that are inherently interpretable. This manuscript clarifies the chasm between explaining black boxes and using inherently interpretable models, outlines several key reasons why explainable black boxes should be avoided in high-stakes decisions, identifies challenges to interpretable machine learning, and provides several example applications where interpretable models could potentially replace black box models in criminal justice, healthcare, and computer vision.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Dissecting racial bias in an algorithm used to manage the health of populations

              Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses. Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise. We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                WIREs Data Mining and Knowledge Discovery
                WIREs Data Mining Knowl Discov
                Wiley
                1942-4787
                1942-4795
                September 2020
                June 29 2020
                September 2020
                : 10
                : 5
                Affiliations
                [1 ]Faculty of Health Sciences University of Maribor Maribor Slovenia
                [2 ]Faculty of Electrical Engineering and Computer Science University of Maribor Maribor Slovenia
                [3 ]Department of Biomedical Informatics Harvard University Cambridge Massachusetts USA
                [4 ]Department of Computer Science, KU Leuven Leuven Belgium
                Article
                10.1002/widm.1379
                ec80b927-f61c-4c24-8a44-99300f1259bf
                © 2020

                http://onlinelibrary.wiley.com/termsAndConditions#vor

                http://doi.wiley.com/10.1002/tdm_license_1.1

                History

                Comments

                Comment on this article