1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A translational perspective towards clinical AI fairness

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Artificial intelligence (AI) has demonstrated the ability to extract insights from data, but the fairness of such data-driven insights remains a concern in high-stakes fields. Despite extensive developments, issues of AI fairness in clinical contexts have not been adequately addressed. A fair model is normally expected to perform equally across subgroups defined by sensitive variables (e.g., age, gender/sex, race/ethnicity, socio-economic status, etc.). Various fairness measurements have been developed to detect differences between subgroups as evidence of bias, and bias mitigation methods are designed to reduce the differences detected. This perspective of fairness, however, is misaligned with some key considerations in clinical contexts. The set of sensitive variables used in healthcare applications must be carefully examined for relevance and justified by clear clinical motivations. In addition, clinical AI fairness should closely investigate the ethical implications of fairness measurements (e.g., potential conflicts between group- and individual-level fairness) to select suitable and objective metrics. Generally defining AI fairness as “equality” is not necessarily reasonable in clinical settings, as differences may have clinical justifications and do not indicate biases. Instead, “equity” would be an appropriate objective of clinical AI fairness. Moreover, clinical feedback is essential to developing fair and well-performing AI models, and efforts should be made to actively involve clinicians in the process. The adaptation of AI fairness towards healthcare is not self-evident due to misalignments between technical developments and clinical considerations. Multidisciplinary collaboration between AI researchers, clinicians, and ethicists is necessary to bridge the gap and translate AI fairness into real-life benefits.

          Related collections

          Most cited references56

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Dissecting racial bias in an algorithm used to manage the health of populations

            Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses. Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise. We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              I.—COMPUTING MACHINERY AND INTELLIGENCE

              A Turing (1950)
                Bookmark

                Author and article information

                Contributors
                liu.nan@duke-nus.edu.sg
                Journal
                NPJ Digit Med
                NPJ Digit Med
                NPJ Digital Medicine
                Nature Publishing Group UK (London )
                2398-6352
                14 September 2023
                14 September 2023
                2023
                : 6
                : 172
                Affiliations
                [1 ]Centre for Quantitative Medicine, Duke-NUS Medical School, ( https://ror.org/02j1m6098) Singapore, Singapore
                [2 ]Centre for Ethics, Department of Philosophy, University of Antwerp, ( https://ror.org/008x57b05) Antwerp, Belgium
                [3 ]Antwerp Center on Responsible AI, University of Antwerp, ( https://ror.org/008x57b05) Antwerp, Belgium
                [4 ]Department of Health Outcomes and Biomedical Informatics, University of Florida, ( https://ror.org/02y3ad647) Gainesville, FL USA
                [5 ]GRID grid.419272.b, ISNI 0000 0000 9960 1711, Singapore Eye Research Institute, , Singapore National Eye Centre, ; Singapore, Singapore
                [6 ]SingHealth AI Office, Singapore Health Services, ( https://ror.org/04me94w47) Singapore, Singapore
                [7 ]Department of Diagnostic Radiology, Singapore General Hospital, ( https://ror.org/036j6sg82) Singapore, Singapore
                [8 ]Department of Pharmacy, Singapore General Hospital, ( https://ror.org/036j6sg82) Singapore, Singapore
                [9 ]Department of Population Health Sciences, Weill Cornell Medicine, ( https://ror.org/02r109517) New York, NY USA
                [10 ]Laboratory for Computational Physiology, Massachusetts Institute of Technology, ( https://ror.org/042nb2s44) Cambridge, MA USA
                [11 ]Division of Pulmonary, Critical Care and Sleep Medicine, Beth Israel Deaconess Medical Center, ( https://ror.org/04drvxt59) Boston, MA USA
                [12 ]GRID grid.38142.3c, ISNI 000000041936754X, Department of Biostatistics, , Harvard T.H. Chan School of Public Health, ; Boston, MA USA
                [13 ]Programme in Health Services and Systems Research, Duke-NUS Medical School, ( https://ror.org/02j1m6098) Singapore, Singapore
                [14 ]Department of Emergency Medicine, Singapore General Hospital, ( https://ror.org/036j6sg82) Singapore, Singapore
                [15 ]Institute of Data Science, National University of Singapore, ( https://ror.org/01tgyzw49) Singapore, Singapore
                Author information
                http://orcid.org/0000-0002-6758-4472
                http://orcid.org/0000-0002-9883-9167
                http://orcid.org/0000-0002-1068-7868
                http://orcid.org/0000-0001-6916-5960
                http://orcid.org/0000-0001-7365-0053
                http://orcid.org/0000-0001-9459-9461
                http://orcid.org/0000-0001-6712-6626
                http://orcid.org/0000-0003-3610-4883
                Article
                918
                10.1038/s41746-023-00918-4
                10502051
                37709945
                12cfb5c3-a0fe-470f-806f-d35afb8ad065
                © Springer Nature Limited 2023

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 16 June 2023
                : 4 September 2023
                Funding
                Funded by: Duke-NUS Medical School
                Funded by: Estate of Tan Sri Khoo Teck Puat
                Categories
                Perspective
                Custom metadata
                © Springer Nature Limited 2023

                medical research,health care
                medical research, health care

                Comments

                Comment on this article