Author: Pot, Mirjam; Kieusseyan, Nathalie; Prainsack, Barbara
Title: Not all biases are bad: equitable and inequitable biases in machine learning and radiology Cord-id: yrag4t2n Document date: 2021_2_10
ID: yrag4t2n
Snippet: The application of machine learning (ML) technologies in medicine generally but also in radiology more specifically is hoped to improve clinical processes and the provision of healthcare. A central motivation in this regard is to advance patient treatment by reducing human error and increasing the accuracy of prognosis, diagnosis and therapy decisions. There is, however, also increasing awareness about bias in ML technologies and its potentially harmful consequences. Biases refer to systematic d
Document: The application of machine learning (ML) technologies in medicine generally but also in radiology more specifically is hoped to improve clinical processes and the provision of healthcare. A central motivation in this regard is to advance patient treatment by reducing human error and increasing the accuracy of prognosis, diagnosis and therapy decisions. There is, however, also increasing awareness about bias in ML technologies and its potentially harmful consequences. Biases refer to systematic distortions of datasets, algorithms, or human decision making. These systematic distortions are understood to have negative effects on the quality of an outcome in terms of accuracy, fairness, or transparency. But biases are not only a technical problem that requires a technical solution. Because they often also have a social dimension, the ‘distorted’ outcomes they yield often have implications for equity. This paper assesses different types of biases that can emerge within applications of ML in radiology, and discusses in what cases such biases are problematic. Drawing upon theories of equity in healthcare, we argue that while some biases are harmful and should be acted upon, others might be unproblematic and even desirable—exactly because they can contribute to overcome inequities.
Search related documents:
Co phrase search for related documents- accurate diagnosis and low income: 1, 2, 3, 4, 5, 6, 7, 8
- accurate diagnosis and low prevalence: 1, 2
- accurate diagnosis and low quality: 1, 2, 3, 4, 5, 6
- accurate diagnosis and machine human: 1
- accurate diagnosis and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43
- accurate diagnosis and machine produce: 1
- long history and low income: 1, 2, 3, 4, 5, 6
- long history and low prevalence: 1
- long history and low quality: 1
- long history and low socioeconomic status: 1
- long history and machine learning: 1, 2, 3
- low income and machine bias: 1
- low income and machine human: 1
- low income and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- low income and machine learning technology: 1
- low prevalence and machine learning: 1, 2, 3, 4, 5
- low quality and machine crucial: 1
- low quality and machine human: 1
- low quality and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
Co phrase search for related documents, hyperlinks ordered by date