by Florian Wüstholz
Many medical algorithms require the race of the patient to be included in the calculation. The American Heart Association, for instance, recommends using an algorithm
to calculate the risk of heart failure. People who are categorized as “non-black” automatically receive three additional points, out of 100 possible points. The algorithm rates their risk higher, and they might be treated faster as a result. The STONE algorithm
, which estimates the probability of having kidney stones, works in a very similar way. Here, too, “non-black” people receive three additional points (out of 13) - and thus better treatment.
A new study in the New England Journal of Medicine
analyzes the systematic discrimination of People of Color (PoC) by medical algorithms. It comes on top of a growing body of literature criticizing racially distorted algorithms, mostly because there is no reason to use a social concept (race) to assess biological functions. In both cases exposed above, there seems to be no justification for downgrading the scores of people who are “black”. Other racially biased algorithms are based on outdated data or on data that uses race as a proxy for other factors, a logic akin to considering that having a lighter in your pocket causes lung cancer.
In Switzerland too
This applies to the widely used estimation formulas for kidney function, which are used in Switzerland. Because it is very time-consuming to measure the functionality of kidneys directly and precisely, creatinine concentration (a molecule that kidneys filter out) in blood is normally used as a proxy. Together with other variables, the formula then estimate the actual kidney function. The variables include gender, age and “race” - where a distinction is only made between “black” and “non-black”. The algorithm provides the “estimated glomerular filtration rate” or eGFR. That value, in turn, plays a key role in deciding further treatment. It can, sometimes, influence a person’s position on the waiting list for a kidney transplant (transplant decisions are based on many other factors, including more precise measurements of kidney function).
The two most widely used formulas for eGFR - MDRD
- automatically compute a better kidney function for people who are categorized as “black”. MDRD increases their score by 21 percent and CKD-EPI by 16 percent. A large difference, which can influence the care the patient receives.
Research by AlgorithmWatch shows that all five university hospitals in Switzerland use CKD-EPI, including its racial component. Daniel Sidler of the Inselspital in Berne explained that the “ethnicity factor” is manually determined by the health professional conducting the examination. Thomas Müller, of the University Hospital Zurich (USZ) also confirmed that they use CKD-EPI. A nephrologist, or kidney specialist, from Lausanne hospital said that their algorithm was geared towards “Caucasian” patients but that doctors could adapt the formula based on the patient’s “ethnicity”.
Representatives from the Geneva University Hospitals, the country’s largest, told AlgorithmWatch that, while the patients’ race is never stored in their file, doctors are advised to use the racial component of CKD-EPI when making treatment decisions. The hospital encourages the use of an online tool
which asks whether the patient is “African American” or belongs to “all other races”. The hospital’s representatives did not answer when asked whether patients are made aware of the usage of the tool.
The doctors at Inspelspital and USZ stood by the algorithm’s discrimination of “black” people. They told AlgorithmWatch that people with darker skin color produced more creatinine. There is no evidence that this is true. Rather, creatinine production depends on muscle mass. The study
on which the estimation formula CKD-EPI is based justifies the adjustment for “blacks” on the unproven assumption that “blacks” have higher muscular mass and therefore produce more creatinine.
Such discriminatory methods are coming under increasing criticism
. There is resistance in Switzerland too. “The idea that Black people can endure more pain because they are physically stronger than white people is absolutely racist,” said Li Owzar from Diversum, an organization that provides safe discussion spaces to PoC. “A categorization based on a perceived skin color is just as racist and therefore completely untenable.”
In the case of eGFR, several studies have already cast doubt on their the algorithm’s validity. Studies from Japan, Pakistan and India showed that, depending on the population, the formulas can be systematically wrong. Some US hospitals went as far as banning the algorithm
. Not so in Switzerland, where they are still considered state-of-the-art in many publications
that claim that the algorithms are “tested on different populations and against different clinical backgrounds”.
Given the broad criticism, this conclusion is questionable. In Switzerland in particular, very few studies researched discrimination against PoC. Systematic and independent studies would be needed, because medical algorithms reflect the racism that crept in the data they were built upon. What applies to algorithms in general is of course also true for the eGFR estimation formulas: If the data basis is poor, the result is an unusable algorithm. In this case an algorithm that systematically discriminates against PoC, sometimes resulting in worse treatment.
Nicolas Kayser-Bril contributed to this report.