Healthcare algorithm biased toward black patients

Racial bias has been uncovered in health algorithms, negatively affecting black patients, according to new research published in Science.

The findings are significant as more of the healthcare field relies on machine learning and AI capabilities with underlying algorithms in care services.

One of the biggest challenges to analyze algorithms is that they are often proprietary, meaning outside researchers may have a hard time evaluating them. To audit proprietary algorithms, independent researchers have to figure out ways to work from the outside, but it makes the task of finding and knowing what to do with disparities even more difficult.

“Without an algorithm’s training data, objective function, and prediction methodology, we can only guess as to the actual mechanisms for the important algorithmic disparities that arise,” wrote corresponding author Sendhil Mullainathan, of the Booth School of Business at the University of Chicago, et al.

Mullainathan and researchers looked at one typical algorithm with a dataset that includes both the algorithm’s predictions and the data of its inner workings––inputs, outputs and outcomes. The main sample consisted of 6,079 black patients and 43,539 white patients, with about 71% enrolled in commercial insurance and 29% in Medicare.

The researchers obtained the risk scores for each patient-year and found that the algorithm used cost to rank patients on health. The problem they found was that healthier white patients were ranked the same as black patients who were sicker, with at least one more chronic illness. This meant black patients were less likely to be flagged as a patient who could benefit from increased healthcare, with more workers monitoring their health and condition.

The disparity may be that black patients had less healthcare spending compared to white patients for similar conditions.

Fortunately, the researchers also uncovered that the algorithm can be improved by using different labels, such as swapping out cost with an index variable that combined health prediction with cost prediction.

“These results suggest that label biases are fixable,” Mullainathan et al. wrote. “We must change the data we feed the algorithm—specifically, the labels we give it.”

Amy Baxter

Amy joined TriMed Media as a Senior Writer for HealthExec after covering home care for three years. When not writing about all things healthcare, she fulfills her lifelong dream of becoming a pirate by sailing in regattas and enjoying rum. Fun fact: she sailed 333 miles across Lake Michigan in the Chicago Yacht Club "Race to Mackinac."

Trimed Popup
Trimed Popup