Computers can do many jobs more quickly than humans, but they tend to copy our mistakes—after all, they learned it from watching us. Scientists have found that a major algorithm used in American hospitals is biased against black patients, because it’s reflecting racial health disparities that already existed.
The researchers didn’t want to name-and-shame the particular software or the health system that it’s used in, so we can’t tell you a specific hospital to avoid. But they note that it’s “a widely used algorithm, typical of this industry-wide approach and affecting millions of patients.”
Hospitals use this algorithm to figure out which are the sickest patients, and prioritise them for care, including dedicated nurses and quicker access to primary care appointments. But it turns out that one of the main numbers it uses to determine who is sickest is, simply, who has incurred the most health care costs in the recent past.
So this means that if you’re sick but you haven’t been able to take time off work to visit the doctor when you really should have, you’ll appear to the system as if you’re less sick. Or if you’re a black patient and your doctor doesn’t take your complaints seriously, you’ll likewise appear less sick to the algorithm. The computer assumes that people who get the most care are the people who need the most care, but in reality those numbers simply don’t match up. The researchers found that a black patient had to be significantly sicker than a white peer to get a score that would recommend them for the coordinated care program.
There’s no immediate takeaway for patients, but the study has brought the problems with these algorithms to light. The researchers are working with the makers of this algorithm to reprogram it in a way that doesn’t make biased assumptions; so far they have been able to reduce the bias by, they say, 84%.
After all, the authors point out, when you have so enough data to work with, you can program an algorithm to counteract some of the bias created by humans. The problem is just that programmers haven’t usually bothered to do that, and so the computers learn to copy the world they live in.