A report published by the Center for Applied Artificial Intelligence at the University of Chicago Booth School of Business has found that algorithms used to inform healthcare delivery and planning across the US are reinforcing racial and economic biases.

The Algorithmic Bias Playbook details how biased algorithms are influencing how patients are treated by hospitals, insurers and other businesses.

The playbook sets out a plan of action healthcare organisations can take to create an inventory of their algorithms, screen them for bias, adjust them or scrap them if the bias cannot be fixed and set up structures to prevent future bias.

Berkeley School of Public Health associate professor Ziad Obermeyer, who co-authored the report, told STAT: “These algorithms are in very widespread use and affecting decisions for millions and millions of people, and nobody is catching it.”

The researchers found that bias was common both in clinical calculators and checklists as well as more complex, artificial intelligence (AI) informed algorithms.

The report flags biases in algorithms that: determine the severity of knee osteoarthritis; measure mobility; predict the onset of serious illness; and identify which patients may fail to attend appointments or may benefit from additional outreach to manage their health.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

The researchers also found that the Emergency Services Index, which groups patients based on the urgency of their medical needs in emergency departments, performed poorly in assessing black patients. It is used in about 80% of US hospitals.

Work started on the Algorithmic Bias Playbook after a high-profile 2019 study found that a prominent AI software that determined which patients get access to high-risk healthcare management programmes routinely prioritised healthier white people over less healthy black people.

This was because the algorithm factored in historic healthcare costs into its final decision. Because of structural inequalities in the US healthcare system, black people at a given level of health tend to generate lower costs than white people at equivalent levels of health. This meant black patients were much sicker than white ones for a given level of predicted risk.

By collecting the algorithmic disparities between black and white patients, the percentage of black people enrolled in the programmes leapt from 18% to 47%.