Improving health equity one algorithm at a time
Perspective: Insights-driven outcomes | Capability: Behavioral Health | By: Beacon Health
In healthcare, testing algorithms for unintended bias is imperative, but it’s only the first step. Going above and beyond to resolve the bias is what matters most. And Carelon business Beacon Health Options did just that in order to better support homeless Medicaid members.
Data and algorithms are frequently used in the healthcare industry to identify populations that may benefit from specialty care management. Data-driven programs that utilize algorithms can improve disease management, health outcomes, and reduce the cost of care. They also have the potential to remove bias from human decision making when it comes to accessing care.
But what happens when the algorithm itself is biased? Recent research has shown that algorithms in healthcare[1] and other fields[2] can show bias against certain populations due to systemic racism that is reflected in the data used to construct these computer-based calculations.
In healthcare, for example, data on cost and utilization of care is often relied upon as an indicator of problem severity. However, studies show that Black, Indigenous, and People of Color (BIPOC) typically consume healthcare at lower rates than non-Hispanic Whites despite having similar health status. In this example, over-reliance on utilization or cost-based indicators can perpetuate bias by under-recognizing health issues in BIPOC populations.
*Claims were pulled for one year, allowing for a four-month gap, starting from the month the HMIS data was loaded. Data was pulled regardless of the number of days the member had been eligible for Medicaid during that year.
Explore more from Beacon Health
Sources:
- Science website: https://www.science.org/doi/10.1126/science.aax2342 .
- New York Times website: https://www.nytimes.com/2017/12/20/upshot/algorithms-bail-criminal-justice-system.html .
Let's connect
We’re here to help you learn more about Carelon.