Identifying and mitigating algorithmic bias in the safety net

Abstract Algorithmic bias occurs when predictive model performance varies meaningfully across sociodemographic classes, exacerbating systemic healthcare disparities. NYC Health + Hospitals, an urban safety net system, assessed bias in two binary classification models in our electronic medical record...

Full description

Saved in:
Bibliographic Details
Main Authors: Shaina Mackin, Vincent J. Major, Rumi Chunara, Remle Newton-Dame
Format: Article
Language:English
Published: Nature Portfolio 2025-06-01
Series:npj Digital Medicine
Online Access:https://doi.org/10.1038/s41746-025-01732-w
Tags: Add Tag
No Tags, Be the first to tag this record!

Similar Items