Section 1557 Final Rule

Your practice may or may not yet use AI-driven predictive analytics or clinical algorithms, but if it doesn’t yet, it likely will soon. New federal guidelines — and fines — on the subject of discrimination that is baked into many AI-based tools, and you need to know about them.


A few weeks ago, a small federal agency expanded the nondiscrimination provision of the Affordable Care Act (ACA) to apply to technology used in health care settings. This provision — Section 1557 — was enacted in 2010 as part of the ACA to ensure that all individuals have equal access to healthcare services by prohibiting discrimination based on race, color, national origin, sex, age, or disability in health programs or activities that receive federal funding assistance from HHS and any health program established under the ACA. This includes hospitals, health clinics, physician practices, health insurance issuers, and state Medicaid agencies. Since its enactment, the HHS Office for Civil Rights (OCR) has enforced Section 1557 by investigating complaints of discrimination and taking enforcement actions against those found to be in violation, including monetary penalties. In a recent final rule, OCR announced an update to Section 1557 that codifies expanded protections — clarifying that the nondiscrimination principles apply to the use of AI, clinical algorithms, predictive analytics, and other tools.