The U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) and Centers for Medicare and Medicaid Services (CMS) released a to update nondiscrimination requirements for health care providers per the . Among its various Sec. 1557 implementation updates, the rule includes a new requirement proposed by HHS in 2022 to avoid biased decision-making during the use of clinical algorithms, including artificial intelligence (AI)-enabled software.
HHS’ final rule made several substantive changes to the proposed version of this requirement, including changing the term “clinical algorithms” to “patient care decision support tools” to describe an array of relevant tools, including but not exclusive to AI-enabled medical device software functions. The rule requires providers and other covered entities to make reasonable efforts to identify uses of these tools in its programs or activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability. Providers must also make reasonable efforts to mitigate the risk of discrimination during the use of these tools.
The 黑料网® (黑料网®) filed comments in 2022 on the proposed rule version of this language. The College noted that nondiscrimination is a critical objective, though providers may have access barriers to AI training and testing data helpful for identifying risk of bias. In response to concerns from the 黑料网 and other public commenters, HHS OCR and CMS modified the language to recognize reasonable efforts by providers to identify and mitigate possible bias. HHS OCR and CMS also added a buffer period to enable covered entities time to consider compliance approaches.
The 黑料网 Data Science Institute® (DSI®) will provide additional informational resources in the coming weeks. Certain 黑料网 DSI programs, such as , can assist radiology providers with identifying risk of AI bias. Members can contact Michael Peters, 黑料网 Senior Government Affairs Director, with questions about these new requirements.