30 April 202411 minute read

Non-Discrimination in Healthcare Final Rule imposes diligence obligations on Medicare providers and suppliers using patient care decision support tools

Two federal agencies under the US Department of Health and Human Services (HHS) – the Office for Civil Rights (OCR) and the Centers for Medicare and Medicaid Services (CMS) – jointly released a comprehensive Final Rule on Friday, April 26, 2024, designed to bolster protections for patients against discrimination in healthcare access and service delivery. The Nondiscrimination in Health Programs and Activities Final Rule was issued under Section 1557 of the Affordable Care Act and prohibits discrimination on the basis of race, color, national origin, sex, age, or disability in certain health programs and activities, any part of which is receiving Federal financial assistance.

This Final Rule follows publication of OCR’s proposed rule on August 4, 2022, which received substantial public comment. In addition to other extensive requirements addressing access to and receipt of healthcare services for vulnerable populations, the Final Rule imposes obligations on covered entities to conduct reasonable diligence to identify potentially discriminatory output of their use of clinical algorithms. Specifically, OCR solicited comment under the proposed rule about “what types of clinical algorithms are being used in covered health programs and activities; how such algorithms are being used by covered entities; [and] whether they are more prevalent in certain health settings …” Following review of comments, OCR and CMS sought to clarify application of the non-discrimination provisions to clinical algorithms by adopting a new term, “patient care decision support tool”, which is defined to mean “any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a covered entity to support clinical decision-making in its health programs or activities.” Thus, this term is intended to broadly capture tools used by covered entities in clinical decision-making impacting patient care, including but not limited to, AI algorithms and other clinical decision support.

Importantly, for the first time, receipt of Medicare payments is considered a form of federal financial assistance as applied to the triggering of civil rights laws. This means that any provider or supplier receiving Medicare payments is considered a covered entity under the Final Rule and is subject to the non-discrimination provisions.

Under Section 92.210, covered entities are required to make reasonable efforts to identify patient care decision support tools used in their health programs and activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability. Further, for each patient care decision support tool identified, a covered entity must make reasonable efforts to mitigate the risk of discrimination resulting from the tool’s use in its health programs or activities. Notably, the Final Rule did not directly address concepts of implicit bias, which account for much of AI basis in healthcare, ie, where proxy variables lead to a discriminatory result. Recognizing the many possible indirect measures of race, color, national origin, sex, age, and disability, OCR did not require covered entities to identify all patient care decision support tools with input variables or factors that indirectly measure these protected bases. Importantly, Section 92.210 does not require covered entities to obtain datasets or other attribute information from developers when purchasing or using patient care decision support tools. However, if a covered entity has reason to believe that the tool uses variables or factors that measure race, color, national origin, sex, age, or disability, or the covered entity otherwise knows or should know that the tool could result in discrimination, the covered entity has an obligation to further investigate to determine if mitigation efforts are needed. This “know or should know” standard may lay the foundation for OCR investigations into implicit bias in patient care decision support tools, in addition to the more overt forms of explicit bias discussed in the Final Rule.

The Final Rule references the Office of the National Coordinator for Health Information Technology’s (ONC) recently published final rule for “Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing” for predictive decision support interventions (Predictive DSI), which DLA Piper previously reported here. OCR clarifies that the Final Rule is not duplicative of the ONC rule regarding Predictive DSIs because ONC’s rule applies to and includes requirements for health information technology (IT) developers, whereas § 92.210 applies to and includes requirements for Section 1557 covered entity users of patient care decision support tools (including Predictive DSIs). However, OCR noted its intention that the Final Rule work in tandem with ONC’s final rule. Specifically, the ONC Final Rule requires developers with Health IT Modules certified to § 170.315(b)(11) to disclose information to the DSI users about a DSI’s source attributes relevant to health equity. OCR intends for this disclosure requirement to enable a covered entity that uses Health IT Modules certified to § 170.315(b)(11) to learn from the developer whether a specific DSI relies on attributes that measure race, color, national origin, sex, age, or disability. Covered entities must therefore exercise due diligence when acquiring and using such tools to ensure compliance.

OCR expressed its view that patient care decision support tools used by covered entities such as hospitals, providers, and payers (health insurance issuers) in their health programs and activities will be broadly interpreted to apply to “screening, risk prediction, diagnosis, prognosis, clinical decision-making, treatment planning, health care operations, and allocation of resources” as applied to the patient. In particular, tools used to assess health status, recommend care, provide disease management guidance, determine eligibility and conduct utilization review related to patient care that is directed by a provider, all impact clinical decision-making, and are covered under this Final Rule.

While OCR declined to require specific mitigations for covered entities with respect to their evaluation of these tools, it noted some safeguards that covered entities could use to mitigate discrimination, including, among other things, (i) establishing written policies and procedures governing how clinical algorithms will be used in decision-making, (ii) adopting governance measures, (iii) monitoring any potential impacts and developing ways to address complaints, and (iv) training staff on the proper use of such systems in clinical decision-making.

OCR noted that certain input variables, such as race, may generate greater scrutiny as compared to others that might have a clinically relevant basis, such as age. While OCR appears to make allowances for input variables that may take a particular protected basis into consideration (eg, age), it does not overtly discuss the ways in which race could itself be a clinically relevant factor such that use of race as an input could be used to guard against proxy bias and increase minority populations’ access to needed healthcare services. The industry should guard against unintended consequences, where regulatory burdens on algorithms incorporating race information incentivize facially neutral algorithms that implicitly increase bias. It is unclear how OCR would ultimately view the use of patient care decision support tools that use certain input variables to improve access to care for vulnerable populations; however, this may be something OCR considers under their case-by-case evaluation of noncompliance with § 92.210. The Final Rule laid out some of the factors that OCR will consider when investigating covered entities’ compliance, including:

  1. The covered entity’s size and resources (eg, a large hospital with an IT department and a health equity officer would likely be expected to make greater efforts to identify tools than a smaller provider without such resources)
  2. Whether the covered entity used the tool in the manner or under the conditions intended by the developer and approved by regulators, if applicable, or whether the covered entity has adapted or customized the tool
  3. Whether the covered entity received product information from the developer of the tool regarding the potential for discrimination or identified that the tool’s input variables include race, color, national origin, sex, age, or disability and
  4. Whether the covered entity has a methodology or process in place for evaluating the patient care decision support tools it adopts or uses, which may include seeking information from the developer, reviewing relevant medical journals and literature, obtaining information from membership in relevant medical associations, or analyzing comments or complaints received about patient care decision support tools.

OCR has been granted enforcement authority over Section 1557, including the authority to handle complaints, initiate and conduct compliance reviews, conduct investigations, supervise and coordinate compliance within HHS, make enforcement referrals to the Department of Justice, in coordination with the Office of the General Counsel and the relevant component or components of HHS, and take other appropriate remedial action as deemed necessary, in coordination with the relevant component or components of HHS, and as allowed by law. However, Section 1557 does not remove or replace any enforcement mechanisms under Title VI of the Civil Rights Act of 1964 (42 U.S.C. 2000d et seq.), Title IX of the Education Amendments of 1972 (20 U.S.C. 1681 et seq.), the Age Discrimination Act of 1975 (42 U.S.C. 6101 et seq.), or Section 504 of the Rehabilitation Act of 1973, including any private right of action under any of the foregoing. It therefore remains to be seen how Section 1557 will ultimately be enforced against covered entities.

Examples of algorithmic bias, particularly involving the prevalence of ethnic and racial bias in clinical algorithms resulting in fewer health care services provided to minority patients, were also cited in the Final Rule. Despite these risks, OCR expressed its support for covered entities’ continued development and use of patient care decision support tools to improve patient care but cautioned regarding the importance of testing and recalibration to avoid discriminatory application of such tools in patient care.

OCR further indicated that as it implements Section 1557 and other civil rights laws, it will continue to consider additional actions to support covered entities in their implementation of these tools, including through guidance or the provision of technical assistance. To this end, OCR pointed to recent federal guidance, including (1) the Biden Administration’s Blueprint for an AI Bill of Rights, which includes a principle for protecting the public from algorithmic discrimination; (2) E.O. 14091, Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government, which includes a section requiring agencies to consider opportunities to “prevent and remedy discrimination, including by protecting the public from algorithmic discrimination;” and (3) E.O. 14110, Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, which sets forth numerous executive actions designed to ensure the equitable, safe, and secure use of AI. OCR declined to require specific transparency disclosures regarding patient care decision support tools at this time; however, the agency noted that it would be a best practice for covered entities to disclose information to patients about the patient care decision support tools used in their health programs and activities. OCR further noted that it will continue to partner with other government agencies and covered entities to address best practices and may release guidance in the future.

While the Final Rule is effective 60 days after publication, OCR has determined that it is reasonable to allow additional time for covered entities to comply with certain procedural requirements. Provisions pertaining to the use of patient care decision support tools will be effective within 300 days of the effective date of the Final Rule. The additional time is intended to provide covered entities with the opportunity to properly designate a Section 1557 Coordinator and develop their internal processes to evaluate their use of patient care decision support tools and implement mitigations to address any identified discriminatory effect.

OCR is further seeking comment on the uses of patient care decision support tools addressed in this Final Rule as well as others that may result in unlawful discrimination in violation of Section 1557, including whether § 92.210 should be expanded to cover additional tools, demonstrating the Agency’s intention to continue to monitor and develop guidance in this area.

With a sector-focused AI practice, supported by a team of data scientists, DLA Piper is well positioned to assist covered entities with their AI governance policies and procedures, including their adoption, implementation and testing of patient care decision support tools.

Please reach out to the authors of this alert or to your DLA Piper relationship partner for more information.

Print