Inside the effort to limit legal exposure while going all-in on AI

Providers are increasingly using artificial intelligence tools to bridge staff shortages in skilled nursing facilities, but legal experts warn that taking shortcuts during implementation can increase long-term risks. Providers must establish strong governance frameworks to ensure privacy and confidentiality, and be prepared for regulatory scrutiny.
Providers are using artificial intelligence tools to address staff shortages in skilled nursing facilities. However, legal experts warn that launching these tools without proper understanding, oversight, and auditing can increase long-term risks. Attorney Greg Smith notes that AI can streamline processes, improve clinical decision-making, and enhance quality of care, but providers must prepare for the associated risks. Sheppard partner Carolyn Metnick emphasizes that AI governance is not just IT oversight, but enterprise risk management. The Department of Health and Human Services has issued a request for information on accelerating AI adoption in clinical care, but has yet to propose a rule. Existing laws, such as HIPAA, may already apply to AI tools, and a growing patchwork of state AI laws requires extra attention from large organizations.
This content was automatically generated and/or translated by AI. It may contain inaccuracies. Please refer to the original sources for verification.