AI Governance in Health Systems: Aligning Innovation, Accountability, and Trust

AI Governance paper graphic

White Paper

AI Governance in Health Systems: Aligning Innovation, Accountability, and Trust

Published date

October 28, 2024

Summary

Tools enabled by artificial intelligence (AI) have the potential to transform patient outcomes and health system operations and are already having significant effects. AI applications have facilitated faster triage and diagnosis, enabled the anticipation of patient outcomes to create personalized treatment plans, and streamlined clinical operations, patient communication, and resource allocation. But while the integration of AI tools in healthcare systems offers immense potential, the use of AI in such a sensitive and critical sector also raises significant ethical, legal, and practical concerns.

A comprehensive governance system has multiple advantages, including ensuring patient safety, maintaining ethical standards, ensuring regulatory compliance, fostering trust through transparency and accountability, and managing privacy concerns and other legal issues. But AI governance is a relatively new concept for health systems, many of which have integrated only limited numbers of AI tools into their workflows.

This project convened a working group of six health systems (see below) located across the United States who have established AI governance systems in the past several years and conducted informational interviews with multiple other health systems to learn about AI governance scope, goals, and processes. We found important commonalities in the components of governance processes, but different ways to accomplish these tasks. This paper walks through the main components of health system governance and explores how different health systems approach these components, as well as discussing how health systems can begin to set up their own governance systems. We offer recommendations for policy makers, health systems, and other stakeholders on how they can standardize and simplify these processes to democratize access to AI-enabled health tools. We heard from all the health systems that this is a resource-intensive task, and more technical expertise, training, and tools are needed to ensure the availability of technical expertise to help under-resourced health systems realize the benefits that AI tools may provide.

The details of this paper will be discussed during a webinar on November 18, 2024:

Duke-Margolis Authors

Valerie Parker Headshot

Valerie Parker, MS

Policy Research Associate

silcox

Christina Silcox, PhD

Research Director, Digital Health
Adjunct Assistant Professor
Senior Team Member
Margolis Core Faculty

External Authors

Duke Health
Nicoleta J Economou, PhD

Director of Governance and Evaluation of Health AI Systems

Working Group

The University of Chicago Medicine
Karen Habercross

VP, Chief Information Security and Privacy Officer

Mayo Clinic
Hailey Hildahl

Sr. Digital Product Manager

Mark Lifson

Director, AI Systems Engineering

Lauren Rost

Senior AI/ML Engineer

David Vidal

Vice Chair, AI Enablement

Stanford Healthcare
Nikesh Kotecha

Director, Head of Data Science

Anurang Revri

Vice President, Chief Enterprise Architect

UNC Health
Michael Plesh

Executive Director of Technology

Ram Rimal

Manager of Data Science and AI

Kaiser Permanente
Matthew D. Solomon

Augmented Clinical Intelligence Program

Kaiser Permanente Northern California

Ellen Woo

Executive Director of AI and Emerging Technologies

Daniel Yang

Vice President of AI and Emerging Technologies