Kaiser Permanente's new head of AI on 'two fundamental shifts' the technology will enable
Photo: Kaiser Permanente
An increasing number of health systems are hiring executives, usually physicians, to head their artificial intelligence efforts – often chief AI officers, fully at the C-suite level. The complex technology with great potential is in the midst of an explosion in healthcare and many industries.
Kaiser Permanente, with its 40 hospitals and 600 medical offices, created the role of vice president of artificial intelligence and emerging technologies when they hired Dr. Daniel Yang in late 2023.
Yang, who is not a chief, reports to Dr. Andrew Bindman, the chief medical officer. In his role, he establishes quality oversight for all AI applications across the organization, including those used in clinical operations, research, education and related administrative functions.
Yang recently published a framework of seven principles that guides the organization in helping to ensure the AI tools it deploys are safe and reliable. He also has called on policymakers to help make sure AI is used responsibly.
Healthcare IT News sat down with Yang to discuss his role at Kaiser Permanente, the most important healthcare issues that AI can affect, and the deployment of a new AI-enabled clinical documentation tool for the health system's physicians and other caregivers.
Q. Kaiser Permanente executives created the role of vice president of artificial intelligence and emerging technologies late in 2023 when they hired you. What were they looking for in an executive, and what did you bring to the table?
A. I joined Kaiser Permanente from the Gordon and Betty Moore Foundation, where I supported AI governance efforts at health systems to ensure a thoughtful and judicious approach to procuring and deploying AI tools.
While at the Moore Foundation, I began having discussions with Kaiser Permanente Chief Medical Officer Dr. Andrew Bindman about setting up a responsible AI program for Kaiser Permanente. It was clear Kaiser Permanente wanted someone who wakes up and goes to sleep thinking about the ways AI and other emerging technologies can help address challenges facing the U.S. healthcare system.
These challenges might include access to quality care, affordability and patient satisfaction.
What ultimately drew me to Kaiser Permanente was the opportunity to continue shaping the intersection of patient safety, healthcare quality and emerging technologies. I saw this role as an irresistible opportunity to build a responsible AI culture and program at Kaiser Permanente and, by working for one of the largest integrated health systems, an opportunity to influence the responsible adoption of health AI across the United States.
It also allowed me to return to my roots in internal medicine. I am still a practicing clinician, and I continue to see patients in urgent care at Kaiser Permanente and the Department of Veterans Affairs. This hands-on experience helps me understand and have empathy for the daily challenges clinicians face. I bring that empathy to my role every day as we consider how to deploy AI and new technologies in ways that enhance patient care and minimize risks.
Q. How would you describe your job description to peers interested in being in your role and to executives considering filling such a position?
A. When I look at other AI leaders or similar executive roles that are being created across different health systems, I end up seeing a common set of denominators in these positions.
One, these leaders often are practicing clinicians.
Two, many of these leaders are not technologists at their core. They have come to value the importance of technology whether their backgrounds are from a quality, safety or research perspective.
Three, these roles require some operational experience. In most cases, these leaders must stand up new processes and programs that didn't exist before. They need to imagine and trailblaze new programs without pre-existing playbooks and find the resources to do it, which can be particularly challenging given the macro pressures of the rising costs in healthcare.
And four, strong communication skills are crucial to ensure AI technologies are implemented in a way that is scientifically sound and practically useful. I consider myself an AI translator, bridging the gap between technical teams, clinicians and the public. While I may not have a PhD in artificial intelligence or computer science, my strength is in aligning AI solutions to meet the practical needs of our clinicians and members.
Q. What do you think are the most important AI issues in healthcare today?
A. I think the real question is: "What are the most important issues in healthcare today?" This is where I feel like many leaders get it wrong. You don't have an AI problem. AI is just one of many tools or technologies we can use to help solve real problems the U.S. healthcare system faces today.
Some of these important, overarching challenges in healthcare include increasing delays in terms of patient access, rising costs to deliver healthcare services, and provider burnout. These challenges are the consequences of a mismatch between the growing complexity of care delivery and the availability of trained clinicians to provide care.
AI can help enable two fundamental shifts needed to address this supply-demand mismatch.
First, AI can help unlock opportunities for patients to better self-manage their care.
Second, AI can help the healthcare system evolve from a 1:1 model of care (1 clinician:1 patient) to a 1:many model of care while maintaining or even enhancing the quality of care and patient experience. The use of generative AI for personalized tutoring in the education sector is an interesting model for us to learn from.
And finally, the healthcare industry must address the concerns around safety and quality when it comes to AI in healthcare. This requires a robust responsible AI program that prioritizes patient safety and quality of care above all else.
It also will accelerate an organization's capabilities in addressing the ethical considerations and potential biases in AI deployments. Ensuring AI tools do not perpetuate existing inequalities in healthcare is paramount. Continuous monitoring and evaluation are essential to achieve these goals.
Q. What is an AI project you have overseen in your first year at Kaiser? How did the process go and what have you and your team accomplished? What have been the outcomes?
A. The COVID-19 pandemic pushed U.S. physician burnout to an all-time high. We aimed to better support our doctors and clinicians by reducing the administrative work they encounter in their day-to-day lives, especially the time spent on documenting clinical notes during patient visits. We saw that the rapid advancements in generative AI could help with this issue.
I helped lead the deployment of a new AI-enabled clinical documentation tool for our doctors and other clinicians at Kaiser Permanente's 40 hospitals and more than 600 medical offices. We believe this was the largest implementation of ambient listening technology in the United States.
The tool, which requires patient consent, helps doctors and other clinicians securely draft clinical notes during patient visits. Most importantly, it liberates our doctors and clinicians from their keyboards so they can refocus their attention on patients instead of computer screens.
Our belief at Kaiser Permanente is that AI should never replace the judgment or expertise of our doctors and clinicians. To succeed in this, we must assess any AI tool before deploying it to ensure we understand how to safely and effectively use it.
For example, when we first rolled out the tool, I co-led our quality assurance testing to assess how well it performed in different clinical specialties and in the messiness of real-world environments. These insights helped guide our responsible use of the tool, including our trainings for doctors and clinicians on how to use it effectively while minimizing risk.
As of September 2024, doctors and clinicians have used the tool to record more than 2 million patient interactions. Doctors and clinicians have shared that the tool allows for more meaningful interactions with their patients. Patients also have shared that the tool is creating greater transparency during the visit.
For example, many physicians are now dictating aloud their results during a physical exam whereas before they may not have said anything. To me, this is a great example of how emerging technologies can help support our care teams in delivering superior care experiences for our members and patients.
Follow Bill's HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication