Effective healthcare AI from data to deployment
Photo: HIMSS Media/Andrea Fox
CHICAGO – AI has the potential to foster early and accurate disease detection, precision and personalized medicine, population health and care access, clinical decision support, operational efficiencies and improved clinician and patient engagement.
But the reality is, "We are not seeing these come to fruition," said Mayo Clinic Platform's Sonya Makhni.
Speaking to an energized forum on AI and machine learning today at HIMSS23, the AI medical director highlighted the challenges of data standardization, drift in machine learning, accountability and feasibility in clinical practice.
Due to inconsistent efficacy, bias and AI's other challenges, point solutions have limited adoption in the real world, she said. Suboptimal practices and assumptions result in systematic errors that bias patient groups.
"If we don't understand what these are, we can't really address the risks bias creates," said Makhni.
But at the same time, even in the case of the most effective algorithm that has minimal bias and works beautifully, "that algorithm will not be adopted unless it is integrated in the workflow in a smart, sustainable, efficient way."
Healthcare AI must make sense in clinical practice. It's "unacceptable and unsustainable" to expect clinicians to deal with excessive pop-ups or have to navigate away from their workflows, Makhni said.
Changing dynamics also lead to monitoring and tracking issues for ML models. "There are also clinical and operational changes that need to be accounted for – like clinical guidelines changes … How is that impacting the algorithm?"
Meanwhile, how the mathematics changes over time – or machine learning drift – need mechanisms for monitoring and accountability, she said.
Deploying validated algorithms in clinical practice
Makhni presented how platform-based collaboration is developing AI data standards, training and validating ecosystems, ensuring cybersecurity and deploying real-time algorithms at Mayo Clinic.
On The Clinic's development platform, collaborators agree to data standards and retain ownership of their data as they partner to train and validate ecosystems, she explained.
With access to de-identified 10.4 million longitudinal patient records, 12 billion vitals, 1.3 billion lab results, 662 million clinical notes and more, Mayo Clinic facilitates external innovation at a much faster pace because developers are not jeopardizing patient data.
The platform aims to democratize data. With no-code, low-code and high-code tools, innovators and clinicians can use their professional knowledge and create their own data, Makhni said.
By using frameworks, validation using disparate data sets on the platform makes the AI relevant to a clinical audience.
To tackle bias, representations of areas of bias that are easier to understand and more translatable in practice help build intuition.
Makhni said the platform is deploying validated algorithms – such as EKG algorithms – and integrating them into workflows. She noted that the deployed algorithms provide real-time insights and benchmarking and present a pathway for building a registry that monitors ML drift.
Regardless of the development approach, there are a few key considerations to improve the chance that algorithms are adopted, Makhni concluded.
While ML collaborators should commit to aligning on values and protecting patient privacy, it's equally critical to equip the workforce and empower clinicians to use the algorithms correctly and "upskill without overburdening."
Her final advice was to instill trust with transparency, collaborate and co-create with stakeholders and decision-makers – "even if there is tension."
"And never forget about the patient. We're doing this to improve their lives. So we need to find ways to involve them in this process," Makhni said.
Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org
Healthcare IT News is a HIMSS Media publication.