AWS on AI, machine learning, interoperability improving patient outcomes

A leader at Amazon Web Services goes in-depth into the technologies and tactics that can lead to strong personalized care.
By Bill Siwicki
11:52 AM

Phoebe Yang, general manager for nonprofit healthcare at AWS

Photo: AWS

As the country moves toward value-based care, artificial intelligence and machine learning – paired with data interoperability – have the potential to improve patient outcomes while driving operational efficiency to lower the overall cost of care.

By enabling interoperability securely and supporting healthcare providers with predictive machine learning models and insights afforded by genomic research, clinicians will be able to seamlessly forecast clinical events – such as strokes, cancer or heart attacks – and intervene early with personalized care and access to curated information to support a superior patient experience.

Further powering these predictive capabilities with location-agnostic, voice-enabled, accessible modalities of providing care advances the practice of medicine to align with what is most convenient, affordable and targeted for the specific needs of patients.

Healthcare IT News sat down with Phoebe Yang, general manager for non-profit healthcare at Amazon Web Services, to discuss these subjects, offering healthcare CIOs and other health IT leaders lessons in state-of-the-art technologies.

Q. How can AI and machine learning combined with data interoperability enhance patient outcomes and operational efficiency to lower care costs?

A. Interoperability among medical information systems is foundational – or should be – because without it, physicians don’t have ready access to their patients’ complete medical histories. 

Failing to bring all information together undermines clinical teams’ diagnoses and treatment plans. It opens the door to poor outcomes and readmissions that harm both the patient and the overall cost of running the system.

But there’s more to interoperability and AI/machine learning than avoiding mistakes and omissions. Healthcare is more clinically effective and cost-effective when it prevents problems instead of reacting to them.

Machine learning fueled by comprehensive data can help physicians predict health issues across populations and in individual patients. A timely view of the whole patient can drive breakthroughs in personalized medicine and close gaps in care, including disparities for the most vulnerable people. Data and AI/machine learning are the keys to making that happen.

Between the industry’s own work in promoting open standards and application programming interfaces (APIs), and the contributions of third-party services that specialize in interoperability, we have seen progress, but challenges remain.

At the core of these challenges are incomplete and unstructured data, the rate of adoption of interoperability standards, and getting EHR systems to see beyond clinic walls. Clinical data comes in so many forms, formats and systems.

The onerous task of manually translating data into common formats would be nearly impossible, especially given the exponential growth in the volume, types and sources of data. However, AI and machine learning can automate the process to create standard formats that every participant in the system can access and use.

The cloud is playing a valuable role in driving data interoperability. While the Fast Healthcare Interoperability Resources (FHIR) standard enables the presentation of clinical data in a more commonly readable way, cloud-based AI and machine learning help de-identify patient data – reconciling the tension between the value of large-scale centralized data lakes and the need to uphold privacy – so that once disparate and siloed data become a powerful tool for managing population health, personalizing care, and driving value and efficiency across the system.

A key element of why the cloud is such a powerful enabler is the ability to support volume. Case in point: Pathology sits at the center of cancer research and diagnosis, and the practice has struggled to adopt digital technology that can handle the digital images pathologists use, which can be 10 times the size of radiology images.

Maintaining, indexing and using repositories of those images can overwhelm traditional image management systems causing delays and potential disruption. Through a collaboration with the U.S. Government’s Joint Pathology Center (JPC) and digital pathology platform developer Proscia, we’re deploying AI and machine learning to automate and streamline the digitization of the world's largest repository of human pathology specimens – which is approximately 55 million slides.

Reducing the cost and complexity associated with managing large pathology-image volumes is making critical data readily available to researchers and clinicians, all with the goal of accelerating research and improving diagnoses.

Q. You've said that by enabling interoperability and supporting clinicians with predictive machine learning via genomics, the clinicians will be able to forecast clinical events and intervene early with personalized care. Please elaborate on how this will be possible.

A. The key to predictive, personalized medicine is data liquidity. Syntactic interoperability (a common structure) and semantic interoperability (a common language) can help practitioners combine an individual patient’s own data, at new levels of completeness and detail, with anonymized large-population patient data, including genomics. AI/machine learning-based learning models can use that large, detailed view to predict specific, individual health threats with unprecedented accuracy.

It’s like being able to see all the pieces of a large puzzle for the first time – and also being able to assemble them at superhuman speed. There are algorithms that can detect problems like congestive heart failure months before they would normally clinically appear.

Sensing technologies can observe subtle behavioral changes that might flag a risk of depression. Or to take another example, the Pittsburgh Health Data Alliance and other researchers we serve are using machine learning to look more closely at mammograms to advance the detection of short-term breast cancer risks.

Making predictions is only part of the challenge. The same data interoperability and machine learning models that deliver the predictions where and when they’re needed can also develop evidence-based treatment plans that apply precisely to the patient and condition at hand. This is a frontier of care that can’t work if the flow of data is interrupted or disjointed anywhere in the process.

Because cloud-based machine learning and AI technologies accelerate the digitization and use of healthcare and life sciences data, clinicians can produce more precisely targeted therapies that wouldn’t be possible using classical methods.

And they can make more accurate predictions of adverse events on the individual level. The information to do this has always existed, but human hands and eyes have never been able to catalog it, sift through it, and find the critical connections to put it to use at the scale required for accuracy and adaptability as new data comes in. Machines can.

It’s interesting to note that the better we understand large populations and gather detailed, longitudinal data from them, the better we can serve individuals by putting that aggregated knowledge to case-by-case use.

For example, Cambia Health Solutions offers a digital and mobile health guide called Journi that uses AI/machine learning. It drives decision engines that derive insights from the specific patient’s history – but also draws on similar profiles in the larger population to inform clinical decisions and even predict patient attitudes and behaviors.

What’s encouraging is that, in addition to technology, there’s a second force propelling these advances: consumer desire. One colleague of mine said recently that consumers are “beautifully and wonderfully dissatisfied.”

Their banks, stores and hotels do a great job knowing them, predicting their needs and satisfying them in surprising ways. So why can’t healthcare do the same? Our answer needs to be: Actually, we can.

Q. What role in these predictive capabilities will location-agnostic, voice-enabled, accessible modalities of providing care play?

A. In this complex array of population-scale data lakes, analytics, AI and machine learning, the “front door” is acquiring structured and unstructured data at speed, volume and with integrity so it’s there to use.

Note-taking on paper can’t meet today’s needs. Neither can waiting for someone to call with an issue. We’ve had “location-agnostic voice technology” for a century and a half: the telephone. But to play a role in modern healthcare, we need technologies that are secure and compliant, while remaining convenient enough to fit into an individual person’s lifestyle.

The information that an algorithm uses to predict a heart attack may originate in a smartphone app that has collected data from a person’s fitness-tracker watch. Someone may get timely help for an emotional illness because machine learning allowed a sentiment-sensing application to pick up cues during a conversation. The same principles apply on the population level, where machine learning can help detect clusters or track external health factors.

At Accountable Community of Health (ACH), Innovaccer Health Cloud’s integrated application suite and developer toolkit made it possible to create interoperable applications that helped unify patient records, track the patient journey and analyze patient needs. They were able to close referral loops, improve care management, and provide a Medicaid population of more than 2,000 patients with a connected care experience.

It’s important to fit the technology to the care model, and not the other way around. For example, when we built voice-enabled technology solutions for Houston Methodist, we didn’t start with what Amazon Lex could do. We met with staff members and captured an understanding of how the hospital worked from moment to moment, then designed solutions that would address people’s established needs.

Now, in Houston Methodist’s operating rooms, surgeons and staff will be able to interact with a digital voice assistant in the middle of surgery – to query a patient’s health record, complete safety checklists, or start and stop process timers.

In patient exam rooms, where patients and clinicians so choose, we’ve enabled client devices such as a tablet, phone or PC to capture dialogue between patients and clinicians in a secure, compliant manner. The doctor and the patient each automatically receive a generated summary of their interaction, and the information becomes indexed and searchable as part of the patient’s health record.

In all, interoperability of healthcare data is key to being able to identify the unique needs of each individual, which is essential to creating a frictionless and more personalized patient experience. Personalization presents itself in countless ways: developing personalized cancer treatments, diagnosing rare diseases, simply allowing caregivers to actually “give care,” and ways we can only begin to imagine today.

It is inspiring to see visionary organizations across the continuum of care viewing health in a very patient-centric lens and innovating to make the healthcare experience more personal for patients. It is a promising future we already live in. But we’re just scratching the surface of the opportunity – and the work to be done – to deliver truly personalized health.

Twitter: @SiwickiHealthIT
Email the writer: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.