Data interoperability, knowledge interoperability and the learning health system

Dr. Blackford Middleton, physician expert in interoperability and standards, offers a deep dive into how provider organizations can gain more value from their EHR and IT investments.
By Bill Siwicki
12:16 PM

Dr. Blackford Middleton, chief informatics and innovation officer at Apervita.

Photo: Apervita

Progress has been made in healthcare with improving data interoperability, or "data liquidity," the ability of data to securely flow from one place to another, as needed and appropriate, for patient care and patient use. But there still is more work to do to get to a fully interoperable system.

To get an in-depth look at interoperability and the healthcare knowledge that can stem from it, Healthcare IT News interviewed Dr. Blackford Middleton, chief informatics and innovation officer at Apervita, a health IT vendor of quality measurement, value optimization, clinical intelligence and interoperability solutions. Middleton also is a member of the advisory council to the standards body HL7.

Q: What is the state of the interoperability of data in healthcare today, in your view?

A: True data interoperability implies both syntactic and semantic interoperability. Simply put, syntactic interoperability refers to the ability to move data from one EHR to another, for example, using a common transport method.

Think of the U.S. mail system. We can send a letter from point A to point B, but there is no guarantee that you can read the content. But we have a common envelope format, address label, etc. When we add on the notion of semantic interoperability, not only do we receive the letter, but we can understand the meaning of its content. The information within the letter is represented using a common language, so we can understand the meaning of the content.

We have estimated that the value of true semantic interoperability in healthcare would be approximately $78 billion per year in steady state after a 10-year rollout.

Recent regulation and resulting policies also are driving the healthcare industry toward improved data interoperability. The 21st Century Cures Act and the resulting final rules on information blocking, and Interoperability and Patient Access, from ONC and CMS, respectively, drive the adoption of key standards for improved data interoperability.

They both promote and require use of FHIR (Fast Healthcare Interoperability Resources) standards. FHIR is a more modern technology standard that supports bidirectional data exchange (read/write), more granular data types as well as documents, and is extensible for plug-and-play apps (Smart on FHIR). 

And it supports creation of computable knowledge artifacts (EBM on FHIR), digital measures (eCQM on FHIR), eCase Reports, and computable practice guidelines (CPG on FHIR), among other elements.

Taken together with the evolving USCDI (U.S. Core Data for Interoperability) standard, which progressively and incrementally grows the standard terminology and definitions for clinical data, we have a clear path toward requisite semantic and syntactic data interoperability – true data liquidity at scale.

Q: How can healthcare organizations improve the interoperability of data and, as you say, "knowledge."

A: While we have made progress on the exchange and interoperability of data as I just described, we have further to go on establishing true interoperability for knowledge. What I mean by this is that if the world's greatest sepsis detection algorithm and care management pathway or guideline has been implemented at a large, leading health system, how do we make that same algorithm available at a small, rural hospital?

One may safely assume that this requires data interoperability at the outset. The lack of knowledge interoperability, which may also be thought of as shareable or computable biomedical knowledge usable across disparate EHRs, is what is now preventing us from achieving the full value proposition many predicted to result from the adoption of EHRs in the U.S., and the anticipated transformation of U.S. healthcare.

If each and every implementation of EHRs requires re-implementation or, even worse, re-discovery, of an evidence-based computable best practice guideline, transformation of our healthcare system simply will not occur because we cannot scale the rapid creation, implementation and ongoing evolution of new knowledge in computable form. Knowledge interoperability is the cornerstone of creating the "learning health system."

A related issue is that modern clinical practice and cognitive support with EHRs requires use of an array of knowledge components: from standards for billing and coding, to clinical data standards for quality measurement and reporting, to value sets and reusable logic components, and ideally common clinical workflows for effective use.

I like to think of all of these parts as components of a "knowledge ecosystem," or a "knowledge supply chain," where the reasonable expectation is that they all work together effectively. 

Analogously, the automotive parts supply chain allows manufacturers of automobile engines, transmissions, wheels, windshields, nuts and bolts, and every other auto part to be used together building an automobile in Detroit, or anywhere for that matter.

We have a knowledge ecosystem in place, and we are moving toward a fully synchronized and coordinated ecosystem that guarantees that all of the component parts of a computable practice guideline, also known as e-pathways, will work together or, conversely, allow us to understand when one part of another is working quite right, learn and modify it for use at scale.

The HL7 Computable Practice Guideline on FHIR and the Clinical Quality Language are exciting newer standards that allow for implementation of best practice care guidelines in computable form to both improve the ability to stay current within EHRs with best practices, as well as share them across multiple disparate EHRs.

They allow us to use common data models such as QDM or FHIR, common (standardized) value sets, regular expression logic, and ultimately a common set of methods for presentation into the clinical workflow. 

Taken together this allows the clinician end user at any healthcare organization to use the most current, best evidence computable practice guideline from anywhere, to optimize her clinical practice.

Q: Describe what you call the "learning health system" – and how does the interoperability of "knowledge" help healthcare organizations deliver on the value potential in EHR investments?

A: I think of the "learning health system" as the technology-enabled system of health and care delivery that creates and uses interoperable healthcare data, monitors and measures all important aspects of care, uses shareable computable knowledge, and has built-in feedback loops to improve care and fine-tune computable practice guidelines, and machine learning algorithms that create cognitive assistants (AI) in care delivery.

This continuous learning system is often thought to pursue the Quadruple AIM: improving quality, lowering cost, and improving the patient and provider experience, and has been a key focus area for the AHRQ evidence-based Care Transformation Support (ACTS).

The learning health system is not limited, however, to one patient, one doctor, one health system or one EHR. It must transcend the whole healthcare continuum – as our patients do – and capture the patient experience throughout to understand how to optimize the patient journey. The interoperability of data and of knowledge as described above are cornerstones of this vision.

With a learning health system, we can capture a true picture of patient experience and progress toward shared goals, and improve our understanding of best practices and the public health at local, community and national levels.

Q: How can healthcare organizations use artificial intelligence and machine learning to improve the ways in which they can deliver what you label "clinical intelligence" at scale into the clinical workflow, across disparate EHRs?

A: The core technologies of the learning health system I have described are now sufficiently implemented to allow us to deliver the most up-to-date, best-evidence (synthesized with real-world data and experience) computable practice guidelines, into the clinical workflow, and ultimately to patients and their providers together.

We have the installed base of EHRs, the secure cloud is pervasive, and the standards for both data and knowledge representation are in hand. As I mentioned, feedback loops of both data describing real-world use (efficiency and efficacy), as well as practical and pragmatic concerns, can be delivered to both traditional knowledge authors such as those creating best practice guidelines as well as those using machine learning to create AI/cognitive assistance tools.

Different computable practice guidelines and other knowledge artifacts can now be implemented at scale across disparate EHRs, and equally importantly, tested before use and monitored in practice. Machine learning can be used in disease surveillance to identify previously unknown disease correlates – for example, vioxx and cardiovascular disease – as well as optimize predictive analytics, quality measures and e-pathways.

Further, we can now truly approach precision medicine at scale – analogous to the "mass customization" ideas of the early worldwide web and Internet: e-pathways, measures and even value-based contracts can be increasingly individualized as we better understand patient phenotypes, pharmacogenomics (correlations of genetic variants and optimal therapeutic strategies), social determinants of health, and patient behavior and activation.

Critical to this vision is that we can understand how each component of a composite knowledge artifact like a computable practice guideline performs in practice, provide feedback assessments (quantitative and qualitative), and rapidly evolve the tool, whether it is a measurement specification, an e-pathway, value-contract or Smart on FHIR application.

Thus far, we have not truly been able to build a whole composite knowledge artifact built upon standard data model choices, terminology choices, value set choices, logic choices, workflow and implementation choices, and see how the whole performs across diverse settings. This knowledge ecosystem and vision of frictionless exchange of data and knowledge is truly the way we transform patient health and healthcare at scale.

Twitter: @SiwickiHealthIT
Email the writer: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.