Data normalization: A foundational step to achieving Triple Aim goals

EHR systems tend to represent data differently, leading to semantic interoperability issues
By Brian Levy, MD
03:44 PM

Actionable data is critical to advancing healthcare's Triple Aim: improving patient experience, population health and costs. Data accuracy for analytics and information sharing must exist to accomplish these goals.  

Interoperability is a focal point of national efforts to advance information sharing and data analytics, but lack of a common clinical vocabulary is currently a primary roadblock to forward momentum. Until a foundation for semantic interoperability is laid that standardizes clinical terminology across disparate systems, success with such care delivery models as population health management, health information exchange and accountable care organizations will be limited.  

Industry standards such as SNOMED CT, LOINC and RxNORM are important steps towards achieving this goal, but there is currently no existing standard that addresses all clinical terminology. Patient data shared between health information systems must be "cleaned" before data warehousing and analytics strategies can be realized.

Simply put, data must be normalized to remove semantic ambiguity.

The growing need for semantic interoperability
Terminology is core to healthcare data. From procedures to results to diagnoses, clinical concepts are represented across health IT platforms in various coded clinical terminologies or free text.

In the past, codified healthcare data was predominately claims, which did not sufficiently capture the patient's true diagnosis. As the health IT movement evolves, the utilization of clinical data--often derived from EMRs--to support data analytics and accurate information-sharing across disparate systems is becoming a requirement. The Health Information Management Systems Society defines the highest level of interoperability as semantic interoperability, characterized as the ability of two or more systems to not only exchange data but also use the information that has been exchanged.

EHR systems tend to represent data differently, leading to semantic interoperability issues. A recent paper published in the Journal of the American Medical Informatics Association points to the challenge. After analyzing 21 EHR technologies, researchers found 615 observation errors and data variations in the CDA (Clinical Document Architecture) which is used to share patient information between systems.

Among internal and external EHRs, legacy clinical systems and multiple laboratories within a hospital continuum for example, there may be several hundred representations of lab data alone. All of these clinical concepts must be normalized into one standard terminology for high-level data sharing and analytics initiatives; otherwise, the negative fallout could include:

  • misrepresentation of patients within Clinical Quality Measures, population health reports and predictive analytics
  • lost reimbursement as patients aren't placed into the proper risk stratification
  • inaccurate real-time care alerts or decision support rules
  • inefficient use of care management and patient engagement resources and tools
  • safety issues around drug interactions/doses, allergies and medical history
  • inability to perform population health analytics.

Fundamentals of managing terminology
Terminology management strategies help to eliminate data ambiguity and understand previously un-coded and non-standard data. When data is normalized, a foundation exists that enables healthcare organizations to answer critical questions, report accurately to registries and regulatory bodies and elevate patient care.

Terminology management can transform the wealth of stored data that exists across a healthcare continuum into an integrated, focused repository accessible to users across the enterprise. Three components should be addressed within terminology strategy: 1) standardization of local content; 2) translation between standards; and 3) code groups.

In terms of standardizing local content, it is not uncommon for a health system to manage 40 or more separate IT systems, each with its own terminology infrastructure. Thus, the first step is laying a foundation, or a single source of truth, for standardizing data between these systems internally.

In addition to mapping from local terminologies to standards, there is a need to map between standards.  For example, problem lists in the EMR will be coded in SNOMED CT, but ICD-10-CM codes will be needed to bill.  A proprietary pharmacy terminology such as Medi-Span may be used for ordering drugs in the EMR, but RxNorm is required for sharing the patient's medication lists with other systems.   

While standardization of local content and maps are critical first steps to addressing interoperability challenges, they don't address broader terminology management issues that exist with population health and analytics initiatives. When managing patient populations, healthcare organizations often need to build code groups, or lists of codified terminologies to construct quality measures, decision support rules or population cohorts.

The following example illustrates where content, maps and code groups come into play. A data analyst needs to create a cohort of patients that have either been diagnosed with heart failure or are on an ACE Inhibitor. For simplicity, this definition could consist of two code groups--one capturing all diagnoses codes for heart failure and another consisting of ACE Inhibitors. If the data warehouse contains drug data consisting of RxNorm, NDC, Medi-Span and free text entries, the analyst would probably want to standardize the free text entries to a standard, normalize all the standardized drug data to RxNorm, and then create a code group consisting of all the RxNorm codes for ACE Inhibitors.  

Code groups provide an additional layer to help abstract the complexity related to the underlying standardized terminologies of a particular population cohort, ensuring greater accuracy with any subsequent reporting, analytics or clinical initiative.

Enterprise terminology management
Large healthcare organizations without an enterprise terminology management strategy face inevitable conflicts regarding terminology intake, management and distribution. Attempts to solve this problem are often characterized by a combination of spreadsheets and error-prone manual processes involving numerous interdepartmental coordination meetings between informaticists and individual business owners.  Furthermore, these processes need to be repeated each time a local business rule or standard code set is updated.

Leveraging a terminology management platform can be an important step toward achieving the goals of data normalization to align with national quality initiatives. When a platform of software, content and consulting solutions exists to map, translate, update and manage standard and enhanced clinical terminology, the guess work is essentially removed from complex terminology management processes.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.