Lingering obstacles block the path to interoperability
Acknowledgement of the value of interoperability – and the desire to implement it – are seemingly widespread in healthcare. So why is the industry still so short of achieving it? For many reasons – technology, financial or logistical obstacles, a lack of standardization, fear of new procedures, or data gaps in EHR systems – the goal of being able to easily and securely exchange accurate patient data across healthcare providers remains elusive.
Technology isn't enough
Getting a handle on advancing interoperability requires that technical and business process/policy challenges are addressed together, instead of in isolation, so that there's an integration of technology and policy workflows and scaling. "Simply putting the technology in peoples' hands isn't enough," said Steven Posnack, director of the Office of Standards and Technology for the Office of the National Coordinator in the U.S. Department of Health and Human Services.
"There need to be business agreements in place and, in many cases, a business model around exchanging information that impacts the delivery of care." Whatever the intention is – e.g., sending a patient for a referral, requesting information from a patient or sending an electronic prescription -- the training and workflow implementation involved with interoperability technology must make it a more usable and seamless part of the health information technology and patient care delivery infrastructure.
Last-mile problems: the failure to communicate
There's also a gap – one of what David C. Kibbe, MD, president and CEO of DirectTrust, calls "last-mile problems" delaying full-scale interoperability adoption – between the fairly robust and reliable ability of networks to move health information data from point A to point B and the ability to use that data for clinical decision-making.
That's because not all of the endpoints – the sending or receiving EHRs – can readily send or receive the information. He likens it to making a phone call where the connection is strong but the cell phone you're calling "only receives messages in French. So if you send a message where you happen to be speaking in English or German or Spanish, that particular party at the end of that phone call won't understand it."
A corollary to that is the lack of uniform standardization for CCDAs, the formatted data messages for clinical summaries that can be generated and digested by electronic health records.
"One of the problems with interoperability is that the CCDA is still interpretable in different ways," observes Kibbe. "So not every electronic health record can understand every other electronic health record's CCDA." The result can be the transmission of copious amounts of extraneous data content that the receiving provider doesn't need and can't use, instead of the core data requested for care coordination. However, efforts by ONC and private industry players are under way to obtain standardization for efficient and reliable content exchange.
Getting connectivity with existing EMRs
There is an inherent challenge to interoperability presented by the simple fact that there are, perhaps, hundreds of competing electronic medical records proliferating whose construction is such that they don't match up with one another. "They don't have interchangeable parts," said Rich Parker, MD, chief medical officer for Arcadia Medical Solutions, a major aggregator of EMR data from disparate systems on behalf of health care provider organizations. "It would be like saying a Honda and a Ford have interchangeable parts."
Nevertheless, this challenge is being met in a couple of big ways – through federal rules promulgated in recent years that require electronic medical records to share some interoperability features in order to be certified; and by what companies like Arcadia do. "Say you're a group of 1,000 doctors operating on eight different EMRs," Parker said.
"Instead of trying to figure out how to plug them into each other, which you really can't do, or spending millions of dollars to convert them all to one system, which usually is too expensive, you let a company like us come in and connect them." That, however, can take several months.
That dovetails with the suggestion from Erin Sparnon, engineering manager in the health devices group at the ECRI Institute: that "it would be more fruitful" if hospitals focus less on new technologies and more on getting support from their vendors to make their existing health information systems – into which they've sunk huge amounts of money – interoperable.
A key to innovation
According to Leigh Anderson, chief information officer at Premier, Inc., the core data that providers need to integrate from multiple sources is financial, or claims information, and – most importantly – clinical. "The reason why clinical data is important is for population health management," he said. "You must understand the sickest people across the continuum so that you can effectively target your resources to make sure they stay well."
In Anderson's view, one way to use that data innovatively is through an HL7 standard that could deploy it for analytic visibility or workflow purposes and really make a difference. The hoped-for result is interoperability at a deeper level than a traditional HL7 solution.
"That's how I think you start to get innovation in health care, which is what I think the purpose of interoperability is, so that you're not just doing interoperability for it's own sake," Anderson said. "It's probably the most exciting thing to come along from an interoperability perspective in awhile."
The financial impetus
Fee for service arrangements, in Kibbe's view, tend to be the fundamental impediment to interoperability adoption, because they don't incentivize care coordination or discourage duplication of services.
But with value-based, or risk-based, payments, it pays for providers to avoid such duplication "and to be more careful about surveying the information that comes in about the patient from somewhere else, particularly if it's recent," he said. Were the method-of-payment balance tipped more in favor of value-based arrangements in the U.S., and paired with quality and cost control incentives, "I think we would see these issues of interoperability disappear over a period of five to six years," Kibbe predicts.
Often, the biggest impetus towards interoperability is financial, where in markets that are shifting from a fee-for-service to global payment platform, actors such as state Medicaid agencies or commercial plans are insisting on it. In Parker's view, organizations that "are feeling that financial threat will be more prone to move forward with interoperability because that's the only way they'll be able to get all of their patients in one system, to be able to do population health."
The missing link
There's an elephant in the room here, too: Federal law prohibits the Department of Health and Human Services from setting a standard for a unique patient identifier. Beyond initiatives in which ONC, CHIME and HIMSS are involved – separately or in collaboration – to match patients with their correct data, industry organizations, including CHIME and the American Medical Association, see the adoption of such an identifier as critical to addressing many, if not most, of the problems associated with blocking interoperability.
"The prohibition on establishing a national patient identifier, as you might imagine, hasn't helped us meet the challenge of patient identification," said Sparnon.
Twitter: @HealthITNews