The healthcare industry is naturally rich with data -- clinical, patient, claim, hospital system, financial, pharmacy and, most recently, data from wearable technology.
It’s clear that analyzing this data collectively can drastically improve patient care and both clinical and financial outcomes, but how to actually collect, read, integrate, understand and leverage the data remains a broken process.
From a technology perspective, data is sourced from a myriad of systems with varied levels of sophistication, accessibility, transparency and quality. Systems designed decades ago prior to the advent of Big Data are still prevalent, and pulling data from them can range from merely difficult to downright arcane. On top of that, between payers, providers and patients, the opportunities to combine data sets can far exceed the willingness or ability of all parties to collaborate. Add to that the poor state of healthcare data integration tools, and you have quite a challenge to make sense of the healthcare puzzle.
The industry is faced with the challenge of enabling these vast and varied systems to talk to one another in a meaningful way that generates actionable value. So, where do we stand when it comes to facing data integration demands, and where can we improve?
We’re still sorely lacking when it comes to addressing integration. Large legacy software systems and the practice of manually collecting information is just the tip of the iceberg. Sadly, much of the healthcare analytics story still remains buried in hidden spreadsheet formulas. To really solve the data dilemma, we need to rethink our approach to integrating data by first integrating teams, integrating concepts and integrating technologies. Only then can we meaningfully integrate data.
Integrating Teams – Team design is one of the largest hindrances to quality data integration. Too often, IT teams are tasked with collecting data for an entirely separate analytics team, who then needs to provide reports to drive a separate clinical transformation team. Those who use the data are too far disconnected from the data collection process, while those tasked with collecting the data frequently have a poor understanding of the business need or even the source of the data itself. There are definite silos throughout the data lifecycle process. Not only are teams operating under singular mindsets, the points of data transfer or handoff can be sloppy and important details can be missed. Skill sets for data retrieval, organization, interpretation and action must become intertwined in order for data integration to improve. Crossover of team members can also help mitigate the lost efficiencies.
Without the full picture or people available to connect the dots, there is a huge margin for misinterpretation or missed opportunities. Building teams that include skilled professionals who understand and have access to the full picture will result in quicker and more effective advancements. We need good, accurate, timely data from all different parts of the business.
Integrating Concepts – Teams of data professionals will universally agree that your systems don’t talk to one another well because they model data differently, and lack solid relational keys to tie similar concepts across systems. As a common example, each data system will contain its own definition of what constitutes a person, an eligible member, and a patient. And each system represents these concepts with, at best, their own internally created unique keys, or at worst, no meaningful key at all. Either way, important concepts don’t map cleanly across systems. There are techniques for data unification across systems, but they often require system experts, external key lookups, a sophisticated data integration team, and constant grooming. Proper data warehousing techniques can help, but frequently the grander promises of a full-on Enterprise Data Warehouse have overshadowed the simpler and smaller utilitarian wins such as this.
Integrating Technology – Even with more comprehensive teams and data model concepts we still need technology for these vast data sets to talk with one another. Currently, we’re dealing with legacy systems that can’t handle the magnitude of data being generated, or collaborate effectively with new software. On top of these outdated systems, the integration tools the industry has adopted serve general-purpose data integration needs, requiring custom build-outs to address the specific demands of the healthcare system. Tooling that effectively and naturally understands and validates industry coding, provides meaningful data profiling, componentizes processing for reuse, and can handle the sheer volume of healthcare data is a must. Off-the-shelf general-purpose data integration tools may be appealing at the time of purchase, but require a huge investment in building up the missing library of established healthcare data expertise, making the hard path from data, to knowledge, to wisdom that much longer.
Data integration demands can’t be solved in one fell swoop, but if the cornerstones of people, processes and technology are each properly advanced, we can effectively begin to see more immediate, effective and impactful outcomes from healthcare data analysis.