A beginner's guide to data analytics
This is Part I of our three-part June 2015 print cover story on healthcare analytics. Part I focuses on the first steps of launching an analytics program. Part II focuses on intermediate strategies, and Part III focuses on the advanced stages of an analytics use.
This first part may sting a bit: To those healthcare organizations in the beginning stages of rolling out a data analytics program, chances are you're going to do it completely and utterly wrong.
At least that's according to Eugene Kolker, chief data scientist at Seattle Children's Hospital, who has been working in data analysis for the past 25 years. When talking about doing the initial metrics part of it, "The majority of places, whether they're small or large, they're going to do it wrong," he tells Healthcare IT News. And when you're dealing with people's lives, that's hardly something to take lightly.
Kolker would much prefer that not to be the case, but from his experiences and what he's seen transpire in the analytics arena across other industries, there's some unfortunate implications for the healthcare beginners.
"What's the worst that can happen if Amazon screws up (with analytics)?...It's not life and death like in healthcare."
But it doesn't have to be this way. Careful, methodical planning can position an organization for success, he said. But there's more than a few things you have to pay serious attention to.
First, you need to get executive buy in. Data analytics can help the organization improve performance in myriad arenas. It can save money in the changing value-based reimbursement world. Better yet, it can save lives. And, if you're trying to meet strategic objectives, it may be a significant part of the equation there too.
As Kolker pointed out in a presentation given at the April 2015 CDO Summit in San Jose, California, data and analytics should be considered a "core service," similar to that of finance, HR and IT.
Once you get your executive buy in, it's on to the most important part of it all: the people. If you don't have people who have empathy, if you don't have a team who communicate and manage well, you can count on a failed program, said Kolker, who explained that this process took him years to finally get right. People. Process. Technology – in that order of importance.
"Usually data scientists are data scientists not because they like to work with people but because they like to work with data and computers, so it's a very different mindset," he said. It's important, however, "to have those kind of people who can be compassionate," who can do analysis without bias.
And why is that? "What's the worst that can happen if Amazon screws up (with analytics)?" Kolker asked. "It's not life and death like in healthcare," where "it's about very complex issues about very complex people. ... The pressure for innovation is much much higher."
[Part II: Clinical & business intelligence: the right stuff]
[Part III: Advanced analytics: All systems go]
When in the beginning stages of anything analytics, the aim is to start slow but not necessarily to start easy, wrote Steven Escaravage and Joachim Roski, principals at Booz Allen, in a 2014 Health Affairs piece on data analytics. Both have worked on 30 big data projects with various federal health agencies and put forth their lessons learned for those ready to take a similar path.
One of those lessons?
Make sure you get the right data that addresses the strategic healthcare problem you're trying to measure or compare, not just the data that's easiest to obtain.
"While this can speed up a project, the analytic results are likely to have only limited value," they explained. "We have found that when organizations develop a 'weighted data wish list' and allocate their resources toward acquiring high-impact data sources as well as easy-to-acquire sources, they discover greater returns on their big data investment."
So this may lead one to ask: What exactly is the right data? What metrics do you want? Don't expect a clear-cut answer here, as it's subjective by organization.
First, "you need to know the strategic goals for your business," added Kolker. "Then you start working on them, how are you going to get data from your systems, how are you going to compare yourself outside?"
In his presentation at the CDO Summit this April, Kolker described one of Seattle Children's data analytics projects that sought to evaluate the effectiveness of a vendor tool that predicted severe clinical deterioration, or SCD, of a child's health versus the performance of a home-grown internal tool that had been used by the hospital since 2009.
After looking at cost, performance, development and maintenance, utility, EHR integration and algorithms, Kolker and his team found for buy versus build, using an external vendor tool was not usable for predicting SCD but that it could be tested for something else. And furthermore, the home-grown tool needed to be integrated into the EHR.
Kolker and his team have also helped develop a metric to identify medically complex patients after the hospital's chief medical officer came to them wanting to boost outcomes for these patients. Medically complex patients typically have high readmissions and consume considerable hospital resources, and SCH wanted to improve outcomes for this group without increasing costs for the hospital.
For folks at the Nebraska Methodist Health System, utilizing a population risk management application that had a variety of metrics built in was a big help, explained Linda Burt, chief financial officer of the health system, in Healthcare IT News' sister publication webinar this past April.
"The common ones you often hear of such as admissions per 1,000, ED visits per 1,000, high-risk high end imaging per 1,000," she said. Using the application, the health system was able to identity that a specific cancer presented the biggest opportunity for cost alignment.
And health system CFO Katrina Belt's advice? "We like to put a toe in the water and not do a cannon ball off the high dive," she advised. Belt, the CFO at Baptist Health in Montgomery, Alabama, said a predictive analytics tool is sifting through various clinical and financial data to identify opportunities for improvement.
In a Healthcare Finance webinar this April, Belt said Baptist Health started by looking at its self-pay population and discovered that although its ER visits were declining, intensive care visits by patients with acute care conditions were up on upward trend.
Belt recommended starting with claims data.
"We found that with our particular analytics company, we could give them so much billing data that was complete and so much that we could glean from just the 835 and 837 file that it was a great place for us to start," she said. Do something you can get from your billing data, Belt continued, and once you learn "to slice and dice it," share with your physicians. "Once they see the power in it," she said, "that's when we started bringing in the clinical data," such as tackling CAUTIs.
But some argue an organization shouldn't start with an analytics platform. Rather, as Booz Allen's Escaravage and Roski wrote, start with the problem; then go to a data scientist for help with it.
One federal health agency they worked with on an analytics project, for instance, failed to allow the data experts "free rein" to identify new patterns and insight, and instead provided generic BI reports to end users. Ultimately, the results were disappointing.
"We strongly encouraged the agency to make sure subject matter experts could have direct access to the data to develop their own queries and analytics," Escaravage and Roski continued. Overall, when in the beginning phases of any analytics project, one thing to keep in mind, as Kolker reinforced: "Don't do it yourself." If you do, "you're going to fail," he said. Instead, "do your homework; talk to people who did it."