Health reform to spur waves of data

Medicare, Medicaid prepare systems for terabytes of claims info
By Mary Mosquera
09:25 AM

The Centers for Medicare and Medicaid Services will have to manage and analyze double the volume of Medicare data and triple the terabytes of Medicaid data after health reform is fully in place. 

By 2015, the waves of Medicare claims data will explode from 370 terabytes to 700 terabytes. For Medicaid, 30 terabytes of data will multiply to 100 terabytes, according to a CMS official.

The Centers for Medicare and Medicaid Services will have to manage and analyze double the volume of Medicare data and triple the terabytes of Medicaid data after health reform is fully in place. 

[See also: CMS awards up to $15B for data center.]

By 2015, the waves of Medicare claims data will explode from 370 terabytes to 700 terabytes. For Medicaid, 30 terabytes of data will multiply to 100 terabytes, according to a CMS official.

CMS has been upended by health reform just as providers have, and the agency is transforming how it operates and communicates with physicians and hospitals to be ready for the roll out of the health reform law in 2014.

The Patient Protection and Affordable Care Act (ACA) requires a huge IT infrastructure so that CMS can manage the shift to pay healthcare providers based on quality and to analyze the data that supports provider improvements in performance of patient care, according to Tony Trenkle, CMS CIO.

CMS has historically been a decentralized, stove-piped agency with mushrooming volumes of data and separate data centers springing up around specific programs. Now CMS is concentrating on adopting enterprise and shared services and establishing the capabilities to collect, analyze and use real-time data.

“We have very little time to change. And we’re already feeling the first winds of the tsunami,” he said at a recent conference sponsored by the Bethesda, Md., chapter of AFCEA, which promotes industry and federal agency partnership and IT innovation.

“The linchpins are data and IT,” he added. “When you talk about Big Data, it’s not only the volume, but the volume and complexity together,” he said. And there will be volumes more data as CMS migrates from fee-for-service and traditional capitation models for payment to a variety of programs that will pay based on quality, performance and shared savings.

Previously, the data that CMS collected and held was just related to claims, then the agency moved into quality data. More recently, CMS has built up encounter data from Part C prescription drug plans and some clinical data from the HITECH Act programs starting in 2011. A large amount of the data is unstructured and not machine-readable.

CMS has created an Office of Information Products and Data Analysis to work across the enterprise and with the IT infrastructure in a coordinated effort to make data and tools more available and usable.

[See also: CMS adds infection data to Hospital Compare website.]

Data analytics is beginning to be used for fraud prevention for scrutinizing geographic variations and will evaluate if payment models improve patient care or bend the cost curve.

“We need to internally enable the business use of data without having to be a programmer or expert user to utilize it,” he said.

Trenkle wants to get more of the data and the online tools that support it pushed down to end users, such as ACO communities, and not just power users, so they can understand how well they’re performing. States are also looking for more Medicare data to understand the healthcare gaps and needs in their counties and region.

“The idea is to provide community users with pools of information and get that information out quicker,” he said.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.