Big data and public health, part 2: Reducing unwarranted services

By Roger Foster
10:34 AM

Each of these questions can use large population datasets to predict a specific risk or outcome based on the individual’s specific history and the past behavior of the population. This is similar to how Amazon will make a book recommendation for you based on both your past purchases and the patterns of their entire population of book buyers. Predictive analytics will say what you are likely to buy and, what’s more, accurately inform you about your buying decisions.

Predictive and prescriptive analysis in government health

Large government healthcare organizations like the Department of Veterans Affairs (VA) and the Military Health System (MHS) have the opportunity to use their unique patient data sets of about 18 million cared-for lives to draw new insights based on the clinical, operational and (where connected) financial data elements. These data, coupled with analytical models, support data driven decisions for care delivery models.

[See also: Johns Hopkins' David Bodycombe on how the ACA will bolster population health.]

Similar-sized statistically significant data sets are now being used in the private sector to match patient profiles with population data to monitor disease progression and medical outcomes. The commercial data sets now available are patient de-identified assuring privacy protection and HIPAA compliance. The VA and MHS should consider additional ways to utilize their big data information to improve quality and medical outcome while reducing the overall cost of care.

Medicare and Medicaid cover more than 90 million lives. Analysis of the Medicare Current Beneficiary Survey data from the Center for Medicare Services (CMS) shows that regional variations in healthcare spending are the largest single contributor to healthcare costs that have no discernible improvement in patient outcome, according to a 2009 report in the New England Journal of Medicine, Getting Past Denial: The high cost of healthcare in the United States. This means that we have certain areas in the U.S. that are much better at delivering healthcare with the same outcome for lower cost.

The big data challenge will be to systematically identify these performance differences and root out the unwarranted services. One example is the practice of ordering dual high-cost CT and MRI imaging tests together. While each test has its merits and together they might provide a more complete picture of the medical problem, often the critical clinical assessment can be made with just a single test. The use of population data tailored for individual patients can help sort out when multiple tests are unwarranted.

[Related: Kaufmann reports on leveraging big data to contain care costs.]

The Agency for Healthcare Research and Quality (AHRQ) is a small agency within Health and Human Services that supports research that helps make more informed decisions and improves the quality of health care services. In 2010, AHRQ conducted the largest federal investment connecting medical liability to quality by funding states efforts to implement and evaluate patient safety approaches and medical liability reform. This population-based research is most effective when the researchers are able to access and review the huge patient record databases and registries for the comparative effectiveness studies. The AHRQ mission is directly focused on making use of information in the collective datasets of patient populations and could largely benefit from scientific computing and big data management support and best practices.

Multiple agencies across the government need to use the tools of predictive analytics and ‘big data-sized’ large population health datasets to identify unwarranted healthcare services. Implementing analysis that supports the systematic review of fast stores of de-identified patient electronic records will improve the quality of patient care and decrease the overall cost of healthcare services.

The next article in this series will address how big data can be used to combat waste, fraud and abuse in healthcare.
 

Roger Foster is a Senior Director at DRC’s High Performance Technologies Group and advisory board member of the Technology Management program at George Mason University. He has over 20 years of leadership experience in strategy, technology management and operations support for government agencies and commercial businesses. He has worked big data problems for scientific computing in fields ranging from large astrophysical data sets to health information technology. He has a master’s degree in Management of Technology from the Massachusetts Institute of Technology and a doctorate in Astronomy from the University of California, Berkeley. He can be reached at rfoster@drc.com, and followed on Twitter at @foster_roger.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.