Q&A: How a health 'data spill' could be more damaging than what BP did to the Gulf

By Tom Sullivan
11:12 AM

Q: One thing we’ve all learned from the VA, not just in its 2006 data loss, but this past October, when the agency reported the loss of an iPad, albeit an empty one, is just how easily hardware can disappear.

RK: Oh, absolutely. And in the context of healthcare, protected healthcare information and financial information used to drive medical ID theft, which seems to be one of the growing issues not only in the U.S. Medical identity theft is one of the results of the breaches in healthcare data being up 26 percent.

LP: The basic issue, when you think about data theft not data loss, because it’s hard to know whether that lost data ultimately ends up in the hands of the cybercriminal and all of these bad things occur, but in the case of identity theft, the end goal has been historically to steal a person’s identity, and just like getting a financial record, getting a health record probably has your credit card, debit card, and payment information contained in that record. The financial records are actually lucrative for the bad guy, but the health record is actually much, much more valuable item because it not only gives you the financial information but it also contains the health credential, and it’s very hard to detect a medical identity theft. What we’ve found in our studies is that medical identity theft is likely to be on the rise and, of course, there’s an awareness within the healthcare organizations that participate in our study that they’re starting to see this as more of a medical identity theft crime. It’s not just about stealing credit cards and buying goodies, it’s about stealing who you are, possibly getting medical treatment and, therefore, messing up your medical record. The victim may not know about this until he or she stumbles on something that alerts them their medical identity was stolen. That was definitely recognized by our respondents as an emerging threat they believed was affecting the patients in their organizations.

Q: Taking that into consideration, is there really even such a thing as a secure mobile device?

LP: I might get into trouble here. The answer is, mobile devices by their very nature can never be completely secure. The reason is that the whole idea is to allow the user the maximum convenience in terms of communication, which means staying connected. You can use it for so many purposes, from mobile payments to medical records to whatever else. So the other part of that reality is that the security industry is spending lots of resources and really talented companies have come up with better security solutions that don’t necessarily reduce the convenience but create a much better security environment around the devices. As you probably know, they have to be invisible to the end-user to be acceptable. We’re seeing more technologies being built with this kind of convenience in mind. Right now mobile devices are a source of great insecurity. Over time, they’ll be insecure forever, but much less insecure through the development of new technology.

Q: All of this makes plenty of patients and providers wary about sharing personal information. But for healthcare to really improve in this country – and I’m talking with or without federal health reform legislation – patients need to be willing to share their data in a protected fashion, so it can compiled and analyzed for better outcomes. How can we get over this patient consent hump?

RK: I would call it practical or pragmatic tactics, things like providers taking very simple steps to understand where their protected health information is, including when they do have an issue having a prepared response to deal with the privacy and security issues quickly, and probably most important, making sure all of the business associates and their ecosystems of contractors and providers have a focus on ensuring the information is protected and secure. Beyond that, as Larry said, it’s going to take the security and health industries making improvements in technology, training, and just general awareness that there is value to this data both for good as well as for evil. And as an important asset it needs to be protected.

LP: Ditto for my response, and I’d just add a couple things. First, I think the issue of consent is basically very, very difficult. That’s because people will give their consent, and not really give their consent. In other words, they understand it well enough so they’ll check the box that everything’s okay and then down the road when something happens, they’ll say ‘how did my data end up there?’ So we know that consent, you could talk to 100 people and get 99 different definitions of what they consider consent to be. But as the public get smarter about this whole issue – hopefully that occurs – they are going to be a bit more careful about letting providers share their data with third parties. The other issue, too, is the emergence of large health databases. It’s possible you could build in certain controls to give patients a sense of control over the data that’s collected and used about them. But it also creates the potential firestorm for cyber because instead of just hacking into one hospital, you’re hacking into a database of the United States government and that has huge infrastructure implications.

Q: Is there anything that came out of the study that I didn’t ask about?

RK: The only thing I’d add in closing, Tom, is if you think about the $6.5 billion figure and the fact that as opposed to that money being invested or spent on responding to data breaches, it being used toward good. With that $6.5 billion you could hire 81,250 registered nurses nationwide in the system to help patients.

LP: A sobering thought, there.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.