Q&A: Security failings 'a cultural issue,' says expert
Mac McMillan, CEO of Austin, Texas-based IT security firm CynergisTek and chair of the HIMSS Privacy & Security Policy Task Force, has some strong opinions about privacy protections in healthcare nowadays. The short version? Things could be a lot better.
You’ve said that if any other industry had this many privacy and security breaches, “heads would be rolling.” What is wrong with healthcare? Is the information landscape just too complicated? Or is it a matter of culture?
It definitely is not that it’s too complicated. And I think a lot of people try to hide behind that. They say, ‘We can’t do it, because our industry is so unique.’ Well, it’s not any more unique than the financial industry or the energy sector, or things that go on in the federal government. It’s just a matter of really figuring out how to do it, and settling the problem. Quite frankly, I think the problem is still that it’s a cultural issue. Look at the number of breaches. Insider snooping is still rampant in healthcare. And that’s a cultural issue. It’s a cultural issue that says, ‘It’s OK for me to be looking at things that I’m not supposed to be looking at.’ Which goes right to the core of how the industry views security and confidentiality. Providers do not see security as an imperative yet. That’s not across the board, obviously. There are some folks out there who are actually getting it and trying to do a good job. And they’re making the investment, and I think they’re seeing the benefit of doing that now. They’re realizing that there is a benefit to doing these things correctly. But that’s not the majority.
[See also: Top 5 most common gaps in healthcare data security and privacy.]
You just took part in a webinar on HIPAA security risk analyses. What should organizations keep in mind when undertaking them?
One of the things they really need to focus on is really understanding and appreciating where their personal health information is – even more than before. HIPAA has always had a requirement for organizations to map where their personal health information is, and to build their programs around that and understand what the risks are to that data, whether it’s at rest or in transit. But with the requirements being levied under the HITECH rules, it’s getting more and more specific. And there’s more and more emphasis being placed on really knowing where that data is, who’s touching it, where it’s being sent, the relevance or the appropriateness of where it’s going, and where it’s residing. And also whether or not they really, truly assessed the risk to that information properly – “reasonably” is the term the government uses – and then took appropriate measures to protect it. More and more, they’re looking at these breaches that are occurring. They’re going to conduct 150 audits between now and next October, spread out among providers and payers and business – they now have to be ready to receive either an audit or an investigation, depending on the circumstances, and they can’t just sit there and hide behind the face that they’ve done a cursory risk assessment.
What do you suspect most providers will discover after those analyses? Robust security, or flaws they need to fix?
I suspect almost all of them are going to still have areas that they need to address. That’s been our experience all along. When I look back at the risk assessments that our company has conducted over the last year, I’m just absolutely amazed at the amount of remediation that a lot of organizations are still having to do. And part of it is because they just really have not invested in security yet. A classic example is that we still have hospitals out there that don’t have a dedicated staff to the security function. They don’t have all of their policies and procedures documented. Many of them have not invested in the technologies that are necessary for them to put those controls in place. We still have organizations that are wrestling with whether to encrypt e-mail! You would think that would be a no-brainer. But when you look at the requirements being talked about for meaningful use Stage 2, they’re recommending even more security requirements be baked in, because there are incentives and penalties tied to that. And that, quite frankly has gotten people’s attention and gotten them to spend money on security.
[See also: CynergisTek, Diebold partner on security.]
Continued on next page.