This content is part of the Essential Guide: Guide: Securing patient data while promoting BYOD in healthcare
Manage Learn to apply best practices and optimize your operations.

CIO John Halamka answers reader questions on writing BYOD policy

SearchHealthIT's virtual seminar inspired viewer queries about their own thorny BYOD policy issues in health care.

After Beth Israel Deaconess Medical Center (BIDMC) CIO John Halamka, M.D., concluded his SearchHealthIT virtual trade show presentation "BYOD for Healthcare: The Good, the Bad and the Ugly," attendees from hospitals throughout the country submitted questions on bolstering HIPAA compliance with BYOD policy. Read Halamka's insights and suggestions from that exchange.

Q: How did you get buy-in from the physicians for the BYOD policy, and who had to give them the news?

Halamka: It was relatively quick. I think it was Rahm Emanuel that said, "Never let any crisis go unused," or something to that effect. When you have a sentinel event, a laptop theft, that is a catalyst for change. That's really what happened with us: A physician's personal device was stolen, and the entire physician community became aware of the nature of the consequences of such a device with cost, patient notification, institutional reputation, etc.

There was broad support at the medical executive committee level across all the senior clinician leadership to move forward with this initiative. I think the person who asked the question is right: If you didn't have a sentinel event and people didn't understand the risk, it would be a harder sell. So your choice is have something bad happen and mitigate, or try an aggressive education program to get buy-in.

Q: How do you "sell" that policy of auto-wiping personal devices after 10 failed password attempts? It must be a hard pill for employees to swallow.

Halamka: Yes, it isn't "Oh that's wonderful, you're going to erase all my personal data!"  It has to be sold in terms of risk. If you look at the regulatory and compliance environment today, the cost of a stolen mobile device can run $300,000 or $500,000 when you're looking at legal, forensics and patient notification.

So you say to the person: "You are ultimately responsible for your personal device. There could be hundreds of thousands of dollars in penalties, some could even possibly accrue to you. Or we could just insure that we auto-wipe the device after 10 tries, because if you put a four-character password on it all somebody has to do is try 9,999 times, and privacy is breached." With that understanding of risk and responsibility, people have accepted the auto-wipe feature.

Q: How can you protect medical devices from hackers? Not necessarily a BYOD question yet, but I am sure you're anticipating that happening as more patients walk in with their own sensors they've purchased accessing Bluetooth, Wi-Fi, etc.

Halamka: This is a huge issue. You would think, when you buy a device, it's like a toaster: All completely self-contained, it's fine. Nope! A lot of these devices you would buy -- like EKG machines, IV pumps, imaging devices -- they are like Linux workstations running Apache from five years ago.

What happens, and it's a nasty problem for the country, is that the FDA wants to look at the safety of medical devices and they say: "You've certified with us Apache version 1.0, running on Windows NT. Now if you want to update that with a patch or a new level of operating system, you have to go through the certification program again, at great expense." So you end up with a lot of these appliance type devices that are security nightmares. We've had to build network isolation around them, not connect them to the Internet, and put special monitors for intrusion detection and prevention systems -- because they could be spewing viruses all over your network unless you isolate them.

A word on implantable devices, such as pacemakers: They are not safe. They contain microprocessors. They have the same issues -- memory overflow, buffer issues, SQL injection -- every kind of hack you can imagine on an application in the data center is probably applicable to an implantable device. And that issue has not been well addressed yet.

Q: We use Citrix for remote access, we don't allow remote workers to access the local drive. But that doesn't prevent someone from taking screen shots or saving email locally. How do you address this in a BYOD policy?

Halamka: Citrix is quite fine for thick client applications that can't run locally, and many corporations use it to protect security and intellectual property inside the firewall. One challenge we all have  is that doctors are extraordinarily impatient people, and the startup time for a Citrix session and sometimes the instability over a flaky Internet connection can be a disincentive for doctors to buy in.

We have a combination of Citrix for some applications and Web-based applications for others that can be used anywhere on any device. And telling clinicians "You have a policy requirement not to store data locally, not to do 'save as' or print screens," because you recognize that creates a breach situation. To mitigate that breach situation, what we've already done is encrypt the device; so should they "save as" or print screen, in some ways we've gotten around breach reporting requirements. But you wonder, ultimately, if you're going to have to go to a virtual desktop infrastructure so even if you do print screen or "save as," at the end of the session it disappears. But VDI really does have the same issue as Citrix: slow startup time and challenges over low-bandwidth connections.

Meaningful use stage 2 local encryption requirements, just so you know, actually have specific language that says "Yes, the product you use must keep any local data stored or cached encrypted, but if a user does a print screen or a 'save as,' that is really the responsibility of the user." It's not something the vendor or the covered entity has to take accountability for, it's the user's responsibility not to do that and to keep their devices appropriately secured.

Q: Do you run HIPAA risk assessments on every device on your network or how does that work? An example is, did you run testing on the iPhone 5 before allowing its use on your network?

Halamka: Good news is, the evolution of most of these operating systems is such that you have basic, floor-level encryption and as the OS evolves, it just gets better. So for example, our HIPAA risk assessment says is that we believe we have adequate protection on Apple iOS 4.1 or higher; Mac OS X 10.7 or higher; BlackBerry OS for all of the 200 users we have left in our enterprise 4.5 or higher; Windows 7 Professional or higher and Android OS 2.34 or higher. So in that sense you've created a floor and as new devices are introduced they are grandfathered into that floor.

I will make the comment about Android: Although Apple controls iOS, single vendor control, you don't really have a single vendor controlling Android, so there's really no guarantee new Android devices will work well with our server-side controls (i.e. ActiveSync Exchange enforcing encryption). So often we find that Android devices need to be tested just to make sure they'll work. Not so much that the risk is going to be worse in the newer devices, but that they will even work.

Q: Do you maintain a list of recommended or mandated devices? Do employees even ask for advice?

Halamka: The challenge with BYOD is that people view this as their personal device. It's like saying, "We have decided you will drive an economy car, and Prius is the best choice." It's very challenging to do that. Everyone has different values. Everyone has different needs for different applications, for different form factors. So we set policies for minimum functionality, and how it is you should physically secure the device, and certainly we can provide guidance on an operating-system level. But ultimately it's up to the individual what they choose. If you choose poorly, you could end up with a device that just doesn't work on the network.

Q: How do you control how terribly chatty firewalls can be?

Halamka: Interesting question. What the person is getting at is that, as you put various protections on the server side and the client side, they have overhead. Some protocols, some approaches send lots of data back and forth, and the nature of that data might be "Hello, I'm out here and I'm safe!" "Are you sure? "Yes I am!" Constantly going back and forth with chatty protocols.

As we do all of our benchmarking of new technologies, we end up using appliances that we put out at client site to look at the nature of bandwidth flows from their sites and look at application performance. We have tried to avoid the introduction of technologies that have such high overhead that they impact network performance. This is an incredibly rapidly evolving technology stack, a huge moving target. Today's gold star might be tomorrow's complete failure. The technology is changing daily, do your best to select a winner...we do a lot of testing before selecting a product.

Let us know what you think about the story; email or contact @DonFluckinger on Twitter.

Dig Deeper on Electronic health records security compliance

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.