Healthcare predictive analytics and the HIPAA omnibus rule

Some commentators feel privacy regulations make predictive analytics in healthcare impossible, but there are ways to work within the law.

BOSTON -- Healthcare predictive analytics holds so much promise: The technology could help hospitals understand which patients are most likely to need follow-up care, therefore reducing readmissions. It could also assist primary care doctors in managing large patient populations with diverse illnesses.

In the real world, data doesn't move for free very often. Big data sets don't move painlessly. This is a pretty serious burden on data fluidity.

Ann Waldo,
Partner, Wittie, Letsche & Waldo LLP

However, the industry has been slower to embrace the power of data analytics than other sectors, such as manufacturing and retail. Why? The HIPAA omnibus rule's more stringent privacy rule, for one thing, said speakers at the O'Reilly Strata Rx 2013 Conference.

Some commentators have noted that healthcare providers face tighter privacy regulations than other industries. These laws can slow down data fluidity, making it hard for researchers and providers to get their hands on the data they need. But this problem is far from insurmountable and does not need to impede robust analytics operations.

Ann Waldo, partner at Wittie, Letsche & Waldo LLP, said HIPAA privacy regulations can be prescriptive, which makes life difficult for analytics professionals who want to ply their skills in the healthcare industry.

"From a data science perspective, that adds a lot of cost and complexity," Waldo said.

This is particularly true for third-party software vendors who want to sell analytics technology to providers. Prior to the release of the HIPAA omnibus rule, third-party vendors' responsibility under HIPAA was somewhat opaque. But with the release of the updates last January and its revamped business associate agreements, it became clear business associates like those analytics providers are now subject to the same regulations and liabilities as the healthcare providers themselves.

Another hang-up holding back the development of predictive analytics in healthcare is the prohibition against selling personal health information. HIPAA-covered entities are not allowed to sell their patients' data. The rule was originally put in place mostly to prevent marketers from accessing sensitive clinical information, but Waldo said this law also gets applied to third-party vendors who move data from the provider to their own databases for analytical purposes. Since large data sets can be expensive to move and vendors want to be reimbursed for this, it can be difficult to get projects started.

"In the real world, data doesn't move for free very often," Waldo said. "Big data sets don't move painlessly. This is a pretty serious burden on data fluidity."

But the news isn't all bad. Waldo said the HITECH Act, the basis of the meaningful use EHR incentive program, included a provision that allows medical research subjects to sign compound authorizations. This means that while researchers are getting authorizations from patients for a present study, they can also ask the patients if they would be willing to let the researcher hold onto their data for future studies. Waldo said compound authorizations were previously prohibited, which made it difficult for data scientists to develop analytical models, often an iterative process.

Furthermore, there are ways providers can use the data they already have on hand to develop those models. Khaled El Emam, senior investigator at Children's Hospital of Eastern Ontario Research Institute, said a number of methods, such as data masking and de-identification, can release data sets from HIPAA regulations while still maintaining their usefulness in analytics initiatives. As long as providers remove 18 personally identifiable data points -- such as name, address, phone number, email address, photo, date of birth, profession and geographic location -- from database fields, any use of the data is covered under a safe harbor exemption (the full standard for anonymizing records can be found here).

But, El Emam cautioned, deploying technical measures to de-identify data does not guarantee compliance with the law. He recommended providers and business associates read all applicable regulations in detail before doing anything that could expose PHI.

"It's important to remember that it's not only a technical thing," he said. "The regulations are important. They talk about the controls to put in place."

Ed Burns is site editor of SearchBusinessAnalytics. Email him at eburns@techtarget.com and follow him on Twitter: @EdBurnsTT.

Dig deeper on HITECH Act and meaningful use requirements

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

-ADS BY GOOGLE

SearchCompliance

SearchCIO

SearchCloudComputing

SearchMobileComputing

SearchSecurity

SearchStorage

Close