japolia - Fotolia
ORLANDO -- Compensation for healthcare providers in America is moving away from a fee-for-service model to one based on value measured by quality, affordability, efficiency and satisfaction of the patient. The former model is based on volume of services provided, while the new population health models focus on improving the overall well-being of a defined group of patients.
Providers now need to organize the healthcare system based on understanding population, said Charles Saunders, M.D., CEO of Healthagen Population Health Solutions in New York (Aetna's population health company), speaking in June at the Healthcare Financial Management Association (HFMA) annual conference in Orlando. This change is inevitable, he said, because payers are moving to value-based compensation.
Medicare has pledged that next year, 30% of its payments will be made to accountable care organizations, (ACOs) and 85% of all payments will be value-based. By 2018, one out of every two payment will be to ACOs and nine out of 10 to value-based programs.
Every session covering population health was well attended at the HFMA conference, and speaker after speaker touched on the challenges. Almost every case came down to the same raw conclusion: To analyze population health, healthcare organizations need data and technology, not just one or the other.
Population health = population risk
Healthcare providers who embark on a population health initiative agree to assume greater risk. In all the various population health models, the provider assumes responsibility for providing care for a group of patients for a negotiated amount. If the cost of care exceeds that amount, the provider loses money.
"Healthcare in the past has been rewarded not for risk, but for avoiding risk," said Roger Jansen, senior vice president and chief human resources officer at Spectrum Health, a hospital and health plan system based in Grand Rapids, Mich. But with population health initiatives, a provider is knowingly and willingly taking on risk.
"We need to understand risk," said Robert Broadway, vice president of corporate strategy at Bethesda Health, a hospital system in Boynton Beach, Fla. But the real truth is that providers under the fee-for-service model already assume "exorbitant risk" from treating self-pays to insuring themselves against malpractice, said Broadway, who is also former chairman of the HFMA board of directors.
Population health models require a "completely new set of skill sets running a risky, narrow-margin business that consumes an enormous amount of capital, and if you are wrong on pricing, you can have massive losses," Saunders said. He encouraged providers to look at payers, which have been managing risk for decades.
Roger Jansensenior vice president, Spectrum Health
The risk is measurable. "The first thing is to aggregate and analyze data that is available to understand the population and understand the risk," Saunders said. To do that, use analytics to focus on "highest-risk individuals," as in his experience, 45% of the cost of any health system is consumed by 5% of its population. "No one is in a better position than a health system" to understand those at highest risk and highest cost, he said.
All kinds of data are relevant to assessing risk, Saunders said. Examine electronic health records, claims, patient registration information, patient self-reported health risk appraisals, laboratory and prescriptive information, and even external sources such as credit card and other sociodemographic data that will enable you to break down populations and predict behavior and risk.
While providers might be daunted by what appears to be the sheer volume of data, especially in a field that lags behind other data-driven industries, according to one provider, what seems massive is actually not.
Dignity Health runs on data
One of the nation's largest healthcare providers has jumped into a data-rich environment with both feet.
Dignity Health, which is based in San Francisco, exports all its data from repositories into a Hadoop data lake, in which the information can be culled and analyzed. The data lake sits in the cloud alongside more traditional databases, rather than replacing them, said Sunil Kakade, Dignity Health's director of information and application architecture. He spoke in June at the Hadoop Summit North America.
"Healthcare data is not going to be high volume," he said. "It's not like Facebook," with billions of transactions. "Big data doesn't mean a lot of data," Kakade explained.
Generally, data lakes store information using the open source Hadoop framework for future use in analytics applications. Proponents believe data lake architecture offers a cheaper alternative to traditional data warehouses. Others, however, say data lakes bring hidden complications.
Dignity has structured its data lake around the patient. Even if Dignity treated every person in the U.S., that would mean at most 400 million rows in the database, he said. Thanks to Hadoop, the database contains structured and unstructured data, and the horsepower behind the system comes from the analytic engines.
"We use big data to find the small data and do smart analytics on that," he said. Much of what Dignity's analysts do is identify risk, carving large patient databases into smaller pieces for forecasting. As an example, he said if his company wanted to identify the risk around patients with congestive heart failure, it represents a relatively small subset of the entire patient population.
"We want to start focusing on the use case and not focus on the data," Kakade said. By employing the data lake approach, the emphasis shifts to the analysis. "Doctors can analyze the data better than the IT guy, so we should take the IT guy out of the equation," he said.
Video: Using data lakes to sift through information
Hospital CIO gives pointers on value-based healthcare
Feds unveil next generation ACO model