April 2, 2014 11:18 AM
Posted by: adelvecchio
Accountable Care Organizations
, population health management
Guest post by Greg Chittim, senior director, Arcadia Healthcare Solutions
It’s no secret that healthcare is changing. Traditional fee-for-service payment models are going away, and new capitated, quality-based payment models are here to stay. The challenge for providers is to position themselves for success under these new models. Now more than ever, it is critical for healthcare organizations to align their care delivery systems and payment models to support population health management to ensure they are delivering the highest value of proactive care to individual patients and patient populations.
The purpose of population health management is to create a mutually beneficial environment that incentivizes both plans and providers to manage the health of their patients by stratifying the population based on risk, engaging high-risk patients, and proactively improving the health of the overall population. Proactive management of high-risk patients has shown to drastically reduce costs by driving down the number of costly emergency department (ED) visits.
With this transition comes many challenges. New technologies, laws and regulations, lack of expertise, and the inherent difficulties of change management are causing all players supporting the transformation to be spread thin. To be successful in the healthcare industry of tomorrow, change is required today. More and more organizations are beginning to engage with accountable care organizations (ACOs) and other capitated quality-centric contracts, implement tools and technology that support population health, and break ground on initiatives that will better position them for success in the new era of healthcare. The industry is realizing that population health is not a fad — it is the future of healthcare.
At Arcadia Healthcare Solutions, we define population health management as a provider’s ability to answer the following three questions with certainty, and have the data to support those answers:
- Who are my patients?
- How sick are my patients?
- Am I effectively caring for my patients?
Below are steps organizations can take to address these questions.
Step 1: Provider-patient attribution
Maintaining a high degree of precision around defining a patient population has never been as important as it is now. Historically, patients scheduled visits, came in for their appointments, and follow-ups were conducted as necessary. With new payment and delivery models — such as ACOs and patient centered medical homes (PCMH) — provider reimbursements are increasingly dependent on the quality of care delivered and overall risk of their patient population. Providers are now forced to better manage the health of their patient population by proactively engaging their highest risk patients, while appropriately managing the health of the rest of their population. They must also support their better health management practices with high quality data from their EHR. As a result, the ability for health plans or other communities (such as health information exchanges) to successfully attribute patients to their primary care provider (PCP), and all parties having an answer to the question, “Who are my patients?” is absolutely critical.
In theory, this seems like a simple task; assign patients to a PCP, document it, and reconcile as needed — what’s the big deal? But of course, the devil is in the details. With the disconnected nature of healthcare and the numerous parties involved in the attribution process, aligning this data and creating a single source of truth is actually extremely challenging. We recently came across an example of this at an organization who thought they had a good handle on attribution — around 70-80% by their estimation. Upon initial analysis between their EHR and health plan membership files, we discovered that attribution rates were actually hovering around 13%, creating a barrier for future population health and quality improvement initiatives.
Attribution is where all data analytics, quality improvement, and PCMH transformation projects should begin. The most effective way to do this is by integrating clinical and demographic data from EHRs with claims and eligibility data. We do this by tapping into the back end of the EHR, pulling claims data, and loading all data into a centralized data warehouse. Data is then scrubbed and merged utilizing a master patient index, creating a single record for all patients. This rich view of a patient’s activity gives providers the level of detail captured in the EHR, as well as the breadth of data from claims. This allows them to see patient activity across the entire care continuum.
The initial attribution process involves reconciling conflicts between a provider’s perception of attribution (from EHR data) and a plan’s perception (from claims data). This inevitably requires some degree of manual intervention by both payers and providers, but only needs to happen once. A recurring, automated process — requiring far less manual intervention — is implemented after the initial attribution process is complete. We have seen this process improve attribution by upwards of 80% in as few as 90 days, and maintain sustainable attribution rates upwards of 90%.
Once an attribution process is in place, the value of any reporting, quality improvement, PCMH transformation, or overall population health initiative will increase dramatically. Providers will have transparency into their patient population, creating the foundation to begin tackling the challenges that will position them for success in the new era of healthcare.
Of the patient population you identified, how many of these patients are diabetic? Hypertensive? How many are at risk for becoming hypertensive? Are you certain that you can properly identify all of them?
Being able to answer these questions and having the data to back up your answers is crucial. Doing so is more difficult than you may think. First, it is important to be able to identify patients with chronic or other high-risk conditions so you can proactively manage those conditions. Second, new fee-for-value reimbursement models are directly dependent on the level of risk associated with your patient population. By not identifying high-risk patients, reimbursement dollars are left on the table. Finally, you must be able to prove the scope and beneficial impact of the interventions you deliver.
Traditionally, tools used to stratify the population and calculate risk have been primarily based on claims data. High-risk patients were identified based on the diagnosis codes for chronic or high-risk conditions. Outreach and engagement strategies leveraged this data to proactively manage these conditions. However, claims data does not always tell the whole story. What if the condition was not relevant to the visit or was not properly coded? What if the patient has not been in for a visit in that calendar year? What if the patient is at risk for becoming hypertensive but that risk is only captured in specific lab results or diagnostic values? These are all examples of cases where claims data may not tell the whole story. Each of these situations present missed opportunities to both improve quality of care and maximize risk reimbursement.
So, how do you fix this issue? How do you identify these patients, and how do you ensure that you are maximizing your risk reimbursement? One effective way to do this is by leveraging EHR data. Integrating claims with EHR data gives you a richer view of the patient population. Things EHR data can tell you that claims will often miss include the following things.
Vitals signs — Who are patients that have not yet been diagnosed, but are at high risk for becoming hypertensive?
Medical history — Who are patients that may not have had a visit in the last year, but are considered to be high-risk?
Transitions of care — Have any of your patients recently been to the ED?
These are only a few of the questions an integrated data set can answer, but the value of this information is sizable. This level of transparency allows providers to proactively manage their high-risk patients, as well as manage those on the brink of the ‘high-risk’ category. Internal studies conducted by Arcadia have revealed that millions of dollars in risk reimbursement opportunities have been missed due to the incomplete picture presented by claims data.
Although alternative strategies may exist that give you the transparency you need to measure the health of your population, an integrated, data-driven approach is crucial to doing this effectively at scale.
Step 3: Intervene and track
Are you doing what you need to do to effectively care for your patient population? You may be, but do you have the data to back it up? An integrated, high quality data set gives you the data required to support your answer to this question, and gives you the foundational tools needed to identify opportunities and drive improvement.
Once your patient population has been identified, the health and quality of care has been measured, and the population has been stratified based on risk, providers and care teams can begin driving more effective population health and quality improvement initiatives. Visibility into the specific needs of a patient population allows providers to begin providing preventative care to manage the health of their high-risk patients, and engage patients that are in danger of becoming high-risk. Combined, these tools allow providers to maximize the value of their ambulatory networks by aligning delivery models with fee-for-value payment models, and managing costs by providing proactive care, as opposed to costly ED visits or, reactive care.
Although tackling certain quality improvement issues in this model is obvious, others aren’t as apparent. For this purpose, we built a platform that delivers a library of best practice blueprints that can be deployed to drive quality improvement programs. This solution allows you to track the progress of that intervention, and can attribute the results to the intervention. A good change process is usually iterative, but this tool can take some of the iterative cycles and guesswork out of the process. Ultimately, it is designed to drive rapid, sustainable results from quality improvement initiatives.
As provider organizations begin to transform and adopt population health-based tools and methodologies, the improved level of visibility into the organization and patient population will fuel opportunities to drive sustainable change and improvement. With the shift from fee-for-service to fee-for-value payment models, the topic of population health is getting more and more attention. There is optimism about the effect that this shift will have on the industry, as our clients are already yielding positive outcomes in the early phases of the transformation.
March 5, 2014 12:50 PM
Posted by: adelvecchio
, EHR Adoption
Guest post by Karen Clay, vice president of operations, Amphion Medical Solutions Transcription Division
From meaningful use compliance to value-based purchasing, the ability to compete for today’s healthcare organizations is driven by the accuracy of structured narrative reports and the speed with which they are fed into their electronic health record systems.
Financial and clinical coding processes rely upon these reports — which make up 50% of a patient’s record — as their primary source of information. Providers depend upon dictated text to communicate the unique, expressive and complete patient story to other healthcare providers. However, to be leveraged by the EHR, narrative reports have had to evolve into discrete and interoperable patient data available in readable and scannable formats.
The impact of an EHR doesn’t end there. It has also altered the process of transcription and the role of the medical transcriptionist.
When conversation turns to the aspects of healthcare most affected by EHR adoption, the move toward a fully digital environment characterized by real-time data sharing and exchange is not commonly mentioned. The truth is that the move to digital has technically and functionally transformed transcription.
On the technology front, EHRs are able to interface directly with transcription platforms to parse data. Transcription now creates discrete data fields rather than flat files or static information snapshots. In addition to complying with HL7 data requirements, these capabilities created demand for dictation software with advanced speech understanding to create greater efficiencies in data transfer.
Specifically, admission, discharge, transfer feeds and clinical dictation can now be integrated between systems, eliminating manual transcription. Instead, patient demographic information can be systematically merged for editing, which also speeds turnaround times.
That transition to editor is perhaps the most significant effect EHR adoption has had on the role of medical transcriptionists. As speech understanding software becomes more commonplace, creation of a typed document will become extinct. Instead of creating documents, transcriptionists are now responsible for editing them for medical accuracy. Adapting to this new role has been a challenge for some transcriptionists. For others, it has opened doors to new professional opportunities, including quality assurance and coding.
Making the transition
Just as the role of transcriptionists has evolved under the influence of the EHR, so too must the technology infrastructure, in order to effectively transition reports from one-dimensional text to reusable patient data. Today’s narratives are now subject to natural language processing (NLP) — technology capable of “understanding” spoken dictation and converting it to electronic text that can be parsed and mapped to specific data fields.
When paired with transcription management software, NLP technology enables hospitals to seamlessly integrate dictation into the EHR based on pre-defined templates that determine where the data should wind up within the electronic record. While the presence of an EHR on its own does not alter the front-end look or content of the narrative report, it does introduce greater flexibility into the look and feel of the templates themselves.
This is particularly useful for transcriptionists working with multi-facility systems or outsourced transcription vendors. In a manual environment, these transcriptionists would spend a portion of their time formatting clinical narratives to meet the requirements of individual hospitals. Now, formatting is handled by the software behind the scenes. The end result is higher productivity levels, faster turnaround times and greater standardization.
One of the most important aspects of a successful transition to the new era of transcription is establishing the framework to guide software implementation and adoption. In many cases, hospitals find it helpful to seek out the support of a vendor experienced in identifying needs and capable of designing an effective strategy to meet them.
The best firms will follow established best practices, which should include early and ongoing involvement of the facility’s leadership and medical staff to ensure top-down adoption of and compliance with the new processes. Proper training and education is at the heart of any successful transition, which is why it is important the vendor selected to guide the process provides education at appropriate milestones. At a minimum, education should be given during the kickoff phase and at go-live.
The typical training plan should include, at minimum:
- Identification of training facility needs
- Initial training of in-house trainers (“training the trainers”)
- Identification and selection of an in-house training coordinator
Identification, design and development of facility-specific training materials should also be provided, including roles-based guides for instructors, administrative users and report users. Finally, customized workflow mapping should be conducted to ensure any areas of weakness are identified and addressed.
Whether it is the promise of incentive funds or the threat of reimbursement cuts, the pressure to transition to an EHR is increasing. To maximize the return on their investment, hospitals and other healthcare organizations should pay close attention to the impact the resulting changes has on their transcription processes — and the transcriptionists themselves.
The adoption of NLP and front-end speech understanding software, combined with targeted transcriptionist training and education will ultimately increase accuracy, efficiency and productivity. It will also speed access to patient information by those who need it to make the decisions that impact care quality and financial outcomes.
Karen Clay is the vice president of operations for Amphion Medical Solutions’ Transcription Division. She can be reached at firstname.lastname@example.org.
February 19, 2014 11:56 AM
Posted by: adelvecchio
, healthcare analytics
Guest post by Greg Chittim, senior director, Arcadia Healthcare Solutions
Every year the healthcare industry reflects on what worked and what didn’t. At the heart of it all we are hoping to have achieved the one thing that everyone is driving towards: Increasing the quality and efficiency of care for our patients. At the end of 2013, despite the negative press surrounding the Healthcare.gov rollout, I think the industry felt fairly optimistic about the value health IT was beginning to demonstrate and how it could change the face of healthcare. The next 12 months will determine whether this optimism will continue or not.
In 2014 and beyond, I foresee less of a focus on what EHR platform is being used and more of a focus on managing disparate data, improving workflows, and driving adoption more effectively. I truly believe that physicians will figure out how to work with whatever system they’re using in ways that efficiently and effectively improve the health of their patients. Before we get there, there will be a few bumps and bruises along the way.
- The struggle to recruit healthcare analysts and technologists will mount – Especially in the rural areas (where they are not living) and in urban healthcare hotspots (where there is intense competition for their services). Health systems and plans will need to find other options to acquire the analytical talent required to meet the needs of population health and shared risk contract initiatives.
- There will be greater separation between the best-of-the-best and everyone else in quality improvement initiatives – Those who have been implementing quality improvement programs are seeing the fruits of their data, process and team initiatives. Those who have not seen these programs grow organically will be hungry to learn best practices in order to emulate the success.
- ICD-10 will go smoothly for most hospitals, but practices will be underprepared – Hospitals and large health systems have been focusing on all aspects of ICD-10 for years, but too many practices are taking a “wait-and-see” approach. They’re relying on their EHR vendors and ignoring the more challenging processes, training, and ongoing measurement work streams that will ensure a completely successful launch. In 2014, it will be important for them to shift their focus so they are better prepared.
- The demands for data in accountable care organizations (ACOs) will increase – Beyond claims provided by CMS or their health plan, ACOs will increasingly need to use and integrate data from additional sources. Pulling data from different sources — ambulatory EHRs in particular — will help them achieve their desired cost effectiveness and care quality improvements. Having a comprehensive view of all aspects of practice and network operations — quality, general ledger, claims, scheduling, etc. will become the biggest competitive advantage for emerging ACOs.
- Technology services will increasingly be outsourced, but at a price – As outsourcing moves from large health systems down to individual practices, there will be a need for responsive, onshore, vendor-agnostic support. Doing so will leave technology, connectivity, and performance tasks to IT experts and allow the medical experts to focus on patient care.
- Access, enrollment, and management of patient panels in community health will become an issue – In states that are implementing Medicaid expansion, there will be significantly more patients leveraging Community Health Center (CHC) services. CHCs are among the best in the country at efficiently delivering quality care, but will struggle to build cross-system histories and interventions for the chronically and acutely ill among their new populations.
- Achieving patient-centered medical home recognition will take a back seat to acting like a medical home – As ACOs and pay-for-performance contracts become de rigueur for many practices, acting like a true medical home will be table stakes. A stamp of approval from the National Committee for Quality Assurance will be a goal, but not an end in and of itself like it is in many healthcare transformation programs.
What health IT challenges do you foresee?
February 5, 2014 1:05 PM
Posted by: adelvecchio
, Protected health information
, secure text messaging
Guest post by Cliff McClintick, COO, Doc Halo
Physicians have long made the decisions on when and how to communicate with patients. For test results, follow-up questions or words of advice, patients have had to wait for a callback — or make another appointment.
That model is quickly changing among forward-thinking clinicians who are increasingly communicating in ways that respect patients’ fast-paced, technology-dependent lifestyles. In many cases, that means the real-time dialogue of patient and physician texting.
Of course, communicating with patients brings specific privacy considerations, and that’s a major reason why physicians have been cautious about adopting new methods. State-of-the-art systems such as the secure texting technology offered by Doc Halo, however, allow healthcare providers to connect with patients by text message while complying with HIPAA regulations on protected health information (PHI).
Communicating on patients’ terms is particularly important for doctors who work with younger populations. Consider this: For teens and 20-somethings, phone calls feel archaic. To Millenials, those born in 1981 or later, calling without emailing first can seem inconsiderate, as the Wall Street Journal noted last year.
Millennials and the generation after them have spent their entire lives communicating by text and instant message. Researchers last year found that U.S. smartphone users ages 18 to 24 sent an average of 67 texts per day.
Last year NBC News called patient-physician texting “the new age housecall.” Their story profiled doctors who see taking questions and sending follow-up messages by text as a way to form closer connections with patients, and work more efficiently and effectively.
Of course, texting is just as natural for many physicians as it is for their patients. Among new physicians, smartphone usage is roughly 100%, and texting is ubiquitous.
Texting is particularly popular among physicians who have moved to a concierge or boutique practice model. Such physicians typically collect membership fees, make themselves available for messages and calls nearly all the time and forgo dealing with insurance companies.
The old methods of communicating — convenient for physicians, but far less so for the people they serve — still dominate in most practices. Still, connecting electronically is hardly uncommon, with a third of all practices communicating with patients by text or email, according to a study from the Deloitte Center for Health Solutions.
Unfortunately, many of the fears that physicians have about texting with patients are based in fact. Hitting “send” on a typical texting application leaves the message, and any PHI it may contain, unsecure. That means the healthcare provider is exposed to potential HIPAA-related fines.
There’s an easy solution. An effective secure texting application includes features such as encryption, automatic log-out after a period of inactivity and a remote wipe option if the phone is lost. Doc Halo, in fact, includes numerous other security features, including an exclusive provider verification process and two-step verification for lost passwords.
Patients today expect quick, convenient access to their physicians via modern communication methods such as texting. With the right tools, it’s possible to give them just that.
Cliff McClintick is Chief Operating Officer for Doc Halo who specializes in healthcare real-time communications and HIPAA compliant secure texting. He is a former Chief Information Officer of an inpatient hospital and has expertise in HIPAA Compliance and Security, Clinical Informatics, and Meaningful Use. He has more than 20 years of information technology design, management, and implementation experience.
Cliff operated his own consulting company for many years and has management experience in very large systems and implementation projects. He has successfully implemented large systems and applications for companies such as Children’s Hospital, Proctor and Gamble, Fidelity, General Motors, Duke Energy, Heinz, IAMS, and Great American Life Insurance Company.
January 29, 2014 12:46 PM
Posted by: adelvecchio
, mHealth devices
, Mobile devices
, mobile health
Guest post by Tracy Crowe, director of product marketing, NetMotion Wireless
The kind of tablets healthcare providers are prescribing for themselves these days are not the medicinal kind, rather they are of the mobile computing variety, according to a new survey released by Seattle-based mobile software provider NetMotion Wireless.
In analyzing responses from nearly 200 healthcare executives, the Healthcare Goes Mobile survey revealed that tablets and smartphones are becoming imperative tools for hospitals and home health agencies, in what is becoming an increasingly mobile environment. On the flip side, the survey showed that unresolved mobile security and connectivity issues remain significant concerns that must be overcome through software solutions and best practices.
According to the survey, a vast majority of respondents (92%) said mobile deployments were either extremely or very important to the day-to-day operations of their businesses and how they deliver services to patients.
Among the respondents, 88% said their clinicians currently use laptops, while 64% utilize smartphones and 62% employ tablets.
Moreover, 85% said they were contemplating deploying different mobile devices than what they are currently using across their organizations. Tablets were the most popular choice, with 66% of healthcare providers thinking about deploying them, followed by smartphones at 51%.
This growing wave of interest in mHealth can be attributed to a combination of factors, including the ubiquity and improved processing speed of mobile devices and the wireless networks on which they run. In addition, the majority of today’s workforce is comfortable using mobile devices and recognizes their benefits.
For instance, many physicians regard tablets as an ideal device on which they can transmit orders and view patient data while on the move. Home caregivers rely on tablets and smartphones while making house calls in order to deliver more efficient quality care. Meanwhile, the home health agencies that hire these caregivers can manage their employees from a centralized location via these same mobile devices, along with the help of a mobile virtual private network such as NetMotion’s Mobility solution.
When selecting what specific device to give clinicians, 54% of survey-takers said a significant factor in their decision was the number and type of applications that each device supports. In addition, 46% pointed to compatibility with existing corporate infrastructure as a major driver behind their decision. Cost (43%) and end user requests (31%) were third and fourth on the list, respectively.
Among the respondents, 65% had Windows as their primary operating system, 16% had Apple iOS and 8% had Android. Windows remains the predominant operating system among laptops, while Android and Apple control the smartphone and tablet markets. This creates software incompatibility issues between disparate devices. Consequently, experts believe more healthcare providers could turn to cloud-based applications that are less reliant on a specific operating system.
But incompatibility is far from the only concern surrounding mHealth. Slow network connectivity was the top complaint for 45% of respondents; another 24% said the top problem was dropped connections.
This is especially a concern for home healthcare providers who typically visit patient homes with spotty and unreliable 3G, 4G or LTE cellular coverage. Inconsistent network connectivity can increase the possibility of data loss and reduce productivity, and can make connections with the corporate office difficult.
Security issues are another huge concern — one that stems from uncontrolled use of wireless technology. Such issues compromise network security and put patient data at risk, leading to costly HIPAA violations. Half of survey respondents who manage mobile deployments cited patient security as their top concern and an additional 24% ranked it as their second biggest concern.
Security risks are heightened further when employees use their own personal mobile device. According to the survey, 48% of hospitals and health systems use a mix of personal and corporate devices, although a pure bring your own device (BYOD) policy is rare. Only 7% of respondents have one. In addition, 71% of home health agencies have their corporate IT departments issue mobile devices, with 7% instituting a BYOD policy.
Ultimately, developing a company-wide policy that governs BYOD use is critical to ensure data security. Many experts agree that any personal mobile device used for business must at least have password protection, data encryption and the ability to remotely wipe the device clean if it’s lost or stolen.
A good policy also enforces HIPAA privacy rules by controlling how staff can take and store patient images. For instance, any video or photo of an injury or wound must be free of patient-identifying information, and employees should promptly delete images once they are no longer needed.
The home health industry is clearly enjoying the benefits of mobility, while doing their best to manage the challenges of extending tablets and smartphones to hospital and home health workers. This survey confirms that solutions focusing on connectivity, visibility and control of mobile devices are essential to more productive mobile deployments in healthcare.
January 15, 2014 2:57 PM
Posted by: adelvecchio
, Data loss
, health IT
, IT as a Service
, Protected health information
Guest post by Roberta Katz, director, healthcare solutions, EMC, @Roberta_Katz
Healthcare providers are becoming increasingly reliant on electronic health records and are operating in a complex landscape of governance, security, and availability. As a result, trusted IT has become a necessity to establish and maintain individually identifiable healthcare information and protected health information (PHI).
Trusted health IT solutions typically include advanced security, integrated backup and recovery, and continuous availability for the supporting IT architecture. Providers face the unique challenge of keeping PHI highly available, secure, and private as they increase the use of technology to improve patient care delivery.
Security breaches — whether the data is kept on physical IT assets or in a private cloud — can create a lack of confidence in a healthcare system and have significant regulatory implications. Although many healthcare organizations plan to conduct a HIPAA security risk assessment, which is a core requirement of stage 2 of the meaningful use incentive program, there is more work to be done.
To examine these issues, EMC Corp. recently completed the 2013 Global IT Trust Curve Survey — which surveyed 283 healthcare IT executives. The results highlight that providers continue to struggle with unplanned downtime, security breaches, and data loss. In the last 12 months, 40% of respondents reported having an unplanned outage of some kind. On average, these organizations lost 57 hours to unplanned downtime, incurring an estimated loss of $432,000 per outage. Additionally, the study found:
- 61% of global healthcare organizations surveyed experienced a security-related incident in the form of a security breach, data loss, or unplanned down time at least once in the past 12 months.
- Nearly one in five (19%) experienced a security breach in the last 12 months at an average financial loss of $810,189.
- More than one in four (28%) experienced data loss in the past 12 months at an average financial loss of $807,571 per incident.
Failing to invest in trusted health IT solutions to protect data and ensure a reliable, highly available network incurs real, quantifiable costs to the healthcare system. In addition to the financial implications, inefficient IT architecture can slow the transition many organizations are making as they deploy IT as a Service (ITaaS) models and seek to deliver IT solutions to other organizations in their networks. ITaaS models help organizations increase agility, accelerate deployment of key healthcare applications, and lower costs.
Healthcare organizations have until June 30, 2014, to comply with HITECH Act PHI privacy and security requirements or sacrifice significant federal funding in the form of meaningful use incentives. Failure to meet meaningful use stage 1 requirements by 2015 will result in a penalty on CMS reimbursements, starting at 1% and increasing annually.
Healthcare organizations are encouraged to take a holistic view of security management by adopting an integrated approach to governance, risk, and compliance. To align appropriate security activities for maximum protection across the enterprise, I suggest installing a security management framework into your IT infrastructure comprised of:
- Business governance — Embedding security into all organizational structures and processes while taking regulatory requirements (HIPAA, HITECH) and internal policies into account.
- Security risk management — Identifying and classifying information risks and tracking risk mitigation.
- Operations management — Implementing security processes and controls in line with current security policies to prevent risks from developing into security incidents.
- Incident management — Detecting, analyzing, resolving, and reporting security incidents to minimize their impact.
These strategies can be implemented using a phased approach. Investing in a secure and reliable IT architecture increases trust in IT while improving patient care delivery.
Roberta Katz is the director of healthcare solutions for EMC.
January 15, 2014 1:37 PM
Posted by: adelvecchio
SearchHealthIT has compiled a list of healthcare IT events for the upcoming year.
Know of an event that’s not included below? Suggest it in a comment and we’ll add it to the list!
The Digital Health Summit (at 2014 International CES)
January 7 – 10 * Las Vegas, NV
iHT2 CMIO Forum in Phoenix
January 21 – 22 * Phoenix, AZ
IHE North America Connectathon 2014
January 27 – 31 * Chicago
eHealth Initiative Annual Conference
January 28 – 29 * Championsgate, FL
Managed Care Compliance Conference
February 9 – 11 * Scottsdale, AZ
National Quality Forum Annual Conference
February 13 – 14 * Washington, D.C.
HIMSS14 Conference and Exhibition
February 23 – 27 * Orlando, FL
Audit & Compliance Committee Conference
February 24–25 * Scottsdale, AZ
Mobile World Congress 2014
February 24–27 * Barcelona
SAS Global Forum
March 23-26 *Washington, D.C.
iHT2 Health IT Summit in San Francisco
March 25 -26 * San Francisco, CA
Annual Compliance Institute
March 30-April 2 *San Diego, CA
2014 State Healthcare IT Connect Summit
April 1-2 * Baltimore, MD
iHT2 Health IT Summit in Atlanta
April 15-16 * Atlanta, GA
2014 ICD-10-CM/PCS and Computer-Assisted Coding Summit
April 22-23 Washington, D.C.
Medical Informatics World Conference
April 28-29 *Boston, MA
Bio-IT World Conference & Expo
April 29-May 1 *Boston, MA
AthenaHealth User Conference
April 29-May 1 *Boston, MA
Public Health Informatics Conference
April 29-May 1 *Atlanta, GA
April 30 *New York, NY
Beyond EHRs: Information Exchange and Your Practice
May 2 *Buffalo, NY
HL7 International – Working Group Meeting
May 4-9 *Phoenix, AZ
WEDI National Conference
May 12-15 *Hollywood, CA
May 13 -14 * Brooklyn, NY
iHT2 Health IT Summit in Boston
May 13-14 *Boston, MA
The CIO Healthcare Summit
May 13-14 * Dallas
National Healthcare Innovation Summit
May 13-15 *Boston, MA
National Health Insurance Exchange Summit
May 14-16 *Washington D.C.
SearchHealthIT Leadership Dinner
May 15 *Boston, MA
May 15-17 * Long Beach, CA
ATA 18th Annual International Meeting & Expo
May 17-20 *Baltimore, MD
MIT Sloan CIO Symposium on Health IT
May 21 *Cambridge, MA
Health Data Initiative: Health DataPalooza
June 1-3 *Washington D.C.
e-Health Canada 2014
June 1–4 * Vancouver, BC, Canada
Research Compliance Conference
June 1–4 * Austin, TX
International Summit on the Future of Health Privacy
June 4-5 *Washington D.C.
iHT2 Health IT Summit in Chicago
June 10-11 *Chicago, IL
DIA 2014: 50th Annual Meeting
June 15-19 *San Diego, CA
Government Health IT Conference & Exhibition
June 17-18 *Washington, DC
HTRAC East 2014
June 22-24 *Leesburg, VA
June 25 *Santa Monica, CA
iHT2 Health IT Summit in Denver
July 22-23 *Denver, CO
mHealth + Telehealth World
July 22-24 *Boston, MA
The World Congress mHealth and Telehealth World 2014
July 23 – 25 *Boston, MA
iHT2 Health IT Summit in Seattle
August 19-20 *Seattle, WA
September 8, *Chicago, IL
National Health IT Week
September 15-19, *Nationwide
iHT2 Health IT Summit in New York City
September 16-17 *New York City, NY
HIMSS Policy Summit
September 17-18, *Washington, DC
Big Data in Pharma
September 17-18, *Boston, MA
Health 2.0 Eighth Annual Fall Conference
September 21-24, *Santa Clara, CA
AHIMA Convention & Exhibit
September 27-October 2 *San Diego, CA
October *Nashville, TN
Partners HealthCare’s Connected Health Symposium
October 23-24 *Boston, MA
November 2-4 *Napa, CA
SearchHealthIT Leadership Dinner
November 13 *Santa Ana, CA
January 8, 2014 12:01 PM
Posted by: adelvecchio
, predictive modeling
Guest post by Jerred Ruble, president and CEO, TeamQuest
The recent launch of HealthCare.gov has been met with more issues than anyone in the White House could have expected. While the results were unexpected, we’ve come to find that the root causes of the issues were fairly straightforward. So how did we get here?
What many in the technology industry understood immediately was that HealthCare.gov isn’t simply a website which hosts static information. It’s meant to be an application, one that relies upon a wide range of interconnected systems and data sources to provide timely and accurate data that is personalized to a certain degree for its users.
The myriad issues facing HealthCare.gov have been largely traced to a case of bad planning, specifically a lack of effective capacity planning — the process of ensuring the underlying systems had the resources available to handle the influx of Americans wanting to sign up. When the site went live, millions of uninsured Americans rushed to the website. What they were met with was long wait times, error messages, and more of a headache than a relief. It has been reported that at some points in the days immediately following the launch, there were 40,000 people in virtual “waiting rooms” because capacity had been reached.
In real estate, there’s a well-known mantra when it comes to selecting a home — “location, location, location.” When developing applications, there’s a similar thought process — “resources, resources, resources.” Specifically, the computing resources – the memory, storage, CPU cycles, bandwidth, etc. — required to ensure performance, stability and a positive customer experience. This is capacity planning at its core — the proactive approach of identifying the right amount of resources required to meet service demands now and in the future. It is a proactive discipline that allows IT to plan effectively and avoid surprise outages or performance degradations.
Let this be a lesson
Given the dynamic nature of Web-based systems and user requests, it’s hard — but not impossible — to predict the future with regard to necessary resource allocation.
Predictive modeling capabilities, a tool employed by innovative application developers and IT teams alike, can also be instrumental as a marketing differentiator. It allows providers to separate themselves from competitors through enhanced customer experience, pricing, and claims servicing. By making predictive recommendations of future resource needs, it allows providers to avoid costly outages or performance degradations. It’s also more efficient than a common solution employed by lots of organizations: overprovisioning. As the HealthCare.gov launch has shown, the opposite is what you want to avoid.
Insurers are expected to make improvements in their data and information management capabilities and implement a more sophisticated best practices approach to streamline their processes and become more proactive. Many will be looking to leverage IT to lower costs and increase quality while working to enhance the customer experience. Going forward, it will be critical to enhance the customer experience so providers aren’t left to compete on price alone. That starts with making sure systems are available and performing as expected.
With more than 30 years of software and management experience, Jerred’s passion and leadership have become hallmarks of his tenure at TeamQuest. Prior to his role as president and CEO, Ruble served as vice president of engineering at TeamQuest. He is one of the original partners who founded TeamQuest in 1991.
December 31, 2013 12:45 PM
Posted by: adelvecchio
CommonWell Health Alliance
, public HIE
Guest post by Greg Chittim, senior director, Arcadia Healthcare Solutions
State, regional, and local commercial HIEs are reaching a tipping point. Some are succeeding wildly and the regular exchange of data is becoming a core part of their healthcare IT landscape. Others are failing due to lack of participation caused by unclear value propositions and returns on investment. In the latter case, data standards are often the key stumbling block preventing effective and efficient HIEs. More specifically, the problem is a lack of viable and implementable data standards.
How have some HIEs overcome the data standards issue while others have struggled? Most successful HIEs are those that built up a foundation of quality data early on. They used that data to overcome the chicken-and-egg problem of encouraging adoption when there’s not enough value (in the form of data for specific physicians’ patients) to sign up. Those with the smallest hill to climb in this sense are those HIEs built around a dominant health system on a single EHR platform, or a health plan with claims data supported by a majority market share. In these cases, a valuable data asset can be built up quickly then additional stakeholders have a fixed target for a predictable set of interfaces to fill in the margins of data not met by the dominant player.
But what about those statewide or regional public HIEs that do not have the luxury of a dominant commercial player supporting the baseline? In these scenarios, there are a number of national efforts underway to unite government and commercial entities to build viable data standards. The Consolidated CDA is one potential solution, but that requires EHR vendors with divergent strategies and fierce competitive tendencies to implement a shared standard. This has proven a challenge, especially given the lack of clarity in meaningful use measures as to how continuity of care documents must be formatted and what vocabularies must be used. The unfortunate tendency has been for EHR vendors to work in their own silos, doing what is best for their particular market segments, and taking a “wait and see” approach to more specific national standards.
Emerging groups such as the CommonWell Health Alliance present a very real opportunity for a vendor-driven set of standards. If vendors (Epic in particular) manage to find a way to cooperate with the rest of the industry at least on the subject of data standards, it could present a strong solution. In the meantime, the presence of providers who are truly EHR vendor agnostic and can efficiently do the mapping, integration, and reporting on data from many different vendors will be critical in the absence of universal data standards. This investment has proven difficult to sustain for public HIEs whose stakeholders are reluctant to support standards but are eager to support real data and analytics.
The long term solution remains to be seen. However, the private successes point to an answer for struggling public HIEs. That solution is to do whatever possible to build up a baseline of key data — even for a limited use case — and then provide the incentives (or when necessary, penalties) to comply with a well-defined, detailed standard that can be met both in part (for specific use cases) or wholly (for a complete solution) with minimal room for interpretation.