CAMBRIDGE, MD. -- When CIOs think "EHR usability testing," chances are they're imagining armies of people hired by vendors in the alpha testing phase, long before an application
User testing -- even if it's rudimentary, with the tester observing the facial expressions of test participants and writing down their observations with pencil and paper -- is essential for hospitals to get clinician buy-in.
Warner, who helps manage software rollouts for 49 hospitals nationwide, made the case for usability testing in a presentation to CIOs attending the HealthTech Council meeting. User testing -- even if it's rudimentary, with the tester observing the facial expressions of test participants and writing down their observations with pencil and paper -- is essential for hospitals to get clinician buy-in for data-intensive initiatives required to drive new incentives and claims-reimbursement programs such as meaningful use and accountable care organizations (ACO).
"One of the biggest challenges we face as we roll out new technology is the buy-in from the end user and the resistance to change; they resist change, they don't use the system and the benefits we thought we were going to gain we do not gain," Warner said.
Usability testing -- along with an EHR rollout team that frequently updates the whole facility about progress toward the system going live and takes all staff suggestions and complaints seriously -- can help get clinicians using the system more quickly and earning that return on investment health care providers must get from their EHRs to justify the financial investment and workflow disruptions they can bring.
Warner suggested that CIOs set up basic usability tests in a conference room, with one user and one monitor taking notes and making observations. She offered the following tips:
- First, when choosing test subjects, don't test a superuser or EHR champion from the rollout team. You're going to want a subject who is unfamiliar with the EHR and who represents daily users on the floor.
- Have the tester silently watch the user operate the software, without offering any assistance in the way of finding features or offering hints by pointing on the screen.
- Observe efficiency: How quickly can the user log in and get to work, as well as, how well can they remember to log back in after pausing, leaving and getting back in?
- The tester should note where the subject runs into trouble or pauses -- which screens, what they were trying to do or find.
- Users will make errors in navigation, filling out data fields, hitting a wrong key, etc., Warner said. Testers should note where and when subjects make these errors, their severity and how simple or difficult it is to recover from them. If hitting a wrong key causes a blue screen, for example, or makes it otherwise difficult to get back to work, it will slow an application's adoption.
- Observe the body language of the user. Nonverbal behaviors that signal trouble in the usability and a product that might need modification (or the seeking out of another EHR vendor) include users putting cupped hands to their throats and pinching the bridges of the nose. A too-slow system manifests itself in impatient body language such as putting a palm under the chin in boredom, bouncing heels on the floor, drumming fingers or putting hands on the back of the neck.
- Rate the system overall on how easy it is to learn and if there are collateral materials such as quick reference cards involved in the software usage. Consider them part of the usability testing as well.
Warner made the case for simpler applications. "If you have a complex system that's going to require binders of help and [additional] online help, that will certainly influence the end users' willingness to use the system," she said. "They won't want to use it if it's too hard to learn or they went to training and can't remember it."
The EHR usability test results are in. Now what?
Ideally, Tanner said, usability testing is an iterative process. A facility would have the luxury of tweaking an application between tests, improve it, retest and fix it again, and maximize buy-in and therefore the return on investment in an EHR system.
More EHR implementation advice
Slow and steady EHR implementation the Rx for doctors' offices
It's the pre-EHR implementation workflow analysis that makes it go
Going to use iPads? Secure those babies.
Virtualization on iPad EHRs key to secure HIPAA compliance
In the real world, however, time and cost prohibit that, but usability testing still is worth the effort: When a hospital is the software buyer and not the developer, negative user test results can create leverage in negotiating with a software vendor for fixes and features; justify evaluating other vendors for EHR purchase if the testing's done early enough in the process; or if push comes to shove, negotiating a lower cost to purchase an EHR system.
Publishing the results of the testing online, sharing them with the rollout team, and publishing the minutes of evaluation meetings will show everyone in the organization what you're doing and how you're making efforts to correct deficiencies. Furthermore, it helps get executives such as the CEO involved in allocating resources to rectify problems, especially if your EHR rollout team recruits them early in the process.
Chances are CEOs worth their weight will be involved in an enterprise-critical, workflow-changing, expensive decision such as an EHR rollout, right? Well, even if they're not heavily involved, Warner said, emails or bulletin board notices ghostwritten by the rollout team and signed by the CEO can help influence buy-in of what will be an imperfect system, anyway.
Warner's colleague Robert Steele, Tenet senior director of applied clinical informatics, said that while his group works with clinical applications, the interoperability piece is rapidly becoming more and more important as they interact with other systems on the network. Tenet's clinical informatics department has expanded from 40 to 400 people in the last six years.
"We rapidly found out that, while clinical informatics is our area and our responsibility, there needs to be a clearinghouse for anybody putting anything else in," Steele said. For example, how a radiology point-of-care testing system will work with back-end billing and coding software that isn't a clinical information system, per se. "We kind of need to know about it, because it's going to affect us and our main systems. We've rapidly become the eyeballs for the initial assessment of what pieces are touching what pieces [on the network]."
This was first published in November 2012