Posted by: RedaChouffani
EHR, mhealth, mHealth functionality, mobile health, SIRI
An increasing majority of mobile technology and mHealth device users in the health care industry are recognizing the value of their connected health devices. Tablets and smartphones are allowing health care professionals to gain access to patient information and care specific content before, during and after a patient’s visit. Given their ease of use, lightweight nature, long battery life and connectivity, many users are seeing real efficiencies gained in areas of their day-to-day workload.
Be that as it may, the reality is that we may not see these devices completely replacing desktops or notebooks right away. For health care professionals, there is still a gap in the functionality currently available in the mobile devices. For example, when documenting the patient’s chart there are several methods by which to capture that information:
- Voice recognition tools
- Point touch and easy to use note generating functionality as part of the EHR app
- Hand writing recognition features
- Utilization of a keyboard connected to the tablet
But a greater challenge exists for physicians other than simply how to capture patient data in the system. Part of the difficulty with some of today’s applications and systems is the way that these products allow the review of information needed during care delivery. In most cases, when caring for patients, clinicians are constantly looking for information and test results such as labs, imaging, specific medical history data as well as data from other departments and possible other systems altogether. Most apps available today do not currently have the capability for the easy extraction and display of information from such fragmented sources.
That said, what would be some of the functional capabilities that will make the most sense for physicians as well as justify the cost and investment in the next generation of mhealth apps and mobile health technologies?
Timeline view of the patient record: Adopted from Facebook’s timeline concept, this would be a useful feature for physicians. This functionality would collect and group related information into a timeline of the patient’s health history and display the different conditions and their progression over time. For some patients with cancer and other life threatening conditions, the timeline would allow the physician to view the progression of the disease over time and accurately plan for the treatment. The information would be collected from HIEs, PHRs, payer data, eRx, major lab companies and any other health data repositories.
Contextual functionality: As any care giver will tell you, there are several common tests and order sets associated with certain types of conditions. So, it would be valuable functionality to have a smart app that is capable of automatically filtering the order list, medication list and other content based on the patient’s condition and or evidence-based data. This will help reduce the amount of time a caregiver has to spend going between different screens and areas in the app to get to what they want for that patient.
Leveraging connected content and social media integration: There are several examples in the industry today that showcase how more and more apps now come equip with some sort of connectivity to one or more social media sites (LinkedIn, Twitter, Facebook and the like). You can upload your fitness stats to your Facebook, send an article from your Twitter account and even post a video or photo directly from your phone. The next generation of health care apps will come with the capability to securely connect with HIEs, drug interaction databases (i.e., Next generation of Lexicomp), clinical support services and more.
Tag like functionality: Similar to Wikis and online keyword tagging, the next generation of apps needs to allow care givers the opportunity to review patient charts and easily access specific areas simply by clicking on the tags in the document (#labs, #imaging, etc).
Natural Language Processing (NLP – think Siri): Siri, from the iPhone 4S, is no longer just a conversational piece that you ask to marry for sheer entertainment. Many have begun to actively adopt Siri and its functionality has thus been extended. In a recent experiment, I spent some time writing some prototypes in Ruby with web services on top of an EHR database. I was able to build a small sample that allowed me to make simple commands on my iPhone and get Siri to provide me with useful information. This provides a proof of concept of how Siri would work in health care in the future.
For example, some of the commands that I used were “Siri, look up patient Id 12345”, “Siri, please tell me about the active medications that the patient is on”, “Siri, send a message to my assigned nurse and tell her the patient is ready for x-rays” , and even “Siri, when is my next surgery and where?”. All of these commands seem to be futuristic at first, but given the hard work of programmers who identified the Siri Protocol and developed Siri Proxy, they have opened the door for an incredible new way for clinicians and health care staff to interact with smartphones and patient data content. The increasing popularity of voice command technologies, such as Siri, Dragon voice command tools and IBM’s Dr. Watson all show that future products will have the ability to provide support for any virtual assistance able to perform tasks that traditionally require several clicks and manual data entry.
Mobile health applications may be gaining traction and more popularity, but there are still several areas that can use improvement. As the products continue to evolve, health care will begin to see more incredible features and even more highly integrated, intelligent functionalities as part of the next phase in mobility.