beawolf - Fotolia
Since the introduction of EHRs, physicians have relied on storytelling to capture patient information during a care episode. Despite the availability of clinical forms within most EHRs, many physicians still use unstructured data in the form of text. But unstructured data presents hospitals with a major challenge: the inability to mine patient information for insights. Fortunately, natural language processing can extract meaningful insights from unstructured data.
In 2018, Atrius Health adopted NLP to detect at-risk patients by analyzing unstructured patient data based on clinical criteria, specific keywords and text patterns. NLP was able to pinpoint patients who were at a higher risk of geriatric syndromes, such as lack of social support and walking difficulty. This highlights the value healthcare NLP brings to the table by eliminating the need to have clinical staff read through medical notes, which is time-consuming and costly. NLP was able to automate that process and reduce what could take days into a matter of minutes.
Healthcare NLP doesn't stop at identifying text with certain keywords. There are several valuable use cases in which NLP can extract valuable insights from unstructured patient data.
Detecting possible mental health issues from social media: There has been a big push by clinicians in recent years to monitor the mental condition of their patients. Healthcare NLP can detect early signs of mental illness in patients by monitoring their social media posts and alerting a mental health professional. NLP accomplishes this by performing sentiment analysis and detection of quantifiable signals around suicide attempts.
Extracting data from clinical documentation: Healthcare NLP can also locate and extract text from clinical documentation, such as physician's notes and dictations. Healthcare organizations can use this feature to automate the extraction of specific data elements from clinical notes, such as prescriptions and medical history. NLP can also indicate the absence of information that should be included in clinical reports. Radiology firm RadNet used this functionality in its radiology information system to analyze radiology reports that did not include CMS-mandated quality measures.
Converting voice to clinical content: Another use of NLP and natural language understanding is what M*Modal is doing with its Catalyst platform. The company uses AI to assist physicians by converting their spoken words in real time into dictated notes, along with discrete data elements -- prescriptions, ICD-10 (International Statistical Classification of Diseases and Related Health Problems) codes, procedures, etc. -- populated directly into the appropriate section of the medical chart. This eliminates the need for medical scribes to extract the clinical content manually.
Double-checking insurance claims: When physicians complete their patient encounters, the next stop is usually the billing or coding team. Their responsibility is to review the clinical notes and procedure codes to ensure that the physician properly documented the necessary information. Failure to include the appropriate clinical content will result in the denial of the medical claim. 3M provides a billing solution called 360 Encompass that uses NLP to detect mismatches of clinical content and diagnostic or procedure codes that cause claim denials. This reduces claim rejections and automates what was mainly a human process.
AI-based tools offer hospitals new opportunities to automate processes that humans typically perform. Healthcare NLP has given way to a new set of services and products that detect, analyze and extract information that was once locked in clinical notes. It has also created a new way for computers to understand human conversations. However, computer software is not 100% error-free, and NLP will require ongoing testing as more healthcare providers use it in patient care.