AI for Electronic Health Records (EHRs) and Electronic Medical Records (EMRs)

Niccolo Mejia

Niccolo is a content writer and Junior Analyst at Emerj, developing both web content and helping with quantitative research. He holds a bachelor's degree in Writing, Literature, and Publishing from Emerson College.

AI for Electronic Health Records (EHRs) and Electronic Medical Records (EMRs)

There are several companies claiming to offer AI solutions to healthcare companies, as we’ve explored extensively in our past reports. In this particular report, we focus on AI as it pertains to working with electronic health records (EHRs) and electronic medical records (EMRs). AI vendors offer solutions to hospitals and clinics with a variety of functions.

Nuance Communications, known for their natural language processing-based transcription software, offers software that they claim updates EHRs through speech, for example. Linguamatics offers a search software that they claim can allow hospital staff to search for information within their EHR and EMR databases. Cognizant claims to offer an analytics software that takes data in part from EHRs and EMRs and makes an assessment of patient satisfaction with their stay. Similarly, Health Fidelity offers an analytics software which they claim can help predict patient risk. Broadly, these functions fall into the following categories:

  • Medical Transcription
  • Document Search
  • Analytics

Medical Transcription

Nuance Communications

Nuance Communications offers a software called Dragon Medical One, which it claims can help healthcare companies record patient medical experiences using natural language processingIt’s likely that healthcare providers can integrate the software into an existing EHR database.

We can infer the machine learning model behind the software was trained first on hundreds of thousands of relevant speech requests. These requests might be for starting a new line of text, adding punctuation, or typing out what a doctor or patient says into an EHR, in various accents and inflections from various types of people. The machine learning algorithm behind the voice recognition system would then transcribe those speech requests into text. Human editors would then correct the transcription and feed the edited text back into the machine learning algorithm. This would have trained the algorithm to recognize and correctly transcribe these speech requests.

A doctor could then speak to the microphone on their computer, and the algorithm behind the voice recognition software would then be able to fill in the information they say into the appropriate section of the EHR.

Below is a short 3-minute video demonstrating how Dragon Medical One works:

Nuance Communications claims to have helped Allina Health speed up the time it took its doctors to fill out electronic health records. Allina Health integrated Nuance Communication’s software into its Epic EHR. According to the case study, Allina Health saw a 167% increase in how much medical documentation they were able to produce by the time of publishing.

Nuance Communications also lists Nebraska Medicine and Baptist Health South Florida as some of their past clients.

Joe Petro is CTO at Nuance Communications. He holds an MSME in Computer Aided Engineering from Kettering University. Previously, Petro served as SVP of Research and Development at Eclipsys.

Document Search

Linguamatics

Linguamatics offers a software called I2E, which it claims can help healthcare companies organize their large EHR data repositories for finding important information using natural language processing. This information could pertain to which of a doctor’s patients smoke or used to smoke cigarettes or the important chemical structures in a scientific article or internal document.

The company claims I2E combs through EHR data and provides the user with a visual depiction of its results, such as a chart or a graph. Linguamatics claims this visualization can be selected prior to search in order to fit the user’s needs. Healthcare providers can integrate the software into an existing EHR database.

We can infer the machine learning model behind the software was trained on tens of thousands of EHR reports involving patient treatments, the outcomes of those treatments, and the equipment used during a patient’s visit, among other kinds of information that one might find in an EHR, such as demographic information. This text data would have been labeled by individual patient, specific illnesses, and treatment methods for those illnesses across all patients. The labeled text data would then be run through the software’s machine learning algorithm. This would have trained the algorithm to discern the chains of text that, to the human brain, might be interpreted as patient diagnoses, their symptoms, and past and current treatments as displayed in EHR notes.

We could not find a demonstration video showing how I2E works.

Linguamatics claims to have helped Mercy utilize their over 30 million EHR notes to evaluate heart failure device performance in their company. The company used I2E to search through all of their EHR data on their use of heart failure devices, and the software produced a visual depiction of all patient experiences with those devices. Mercy integrated Linguamatics’ software into its existing EHR database. According to the case study, Mercy won the Innovative IT Project of the Year Award at the 2018 Gateway to Innovation conference.

Linguamatics also lists Atrius Health and Penn Medicine as some of their past clients.

John Brimacombe is Executive Chairman at Linguamatics. He holds an MS in Computer Science from the University of Cambridge. Previously, Brimacombe served as CEO at Jobstream Group PLC.

Analytics

Cognizant

Cognizant offers AI-powered solutions which it claims can help healthcare companies organize their EHR data for gauging patient satisfaction. Cognizant claims healthcare providers can integrate the software into existing EMR or EHR databases, as well as where feedback forms from the Consumer Assessment of Healthcare Providers and Systems (CAHPS) would be stored.

The machine learning model behind the software was likely trained on tens of thousands of EHR notes and other reports, such as those from CAHPS. Natural language processing would allow the software to “read” EHRs and pull data from them. The data would then be run through the software’s machine learning algorithm. This would have trained the algorithm to discern which data points within patient medical records correlate to patient satisfaction or dissatisfaction with their medical visits.

The software would then be able to predict which practices would satisfy patients most. This may or may not require the user to upload information about their current treatment methods or new hospital protocols into the software beforehand.

That said, we could not find a demonstration video showing how Cognizant’s software works.

Cognizant claims to have helped an unnamed large healthcare network estimate customer satisfaction using information from their CAHPS reports. The healthcare network integrated Cognizant’s software into its database where CAHPS reports from 60,000 patients is stored. According to the case study, the healthcare network was able to highlight the possible causes of lower patient satisfaction, such as new mothers needing extra care the day they will be discharged. Cognizant saw the results their software produced and made suggestions to improve communication with patients. These included making briefings on newly prescribed medicines for elderly patients a requirement and providing new mothers with hands-on care while preparing to leave the hospital.

Cognizant does not list any enterprise clients by name; however, they make numerous case studies available for all of their solutions.

Sarangarajan TS is CTO and VP at Cognizant. He holds an ME in Power Electronics from Anna University. Previously, TS served as a member of the Transformation Solution Group in the ITIS Business Unit at Tata Consultancy Services

Health Fidelity

Health Fidelity offers a software called HF360, which it claims can help risk-bearing healthcare providers accurately determine risk within their patient population using predictive analytics and natural language processing. The system uses natural language processing to find data points in written words, such as a patient statement of being prone to a certain type of illness, which can then be analyzed with the rest of the relevant data to determine risk.

Health Fidelity claims healthcare providers can integrate the software into their EHR databases.

Before Health Fidelity’s predictive analytics portion of their HF360 software can analyze all data relevant to determining risk, text data from written documents like EHR must be extracted and formatted in a way that a predictive analytics model can recognize. We can infer the machine learning model behind the natural language processing algorithm was trained on thousands of clinical documents including EHR and EMR. This text data would have been labeled as factors that contribute to risk, such as working in a dangerous environment. The labeled text data would then be run through the software’s machine learning algorithm. This would have trained the algorithm to discern the chains of text that a human might interpret as risk factors when adjusting risk for insurance as displayed in a clinical document like an EHR.

A user could then run new EHR or other clinical documents through the software, and the algorithm behind it would then be able to categorize the documents according to the amount of risk the patient poses to a possible insurer.

We can infer the machine learning model behind the software needs to be trained on tens of thousands of EHR documents from the client company’s database. The data would then be run through the software’s machine learning algorithm. This would have trained the algorithm to discern which data points correlate to risk factors associated with certain patients or patient population segments.

The software would be able to predict which patients pose the most risk to insurers in their current condition. This may or may not require the user to upload information about the patient’s current treatments, allergies, or work environment.

Below is a short video demonstrating how Health Fidelity’s NLP technology works. The demonstration starts at 2:34 and ends at 6:36.

Health Fidelity claims to have helped the University of Pittsburgh Medical Center (UPMC) more accurately determine risk from written documents. UPMC integrated Health Fidelity’s software into its EHR database. According to the case study, UPMC used HF360 to make their method of risk adjustment uniform across all of their patient populations. They worked to increase efficiency with the software through targeted patient interventions, which would give them more data to analyze when determining the risk level of those patients.

Health Fidelity also lists Columbia University Medical Center as one of their past clients.

Raj Tiwari is Chief Architect at Health Fidelity. He holds an MS in Electrical Engineering from Oregon Health and Science University. Previously, Tiwari served as Director of Technology at Surgent Networks.

 

Header Image Credit: Federal Times

 

Stay Ahead of the AI Curve

Discover the critical AI trends and applications that separate winners from losers in the future of business.

Sign up for the 'AI Advantage' newsletter:

Subscribe