Vanderbilt combines AI and Smart on FHIR in an EHR voice assistant

"Voice user interfaces are an essential step to humanizing the EHR," says Vanderbilt University Medical Center’s Dr. Yaa Kumah-Crystal, who will be speaking at HIMSS19.
By Bill Siwicki
10:03 AM

Vanderbilt University Medical Center is one of the first provider organizations to develop a voice assistant for electronic health records that can give verbal summaries back to providers using natural language processing.

With VEVA, the Vanderbilt EHR Voice Assistant, the health system is using NLP artificial intelligence from Nuance to help interpret voice requests, and HL7's SMART on FHIR standard to pull pertinent data and provide relevant summaries.

Consulting with Epic

"We also are consulting with Epic, which is developing a voice assistant application that can retrieve and display some relevant patient information such as labs, notes and vitals," said Dr. Yaa Kumah-Crystal, core design advisor at Vanderbilt University Medical Center. "Epic’s voice assistant will also have some medication ordering functionality in future releases."

Unfortunately, the way EHRs are designed can make it very challenging and frustrating to find relevant information, said Kumah-Crystal. Maneuvering the system to find something as simple as a lab value involves several clicks, drop downs and scrolls.

"It is a time-consuming and tactually complicated effort to understand the patient story," she explained. "Often you know what piece of information you want but are forced to forage through a graphical user interface designed by someone that does not understand your clinical workflow. This can be an exasperating experience and one of the reasons EHRs often are cited as contributors to physician burnout."

"Voice is an intrinsically human way of seeking and communicating information that we need to take more advantage of in computing."

Dr. Yaa Kumah-Crystal, Vanderbilt University Medical Center

The EHR also has become an intrusive third-party in the exam room. "To get relevant pieces of information from the EHR, a provider is forced to turn away from their patients and stare at their computer screen to confirm the keystrokes they are entering," she said.

"One particular instance I can recall from my own experience was when I was performing a physical exam on my patients in the pediatric endocrinology clinic," she added, explaining that she had noticed something unusual during the patient’s exam and wanted them to return for blood work in a couple of months. The patient’s mother told the doctor that she believed the patient had a return visit at Vanderbilt coming up, but could not recall the date.

Considerable inefficiency

"In order for me to then find that piece of information and coordinate care for the family, I had to disengage from doing the physical exam, walk over to my computer, relaunch the EHR, which seemed to take forever, then click and scroll through her chart to finally find the one piece of information I was looking for – when her next visit would be," Kumah-Crystal said.

"All the while I was thinking to myself, ‘What if I could just ask the computer when is her next visit,’ and the computer could reply, ‘I see an upcoming dermatology visit scheduled for January 8 at 2 p.m.’"

With that simple voice interaction with the EHR, the doctor could have stayed by her patient’s side, continued performing her exam, and remained engaged with her mother, rather than having to apologize for breaking away to find information, she added.

“The fascinating aspect of this work is that we have finally arrived at a place where the technology behind our natural language processing has parity with human understanding," said Kumah-Crystal.

"We now have confidence that computers can understand the words we say," she explained. "We now have commonplace models of these interactions in the consumer realm like Siri and the Amazon echo that can understand people’s verbal requests and respond almost humanly."

Voice will catch on

Although she has not seen many other organizations besides Vanderbilt, Epic and eClinicalWorks venturing into the EHR voice assistant arena, she expects this concept soon will catch on, and that there will be more exciting developments in the field.

"We need people developing varieties of skills and interactions for these new voice user interfaces to demonstrate the various use-cases and help identify the gaps so we can continue to innovate," she said.

Another thing Kumah-Crystal would like to see is a patient-facing EHR voice assistant to help patients find information about their labs and get guidance about their care.

"This is an area we are starting to evaluate as a means to provide information access to people who may not be as proficient using standard computer interfaces," she said. "Voice is a natural mode of communication we have all been capable of using since we were toddlers. It is an intrinsically human way of seeking and communicating information that we need to take more advantage of in computing."

The tech of our dreams

"Voice is the next interface, and it’s already here," Kumah-Crystal stated. "Voice is an interface many of us have dreamt of using with our computers for years, and now the technology finally caught up with our imaginations."

A forward-thinking healthcare CIO will find ways to invest in research and development of these new platforms, she said. The current generation of voice assistants deals with structured information, and when CIOs can take things to the next level of summarizing free-text content, these interactions are going to be game-changing, she contended.

"But we will only get there with a vision for this innovation," she said. "I would encourage healthcare CIOs to invest in research and platforms that can allow their providers to use their voice to find the information they seek and empower patients to engage with their health. Healthcare is always so far behind with technology, and this is an opportunity for us to be at the forefront. Voice user interfaces are an essential step to humanizing the EHR."

Kumah-Crystal will offer more insights at HIMSS19 in a session titled "Transforming EHR Interactions Using Voice Assistants." It’s scheduled for Wednesday, February 13 from 2:30-3:30 p.m. in room W307A.

HIMSS19 Preview

An inside look at the innovation, education, technology, networking and key events at the HIMSS19 global conference in Orlando.

Twitter: @SiwickiHealthIT
Email the writer: bill.siwicki@himssmedia.com

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.