In the musical "Hamilton," George Washington exclaims: “We are outgunned, outmanned, outnumbered, outplanned. We gotta make an all-out stand. Ayo, I’m gonna need a right-hand man.”
I recently saw this performance and the remark made me think to myself: Like George and his nascent collection of militia trying to defend New York City from a British armada and 32,000 troops, are clinicians about to face an overwhelming onslaught with respect to digital health and cognitive computing?
Closer than ever
I now wonder if these technologies have matured to a point where a convergence is happening. Can we imagine that overworked clinicians dealing with this growing stream of internal and external medical data can now trust a right-hand man or woman mobile device to deliver useful guidance and advice in the midst of caring for their diverse mix of patients? And can this digital assistant filter the noise, highlight relevant information and, ultimately, take such actions on the clinician’s behalf?
I think we are closer than ever to this reality from a data and technology perspective. Consider Apple’s Siri and Amazon’s Alexa, for instance.
In my experience of developing and deploying this compliment of related advancements at Penn Medicine, it has required years of significant investment in technology, collaborative teams and an ecosystem that supports innovation.
And it requires a mix of technical and data assets, the foundation of which is a robust clinical data warehouse updated daily and harmonized into a consistent set of semantically interoperable data and a fully adopted single EMR in which alerts and clinical decision support rules can fire at the point of care.
On top of that data warehouse and electronic health record resides an API layer that enables applications to tap into the EMR and the ability to capture remote patient-generated data.
At the heart of this whole thing sits a predictive analytics cognitive engine to identify clinical situations that require clinician action and generate alerts that the other pieces cannot – and then send those to clinicians’ mobile devices.
Initial needs
Early results at Penn Medicine have shown value in use cases like predicting Sepsis and determining risk of heart failure. Future efforts will be focused on timely ventilator release, maternal morbidity, shock detection, surgical complication avoidance, among others. We are also experimenting with verbal interaction.
But all the technology in the world won’t prompt adoption unless clinicians find value in the advice they receive and can interact with a right hand man as if it were a real human assistant.
I’ve found that the cognitive engine must be right more than 95 percent of the time to instill a confidence among clinicians, the advice has to be relevant, recipients need to intuitively acknowledge the advice and score its accuracy and applicability to improve performance over time and, of course, patient outcomes need to be tracked to ensure use of a right hand assistant actually improves care.
Future work
Like Apple’s Siri or Amazon’s Alexa, the right hand man needs to also listen and quickly respond to questions and requests or indicate that the request cannot be met.
These are exciting times for health care information technologists and tech-minded clinicians. We have been dreaming of having the computing power at hand at a reasonable cost, the discrete data available to drive the cognitive rule sets, ubiquitous smart mobile devices attached to every clinician and a growing population of highly engaged patents.
Large health systems are uniquely positioned to experiment with this technology to assess its effectiveness.
The time is now to rise up and leverage this opportunity. Unlike Hamilton, we should not throw away our shot!
Brian Wells is associate vice president of health technology and academic computing at Penn Medicine.