Mass General, Brigham and Women's to apply deep learning to medical records and images
Artificial intelligence is beginning to reshape healthcare and life sciences. And one application of AI, deep learning, is coming into its own.
Deep learning is a type of machine learning based on data representations rather than task-specific algorithms. Learning can be supervised, semi-supervised or unsupervised.
Much of the excitement around AI today is fundamental because of three ingredients: the development of algorithms that make artificial neural networks, the increasing supply of digital data that now can be created, and, critically, the "GPU" chip architecture – it stands for graphics processing unit – pioneered by vendor NVIDIA, said Mark Michalski, MD, executive director of the Massachusetts General Hospital and Brigham and Women's Hospital Center for Clinical Data Science.
"GPU chips are different than the CPU chips that run many of our computers today in that they solve many simple problems simultaneously, as opposed to one big problem at a time, like CPUs," Michalski explained. "It turns our brain's work in a similar way to GPUs, which is perhaps why GPU chips are so effective as tools for machine learning."
Anyone working in deep learning – the machine learning concept behind many recent AI advances – use GPUs. People have applied this technique to all kinds of data, including images, videos, audio recordings, text and more.
"One of the primary reasons we are building our own data center with dedicated GPUs is that we rely on medical data such as MRIs and CTs, which can be a lot of data, and is sensitive from a privacy perspective," Michalski explained. "We can feed the GPU information derived from medical records, like clinical notes, medical imaging data, pathology slides – just about everything we can learn from our records, we try to leverage."
GPUs and deep learning have big potential for how healthcare can manage and interpret imaging and other clinical data. Michalski is a radiologist by training, and in his field, there are many instances where one has to look at medical images and find abnormalities within images, characterize and measure those abnormalities, and describe a diagnosis based on those findings.
"A single radiologist may have to look through thousands of images a day," he said. "Having systems that can help point out an abnormality in a stack of normal images, and further, automatically measure that abnormality like a tumor or a dilated heart chamber, could be a huge productivity advantage for us."
And these advantages aren't just in radiology. Massachusetts General sees opportunities for helping support physicians throughout the hospital, everywhere from finding features in clinical reports to help oncologists identify ideal treatments for cancer patients to helping the operational leadership within the hospital identify opportunities to use scarce resources more effectively.
At the institutional level, executives and caregivers have been on the lookout for solutions to many of these problems for years, and there is broad hope that machine learning may help develop those solutions.
Consider this example of how the NVIDIA AI technology works: First there is a clinical challenge, such as identifying breast cancer cells on a pathology slide with machine learning. The first step in applying machine learning is to identify data that one can use to train a neural network. To do this, one needs sample data that has been labeled – that is, one needs someone to have pointed out where the cancer cells are in the slide and highlighted them appropriately.
"Sometimes to make the models work, we need lots of those examples," Michalski explained. "Getting enough data to make the AI work is often one of the key challenges. This contrasts with some other artificial intelligence applications, such as autonomous vehicles, where there are thousands upon thousands of images to draw conclusions from."
After the training data is secured, one selects the kind of neural network that one wants to use to build an AI model. This is the "brain" that one wants to teach to identify the cancer cells on a slide. Once that is done, one uses the GPUs to teach the brain and, after that, one can use that brain to identify the cancer cells, a process commonly called "inference."
"In healthcare, integrating the model into the workflow in a way that is helpful to the clinician sometimes is one of the more challenging parts," Michalski explained. "Having the brain isn't enough: It has to be inserted into a physician's process in the right way for it to be useful."
Today, Massachusetts General and Brigham and Women's Hospitals are using deep learning to automate the things that humans do well, but don't want to do or don't have the time to do. But the hospitals are also starting to see that the techniques could have other big potential, such as picking up things in images and medical records that humans don't see very easily, which will be evolving in the coming years.
"We're lucky in medicine to have a lot of exciting advancements every day," Michalski said. "What makes AI so interesting is that it may be able to improve the lives of patients and physicians alike, all while potentially reducing costs. That's why I think a lot of us in the field believe AI and deep learning are going to be very important to the future of healthcare.
Big Data & Healthcare Analytics Forum
The San Francisco forum to focus on utilizing data to make a real impact on costs and care June 13-14.
Twitter: @SiwickiHealthIT
Email the writer: bill.siwicki@himssmedia.com