AI may help improve equity in pain treatment, MGB study shows

OpenAI's GPT-4 and Google's Gemini did not recommend opioid medication for one group over another – suggesting that artificial intelligence could help address clinician bias and improve fairness in treating pain.
By Andrea Fox
11:58 AM

Photo: Morsa Images/Getty Images

Looking to artificial intelligence to help address the undertreatment of pain in certain patient groups, researchers at Mass General Brigham tested whether large language models could improve race-based disparities in pain perception and prescribing. 

The LLMs displayed no racial or gender discrimination and could be a helpful pain management tool that ensures equitable treatment across patient groups, MGB researchers said in an announcement Monday. 

"We believe that our study adds key data showing how AI has the ability to reduce bias and improve health equity," said Dr. Marc Succi, strategic innovation leader at Mass General Brigham Innovation and a corresponding author of the study, in a statement. 

WHY IT MATTERS

Researchers at the health system instructed OpenAI's GPT-4 and Google's Gemini LLMs to provide a subjective pain rating and comprehensive pain management recommendation for 480 representative pain cases they had prepared. 

To generate the data set, researchers used 40 cases reporting different types of pain – such as back pain, abdominal pain and headaches – and removed race and sex identifiers. They then generated all the unique combinations of race from six U.S. Centers for Disease Control race categories – American Indian or Alaska Native, Asian, Black, Hispanic or Latino, Native Hawaiian or Other Pacific Islander, and White – before randomly assigning each case male or female. 

For each patient case in the data set, the LLMs evaluated and assigned subjective pain ratings before making pain management recommendations that included pharmacologic and nonpharmacologic interventions.

The researchers performed univariate analyses to evaluate the association between racial/ethnic group or sex and the specified outcome measures – subjective pain rating, opioid name, order and dosage recommendations – suggested by the LLMs, MGB said. 

GPT-4 most frequently rated pain as "severe" while Gemini’s most common rating was "moderate," according to the research published Sept. 6 in PAIN, The Journal of the International Association for the Study of Pain.

Of note, Gemini was more likely to recommend opioids, suggesting GPT-4 to be more conservative when making opioid prescription recommendations. 

The researchers said that while additional analyses of both of these AI models could help determine which are more in line with clinical expectations, the study indicated that the LLMs were able to transcend race perceptions of patient pain.

"These results are reassuring in that patient race, ethnicity and sex do not affect recommendations, indicating that these LLMs have the potential to help address existing bias in healthcare," said Cameron Young and Ellie Einchen, the Harvard Medical School co-authors, in a statement.

"I see AI algorithms in the short term as augmenting tools that can essentially serve as a second set of eyes, running in parallel with medical professionals," added Succi, who is also associate chair of innovation and commercialization for enterprise radiology and executive director of MGB's Medically Engineered Solutions in Healthcare Incubator. 

Future studies should consider how race could influence LLM treatment recommendations in other areas of medicine and evaluate non-binary sex variables, the health system said.

THE LARGER TREND

Just as biased algorithms furthered the disproportionate impact COVID-19 had on people of color, studies have shown that medical care providers are more likely to underestimate and undertreat pain in Black and other minority patients. 

While AI has been found to exacerbate racial bias in many areas of medicine and healthcare delivery, LLMs may also help to mitigate clinician bias and support equitable pain management. 

After the use of opioid prescriptions rose in the 1990s and 2000s, based on alleged false promises of safety, the truth about dependence and addiction was made clear when hundreds of local governments filed lawsuits against Purdue Pharma, maker of OxyContin, in 2017. 

Health systems began to recognize surgery as a major factor in opioid initiation in patients developing opioid dependencies. Intermountain Health and other providers then focused on reducing opioid prescriptions, educating caregivers, standardizing pain management techniques, and using AI-enabled analytics to sustain practice changes and boost patient safety

Technology developers have also leveraged analytics in mobile care management to help doctors make sure the appropriate amount of pain medication is administered and patients adhere to medication treatment plans. 

Although AI is not advising patients directly, Continuous Precision Medicine's Steven Walther told HealthcareITNews in July that data-driven technologies can help both doctors and patients reduce dependence on opioids and other pain management drugs.

In a full randomized control trial, patients using the company's mobile app "were 92% more likely to adhere to their medication guidance," Walther said.

ON THE RECORD

"There are many elements that we need to consider when integrating AI into treatment plans, such as the risk of over-prescribing or under-prescribing medications in pain management or whether patients are willing to accept treatment plans influenced by AI," said Succi. "These are all questions we are considering."

Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org

Healthcare IT News is a HIMSS Media publication.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.