Slow down AI adoption, one tech CEO cautions

Healthcare leaders must demand ethical artificial intelligence and ensure their RFPs list requirements for safeguards, says Sarah M. Worthy of DoorSpace. This will take time, which would mean pulling back from investing in the majority of AI tools.
By Bill Siwicki
10:26 AM

Sarah M. Worthy, CEO of DoorSpace, an employee relationship management technology and services company

Photo: DoorSpace

Some clinicians and healthcare professionals are ready and willing to welcome artificial intelligence to their daily routines. Others, though, are a bit more hesitant.

"We're all becoming less trustworthy of AI outcomes as we use it more and more and can see how unreliable it is," says Sarah M. Worthy, CEO of DoorSpace, an employee relationship management technology and services company that has AI in its mix.

"Instead of building AI to replace the most important people in healthcare, we should be looking at how we can use AI to make those people's jobs easier and more productive."

While many healthcare executives are eager to spend money investing in AI, Worthy believes there are ideal ways to spend such resources on AI in healthcare.

"We aren't going to be able to replace clinicians with AI anytime soon – and so it pains me to see so much money and focus placed on these types of technologies," she said. "There are a lot of areas on the administrative side of healthcare where AI could make a huge impact to reduce costs, reduce delays in care, and improve the overall experience for both patients and clinicians without having to risk patient lives."

We interviewed Worthy to discuss where and how she believes resources should be spent on AI in healthcare, and what the outcomes could be.

Q. You suggest some clinicians and other healthcare workers are hesitant about AI in healthcare. Why?

A. I'm hearing a lot of skepticism from both the clinical side as well as the administrative side in healthcare around the use of AI, and their skepticism is well-founded. What most people think of as "AI" and what we're seeing hyped in the media is the large language model, or as I like to say, "AI that talks to us."

LLMs have a lot of known issues where they can provide false information as well as contribute to societal biases, which can greatly impact patients.

In a life-or-death situation, where a person's wellbeing is literally on the line, we want our healthcare workers to be hesitant to bring this technology to the bedside. Our clinicians already are overworked and exhausted, and expecting them to incorporate unpredictable technology into their care practice is unreasonable and dangerous.

Q. How can this barrier to AI adoption be overcome?

A. Healthcare leaders need to be vocal in demanding ethical AI from the technology sector and making sure their RFPs list requirements for AI that has safeguards in place. This will take time and that means pulling back from investing in the majority of AI tools.

In the short term, while we wait for these AI tools to become safer and more reliable, executives can look at investing in AI for the non-patient side of the healthcare business. There are a number of great and proven ways AI supports the automation and management of operational and workforce data to save time and money while driving faster, more accurate decision-making around the business in ways that don't touch directly on patient care.

Most important, virtually every healthcare organization in the U.S. today has a data management crisis. Their data is in siloes, split across departments and paper and spreadsheets. There is a saying: "Bad data in, bad reports out."

Healthcare organizations need to get their data organized and have documented data lifecycle management processes in place. Otherwise, any investment in AI is going to produce a negative ROI.

Q. You say work with AI in healthcare today should focus on administrative tasks. What exactly is your vision for AI today?

A. I don't have a vision for AI in healthcare, I have a vision for a better workplace experience in healthcare that incorporates AI and other technologies to get us there. This distinction is important to highlight because I often have executives approaching AI from a position of, "I have this AI, so how should I use it?" When the better way to approach it is to say, "I have this problem, what's the best way to solve it?"

I think this is one of our unique strengths and what really differentiates our work here at DoorSpace. There's a common trend for healthcare leaders to try to add new things to a process when attempting to solve a problem. But oftentimes, the best way to solve these problems is to remove things from the process.

When I look at the problems on the administrative side of healthcare, the majority of them are rooted in this very issue, where a problem occurred so new regulation was created to add to the process.

Over time, with each new problem, they've added more CME, more compliance rules, more forms and paperwork. All of this has added up to a situation where physicians are spending nine hours a week just on non-patient related paperwork. That's a full workday.

We are looking at how we can use data to better measure and understand how to remove things from the process while increasing quality and efficiency. We view AI as one of the ways we can automate a lot of the data management that is currently using up physicians' and executives' valuable time on low-value data entry and report creation.

Q. What are your thoughts about where AI fits on the clinical side of healthcare in the years to come?

A. One product I saw recently that I really like that is taking AI into the exam room is using AI as a scribing tool. This tool listens in the background as the doctor and patient discuss the issues and records everything into EHR notes the doctor can later review and edit in a few minutes.

This allows the doctor and the patient to be face to face during the exam rather than having the doctor staring at the computer to input EHR data the entire exam.

We're also seeing a lot of success in radiology where AI is making it faster for radiologists to evaluate scans for more accurate diagnoses. In all the test cases I've seen to date, the only ones that got any positive outcomes all shared one thing in common: The AI was a tool that augmented the work of the clinician.

So, I think we'll continue to see AI used to help healthcare workers do their jobs more accurately, more efficiently and faster, to give them time back in their days. I don't foresee AI replacing doctors and nurses anytime soon without disastrous consequences.

Follow Bill's HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.

The HIMSS AI in Healthcare Forum is scheduled to take place September 5-6 in Boston. Learn more and register.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.