Integrated workflows ease staff burden and satisfy regulatory mandates.
Separating the Hype From the Hope in Healthcare AI Trends for 2024
AI is everywhere, and healthcare technology is no exception. By learning to tell the wheat from the chaff—and choosing just the right AI applications for our needs—we ensure these tools enhance workflows so that the healthcare industry can face new challenges with confidence and precision.
On a recent episode of The Seamless Connection, a Slice of Healthcare podcast about realizing healthcare technology’s full potential, experts discussed use cases for machine learning as well as security considerations and more.
“Every single vendor in healthcare right now has an AI component, even if when you look under the hood it’s just some simple regression or something like that,” said Jake Lancaster, M.D., CMIO at Baptist Memorial in Tennessee. “We’re not just bringing in AI for AI’s sake—it’s really got to solve the problem that you have.”
At Baptist Health in Jacksonville, Florida, Stacey Johnston, M.D., M.H.A., VP, Chief Applications Officer, said they created an AI Institute to identify where the technology can play a significant role.
“We’ve actually done some work with DrFirst and are partnering with them in utilizing AI to help with medication reconciliation and have the medication history be more accurate with more data coming in,” Johnston said. “They’re able to infer medications from prescriptions when there’s incomplete information in the database.”
Baptist Jacksonville has also been an early adopter of Epic’s AI in-basket solution, she said. “But I tell the doctors: It’s not going to solve all of the world’s in-basket problems.”
Most importantly, it cannot make clinical decisions—it queues up a message that might be an appropriate reply to a patient, but it still needs to be reviewed, Johnston said.
DrFirst’s Chief Medical Officer Colin Banas, M.D., M.H.A., agreed that a clinical review is an important part of any AI-powered process.
“I’m the guy who always says, let’s call it augmented intelligence because it is really augmenting the clinical work. It’s certainly not meant to replace people,” he said. “Help me write my note, help me respond, but don’t make decisions.”
“There are some amazing use cases out there, but you have to be deliberate,” Johnston said. “You have to have a significant amount of governance around AI, and you have to understand your staff’s comfort level for moving forward with some of these solutions.”
Think about nurses and other clinical users too, she added, not just the bandwidth of your IT department. “We’re throwing so much new technology at them at one time that our adoption rates can be low,” she said.
“But in 10 years,” Banas said, “I think we’re going to look back and say, ‘how did I ever practice without this?’”
Balancing the need for technological innovation with concerns about data privacy and security is another big consideration.
“Every new application, every new piece of software, every new algorithm that touches our system and our data, has to undergo a thorough privacy, security, and legal review,” Lancaster said.
“AI is no exception, and it’s probably scrutinized even more,” he added. “Especially things that are tied to the large language models that had some issues with hallucinations early on. But it’s something we feel we need to do to protect ourselves and protect our patients.”
Banas talked about the U.S. House of Representatives hearings on AI in healthcare late last year, where questions were very focused around data privacy and security.
“Another way to look at it is that the genie is out of the bottle and this thing is racing like Mach 10 down the hill. The gains might be so impressive that you can’t really slow it down,” he said.
But if anything could slow it down, it would be money.
“Funding is very tight right now,” Lancaster said, so it’s important to be “really looking at those ROIs very carefully and not just bringing on something because it’s a shiny new object.”
Interested in the full discussion? Watch the recording here.