It was a warp-speed tour of what’s happening with artificial intelligence in medicine.
Chris Manrodt, an R&D manager for Philips’ medical imaging business in Plymouth, last Friday gave a presentation to several hundred Twin Cities software developers and health care executives and then declared, “I feel like I’ve said about 50 controversial things, so let me take your questions.”
It was the opening session of a daylong gathering of about 1,200 people from the local data science community, the first time in five years that MinneAnalytics, the local association of software developers, staged a conference devoted to health care and medical technology.
“The promise is there. The challenge we face is that our track record kind of sucks,” Manrodt said. “I don’t want to pick on anybody in particular, but I’ll point to this headline: ‘AI failed to live up to its potential in the pandemic.'”
In short, don’t expect AI to replace doctors.
“The opportunity to turn the data from the administration of health care into the care of patients is actually a much wider gap than I think any of us anticipated back in the middle of the last decade,” he said.
Like many people, I’m frequently confused by what I read and hear about AI. Seeing investors pour so much money into companies associated with it gives me fear of missing out, not just with my own investments but simply in understanding what is going on.
My takeaway from Friday’s conference is that medtech developers are working on some great devices to bring down the cost of diagnosing illness and disease.
Executives from Twin Cities-based startup VoxCi Health described their device that will detect disease by sensing chemicals in a person’s breath — specifically what they exhale. Its initial target market is patients suspected of having lung cancer.
Nashville-based Peerbridge Health promoted a small wearable device that measures cardiac output, potentially replacing the need for people to go to a hospital or clinic for electrocardiograph (ECG) tests.
It seems, though, that it will be a long time for AI to be able to provide diagnosis or recommendations. In his speech and a conversation afterward, Manrodt made it clear that people like him are at the start of a long climb. I found his perspective helpful to hear with so much hype swirling.
“Health care has been the place where it has been most difficult to get AI to really make an impact,” he said.
Generative AI, the kind that can create new ideas or things like conversations and stories and images, needs really good data to build the connections and make diagnoses when someone is sick.
Unfortunately for creating an AI model, people seek health care in differing, unpredictable ways. There’s no way to track what makes people decide to go to a doctor or hospital or clinic in the first place.
“There is a complex set of social, psychological and economic factors before you decide to seek care and when data collection in health care starts after you have sought care,” Manrodt said.
Then, after a visit to a doctor or hospital, people also behave in different ways. Some will go back when their physician says, and others won’t. Many doctors don’t ever know how well their patients fare after a visit.
“Most of the time when the patient leaves the care setting we don’t know whether they’ve improved or not, unless they come back and tell us,” Manrodt said. For an AI model to be useful, he added, “The data on outcomes has to be good.”
One area where AI is moving quickly in health care, he said, is radiology. Artificial intelligence models are being trained to analyze images in many fields. He also said that AI may prove useful in helping doctors and nurses reduce errors in caregiving, akin to a collision warning system in a car.
“You still have to hit the brakes,” Manrodt said. “But if you get the collision warning, you know something’s up. If we even save one more life with something like that, it’s worth it.”
An adviser to the University of Minnesota’s Carlson School of Management, Manrodt said he’s been amazed by colleagues in other industries who also work with the school’s faculty and students on AI.
He reminded me of an announcement Cargill made a few years ago about using facial recognition technology with cattle to determine things about their feeding and health. The data scientists pushing that technology forward have at least one big advantage on those trying to improve human health.
“You don’t have to get a consent form from any of those cows,” Manrodt said.