May 12, 2022 — Artificial intelligence has gone from science fiction to everyday reality in just a few years, being used for everything from online activity to driving. Even, yes, to make medical diagnoses. But that doesn’t mean people are ready to let AI guide all of their medical decisions.
Technology is rapidly evolving to help guide clinical decisions in more medical specialties and diagnostics, especially when it comes to identifying anything out of the ordinary during a colonoscopy, breast cancer, the skin or an x-ray.
New research explores what patients think about the use of AI in healthcare. Sanjay Aneja, MD, and colleagues at Yale University surveyed a nationally representative group of 926 patients about their comfort with technology, concerns, and general opinions about AI.
It turns out that patient comfort with AI depends on how it is used.
For example, 12% of respondents were “very comfortable” and 43% were “somewhat comfortable” with reading AI chest X-rays. But only 6% were very comfortable and 25% were somewhat comfortable with AI making a diagnosis of cancer, according to survey results published online May 4 in the journal JAMA network open.
“Having an AI algorithm that reads your X-ray…it’s a very different story than relying on AI to diagnose a malignancy or to announce that someone has cancer,” says Sean Khozin, MD, who was not involved with the research.
“What’s really interesting is that… there’s a lot of optimism among patients about the role of AI in making things better. That level of optimism was great to see,” says Khozin, oncologist and data scientist, member of the executive committee. at the Alliance for Artificial Intelligence in Healthcare (AAIH). AAIH is a global advocacy organization in Baltimore that focuses on responsible, ethnic, and reasonable standards for the use of AI and machine learning in healthcare.
All in favour, say the AI
Most people had a positive overall opinion of AI in healthcare. The survey found that 56% believe AI will improve healthcare over the next 5 years, compared to 6% who say it will make healthcare worse.
Most work on medical AI focuses on the clinical areas that could benefit the most, “but we rarely ask where patients really want AI to impact their healthcare,” says Aneja, lead author of the study and an assistant professor at the Yale School of Medicine.
Ignoring patient perspectives leaves an incomplete picture.
“In many ways, I would say our work highlights a potential blind spot among AI researchers that will need to be addressed as these technologies become more mainstream in clinical practice,” says Aneja.
It remains unclear to what extent patients know or realize the role that AI is already playing in medicine. Aneja, who has assessed AI attitudes among healthcare professionals in previous work, said: “What became clear when we surveyed patients and doctors is that transparency is needed. regarding the specific role that AI plays in the treatment of a patient”.
The current survey shows that around 66% of patients think it is “very important” to know when AI plays an important role in their diagnosis or treatment. Additionally, 46% think information is very important when AI plays a small role in their care.
At the same time, less than 10% of people would be “very comfortable” getting a diagnosis from a computer program, even one that makes a correct diagnosis more than 90% of the time but is unable to explain why.
“Patients may not be aware of the automation that has been built into many of our devices today,” Khozin said. Examples include electrocardiograms (tests that record electrical signals from the heart), imaging software, and colonoscopy interpretation systems.
Even if they don’t know it, patients likely benefit from using AI in diagnosis. An example is a 63-year-old man with ulcerative colitis living in Brooklyn, NY. Aasma Shaukat, MD, a gastroenterologist at NYU Langone Medical Center, performed a routine colonoscopy on the patient.
“As I was concentrating on taking biopsies in the [intestines] I didn’t notice 6mm [millimeter] flat polyp…until the AI warns me.
Shaukat removed the polyp, which contained abnormal cells that may be precancerous.
Responding to AI Anxieties
Yale’s survey found that most people were “very concerned” or “somewhat concerned” about possible unintended effects of AI in healthcare. A total of 92% “said they would be concerned about misdiagnosis, 71% about an invasion of privacy, 70% about spending less time with doctors, and 68% about increasing health care costs.
A previous study by Aneja and colleagues published in July 2021 focused on AI and medical liability. They found that physicians and patients disagree on responsibility when AI leads to clinical error. Although most physicians and patients believed physicians should be held accountable, physicians were more likely to want to hold health care providers and organizations accountable as well.