A majority of Americans would be uncomfortable with their health care provider relying on artificial intelligence (AI) to diagnose and treat them, according to a new poll.
The survey from the Associated Press-NORC Center for Public Affairs Research found that 56 percent of respondents said they would feel “very” or “somewhat” uncomfortable if AI was used in their medical care. Only 37 percent said they would feel comfortable with it.
The findings come as more hospitals are turning to AI technology for diagnosis and treatment decisions, particularly when it comes to imaging tests like X-rays and CT scans. While some experts say this can help reduce errors and improve accuracy, others worry about the potential risks associated with using such technology without proper oversight or regulation.
“We’re seeing a lot of enthusiasm around AI in healthcare,” said Dr. David Magnus, director of Stanford University’s Center for Biomedical Ethics. “But there is also concern about how these systems will be deployed responsibly.”
Magnus noted that while AI has been shown to outperform humans in certain areas — such as diagnosing skin cancer — there is still much we don’t know about its capabilities when it comes to complex medical conditions like heart disease or mental illness. He added that even if an algorithm is accurate most of the time, there could still be cases where it fails due to unforeseen circumstances or bias within the data set used by the system.
“There needs to be transparency around how these algorithms work so people can understand why they’re making certain decisions,” he said. “And there should also be safeguards in place so that any mistakes made by an algorithm can quickly be identified and corrected.”
The AP-NORC poll found similar levels of discomfort among both men and women when asked whether they’d trust an AI system over a human doctor: 57 percent were uncomfortable compared with 38 percent who felt comfortable doing so; 54 percent were uncomfortable trusting an AI system over a nurse compared with 40 percent who felt comfortable doing so; And 58 percent were uncomfortable trusting an AI system over another type of health professional compared with 35 percent who felt comfortable doing so .
In addition, nearly two thirds (64%) expressed concerns about privacy issues related to having their personal information stored on computers operated by companies outside their own health care providers – suggesting many may not yet fully understand what data is being collected or shared through these systems .
Despite all this uncertainty surrounding artificial intelligence in healthcare , experts say its use will only continue growing as more organizations look for ways to cut costs while improving patient outcomes . But before any widespread adoption takes place , Magnus says regulators need step up efforts ensure safety protocols are put into place – something he believes could take years given current resources available at federal agencies like FDA .
As Magnus puts it : “ We have no choice but move forward cautiously .”