Almost three out of 10 GPS The UK is using artificial intelligence tools such as ChatGPT during consultations with patients, although this could lead to them making mistakes and ending up in court, a study shows.
The rapid adoption of AI to ease workloads comes at the same time as a “Wild West” lack of regulation of the technology, leaving GPs unsure which tools are safe to use. This is the conclusion of the Nuffield Trust, based on a survey of 2,108 family doctors conducted by the Royal College of General Practitioners on artificial intelligence, as well as focus groups of general practitioners.
Ministers hope AI will help reduce the delays patients face when seeing a GP.
The study found that more and more GPs are using AI to create patient encounter summaries, helping them diagnose a patient's condition and carry out routine administrative tasks.
Overall, 598 (28%) of the 2,108 survey respondents said they were already using AI. It is used by GPs more often than men (33%) than women (25%) and is much more likely to be used in affluent areas than in poorer areas.
It is quickly becoming more widespread. However, the vast majority of GPs, whether they use it or not, are concerned that practices using it could face “professional liability and medico-legal issues”, as well as “risks of clinical error” and issues of “patient confidentiality and data security”, the Nuffield Trust report said.
“The government is pinning its hopes on the potential of AI to transform National Health Service. But there is a huge gap between policy ambitions and the current disorganized reality of how AI is being adopted and used in general practice,” said Dr Bex Fisher, a GP who is the think tank's director of research and policy.
“It is very difficult for GPs to feel confident using AI when they are faced with a wild west of tools that are not nationally regulated in the NHS,” she added.
While some NHS regional integrated care boards support the use of AI by GPs, others prohibit it.
In a blow to ministers' hopes, the research also found that GPs are using the time they save to recover from the stress of their busy days rather than see more patients. “While policymakers hope this saved time will be used to arrange more appointments, GPs report using it primarily for self-care and relaxation, including cutting back on overtime to prevent burnout,” the report adds.
A separate study A look at how GPs in the UK are using AI, published last month in the journal Digital Health, found similar findings. It found that the proportion using AI increased from 20% to 25% compared to the previous year.
“In just 12 months, generative AI has gone from taboo to tool in British medicine,” said Dr Charlotte Blease from Uppsala University in Sweden, lead author of the study.
Like the Nuffield Trust, it highlighted the lack of regulation as a key issue, especially given the speed at which GPs are adopting AI into their clinical practice. “The real risk is not that GPs use AI, but that they do so without training or supervision,” Blease said.
“AI is already being used in everyday medicine. The challenge now is to ensure it is used safely, ethically and openly.”
An increasing number of patients are also using AI to improve their care, including when they have been unable to get an appointment, according to Healthwatch England.
“Our recent research shows that while patients continue to trust the NHS for health information, around one in ten (9%) use artificial intelligence tools to get information on how to stay healthy,” said Chris McCann, deputy chief executive of the patient watchdog.
“There are various reasons why people might turn to AI tools, including when they are unable to access GP services. However, the quality of recommendations from AI tools is inconsistent. For example, one person received advice from an AI tool that confused shingles with Lyme disease.”
A commission set up by the government in September on how to ensure the safe, effective and properly regulated use of AI will make recommendations in its report.
The Department of Health and Social Care has been contacted for comment.






