ChatGPT can rewrite clinic letters from consultants to make them more easily understandable to patients without losing key information and potentially free up GP time, a study has found.
If adopted, the approach has the potential to free up time in general practice as patients were less likely to book appointments with their GP to have clinic letters translated, the study’s lead researcher told Pulse.
The study, published in BJGP Open, assessed the use of artificial intelligence (AI) on 23 clinic letters from eight specialties. ChatGPT was given a prompt to convert them to more easily understood language for a UK patient with an average reading ability.
A doctor then assessed the resulting letter to check there had been no loss of clinical information. Patient representatives assessed both letters for clarity.
The AI-generated letters were longer, but patients rated them significantly better for understanding of diagnosis or medical condition.
The study also found the AI letters easier to understand when it came to understanding of treatment or management plans.
There was also a significant difference in the need for help in understanding the content of the letter between those originally from the consultant and those edited by AI, the researchers said.
Objective measures of readability found no difference in the ‘readability’ of the letters which the researchers said showed the importance of including real patients in research assessing communication.
The NHS has recommended patients receive copies of all correspondence between healthcare professionals relating to their care since 2000.
Later guidance emphasised the benefit of writing directly to the patient rather than sending them a copy of the GP letter.
The study, whose authors included Pulse’s editorial advisor Dr Keith Hopcroft, said: ‘The workload involved in creating patient-friendly letters, or patient-friendly versions of the GP letter, is potentially onerous – and perhaps explains why, in many cases, secondary care doctors still tend to simply copy patients into the letters they send to GPs.’
These are typically written in technical language with an inevitable risk of being ‘at best unclear and confusing and at worst impenetrable and worrying’.
It can mean an increase in GP workload via appointments arranged by patients to have the letters interpreted, they noted.
The study concluded: ‘Translation of letters by ChatGPT resulted in no loss of clinical information, but did result in significant increase in understanding, satisfaction and decrease in the need to obtain medical help to translate the letter contents by patient representatives compared with clinician written originals.
‘Overall, we conclude that ChatGPT can be used to translate clinic letters into patient friendly language without loss of clinical content, and that these letters are preferred by patients.’
Study lead Dr Simon Cork from Anglia Ruskin University’s school of medicine told Pulse: ‘Our analysis showed that patients preferred the translated letter and were less likely to book appointments with GPs to have clinic letters translated.
‘This has the potential to reduce the number of GP appointments required by patients receiving clinic letters, as well as empowering patients to better understand their health and treatment plans.’
He added that it is ‘really important’ that clinicians take the time to ensure all of the clinical information included in the original letter is carried over into the patient friendly version, and likewise that the programme has not included ‘any erroneous information’.
Clinicians should also take care that identifying information is not uploaded in documents being translated, such as date of birth, name, address, he added.
The study warned that one limitation of generative AI is the risk of so-called ‘hallucinations’, where information is provided based on ‘inaccurate, misinterpreted or fictitious information’.
It added: ‘Such hallucinations have the potential to cause at best alarm and at worst harm to patients if used in a healthcare setting. Hallucinations are more likely to be generated when the inputted information is limited.
‘In our study, manual analysis of the generated letters revealed no loss of clinical information or hallucinations.’
A survey from the GMC reported earlier this year that one in four doctors currently use AI and see its benefits for efficiency and patient care.
It follows a previous study suggesting that a fifth of GPs are already using AI with ChatGPT the most popular tool.
Last month, NHS England said that GP practices ‘may still be liable’ for clinical negligence claims arising from the use of artificial AI.
The commissioner has published guidance to assist practices adopting ‘ambient scribing products’ that feature generative AI, as health secretary Wes Streeting encouraged their use.