This site is intended for health professionals only


NHS must resist the urge to replace GPs with AI

NHS must resist the urge to replace GPs with AI

Pulse editor Sofia Lind warns against using artificial intelligence as a substitute for GPs

Artificial intelligence has just outscored GPs in MRCGP-style exams. Four large language models answered 100 AKT-style questions with scores of 95% to 99%, compared with the average GP mark of 73%.

It is a striking result, but one to be cautious about. As the RCGP pointed out, there’s a reason why the MRCGP has three components and the AKT is only one of them. Arguably more important than knowledge tests (a measure of application and recall) are the Simulated Consultation Assessment and the Workplace Based Assessment. I do wonder how AI would cope with those.

The researchers themselves stressed that AI should be seen as a support, not a substitute. And yet, I worry the temptation for policymakers will be to over-promise. Wes Streeting has already said that every patient will have an AI ‘companion’ on the NHS App. I can see how that might be useful for self-care or simple triage (unless programmed to be overcautious).

We have seen this mistake before. Physician associates were intended as a support role, but the Leng Review found that they have been used as a doctor substitute. The result has been confusion, patient safety concerns, and anger from the profession. I fear AI could risk going the same way if its limits are ignored.

That is not to say AI should be dismissed. As decision support, it can retrieve specific facts and guidance more effectively than the average search engine (and I am told it has Dr House levels of diagnostic genius on the rare occasion you’re racking your brains over a thorny clinical scenario). For trainees, it might be an effective revision tool, showing where their knowledge is strong and where it needs work. And as Pulse has reported, ambient scribes that draft consultation notes could hand back precious minutes. But here too, there are big questions about accuracy, medico-legal responsibility and whether patients are comfortable with an algorithm ‘listening in’.

The Government’s 10-year plan places heavy emphasis on AI to ease workload and tackle backlogs. I understand why: patients want access, and the Government doesn’t want to fund enough human GP capacity to go round. But impressive exam scores are not the same as safe, sustainable care. What matters in general practice is not just knowledge recall. It is weighing evidence against circumstance, managing uncertainty, and understanding the patient in front of you. No algorithm can replicate that.

AI may soon become part of everyday practice – but it cannot take the place of a GP consultation. Policymakers must recognise this, and resist the urge to confuse support with substitution.

Sofia Lind is editor of Pulse. Find her at [email protected] or on LinkedIn 


          

READERS' COMMENTS [3]

Please note, only GPs are permitted to add comments to articles

Mark Howson 20 August, 2025 6:14 pm

The AI has access to all the internet. That’s way better than an open book exam. I wonder how registrars would do with an open book exam for AKT- probably about the same as the AI.
I am actually surprised the AI did so badly when it has access to the whole internet – ie all the answers.

So the bird flew away 20 August, 2025 6:49 pm

My mate, Kev, has this party trick where he recites pi to 30 decimal points while lighting his farts. He calls it anal-itical intelligence.
There is no such thing as “artificial intelligence” – it’s a made-up phrase to capture something about fashions in advances in silicon chip manipulation.
Intelligence, as psychologists and philosophers conceive it, has certain dimensions e.g., self-awareness and consciousness, etc.
If we, Homo Stupidus, the tool-using mammal, want usefulness from chip tools then the starting point should be “how can this tool help us do our thing” e.g., building bridges, cleaning sewers, writing music, a GPs job etc….and not “how can we replace” those who build bridges, clean sewers…etc.
In other words, “artificial assistance”, or AA.
Apart from the arguments of Gary Marcus, Geoffrey Hinton and others, and hugely important undecided issues around sovereignty, security, data property ownership, LLM energy use etc, we should not allow our future to be aggressively sold to us by the snake-oil lies and moneygreedworship of Elon, Jeff, Sam, Peter etc in pursuit of their profits for Big Tech/billionaire/crypto/corporate backed AI, whether USA or China owned.
The future of AA should be designed around public ownership, secure, open-source, local tool assistive solutions…that’s my argument, anyway…and Kyle and the Blairite Labour party is going about it all the wrong way..

Vicky Cleak 20 August, 2025 6:49 pm

Who at the ‘NHS’ must resist this? When stupid edicts come out and we are told ‘the NHS says’ I have started asking ‘who in the NHS’ because it is always a person or persons.
I pray that the royal colleges come together and have oversight of any attempts to do this and push back hard if and when it is appropriate. And if the question is who at the Royal Colleges- it’s all the named Presidents. Don’t let us down.