Yes: Artificial intelligence can diagnose and treat patients
The Government’s attempts to increase the GP workforce will take time to work, if they ever do, but technology offers a way of easing the recruitment crisis in the near future.
Chatbots are programmed to have conversations with human users. The bot has to be able to handle all manner of language variations, including local dialects and slang, and this becomes especially tricky when trying to describe medical symptoms. But the bot uses key artificial intelligence (AI) applications to understand patients, and master language skills. Yet for a chatbot to function like a doctor – and not just talk like one – we need to build the equivalent of a whole brain.
Babylon has already built an AI-powered symptom checker capable of accurately triaging patients. It has been used hundreds of thousands of times. Since its introduction, we have seen a 40% reduction in patients needing a same-day GP appointment because the chatbot has signposted them to more appropriate services, such as self-management or pharmacy. This chatbot is now being piloted as an alternative to calling NHS 111 in north-west London.
We are also looking at using AI to diagnose conditions. It is the accuracy of this type of innovation that makes me believe that chatbots will be able to replace many of the functions I carry out as a GP in a typical consultation. It can: take a history; record observations like temperature and blood pressure; use sound and image recognition software to auscultate a chest; and assess a tympanic membrane via devices that attach to the patient’s smartphone. The chatbot can use this information to diagnose and treat the patient according to evidence-based practice – and write up the notes.
Of course there will be patients for whom a chatbot is not the answer – but that’s the point. The technology will free GPs to care for those with complex needs. Meanwhile, chatbots may even be better for picking up a patient’s hidden agenda – those ‘just one more thing, doctor’ consultations – as patients can share their concerns at any time.
Chatbots could even transform healthcare more broadly. We are not just building the digital equivalent of a GP brain, but also the digital equivalents of many other brains, powering the chatbot to carry out a range of healthcare functions. This has enormous potential in health systems globally.
Dr Mobasher Butt is a GP and medical director at Babylon Health
No: AI can never replace a clinical consultation
Technology has undoubtedly benefited general practice. It is now normal to use search tools to access information swiftly both inside and outside the patient record. Patient safety is improved by clinical computer systems providing prompts about drug interactions and allergies. And who would prescribe warfarin these days without an algorithm?
And patients’ use of health apps will grow exponentially. They could improve care once a diagnosis and treatment plan has been agreed by a doctor. They will enable patients to self-monitor.
But a future where AI replaces the need for doctors is far fetched. The apps will actually need more GP time as we validate the AI decisions and search for the pertinent trees in the data forest.
More than that, although computers are able to sift through signs and symptoms quicker than a clinician, they will never be able to provide holistic care. AI cannot use non-verbal clues to tell the difference between a patient underplaying or overplaying their symptoms. It cannot empathise with the patient, providing the comfort and better understanding of their problems. It cannot negotiate a treatment plan. It cannot provide compassion.
Babylon’s AI-powered chatbot triage service has recently been chosen for use in NHS 111. It will use patients’ responses to programmed questions to decide whether referral is needed. Although it has been heralded as the future, at best it will replace non-clinical call handlers – it can never mimic a GP consultation. The unavoidably risk-averse approach will not improve on the high level of disposal by NHS 111 to hospital or general practice and may worsen it.
The growing use of AI is also associated with confidentiality concerns. Development requires an ever-increasing database of patient information. This will be highly attractive to commercial companies. The Information Commissioner condemned the use of Google’s AI program DeepMind by the Royal Free London NHS Foundation Trust for breaching the Data Protection Act.
I look forward to the improved efficiency and safety that continuing development of computer systems will bring. But these will not save clinical time and the replacement of doctors by the likes of a Star Trek emergency medical hologram will remain science fiction.
Dr Grant Ingrams is a GP in Leicester and former GPC IT subcommittee chair