Artificial intelligence, digital and online technologies could offer huge benefits to patients, the NHS and the economy. How we go about adopting these technologies will determine whether those benefits are realised. Done badly – like earlier IT initiatives – the NHS and its patients could again end up substantially worse off.
It doesn’t need to be that way. Alder Hey Children’s Hospital in Liverpool, instead of trying to use technology to replace doctoring, has developed AI in-house from the ground up, with external and NHS England sponsorship – from interactive apps for young patients, to improving logistics and supply chains, and AI-assisted imaging. This is an excellent example of AI developed from and within a medical framework, rather than one imposed on top of existing structures.
The NHS is in fact a tenacious adopter of good technology – from stem cells, genetics and immunotherapy to robotic surgery – where effectiveness and safety have been established before roll-out. But commercial influences are now shaping how our medical care is to be delivered like never before. New technology is largely being developed and lobbied for by private companies, outside of our health research environment.
Health secretary Matt Hancock has expressed support for adopting this type of technology into the NHS in the form of a disruptive marketplace exercise, rather than as a maturation of the system.
Far from deregulating the market, we should expect to see AI like this tested independently and peer-reviewed
But this approach has real risks for patients and the health service. For example, AI triage ‘chatbots’ and other health app software used by online GP consultation providers like Babylon are registered as Class 1 medical devices, meaning they are not assessed, approved or certified by UK Medicines and Healthcare Products Regulatory Agency (MHRA), or by anyone else – and there is no requirement for external validation of their effectiveness, safety or clinical efficiency. Such companies are not subject to public transparency requirements for disclosure of information.
And although chatbot disclaimers typically state that they are not providing medical advice or diagnosis, Babylon’s recent ‘research’ paper clearly defined the product as an ‘AI powered Triage and Diagnostic System’. How can a diagnosis not constitute medical advice? Following this unvalidated research and widely publicised claims of accuracy equal to a human doctor, simple real-world test reports of Babylon’s current symptom checker led to MHRA questioning over complaints of missed and mischaracterised life-threatening diagnoses including MI.
Far from deregulating the market, we should expect to see AI like this tested independently and peer-reviewed before it receives the red carpet treatment from the health secretary and NHS England.
With AI patently in its infancy, it seems premature to target the doctor-patient interface instead of aiming first to improve efficiency in operational systems such as supply chain, pharmaceutical procurement or bed management. This is clearly more about selling apps to millions than providing demonstrable benefits.
For chatbots to improve, they need to access our medical records. But while the Department of Health and Social Care’s recent policy paper considers aspects of digital regulation including safety, security and confidentiality of medical data, industry will be reassured that it promises to not build services nationally if possible, and to not close off the market – instead to focus on ‘removing barriers’ to innovations’ uptake and spread.
Health secretary Matt Hancock has signalled he is content for legislation to catch up as technology is rolled out, but is putting the market above patient safety and confidentiality in his haste. I agree with Helen Stokes-Lampard’s warning of hype preceding the product.
Most hothouse innovations – and politicians – will fail to deliver on promises. The NHS must develop and deploy new technologies cautiously; using patients for market-testing of over-hyped or substandard medical products is not acceptable.
Dr Nick Mann is a salaried GP in Hackney