This site is intended for health professionals only


GPs need to be ‘careful’ of liability risks when using AI, expert warns

GPs need to be ‘careful’ of liability risks when using AI, expert warns

GPs using artificial intelligence (AI) need to be aware of liability risks and maintain their ability to ‘critically appraise’ the technology, a digital health expert has warned.

Speaking at the Pulse LIVE conference last week, Dr Annabelle Painter, an honorary digital health fellow at Imperial College and GP registrar, said GPs could be ‘the ones that hold the can’ when AI gets decisions wrong.

The session covered how AI might change the general practice landscape over the next few years, and what GPs need to do to harness it safely. 

Dr Painter urged attendees to become familiar with AI technology as that knowledge may become essential in future. 

‘People worry about clinicians being replaced by AI. I’m here to tell you that at least in the short term, clinicians are going to be replaced by clinicians who can use AI, and so you want to become one of them,’ she said. 

However, Dr Painter warned of the risks when AI software is ‘explainable’ – meaning clinicians can understand how it works – as this could lead to ‘automation bias’, where users trust technology because it is usually correct. 

Dr Painter said: ‘This is what I’m worried about: liability. If we as clinicians are in the loop of these decisions with explainable AI, that are really quite good, but not always good, and we’re subjected to automation bias, then we are still the ones that hold the can when that decision goes wrong. I think we need to be really careful about this.’

She added that there is ‘a lot of grey area’ around liability as there is no case law. ‘So we don’t actually know how this is going to land in terms of clinical liability and product liability,’ she said. ‘But I want you to all just be aware that you need to keep your ability to critically appraise.’

Patient safety should always be at the ‘forefront’ of clinicians’ minds when using AI in healthcare settings, Dr Painter argued. 

She said: ‘I think we just need to have our wits about us as clinicians and making sure that we’re always putting patient safety at the forefront, but retaining our ability to critically appraise outputs, and not just assume that this technology is either correct, or that it would be blamed if those decisions are wrong – you have to remember that is going to fall on you.’

Earlier this year, in Pulse’s final print issue, Dr Painter explored how AI could help primary care in future, and how GPs may become a ‘translator and adviser’ for patients, based on ‘AI-derived insights’. 

Last year, a study claimed that ChatGPT may produce more empathetic responses than doctors to questions from patients.

And Dr Painter told Pulse LIVE attendees that empathy was the area where Google’s AI software, the Articulate Medical Intelligence Explorer (AMIE), ‘outperformed’ clinicians the ‘most’.

An independent review into pulse oximeters recently concluded that immediate action is needed to address biases built into medical devices that impact ethnic minorities and increase health inequalities. 

The chair of the review warned that while the use of AI in these medical devices could ‘bring great benefits’, it could also ‘bring harm through inherent bias against certain groups in the population’.

The Labour Party has promised to make better use of AI and other technologies to bring the NHS ‘into the digital age’ and help relieve the workforce crisis. 

In October, the Government announced a £30m fund to speed up adoption of new health technology in the NHS, with the former health secretary Steve Barclay arguing that AI ‘has the potential to transform our healthcare’. 


          

READERS' COMMENTS [3]

Please note, only GPs are permitted to add comments to articles

Not on your Nelly 5 April, 2024 4:27 pm

AI or PA?

So the bird flew away 5 April, 2024 6:03 pm

Happy to work with AI if it comes box-size. I can use it as a door stop. But would I be liable if a patient tripped over it?

Finola ONeill 8 April, 2024 12:39 pm

‘People worry about clinicians being replaced by AI. I’m here to tell you that at least in the short term, clinicians are going to be replaced by clinicians who can use AI, and so you want to become one of them,’
Seems quite a stretch when none of us have any contact with AI use yet? It’s certainly what the government and the Labour party seem to want.
It’s driven by the private healthcare industry. GP surgeries, clinicians even procedures especially operations aren’t the bi buck.
That’s digital healthcare including AI, diagnostics and pharmaceutical industries.
If anything clinicians by using clinical skills reduce indiscriminate use of diagnostics and use them in a targeted and rational way.
The deliberate deskilling of clinical care; using less skilled and trained clinical point of contact and the drive and hype on diagnostics cf the community diagnostic centres is a drive towards a more high tech, low clinical knowledge model of care.

No analysis of which is more effective of cost effective; these new models, new clinical skillsets managing patients, emphasis on diagnostics and diagnostic facilities, emphasis on ‘digital solutions’ (none of which have even been presented or trialled-do they actually exist or ware they a wish list) is part of the drive to remove the clinician ie the doctor, a replace it with some kind of deskilled automated service.
It will be costly and ineffective; but that’s not the point is it.
The chase for GP records and access to them is a large part of this pursuit. GP records are the only useful, detailed, read coded set of patient data. They are going to need that to develop the AI tools on.
Bit if a dystopian world we are living in. Glad I’m far into my career. It’s all looking a bit grim to me.