This site is intended for health professionals only


BMA warns GPs of AI risks as regulation currently in ‘state of flux’

BMA warns GPs of AI risks as regulation currently in ‘state of flux’
Getty Images

The BMA has warned GPs of the potential risks when using artificial intelligence (AI) while regulations are still in a ‘state of flux’. 

In interim guidance, the GP Committee told GP practices they must have ‘absolute clarity around the use of confidential patient data’ when using AI software.

It also said the GPC is pushing for ‘any necessary regulation’ around AI to take place ‘at a national level’ and for GPs to have ‘protections’ if these technologies will be adopted more widely.

The guidance stressed that Data Protection Impact Assessments (DPIAs) must be completed before any patient data processing considered ‘high risk’ occurs. 

GPs were also advised to ‘ensure they have appropriate indemnity’ and to use the Yellow Card Reporting System for AI technologies that are medical devices if their outputs ‘adversely affect’ patient care. 

The BMA has published this advice ahead of ‘more substantial guidance’ given the ‘renewed focus on the role of AI in general practice’ in recent weeks. 

Last week, NHS England published guidance promoting the use of ‘ambient scribes’, which also said GP practices ‘may still be liable’ for clinical negligence claims arising from the use of AI products. 

The GPC recognised the ‘importance evolving technologies’ can play in a GP’s daily work, especially as it is increasingly possible to integrate AI software with GP clinical systems. 

‘However, we feel it is important to make it clear that there are risks associated with the use of technologies, especially if they are to be considered medical devices, and appropriate regulatory approval must be in place before clinical use occurs,’ the guidance said. 

On patient data, it said: ‘It is important to have absolute clarity around the use of confidential patient data, where it is transferred, when being processed, and where it may later be stored, and if it is made available for secondary purposes.’

GPC IT leads also said: ‘In summary, practices, as data controllers, need to understand the risks they may be taking on if using such AI technologies, particularly at this early stage when the regulatory landscape is in a state of flux.

‘In the coming months we will be working with external bodies to ensure any necessary regulation occurs at a national level and that GPs have the protections  they require if these tools are to be adopted more widely, ensuring at all times that patients maintain their high level of trust in their GP.’

Earlier this year, GP practices in one area were warned against using AI without seeking approval from their ICB first.

However, GP leaders have voiced their concerns regarding the developing use of AI in general practice, with LMCs voting in favour of a motion last year which said that ‘only a doctor with full training and appropriate levels of experience will be effective to challenge an AI’.

And a medical defence organisation warned GPs not to use AI for complaint responses due to the risk of ‘inaccuracy’ or ‘insincerity’.


          

READERS' COMMENTS [1]

Please note, only GPs are permitted to add comments to articles

John Kilpatrick 6 May, 2025 8:57 pm

For anyone who’s interested, DPIA is necessary but not sufficient to use a new digital product in the NHS. Under current legislation, it would need a DCB0160 too (clinical safety workup), and if it’s contributing to the clinical decision-making it would need to be registered as a medical device with MHRA.