This site is intended for health professionals only


MHRA asks GPs to report ‘all AI inaccuracies’ to Yellow Card scheme

MHRA asks GPs to report ‘all AI inaccuracies’ to Yellow Card scheme
robuart via Getty Images

The UK medicines regulator has urged GP practices to report all ‘adverse incidents’ and ‘inaccuracies’ with artificial intelligence (AI) tools used in clinical practice to its Yellow Card Scheme.

The Medicines and Healthcare products Regulatory Agency (MHRA)’s Yellow Card scheme, established for patients to report adverse reactions to medicines, has a specific reporting service for software and AI. 

The regulatory body specifically warned GP practices of the ‘risk of hallucination’ when AI tools are used for tasks like transcribing and summarising appointments.

GPs should only use tools which are registered as medical devices and to be aware of the error risk for tasks such as transcribing – and to report all adverse events or suspected inaccuracies, the MHRA stressed.

It said in a statement: ‘As with all AI-enabled tools, there is a risk of hallucination which users should be aware of, and manufacturers should actively seek to minimise and mitigate the potential harms of their occurrence.

‘We recommend that GPs and healthcare professionals only use tools which are registered medical devices which have been determined to meet the required standards of performance and safety.’

‘We strongly encourage that all suspected adverse incidents, including suspected inaccuracies are reported to the MHRA via the Yellow Card Scheme.’

However the MHRA also confirmed that to date, no adverse incident reports related to AI scribes had been identified in a search of its Yellow Card scheme database.

MHRA guidance defines ‘adverse incidents’ with software as ‘an event that caused (or almost caused) an injury to someone or affected the treatment or diagnosis one could receive.’ 

It lists as examples: imprecise results, inadequate quality controls, inadequate calibration, and false positive or false negative results.

MHRA Yellow Card software reporting

An adverse incident in relation to a medical device, is an event that caused (or almost caused) an injury to someone or affected the treatment or diagnosis one could receive.

Practical examples of problems with software include:

  • imprecise results

  • inadequate quality controls

  • inadequate calibration

  • false positive or

  • false negative results.

For self-testing devices, a medical decision may be made by the user of the device who is also the patient. This also includes devices where an indicative diagnosis may be given to a user who may go on to seek, or otherwise not seek, additional medical advice because of the device output. This may be of particular relevance to symptom checkers and transdiagnostic assessment tools for mental health conditions.

Medical devices including software can be found to be defective or have inherently unsafe design. If any of these could cause an adverse incident or have a safety concern, these can all be reported through the Yellow Card scheme.

Source: MHRA Yellow Card scheme

NHS England guidance on the use of AI-enabled ambient scribing products says more sophisticated transcribing software would likely come under the MHRA’s remit as a medical device.

It says: ‘Ambient scribing products that inform medical decisions and have simple/low functionality (for example, products that solely generate text transcriptions that are easily verified by qualified users) are likely not medical devices. 

‘However, the use of Generative AI for further processing, such as summarisation, would be treated as high functionality and likely would qualify as a medical device.’ 

The MHRA has published its own guidance on determining this. 

RCGP chair Professor Kamila Hawthorne said the RCGP was aware scribes may produce ‘inaccurate or fabricated details’ or ‘misinterpret the nuance of conversations’. 

She said: ‘GPs are always open to introducing new technologies that can improve the experience of patients and help cut the administrative burden, and an increasing number of GP practices are now using AI scribing tools to improve the quality and efficiency of their consultations.  

‘While these tools can offer real benefits, particularly at a time of significant GP workforce pressures, there are some important concerns – particularly around data security of sensitive patient records, data controllership and the risk of inaccuracies. 

‘We are aware that AI scribes can produce inaccurate or fabricated details, and that they can also misinterpret the nuance of conversations. It is important that clinicians review AI-generated documentation for accuracy before adding it to the patient record. 

‘The College is continuing to assess this issue with our Health Informatics Group. We would welcome further guidance for general practice from the Government ahead of their proposed technological updates to healthcare, as described in the 10-Year Health Plan.’ 

The Government’s 10-year plan for the NHS said it would support the rollout of AI transcribing tools in GP practices over the next two years. The plan used figures from ‘local trials’ that showed ambient voice technology (AVT) saved one to two minutes per GP appointment.  

It said the Government will undertake ‘a framework procurement process’ and support GP practices so that they ‘can adopt this technology safely’. Meanwhile, the BMA has warned GPs of the potential risks when using AI technology while regulations are still in a ‘state of flux’. 

GP practices have been advised to appoint a ‘clinical safety officer’ (CSO) who has two days’ training before using any medical transcription tools – amid confusion around NHS England guidance. 

A letter issued in June by NHS England further developed its guidance on the use of AVT and stated that ‘all NHS organisations must ensure that any AVT solutions being used meets the specified NHS standards’. 

An NHS England spokesperson told Pulse: ‘Ambient Voice Technology has the potential to transform care and improve efficiency, and the NHS has issued guidance to support its use in a safe and secure way.’ 

Portfolio careers

What is the right portfolio career for you?

Portfolio careers

          

Portfolio careers

What is the right portfolio career for you?

Portfolio careers