NHS England has decided to pause a ‘ground-breaking’ project using GP data to train an artificial intelligence (AI) model, following concerns raised by GP leaders.
The BMA and RCGP said they were not aware that GP data, collected for Covid-19 research, was being used by NHS England to train an AI model, Foresight.
According to NHSE, the Foresight project represents ‘a ground-breaking AI initiative’ to ‘transform predictive healthcare in the UK’.
It is an AI model ‘specifically designed for healthcare’ that has been trained in the NHS England Secure Data Environment (SDE), a secure data and research analysis platform, on de-identified NHS data from approximately 57 million people in England.
Researchers working on the model said that Foresight is being trained on ‘routinely collected’, de-identified NHS data, ‘like hospital admissions and rates of Covid vaccination’, to predict potential health outcomes for patient groups across England.
NHSE added that ‘like ChatGPT’ the model learns to predict what happens next ‘based on previous events’, working like an auto-complete function for medical timelines, and that predictions ‘are validated against real-world data’.
But GP leaders said that it was ‘unclear’ whether the ‘correct processes’ were followed to ensure that data was shared ‘in line with patients’ expectations’ and governance processes.
The BMA and RCGP joint IT committee asked NHS England to refer itself to the Information Commissioner over this issue ‘so the full circumstances can be understood’. The committee also demanded for NHS England to pause ongoing processing of data ‘as a precaution’.
NHS England told Pulse that it has agreed to pause the project while their data protection officer ‘conducts a review’ and recommends whether ‘any further action’ is needed.
BMA GP committee England deputy chair Dr David Wrigley said: ‘For GPs, our focus is always on maintaining our patients’ trust in how their confidential data is handled.
‘We were not aware that GP data, collected for Covid-19 research, was being used to train an AI model, Foresight.
‘As such, we are unclear as to whether the correct processes were followed to ensure that data was shared in line with patients’ expectations and established governance processes.
‘We have raised our concerns with NHS England through the joint GP IT committee and appreciate their verbal commitment to improve on these processes going forward.
‘The committee has asked NHS England to refer itself to the Information Commissioner so the full circumstances can be understood, and to pause ongoing processing of data in this model, as a precaution, while the facts can be established.
‘Patients shouldn’t have to worry that what they tell their GP will get fed to AI models without the full range of safeguards in place to dictate how that data is shared, and we look forward to receiving these assurances from NHSE in writing as soon as possible.’
RCGP chair Professor Kamila Hawthorne said: ‘As data controllers, GPs take the management of their patients’ medical data very seriously, and we want to be sure data isn’t being used beyond its scope, in this case to train an AI programme.
‘We have raised our concerns with NHS England, through the Joint GP IT Committee, and the committee has called for a pause on data processing in this way while further investigation takes place, and for NHS England to refer itself to the Information Commissioner.
‘Patients need to be able to trust their personal medical data is not being used beyond what they’ve given permission for, and that GPs and the NHS will protect their right to data privacy. If we can’t foster this patient trust, then any advancements made in AI – which has potential to benefit patient care and alleviate GP workload – will be undermined.’
An NHS England spokesperson told Pulse: ‘Maintaining patient privacy is central to this project and we are grateful to the Joint GP IT Committee for raising its concerns and meeting with us to discuss the strict governance and controls in place to ensure patients’ data remains secure.
‘The committee has asked us to confirm that GDPR principles are being upheld at all times and we have agreed to pause the project while our Data Protection Officer now conducts a review and recommends whether any further action is needed.’
It comes after NHS England said GPs will soon be instructed to make anonymous patient data available to a ‘secure’ data sharing platform as it expands to cover non-Covid research.
NHSE wrote to GP practices last month about the expansion of the OpenSAFELY research platform, which it approved earlier this year.
Last year, a major Government-commissioned review found that establishing a central system allowing access to GP data for research was England’s ‘highest data priority’.
data cabal
“de-identified NHS data” We used to call it anonymised. The new word probably recognises that true anonymity in data sets (that are of any use) is impossilble: Date of Birth and post code has you nailed, even age and post code may be enough. Taking a name and most of the address, phone numbers and email addresses does little apart from making everyone feel they have dome something to de-identify individuals.
“Ground-breaking” AI project? More likely “law-breaking” AI project.
Especially laws of data protection, of consent, privacy and perhaps copyright….any others?
What’s the private sector involvement in this?