This site is intended for health professionals only

At the heart of general practice since 1960

Read the latest issue online

The docbot will see you now

How artificial intelligence is changing the GP-patient relationship

The NHS, the health secretary and a number of private companies are promoting AI as the gatekeeper of the future. Léa Legraien and Emma Wilkinson ask whether we are putting too much faith in tech

‘Alexa, what are the early signs of a stroke?’

GPs may no longer be the first port of call for patients looking to understand their ailments. ‘Dr Google’ is already well established in patients’ minds, and now they have a host of apps using artificial intelligence (AI), allowing them to input symptoms and receive a suggested diagnosis or advice without the need for human interaction.

And policymakers are on board. Matt Hancock is the most tech-friendly health secretary ever, NHS England chief executive Simon Stevens wants England to lead the world in AI, and the prime minister last month announced £250m for a national AI lab to help cut waiting times and detect diseases earlier. Amazon even agreed a partnership with NHS England in July to allow people to access health information via its voice-activated assistant Alexa.

Little surprise then that private developers see now as a good time to develop AI to guide patients through their various ailments. Babylon last month announced a £450m R&D investment, partly for AI technology to manage chronic conditions, while the likes of Ada and Your.MD also offer patients the chance to check symptoms. This is on top of the NHS App’s own symptom checker.

Yet the evidence in support of algorithms – and AI – is still lacking, and a Pulse analysis has shown potential drawbacks, such as overreaction to mild conditions and potentially unsafe advice.

If patients are using symptom-checker app I would worry the app will give a false positive. Even worse, a false negative

Dr Rebecca Fisher

Dr Rebecca Fisher, a GP and senior policy fellow at the Health Foundation says: ‘If patients are using symptom-checker apps, I would have two main worries. The first is that the app will give a false positive, with the risk that the patient becomes anxious and also potentially generates unnecessary use of NHS resources.

‘Even worse, there a risk of an app giving a false negative, meaning you might not seek help you actually need.’

Dr Nick Mann, a London-based GP with an interest in AI, says he is already seeing this sort of impact: ‘People will come in with headache and be convinced they’ve had a brain bleed whereas I know, talking to them, they haven’t.

‘I’ve had a lot of requests in the past couple of years, which I never used to have, from people wanting investigations for symptoms they have diagnosed on Google, which are inappropriate.’

With this in mind, Pulse tested some of the available symptoms checkers. We found the apps were successful in offering appropriate advice in the case of a heart attack, but problems also emerged. In one case, a 26-year-old female with acute pyelonephritis was told her condition would clear up on its own.

Dr Roger Henderson, a sessional GP who is also medical director of Liva Healthcare, a digital healthcare company that supports the management of patients with diabetes and who tested the apps for Pulse, says: ‘In this tiny snapshot there are worrying features where everyday complaints were marked as emergencies and potentially severe ones were underplayed.

‘Symptom checkers use a linear algorithm approach and depend on the information provided to them, rather than being able to follow the more nuanced process that GPs use. It is this black-and-white computer reasoning that causes problems, since diagnosis tends to be shades of grey in the real world.’

He says the fact that symptoms checkers encourage people to include all symptoms to give the fullest possible picture can lead to anxiety: ‘If you give a patient a range of diagnoses ranging from minor to very serious, it is natural to focus on the serious even if this is incorrect, causing worry and anxiety.’

This black-and-white computer reasoning....causes problems, since diagnosis tends to be shades of grey in the real world

Dr Roger Henderson

Lincolnshire GP Dr Phillip Williams, who also tested the apps for Pulse, agrees patients don’t always present as textbook cases. ‘Often real patients don’t present with the symptoms we think they should. As these apps become more sophisticated, they may flag key symptoms which aren’t on our radar. For example, we’re taught motor neurone disease presents with fasciculations, whereas, in real life, a common first symptom is fatigue.’

The shortage of relevant research is a problem for many GPs (see box). Dr Benjamin Brown, a senior academic GP and health informatician in Manchester, says: ‘The NHS should only bring in routine care systems that have an evidence base. In the case of model-driven triage, the models may be too conservative. I have anecdotally heard that one of the well-known providers modified its algorithms over concerns about patient safety, which resulted in it sending many more patients to A&E.’

Perhaps the highest-profile patient-facing algorithm is NHS Pathways, used by NHS 111. A 2013 study found NHS 111 increased emergency and urgent care activity by 5-12% each month, while emergency ambulance incidents rose by 2.9%.1

Is there any evidence to support AI in healthcare?

A 2013 study by the University of Sheffield1 revealed that NHS 111 increases ambulance and urgent and emergency care use. It looked at 400,000 calls, including 277,163 triaged using NHS Pathways, and found emergency ambulance incidents rose by 2.9%. It estimated this could mean an additional 14,500 call-outs for a service attending 500,000 incidents a year. In addition, emergency and urgent care activity rose by between 5-12% per month.

The study concluded: ‘The findings reflect the inherent characteristics of the NHS Pathways system such as the levels of caution and risk built into the assessment algorithms, particularly as it is designed to be used by non-clinical call handlers. There may be less flexibility to change decisions compared with assessments made by nurses and it is possible that a different call assessment system could produce different results.’

A 2015 evaluation by Harvard Medical School3 found 23 symptom checkers for self-diagnosis provided the correct diagnosis first in 34% of 45 standardised patient evaluations, listed the correct diagnosis within the top 20 diagnoses given in 58% and provided appropriate triage advice in 57% of cases. It said: ‘Overall they had deficits in both diagnosis and triage accuracy. The risk-averse nature of symptom checkers’ triage advice is a concern. In two-thirds of evaluations where medical attention was not necessary, we found symptom checkers encouraged care.’

• A 2017 evaluation by NHS England4 found patients had a very good experience of triage and assessment tools including the digital version of NHS Pathways in West Yorkshire (web interface), Sense.ly system in West Midlands (voice-activated avatar), Expert 24 in Suffolk (web interface) and Babylon in London. As a result of their use, fewer people were directed to primary care services and more turned to self-management than from NHS 111.

A 2018 study by Babylon5 showed the company’s triage and diagnostic system was able to identify patient conditions modelled by a clinical vignette with accuracy comparable with doctors’, in terms of precision and recall, and was on average safer than doctors. The findings, based on the MRCGP examination, showed above-average pass marks. Yet the paper was not peer reviewed, and the research team included Babylon employees.

NHS England has introduced more clinicians into the call centres but, according to 616 GPs surveyed by Pulse, an average GP still receives around six inappropriate referrals from NHS 111 a month – totalling more than three million a year. Anecdotally, GPs say they are still seeing patients referred to them for dental problems. And last month, a coroner said the lack of flexibility within the algorithm should be addressed following the death of a 17-year-old boy, whom the coroner said may not have understood what he was being asked.

Harry Longman, founder of Askmygp – an online triage and consultation tool for GPs – says: ‘We don't use any AI or algorithms to triage automatically, we have tried that and found it doesn't work. Many questions were irrelevant or difficult for patients, and the resulting output was not that helpful for clinicians.’

The Medicines and Healthcare products Regulation Agency says if an app is intended to influence treatment or results in a diagnosis or prognosis including future disease risk then it is a device and should obtain a CE mark before use. New EU rules, taking effect next year, will introduce more stringent requirements for device manufacturers.

We don't use any AI or algorithms to triage automatically, we have tried that and found it doesn't work

Harry Longman

But, as Professor Brendan Delaney, chair in medical informatics and decision- making at Imperial College London, puts it: ‘The letter of the regulation is fine, but it relies on developers to self-certificate and register - which is OK, provided entry to the market place is actually policed and purchasers insist on CE marking.’

There are positives. AI is being developed to help target patients for screening, and help doctors make decisions – uses few would argue with.

And the Topol Review2 commissioned by Mr Hancock to explore how the healthcare workforce will ‘deliver the digital future’, concluded that ‘early benefits of AI and robotics will include the automation of mundane repetitive tasks that require little human cognitive power, improved robot-assisted surgery and the optimisation of logistics.’ This would allow the workforce to focus on ‘interaction and care’.

However, the first signs are that AI will, at best, increase GP workload. It might be time for the Mr Hancock to review his championing of this new technology.

Dr Fisher says: ‘My patients often need a safe space to feel listened to so I don’t think AI is going to be a replacement for a clinician. It’s more of an add on to us.’

References

1 Turner J et al. Impact of the urgent care telephone service NHS 111 pilot sites: a controlled before and after study. BMJ Open 2013;3:e003451

2 Topol E. Preparing the healthcare workforce to deliver the digital future. Health Education England. February, 2019.

topol.hee.nhs.uk/

3 Semigran H. Evaluation of symptom checkers for self diagnosis and triage: audit study. BMJ 2015;351:h3480

4 NHS England online evaluation, December 2017.

5 Razzaki S et al. A comparative study of artificial intelligence and human doctors for the purpose of triage and diagnosis. June, 2018

 

Readers' comments (14)

  • Azeem Majeed

    Thanks. A very informative article on a topical area.

    Unsuitable or offensive? Report this comment

  • Yes I'll think you find Babylon marked their own exam paper as it were..

    Unsuitable or offensive? Report this comment

  • Eventually, after many years and lots of investment, it might be almost as good at diagnosing problems as an actual GP.
    In the meantime the high rate s of false positives will generate excessive anxiety and workload, and the false negatives will generate excess morbidity and deaths. Responsibility for these must be built into any AI companies contracts, and perhaps even bind the Secretary of State personally: in the form of how they will respond to corporate manslaughter charges and compensation claims. We can’t allow them to in effect experiment by investing so much in unproven technologies when people’s lives are at stake.

    Unsuitable or offensive? Report this comment

  • Back to the old triad: Quality, Quick and Affordable.
    You can’t have all three (and sometimes can only have one).

    So we have been given NHS24/111, which is CHEAP but of UNRELIABLE PROMPTNESS and is UNIVERSALLY SHIT.

    Unsuitable or offensive? Report this comment

  • Nhsfatcat

    Imagine the post: Sad, compo-face, tilted for effect. 'I contacted my DigiDoc 10 times, I know my body, I know I needed antibiotics.'

    Unsuitable or offensive? Report this comment

  • Sounds like AI is hopelessly unregulated. Driving GP use at this time of shortage is irresponsible. It sounds like there needs to be a special register of healthcare AIs like we have the GMC and their behaviour and systemic effects should be tightly monitored. Their proportion of their costs to the NHS should be paid for by a special AI tax to ensure they are operating on a level playing field and not being subsidised by the wider healthcare economy.

    Unsuitable or offensive? Report this comment

  • I like Sensible Doc's idea, and we should take it further.
    Like a Doctor, the AIs should have to pass written and oral examinations (and ideally also be able to perform necessary physical examinations) in order to be allowed to practise.
    They should then be able to pass MRCGP level exams in order to be able to practise in a GP setting.
    They should then be subject to full GMC regulation including annual appraisal and revalidation. They should be able to reflect and critically appraise their own performance. Only if they can do all that, they should be allowed to practise medicine.
    If they can't do that, they aren't really AIs at all, they are cleverly dressed-up decision trees, and they have been around since the dawn of medicine, and shouldn't be used to replace the subtlety and insight of the human element.

    Unsuitable or offensive? Report this comment

  • The patients come after they've consulted Dr Google "just to be sure". A waste of resources and I have to use more time to answer queries that would otherwise not be present and investigate more.

    Unsuitable or offensive? Report this comment

  • |DT | GP Partner/Principal|02 Sep 2019 5:13pm
    I like Sensible Doc's idea, and we should take it further.

    Pft, would you be willing to be taxed more to pay for all that??? Or for more workers to be moved from the private sector/wealth generator to run the bureaucracy of AI regulation??? That's the exact opposite of the solution - which is smaller state, greater individual responsibility AND freedom. Let the individual strive to meet their responsibilities in life, be able to make their own healthcare decisions directly, and pay for their own use.

    Unsuitable or offensive? Report this comment

  • That docbot pictured at the top of the article is either naked or wearing "The Emperor's New Clothes". Is that allowed? and don’t get me started on the vinyled finger.

    Unsuitable or offensive? Report this comment

  • Vinci Ho

    Look , something is missing philosophically here as far as these arguments on AI development and algorithms (hence, protocols) are concerned.
    Fundamentally , it is about interpersonal relationship opposing technical connection . Why do we use the terminology ‘doctor-patient relationship’ instead of ‘doctor-patient connection’ ? Face to face consultation followed by physical examination before sharing a diagnosis and decision with your patient is a form of human touch . We have been called ‘healing hands’ instead of ‘healing machines’ . That is the reason why medicine is an art but not science.
    While I would never deny the success of evolving technologies making our lives ‘better’ , they are not there , however, to replace human beings. The quintessential question is , ‘what is primary ; what is secondary?’
    The outcomes of NHS 111 we have witnessed as well as the controversy over ‘cherry picking ‘ by Babylonians , epitomise the shortsightedness of this government run by successive leaders of a party fraught with utilitarianism( but also undergoing implosion due to Brexit) . Of course , there was also a backdrop of austerity after the financial crisis in 2008 . Quantity was virtually ‘better’ than quality . ‘Cost-effectiveness’ becomes more important than safety .But as time goes by , this would only become an ‘excuse’ rather than true reflection of the mindset of our politicians.

    Unsuitable or offensive? Report this comment

  • all these algorithms are the beginnings of replacing GPs altogether: Dr Babyloogle says you need to: take some paracetamol and review tomorrow, see nurse practitioner, see ENT specialist, see physio/dietician/pharmacist, on-line CBT etc etc etc.
    I'm not worried about my job -GPs will just morph (if they can pass some proper exam -ie not mrcgp) into the elderly medicine consultants and psychiatrists which themselves are endangered species

    Unsuitable or offensive? Report this comment

  • So now we have Amazon giving medical advice. I wonder who is better, Google or Amazon? Will Microsoft be designing a Dr too?
    This is ridiculous, dangerous and bloody annoying. These tech profiteers have no 'business' meddling in Health.

    Unsuitable or offensive? Report this comment

  • i was on a plane back from holiday yesterday. someone behind me collapsed and the stewards called out 'is there a Dr on
    board?' i obliged. no one had the faintest idea what to do - the captain asked me via one of the stewards if we should divert the plane. after 30 seconds assessment it was clear they'd taken too much valium. in reassured them we could go in with the flight. with 30 mins of oxygen they were right as rain and chatting. screw AI - would have been utterly useless in this situation. out of about 300 people on board i was the only one that knew what to do, that was simple human experience, if i hadn't been there the plane would have had to land early. AI may have its uses but in real life as far as i'm concerned it has a long long way to go.

    Unsuitable or offensive? Report this comment

Have your say