This site is intended for health professionals only

At the heart of general practice since 1960

Read the latest issue online

The docbot will see you now

Taking a check on the symptoms checkers

Dr Nick Summerton

Symptoms are tricky blighters. Often, it can be difficult to work out exactly which symptom a patient is worried about. Symptoms also have a variety of causes, and are less likely to indicate significant diseases in general practice than hospital settings.

Various AI companies are now coming to my rescue by developing online symptom checkers. These allow individuals to input information about their symptoms and be provided with advice on next steps. Some of these symptom checkers are using AI machine-learning techniques for data management and chat-bot functionality for user communication.

AI is a focused technology, and this is, on the face of it, at variance with the comprehensive and holistic nature of general practice. As GPs, we need to understand the patient’s context in addition to picking up social and psychological cues with empathy, care and compassion. Explicit knowledge about the predictive value of symptoms can be taught to a machine, but tacit knowledge, such as how to gain an individual’s confidence, might not be.

A symptom checker might help me to think of some alternative diagnoses. It might extract better information from my patients than I do, concerning their alcohol consumption or their sexual health. Also, as confirmed by Pulse’s scenario testing, symptom checkers might be quite good at spotting serious conditions.

But a key problem highlighted by the four experienced GPs engaged by Pulse to test out some symptom checkers was their tendency to be risk-averse. In most situations, a patient with shingles does not need to see me urgently, nor dial 999!

Symptoms are less likely to indicate significant diseases in general practice than hospital settings

Striking a sensible balance between missing important conditions and overwhelming the NHS is challenging, but it’s what GPs do. Building on the excellent work by Pulse, there are three questions that now need to be answered by those developing AI symptom checkers.

1. How has the symptom checker been trained?

All AI techniques depend on high-quality information on which to learn and classify clinical data in relation to outcomes. Based on an approach known as supervised learning, the programmer trains the system by defining a set of diagnoses for a range of symptoms.

Those designing AI symptom checkers always need to be aware of the importance of ensuring that they have been trained using information relevant to the population where the tool will be used.

Using data from hospital populations or hospital doctors to develop symptom checkers for our patients won`t work. As demonstrated by Pulse’s analysis, GP experience matters too.

2. Does the symptom checker consider a person`s overall risk profile?

In order to avoid missing lung cancer, a symptom checker might simply be programmed to triage every user to their GP if they report having had a cough for four weeks.

But if the person doesn’t smoke and is aged under 30, without any personal or family history of lung cancer and no other symptoms, then their risk for lung cancer is very small. In this situation, the advice provided by a symptom checker should be about stopping smoking or consulting a pharmacist, rather than seeing a GP.

3. Does the symptom checker consider how things change over time?

As some symptoms are self-limiting and others are evolving, those developing symptom checkers could easily mimic the GP approach and monitor some patients over time. This might apply to, for example, individuals with back pain or abdominal discomfort, where there are no red flags and they have a low overall risk profile.

Given the rapid growth in demand for healthcare, online symptom checkers are here to stay. But we need to be vigilant to ensure that they improve health outcomes without simply producing anxious patients, stressed GPs or inflicting a further burden on NHS finances.

Dr Nick Summerton is a GP in East Yorkshire and medical writer

Rate this article  (2.08 average user rating)

Click to rate

  • 1 star out of 5
  • 2 stars out of 5
  • 3 stars out of 5
  • 4 stars out of 5
  • 5 stars out of 5

0 out of 5 stars

Readers' comments (7)

  • Dear Nick, I gave you one as well (not that one!), but only because there is no option to give a minus for a cowardly RCGP type post.

    Unsuitable or offensive? Report this comment

  • Dear DecorumEst,

    The first `one` was actually a typo by me! So thank you for your comment - which I thought was a bit harsh!
    I always feel it is much better to try to find a positive way forward - as my piece (and also my recent BJGP editorial) undicates, these tools are far from perfect at assessing symptoms - but nor am I. The intelligent assessment of symptoms and signs is what I have been working on for years - and you might be interested in my book `Primary Care Diagnostics`. Machines can make mistakes and, after 31 years as a GP, I make lots of mistakes too.
    Courage is also about using your own name to express views and opinions!!
    Nick

    Unsuitable or offensive? Report this comment

  • You cannot use your own name with and establishment and country which are looking for scapegoats round every corner.AI needs to be regulated and have a repsonsible officer who will carry the can when something goes wrong.Afterall when we make mistakes complaints/NHSE?GMC etc.Informe informe they all have it informe as Kenneth Williams once said.

    Unsuitable or offensive? Report this comment

  • Dear Nick,
    I apologise if I have offended your sensibilities, my comments were not personal.
    I was unaware that you had written a’recent BJGP editorial’.
    I note that your average user rating for this article remains at ONE (maybe time for some reflection?).

    Unsuitable or offensive? Report this comment

  • The piece comes across as a bit random and vague. We are already tormented by far too many protocols, tick boxes and templates that divert our gaze away from the person sat in front of us. Maybe AI of the future will be able to pick up the nuances and develop the intuition a human GP does well. In the meantime the protocol-driven health care by our 'associated' colleagues is already here and is not very good (artificial unintelligence).

    It would also be prudent for every contributor to declare interests. More than just a GP and medical writer...

    Unsuitable or offensive? Report this comment

  • Not sure I was reading the same article as everyone else but I actually thought this was a pretty balanced view

    Unsuitable or offensive? Report this comment

  • Thanks very much Shaba!

    Unsuitable or offensive? Report this comment

Have your say