This site is intended for health professionals only

Why ChatGPT will never get a hug

Why ChatGPT will never get a hug

A recent study claimed Chatbots such as ChatGPT have better bedside manners than doctors. But GPs’ actual emotion intelligence is one of our major strengths, says columnist Dr Zoe Rog

Last week, a patient hugged me at the end of our consultation. I don’t usually encourage this, and as one of our pharmacists was sitting in with me, I felt obliged to explain.

The patient and his mum were among my first patients when I started in the practice as a GP trainee. They lived together and were devoted to one another. One day the patient phoned, sobbing and distraught that his mum had been admitted to hospital. Despite initial improvement, she had fallen, banged her head and now seemed confused. He was terrified, and I knew that he was vulnerable. I also knew that she was taking warfarin.

Moved by his distress, I felt I had to do something. Luckily, as a trainee, time was on my side. I visited the hospital with him to see his mum; I then approached the nurse looking after her to explain my worries about the warfarin, the head injury and the confusion. A few days later, my patient rang the surgery to tell me how grateful he was.

After I visited the hospital, my patient said the nurse had called a doctor back to see his mum. They had stopped her warfarin, arranged a scan and found a small subdural haematoma. She subsequently made a good recovery and was discharged home. To this day, I have no idea whether my visit made any real difference to her medical care, but my patient and his mum were delighted that I had made the trouble to go.

I stayed at my training practice and continued to see them both for quite a number of years before his mum eventually became frailer and died a peaceful, expected death. For my patient, it was an extremely difficult time; he required a lot of support to adjust to being without her. He hugs me when he sees me because he knows I cared for his mum, and that I understand how much she meant to him.

I saw the headline in recent weeks that read ‘ChatGPT gives more empathetic responses to patient queries than doctors’. I felt quite indignant when I read this, because if there is one thing that GPs should be celebrated for, it is their capacity for empathy. We discuss it at length with trainees in tutorials, they are judged on it in their exams, and a lot of the glowing feedback received by my GP colleagues is about the empathetic way in which patients have been treated.

We often make deeper emotional connections with our patients than we realise. A few years ago, a teenage girl made an appointment to see me. She had done so, she told me, because I had been to visit her family quite a few years earlier when her grandad was dying, and the family had never forgotten my kindness. She said she felt she could trust me to listen and understand how she felt. It was the first time I truly realised that even when we feel helpless, faced with a dying patient we can’t save and the terrible family grief we can’t relieve, simply conveying our empathy can be an incredibly powerful way to help that family.

When I read the article about ChatGPT in further detail, I realised that it was responding to online consultations. On reflection, it would be quite welcome to answer my e-consultations, because I hate the sterile format, lack of interaction and dearth of non-verbal information. Perhaps, in time, AI could fill its boots with form filling, results checking, routine prescription issuing and titration of medication doses. As Canadian physician Sir William Osler once said, ‘The good physician treats the disease; the great physician treats the patient who has the disease.’

We should be encouraging artificial intelligence to become really great at monotony, leaving GPs free to demonstrate how skilled they are at managing the interaction of disease with the fascinating complexity of human beings – and with enough time to show patients the empathy they deserve.

Dr Zoe Rog is a GP in Runcorn, Cheshire



Please note, only GPs are permitted to add comments to articles

Keith M Laycock 25 July, 2023 7:25 pm

Completely agree.
And empathy is not transmittable via a computer monitor.
The demise of ‘House Calls’ further adds to loss of the ‘Art’ component in / of medical practice.
As described, the benefit of GP simply taking the time to visit a hospitalised patient emphasises the physician’s involvement and concern that would be difficult to replicate in any other way.
This view is probably based on having qualified in 1968.

Some Bloke 26 July, 2023 12:28 pm

empathy is not measurable or auditable. you can not write some statistics about it and call it research.
whenever I refer to PCLS, I am sure patient will come back dissatisfied but with ten pages of measurable and auditable crap, but I have to refer for variety of reasons. main reason why all our hard work remains invisible is because it rarely leaves sufficient data trace. Patient coming back dissatisfied from MH assessments would have tens of things coded, by comparison my appointment may have one or two sentences. It doesn’t matter to NHSE that patient values and prefers my opinion, for what they can see- I’ve done nothing good, compared with pages of nonsense that PCLS produce.
AI will replace us. the only question who will be allocated blame when things go wrong (not if, but when). We are still needed to take blame and to be sued, kind of awkward or impossible to fight Googles and Microsofts of this worls, much more convenient to blame your friendly local GP. But once this is sorted- AI is moving in. suspect within time of our working careers.

Gary Parkes 28 July, 2023 8:56 pm

ChatGPT – Defend the idea that AI in general practice in the uk is better at empathy with patient contact than real human general practice doctors
Response from ChatGPT
While AI has made significant strides in various fields, asserting that AI is better at empathy with patient contact than real human doctors in general practice in the UK is challenging to defend. Empathy is a deeply human trait that involves understanding and connecting with patients on an emotional level, which is something AI lacks.

AI lacks the ability to truly comprehend emotions and emotions, making it difficult for them to display genuine empathy. Although AI can analyze vast amounts of data and provide useful information, it cannot replicate the human experience or emotional intelligence that is essential in medical care.

In contrast, human doctors have years of medical training and hands-on experience that enable them to understand the complexities of a patient’s condition, as well as the emotional and psychological aspects of their well-being. Human doctors can empathize, show compassion, and establish meaningful relationships with patients, which has a positive impact on patient outcomes and overall satisfaction.

While AI can assist in certain areas of healthcare, such as diagnostics and data analysis, it cannot replace the human touch and emotional connection that human doctors bring to patient care. AI should be viewed as a valuable tool to complement and enhance medical practices rather than a replacement for the unique and essential qualities that human doctors possess, including empathy.

Q. Defend the idea that human doctors in general practice in the uk are better at empathy with patient contact than AI

Human doctors in general practice in the UK are better at empathy with patient contact than AI because they possess a deep understanding of human emotions, social cues, and cultural sensitivities that enable them to connect with patients on a more personal level. They can read subtle non-verbal cues, show genuine compassion, and tailor their communication style to suit individual patient needs, which creates a more trusting and comforting environment for patients. Additionally, human doctors can draw from their own life experiences and emotions, allowing them to relate to patients on a human-to-human level, fostering a stronger therapeutic alliance. While AI can assist in various aspects of healthcare, it lacks the emotional intelligence and intuition that are crucial for building strong empathetic connections with patients.