This site is intended for health professionals only


AI is not the write stuff

AI is not the write stuff

Copperfield on the demise of his relationship with AI transcription tools in general practice

I recently had an epiphany. Someone showed me an AI transcription software which records the consultation and, at the touch of a button, converts that chaos into clear and sensible written notes in the patient’s record.

Wowzers! This was like watching my brain download onto a computer: a gamechanger. No more laborious typing up! Less staring at a screen!! More time, more headspace!!! I immediately decided to turn revelation into revolution.

In fact, I was so stunned by this software wizardry that I agreed to undertake a Data Protection Impact Assessment, which is what you’re supposed to do rather than just plug it in. This involved long online meetings, endless spreadsheets, and phrases like ‘data subject’, ‘purpose drift detection’ and ‘default-deny architecture’ – which was all way less fun than it sounds. But it didn’t matter, because a bright new cutting-edge, drudgery-free future beckoned.

So I’ve been using this consultation transcription tool for about three months now. And there certainly was a honeymoon period. It analysed and recorded consultations brilliantly. It gave me a choice of format and length. It would even generate referral letters automatically. OK, occasionally it hallucinated, but what’s that amongst friends?

I was besotted. Until, that is, I started having my first review appointments with patients where I’d used AI transcription in the previous consultation. Then, something weird happened. The notes suddenly meant nothing to me. They recorded facts but I couldn’t recall the consultation at all. The patient might as well have seen a locum: I simply didn’t recognise my input in the record.

So transcription seems to do a terrible thing: it drains the notes of my sense of ownership. All those nuances, insights and coded messages I routinely incorporate are lost in translation. I’m getting the thousand words but I’m not getting the picture.

I’m sure this is fixable. Just as ChatGPT can be directed to a chosen style, no doubt transcription software could do the same, eg. in my case: ‘Please adopt a highly cynical tone with the starting assumption that there is nothing wrong with the patient but leaving the door open to a brilliant and obscure diagnosis’. Are you listening, Big Tech? (of course you are, your LLMs are gobbling this up quicker than I can type).

In the meantime, I may well switch off the software. And for those of you wondering if there’s a metaphor buried in here somewhere, well, apparently there is. Because I asked ChatGPT, and it said that replacing the art and wisdom of traditional GP note keeping with AI transcription is a bit like swapping frontline GPs for noctors. Two differences, I guess. One is that they’re not artificial, and the other you can work out for yourself.

Dr Tony Copperfield is a GP in Essex.


          

READERS' COMMENTS [5]

Please note, only GPs are permitted to add comments to articles

David Church 23 April, 2025 6:30 pm

I am intrigued that on landing on the webpage for this, it takes only 2 seconds for Pulse to analyse me and determine that I am indeed human. It can do this even with a post-it note covering the camera on the laptop.
Why, then, do I still need to login? Can the ‘human-detection’ software not be tweaked to detect that I am a Doctor, and indeed, that I am ME, rather than Damien the cockerell or Victoria the chicken sat on my desk tapping at the mouse-pad?

Tj Motown 23 April, 2025 11:05 pm

I had almost the exact same relationship with it. It doesn’t put any flavour or thought into the notes. When I write “had cornflakes for breakfast” it’s different to “Patient eating well” or whatever the AI tries to smarten it up as. You can’t “read between the lines” at all, which is the beauty of general practice notes (and your blog of marvellous double entendre). I do fear a little for the registrars who will never learn how to write notes that we can “read between the lines of”. Maybe now the patients can see what is written on their app, this will have to go anyway. Woe is me.

Douglas Callow 24 April, 2025 8:32 am

Spot on analysis
Actually quite hard to decipher what’s ‘gone on’
I still prefer digital dictation without processing because if done well its an effective care record/easy handover of care for someone else if patients presents again
I am a bit of a lone voice though as most GP colleagues in the practice love Heidi health

J A 26 April, 2025 9:35 am

I still like to write my impression as that conveys my thoughts and provides a true aide memoire. However, I still like to use AI transcription for the bulk of the notes as it forms a robust medicolegal and accurate depiction of the consult. With patients having access to their notes I think this is a useful and important addition.

Jonathan Heatley 29 April, 2025 12:46 pm

111 should be replaced by AI if my experience of using it is anything to go by. There is no long irrelevant guideline path of negative findings and it responds immediately. Its like talking to a very clever, tolerant and patient doctor rather than a guideline functionary.