Primary care experts have called for reporting of GP practice annual cancer diagnosis rates to be scrapped, after their research showed the figures were misleading.
The researchers found that variations between GP practices were down to what case-mix of cancer presentations practices had to deal with in a given year rather than GPs’ actual performance – and that 80% of practices are currently wrongly rated as ‘poorly performing’.
The annual cancer detection rate – the number of cancers picked up via the urgent two-week-wait pathway – and conversion rate – the number of urgent cancer referrals that prove to be cancer – are currently published for each practice on the National Cancer Intelligence Network, and used by NHS managers to monitor how well practices are performing against national averages.
Wide variation in these measures for some practices has led to criticisms of GPs for delaying cancer referrals – and threats from Government to ‘name and shame’ individual GPs for not referring patients soon enough.
However, Dr Peter Murchie, senior lecturer in primary care, said the data were ‘meaningless’ because each individual practice has only relatively few cases each year and a random mixture of presentations to deal with.
Dr Murchie and team looked at the routinely collected data from over 8,303 practices in England over the four-year period from 2010 to 2013.
Previously, research had demonstrated practices’ detection and conversion rates taken over a year-long period went hand in hand – so that practices with higher detection rates, indicating greater accuracy, also had higher conversion rates, or greater efficiency, in detecting cancer.
However, the Aberdeen study found that when the data were looked at over the longer four–year period, the two figures no longer correlated so well – suggesting that most of the variation observed was down to the case mix, or how difficult the cancers were to detect in the first place.
The team then ran a computer simulation and found that for an average medium-sized practice with around 25 cancer cases a year, the current annual measures would wrongly identify 80% of practices as ‘poorly performing’.
Dr Murchie told Pulse this meant the figures were ‘meaningless’ and should be stopped.
Dr Murchie said: ‘The annual performance data based on detection and conversion rates is meaningless – it doesn’t really tell you anything about how well GPs at different practices are picking up cancer.
‘It should be stopped because it undermines the confidence of GPs and the confidence of patients in the profession and there’s a better way to do it, surely. We just need to find that.’
His team suggested that alternative methods of monitoring GP performance, such as confidential case reviews of delayed diagnoses, should be looked at instead – and would be more likely to lead to constructive changes in practice than the current ‘crude approach’.
Dr Richard Roope, a practising GP in Hampshire and cancer lead at Cancer Research UK, which collaborates on the National Cancer Intelligence Network reporting with Public Health England and NHS England, said: ‘Cancer isn’t always easy for GPs to identify, and this research highlights the complexities involved and why sets of data don’t always give the full picture.
‘We welcome further research into which cancers tend to be more difficult for GPs to spot which will help us learn how we can support GP practices more effectively.’