Mortality Investigations

Mortality Regulation

Our local DGH is one of the 14 somewhat arbitrarily singled out for investigation over death rates – or as the press puts it for needlessly killing hundreds of patients.  It does have a long history of high death rates and extensive investigations have taken place.

There is a lot of data available and anyone interested enough can plot the graph for the number of deaths in the hospital and the rate of death (ie dividing by the number of relevant procedures) over time and compare this with other hospitals in the Region and the average of all.  Whatever you choose, be it all deaths in hospital, all deaths in hospital and within 30 days of discharge or deaths for emergency admission, all the graphs show a progressive year on year improvement (with a few non significant fluctuations around the trend line).  But the rates are higher than other local hospitals and above the Regional average and have been for as long as reliable data is available.

The downward trend is obviously encouraging especially since there is evidence that the case mix complexity is worsening.  The gap between our hospital and the average is also closing – albeit slowly.

The issue of a high Hospital Standardised Mortality Rate (HSMR)[1] was raised by Monitor during the journey to authorisation but the explanations and assurances were accepted. Monitor again took an interest in the HSMR when there were issues around emergency care; even though at that time the HSMR was lower than at time of authorisation.  But the value of HSMH is contested especially since it depends on how hospitals code their data.

The only reliable way to measure whether there have been unnecessary deaths is to do proper expert case notes reviews[2], which includes talking to those involved where notes are not clear enough.  It is rarely mentioned that the Francis inquiry rejected such a study and relied on the large volume of reported observations of poor care.  This has not stopped the uninformed from making flawed generalisations from the statistics.

Back at our local DGH assurances were provided to Monitor in that there was a comprehensive regime in place to monitor deaths.  A retired senior consultant had been appointed as an associate NED and he carried out proper case notes reviews of deaths, including speaking to the clinicians involved. A committee which included patient representatives looked at mortality.   Steps had been taken to improve the administration of coding and the involvement of clinicians in coding.  But this took place alongside the continuing direct investigations into the quality of care being provided.

To address some of the concerns over HSMR a new measure of mortality the Summary Hospital (level) Mortality Indicator (SHMI)[3] was introduced, although it was still dependent on the accuracy of coding.  Our hospital did badly two years in a row and so once again came into the spotlight.

So far there has never been anything which explains why the rate of mortality is higher than average or why on some comparative measures the hospital does badly.  No proper independent investigation has unearthed any problems with care or with processes or systems.

Many explanations have been put forward by local commentators such as the absence of good local hospice provision, the demographics (with lots of old people), and more plausibly poor quality of primary care.  All are simply speculative and in theory the various mortality measures should factor these things in the mix.

We now have another blaze of publicity and an investigation.  I wait with interest to learn what comes out of the much hyped investigation – maybe a few observations of where improvements could be made – but I have a strong suspicion that nothing that is found will explain the underlying reasons.

If nothing of significance is found will there be some acceptance that crude statistical measures badly interpreted by the press are not always a good basis for judgements.

Steve Walker reckons there were no “excess” deaths in Mid-Staffordshire


[1] Summary Hospital (level) Mortality Indicator – produced by Health and Social Care Information Centre

[2] As an example a recent study for the DH of such methods see – Preventable Incidents, Survival and Mortality Study Where does it take mortality review? by Dr Helen Hogan Clinical Lecturer in Public Health – London School of Hygiene and Tropical Medicine

[3] SHMI takes all deaths in hospital and up to 30 days after discharge; HSMR is based on in hospital deaths only for a subset of common conditions.  Both SHMI and HSMR (attempt to) adjust case complexity and case mix as well as some contextual factors.