Intended for healthcare professionals

Editorials

An inspectorate for the health service?

BMJ 1997; 315 doi: https://doi.org/10.1136/bmj.315.7113.896 (Published 11 October 1997) Cite this as: BMJ 1997;315:896

Ofsthealth may be inevitable, but it needs to observe the science of quality improvement

  1. John Oldham, General practitionera
  1. a Manor House Doctors' Surgery, Glossop, Derbyshire SK13 8PS

    Imagine yourself being listed for a day case procedure, organising work and home around the date, arriving anxiously at the ward only to be turned away. The reason? Your body mass index does not meet that hospital's criteria. This happened last week to one of our patients. The clinical decision may have been correct but the process of care failed. It is the cumulative effect of such instances together with knowledge of the wide variation that exists in clinical outcomes for similar cohorts of patients1 2 that has led to political demands for an office of standards in health (Ofsthealth)—an inspectorate of health services linked to the development of national standards, similar to that which exists for education.3

    Inequalities in access to care, the experience of that care, and the outcomes associated with it are just as unacceptable as inequalities in resource distribution. The cost of poor quality is a haemorrhage of scarce resources, largely untracked and ignored in the cry for more money for the NHS. The ambition to place quality at the core of the NHS is, therefore, impeccable.

    Many will argue that this has already been done. Certainly directors of quality have sprung up in the last few years, and many useful initiatives have taken place. Equally, many schemes have faded rapidly and died. This, together with jargon from management consultants inviting hard pressed staff to “share the quality vision,” has devalued the currency of quality within the NHS. Meanwhile, the royal colleges have generated clinical quality standards and defended vigorously any territorial incursions. Management is left largely to focus on non-clinical issues.

    Such distinctions are irrelevant to patients. They experience and judge the whole process of care and all its elements. It is the interaction between the parts, not just the individual performance of one of them, that ensures the quality of the process. The system of quality management in the NHS is endemically fragmented—professionally, managerially, and organisationally. The failures and frustrations that have resulted have in turn led to the calls for an Ofsthealth.

    The process of care experienced by a patient has three outcomes: clinical outcomes, service or satisfaction outcomes, and cost outcomes. If we measure just one, what is the evidence that this gives us information about the others, or allows us to improve the process as a whole? In America, the performance based culture, publication of mortality rates has been abandoned by Medicare. It changed clinical behaviour in a way that was not in the best interests of patients—fostering, for example, an unwillingness to operate on higher risk patients.4 5 In Britain the measurement of finished consultant episodes (the NHS's measure of hospital activity) and associated costs gives no indication of how patients have fared or the appropriateness of the process that yielded the data: it too has generated perverse incentives. In these circumstances individuals' energy is directed to fulfilling the standard laid down for one part of the process, irrespective of the overall effect.

    During the past decade a growing body of evidence worldwide has shown that a combination of improved clinical outcomes, increased patient satisfaction, and decreased costs can be achieved by considering whole processes. Dramatic changes can be brought about in these measures, thus releasing the substantial resources that are used in providing poor quality. The organisational requirements are well documented.6 Briefly what is needed are clear aims, a culture and management system that permits data examination without fear, and rigorous statistical measurement of a care process across traditional boundaries. Such changes, and their beneficial effects, have occurred in places in Britain (such as Leicester Royal Infirmary), but those organisations are isolated pockets of enthusiastic activity.

    Therein lies the problem. We know that lasting quality improvement is best managed from the “inside out.” Currently this relies on organic growth stimulated by local innovators. How do we encourage the critical mass that ensures adoption by the mainstream in a way that matches the speed of public demand and political requirement? Do national standards and an inspectorate play a part?

    Isolated standards can distort behaviour and reduce the dynamics of quality improvement.7 Inspection systems can generate fear and a punitive climate, creating an unwillingness to examine data openly.8 Yet Ofsted, the inspection body for education, has certainly tapped a need in educational consumers and created a feeling of involvement and choice. In health, patients are increasingly able to access sources of information about their diseases, including clinical guidelines from the internet. The government rightly focuses on the experience of patients and their clinical outcome. The “pull” effect of this on quality is gathering pace and will increase when people realise the cost benefits of getting the process right first time.

    Do the professions need the catalyst of an Ofsthealth to move faster from the heritage of individualism?9 Do clinicians and managers need that encouragement to place the science of process central to their activity on quality? Perhaps so. For certain, we clinicians cannot operate in a vacuum, hiding statistical variation behind the shibboleth of clinical freedom and reinforcing the illusion that we are the sole arbiters of care.

    Footnotes

    • Dr Oldham is a general practice advisor to the NHS Executive, but the views expressed here are entirely his own.

    References

    1. 1.
    2. 2.
    3. 3.
    4. 4.
    5. 5.
    6. 6.
    7. 7.
    8. 8.
    9. 9.