Skip to main content

Weighing the facts

The Audit Commission's local authority league tables are crude, dated and misleading - but welcome for all that. As schools know only too well, such bruising comparisons are rarely comfortable - or totally fair. But there are important democratic and pragmatic justifications for publishing them. Public organisations must learn to regard performance indicators as an aid to improvement rather than a statistical equivalent to the mediaeval stocks.

Councils exist first and last to serve the public. These indicators suggest the extent to which they do so at present is much too variable. It is no use telling those waiting for their grant or statement of special needs that the computer has broken, that staff have problems or that there has been an unseasonable peak in demand; it is about as acceptable as British Rail blaming the wrong kind of snow. It would be better to apologise, to promise to try harder next time and to mean it.

Every administrative or management system needs both clear targets to aim at and an incentive to achieve them. Because they lack a commercial profit motive, local councils need other measures of success in serving the public. Without some kind of objective measurement how can they even be sure themselves whether they have succeeded, let alone reassure the public which foots the bill?

Those who argue that "you don't fatten a pig by weighing it" or that quantitative indicators ignore quality are missing the motivational point. As our sagacious TES Agenda columnist, Joan Sallis, frequently points out, the effectiveness of the Weightwatchers organisation lies not so much in the healthy diets it promotes as in the public weighing.

All that may be true in principle, but it is equally fair to ask whether in practice the figures published this week by the Audit Commission will help to fight the bureaucratic flab? The commission's general aim is to promote economy, efficiency and effectiveness in public finance. But what, to apply the language of performance indicators to the work of the commission itself, are the specific success criteria for this week's exercise, and how well have these been achieved?

One of those success indicators must relate to the public element of the weighing. There is little point in publication if these figures are ignored by voters and consumers. Much of this information was already available to those who knew where to look for it. Not many did. In fact there is hardly a single previously unpublished fact in these tables since local councils were required by law to publish their own results in local newspapers last year. Hands-up those who noticed?

The innovation this week, then, is in bringing them together is a readily comparable form with an element of league-table razzmatazz. How much interest the public now takes will be one fair test of the commission's own effectiveness. It was wise, therefore, to carry out market research on which indicators seemed of most interest to the public.

But that is not the same as telling people what they most wanted to know. It is easier to imagine research or interest groups sifting these tables for ammunition than conversations in public houses or at school gates turning to the council's per capita spending on museums or the average time taken for its answerphone to frustrate a caller.

Even those pressure groups with an axe to hone will be disappointed by the familiarity or superficiality of some of the figures. Campaigners for more spending on education in Kent, for instance, did not need the Audit Commission to tell them how far below average that county is.

The commission has acknowledged its failure to snap a sensible picture of what is happening to discretionary awards. And nursery campaigners were predictably exasperated by the crudeness of information on three and four year olds. The failure to distinguish nursery and infant classes and part and full-time attendance seems calculated to reinforce those prejudices about the Audit Commission understanding the cost of everything and the relative values of nothing.

It was rather sharper on special needs, an area in which it has a firmer grasp thanks to earlier work. The result was six different indicators providing a baseline from which to judge some of the impact of the special needs code of practice as it begins to take effect. It should add another to these: the number of parents complaining to the new independent tribunal. This will help ensure that more rapidly produced statements are worth the paper they are written on and that the code is more than a paper exercise.

The danger of published performance indicators is that they focus too much effort on narrow targets and not enough on broader objectives. The answer to that is to make the indicators more comprehensive and useful. The challenge to the public service, then, is to keep the indicators in perspective, to treat them as tools, not punishments, and to help the commission improve them next time round.

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you