Domesday’s child gets a mixed reception

31st March 1995, 1:00am

Share

Domesday’s child gets a mixed reception

https://www.tes.com/magazine/archive/domesdays-child-gets-mixed-reception
The Audit Commission’s weighty analysis of local authority performance was published this week. Susan Young opens a four-page report which includes the education indicator tables.

Obsessives desperate to discover how long it takes to mend a dripping council-house tap in south Bedfordshire, the length of time taken to answer a letter in west Wiltshire, or the number of student grant cheques sent out on time in Westminster (answer - none) will find hours of happy reading in the first-ever Local Authority Performance Indicators, published yesterday by the Audit Commission.

But for ordinary members of the public, the usefulness of the latest addition to the Citizen’s Charter initiative has yet to be ascertained.

Established under the 1992 Local Government Act, the indicators must now be published each year with councils obliged to provide information for comparison.

Strong suspicions remain that the exercise was originally conceived as a political stick with which to beat local authorities. But if that was the case, then ministers underestimated the determination of the Audit Commission to retain its reputation for accuracy and impartiality, and the determination of the councils to be judged on a wide variety of information, indicating policy as much as efficiency.

The commission’s original plan was to produce a main report for the general public, with a limited number - around 20 - of the indicators which an opinion poll suggested would be the most interesting and the rest contained in an appendix. It also suggested a league-table format for some indicators.

After protests from councils, plans were amended so that the report is being published in three volumes, plus an inch-thick, A4-sized appendix crammed with individual tables. League tables have been abandoned in favour of bar charts. A commission document explains: “The term ‘league tables’ necessarily implies some judgment of ‘better’ or ‘worse’. By contrast, the bar charts will simply make clear statistical variations in authorities’ performance, leaving it to the public to judge what is good or poor performance.”

Unsurprisingly, ministers were apparently infuriated by the new approach, but the findings are unlikely to provide much political ammunition for any party. No authority performs spectacularly well or badly overall and the usual “flagship” boroughs, although turning in respectable performances, are hardly beacons of excellence. No political party appears to shine particularly brightly in local government, either - which should appeal to the many at Westminster who consider such performance indicators A Good Thing.

In any case, many of the indicators chosen for education are indicative of policy and external factors rather than efficiency, and comparisons may depend on the person doing the comparing.

Two straightforward measures of administrative efficiency in education are which authorities processed statements of special educational need within six months, and which got mandatory grant cheques out to students by the cut-off date. However, Saxon Spence, chair of the Council of Local Education Authorities points out that both operations depend on numbers of office staff and co-operation from parents and other authorities.

Other indicators - including education for under-fives, unfilled classes, and pupil expenditure - depend far more on local and national policy. Alan Parker, education officer for the Association of Metropolitan Authorities, said: “The amounts authorities spend has got more to do with (central Government) standard spending assessment than anything else.”

Paul Vevers, associate director of the Audit Commission, who has been responsible for the exercise, believes it has been useful - with limitations. Although he admits to surprise about many findings - such as the wide disparity in time taken to statement children - he expects some changes as a result of public pressure.

He does concede that the pages of tables and bar charts are probably more informative if read alongside other indicators, such as school exam performance and class size.The commission is considering also measuring more subjective indicators such as parental satisfaction.

Opinion is divided on how useful an exercise the annual publication will prove to be. Many local authority officials are sceptical, pointing out the limitations of what can be proved when quality is not taken into account. One exception is Sir Jeremy Beecham, leader of the AMA,who said: “There’s material there to be useful in the context of a local debate about policy and in certain areas efficiency, promoting more effective scrutiny. But I don’t think it’s going to go to the top of the best-sellers.”

Margaret Morrissey, press officer for the National Confederation of Parent-Teacher Associations, was also doubtful. “Parents are not likely to wade through this. It’s the same scenario as running schools like businesses, when they are not businesses. Everyone is different, and they have all got reasons for being different. ” David Leabeater of the National Consumer Council disagreed, arguing that it was valuable for as much information as possible to be in the public domain. “When people have been deprived of information for generations, it may take a bit of time to see the point of it,” he said. “Now the facts are there for those who have axes to grind, and a great grinding of axes is probably a good thing for democracy.”

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared