CASE STUDY: Kirklees
At the core of the Kirklees authority's benchmarking project is a ring binder "Primary School Profile" which contains information on individual schools across more than 60 separate criteria, from cleaning costs per square metre of floor to a range of attainment measures gleaned from national test results and performance indicators on attendance and exclusions. Each school has its own folder in which its own performance is highlighted against all other Kirklees schools. Each graph also shows the local authority average.
The point of benchmarking is to tighten the focus of comparison so that a head can put his or her own school alongside schools of similar character. So Kirklees puts its primary schools into five bands, each with about 20 schools of similar socio-economic background according to the number of pupils on free school meals and clothing grants. (Despite its obvious weaknesses, the free meals criterion continues to stand up as probably the best indicator of the kind of challenge which a school's pupils present on entry.) All statistics within the folder are then given both for the whole authority and for the band which the named school occupies. Thus, for example, when the head of School X opens Page 50 of the folder, he or she finds a set of three graphs under the heading, "Percentage of Pupils at KS2 Gaining Level 4-plus English Test. " One graph shows the comparative percentages for all Kirklees primaries. The second shows all schools of similar type - junior, infant, or all through primary, with or without nursery - and the third shows all schools within School X's socio-economic band. In each case, School X is highlighted, and all others are anonymous. Across each graph runs a line representing the local authority average.
This folder, clearly, is a starting point - a tool to be used with understanding and considerable discretion. Sue Mulvany and Jerry Brown, inspectors in Kirklees, were at pains to emphasise this - to defuse, for example, any over-eagerness to read cause and effect from one indicator to another. They were keen, too, to avoid deep questioning about the validity or reliability of the various measures - the arbitrariness of the band boundaries, for example, is open to question.
The authority's point, though, is that what the data throws up is not so much right answers as well-aimed questions and starting points for discussion. As Jerry Brown says: "If you want to rubbish this process, you can. The staffroom cynic will always pick on the bit of data that's unreliable. What's important is that it's a way of helping schools towards a philosophy of improvement. "
The anonymity of the data, interestingly, is usually quickly breached within each band by mutual agreement. "We are like a dating agency," is how Jerry Brown describes it. "A school phones up and says that they'd like to visit the (anonymous) school in their band that is performing particularly well in, say, maths. We phone that school and get them together."
This desire for constructive sharing of experience has been noticed in other authorities that have done this kind of work. One reason for this openness may be that schools within a socio-economic band tend to be geographically widely separated and thus not in competition for the same pupils. "There is much less wariness than you find between schools in the same area cluster or pyramid", is how one officer put it.