Raising standards is a laudable aim, and schools would welcome targets to aim at, provided they are realistic and also provided the means to achieve them are furnished along with the objective. But the second of these provisos is unlikely to be met, at least to the satisfaction of headteachers and their staffs, as long as the Government is pledged to keep spending within the inadequate limits set by its predecessor. But the first also poses problems. Brian Wilson wants schools to be compared with others of similar character. Then a comparison of their performances, for example in public examinations, would make sense and give them realistic benchmarks and targets.
But controversy remains over the availability and publication of the data which allows meaningful comparisons. The recent tables on attendance and absence are almost worthless. When the Conservatives set out to establish how many pupils were absent from school, and the reasons, they found that local authorities could not agree definitions for authorised and unauthorised absence. Several years later the same disagreements persist and headteachers also differ in the rigour with which they record absences as unauthorised. Therefore it is impossible to use the published statistics as a platform for establishing comparisons and setting targets. Some authorities even claim not to recognise the practice of permanent exclusion from school. Despite the existence of a Government initiative on attendance and absence, the annual tables remain blighted.
The percentage of pupils receiving free meals is to be used as an indicator of the social composition of schools when it comes to target setting. But here again there are doubts about validity. The annually published tables of exam results are derided because they do not distinguish between schools in leafy suburbs and those with serious problems of deprivation. The Government is not yet ready (will it ever be?) to publish the tables in a fairer form. That is because "fairness" is hard to define.
The Observer recently reclassified the exam results according to the percentage of pupils taking free meals. The exercise was a commendable if limited attempt to remove an anomaly from the "raw" tables, but the critics have been quick to pounce (page four). The statistics remain flawed. It is a matter of opinion whether the exercise is useful. Schools in Glasgow which have gone up the pecking order would say yes. Those in Aberdeen whose relative performance has declined have cause for dissent.
It is not enough to take into account a deprivation index. Exam performance has to be judged on the "value added" by teachers if the statistics are to be used to compare the effectiveness of schools. At present no one would be happy with the benchmarks of achievement at the age of 12 as the starting point for comparing progress over the next four or five years. That is why the Government has shied away from publishing tables in a "value-added" form.
Mr Wilson would like to see schools have well designed targets rather than using unreliable (and in some cases unfairly depressing) information to set their own agendas. The committee on standards has to win support for dependable ways of grouping schools in a climate of scepticism about the validity of tables and their interpretation.