Like with like
The University of London Institute of Education helped OFSTED devise school indicators that enable like to be compared with like. These are not "value-added" measures, as they do not include attainment on intake, but they go some way towards levelling the playing field.
The approach is not perfect by any means. There is a danger, recognised in the OFSTED's own analysis of the institute's work, that pupil background might be used to excuse lower achievement in schools with a high proportion of pupils on free meals: the objection to social adjustments of exam league tables voiced by Education Secretaries since Kenneth Clarke.
But as a tool for helping to judge and improve individual schools, the "like" schools approach is a distinct advance on crude comparisons with national averages. Any danger that they will legitimise failure will be far outweighed by the likelihood that all underperforming schools will be highlighted, that inspectors will be able to say with greater confidence when a school is failing, and that teachers in lower-achieving schools will see any improvement properly recognised.
Well done OFSTED, then, particularly if this is now to replace the expectation that inspectors judge achievement against their own apparently miraculous powers of recognition of underlying pupil ability. Applied on a school-by-school basis, with inspectors and heads checking the validity of the social indices used, this is a real breakthrough.
It is such good news it seems churlish to carp. But why does OFSTED persist in hiding its enlightenment? Why is it acting so mysteriously over this approach, apparently used by Chris Woodhead in selecting outstanding schools for his recent annual report, and broadly supported by the gaggle of statistical experts assembled before Christmas to give it a seal of technical approval?
The new indicators are going to slip quietly into the pre-inspection context and school indicator (PICSI) reports from April 1 when the revised approach to inspections begins.
Of course, using social indicators contradicts previous government pronouncements; that may explain the sensitivity. But this is not an occasion for holding back or doing good by stealth.
These indicators should open to public debate. And all education authorities and schools - not just those being inspected - need to know the scores of other like schools for comparison if they are to act as spurs to improvement and the means to target extra support.