Those convinced that there are lies, damned lies and league tables may not welcome even more analyses of GCSE results. If those we publish this week demonstrate anything, however, it is that there could be yet more useful information to be extracted from the performance tables. This might relate to the apparent impact of school improvement measures, making some of the conclusions they have given rise to so far begin to look rather dubious.
Claims that the performance tables demonstrate that the Government's reforms are working may have been premature. Her Majesty's Chief Inspector of Schools annual report earlier this year cited the tables as evidence that these reforms had reached a turning point and were improving educational quality. And yet it was clear that only some schools were increasing the numbers obtaining five or more grades A to C and that in many of these improvement was accompanied by a worrying increase in the proportion obtaining no GCSE results at all.
In a TES article on the Government's Improving Schools programme (TES, October 6), Sir Tim Lankester, the outgoing permanent secretary at the Department for Education and Employment, maintained that the evidence of the GCSE performance tables was that there was a general rise in attainments and that this was particularly rapid among the bottom 25 per cent of schools. But figures drawn from the performance tables and released by education junior minister Cheryl Gillan to Labour's David Blunkett last week show the gap widening between the highest and lowest performing schools.
Similar signs of polarisation can now be seen in many - though not all - local authorities. As the table and maps on page 12 show, the proportion of pupils failing to achieve any GCSEs is worsening in many authorities at the same time as the numbers obtaining five or more GCSEs is going up.
Why is this? And what should be done about it? So far, the various analyses suggest symptoms rather than cause or cure.
The fact that pupil performance in some schools, or in some local authorities, is apparently improving at a higher rate or more uniformly than in others could arise for a number of different reasons. Has the improvement culture caught hold more firmly in some schools or areas? And if so, is this a result of better school or local authority leadership? Could it even be the result of spending cuts? Similar results may be possible at quite different levels of funding but sudden, morale-sapping changes and fear of compulsory redundancies are powerful distractions.
There are suggestions, too, that the performance tables themselves are having a backwash effect; that schools forced to compete are investing more effort in pupils capable of achieving grades C and above and neglecting the rest. Or that those who remain low achievers feel alienated and give up when left behind by even more of their peers. Improvement in the best schools outstripping that in the worst could also be a sign of increasing selection or social segregation, though once again the absence of any prior attainment measure in the performance tables makes it difficult to answer the questions that really matter.