Small local authorities are often found high up the league tables - but not always justifiably, writes Colin Alston.
THE test results of very small primary schools with fewer than 10 Year 6 pupils are not published in the annual league tables. The Government accepts that their scores can fluctuate wildly from year to year and are therefore not a valid measure of a school's performance. However, when the annual GCSE league tables are published, all local education authorities appear in them - regardless of their size. The Isles of Scilly (with their one successful secondary school) usually shine in the GCSE rankings.
Few people realise how many small LEAs there are. In 1990-91, there were only 108 LEAs in England. But local government reorganisation has increased this to 152.
A handful of authorities have fewer than five maintained secondary schools. More than 30 LEAs have 10 or fewer secondaries. A further 30 or so have no more than 15.
Why does this matter? In a very small LEA, a single secondary school can account for over a quarter of the authority's GCSE results. If just one school's results are very high or low, this can distort a raft of performance indicators and targets at LEA level.
Consider a small experiment. Imagine that the London Oratory School, in the borough of Hammersmith and Fulham, could be wheeled into another LEA, taking its 92 per cent 5+A*-C GCSE grades (last year's figure) with it. As it crosses the border into Islington, the borough's GCSE average leaps from 26.5 per cent to 35 per cent. (Sadly, the "home" LEA's average falls by almost 10 per cent and it drops 58 places in the tables.) If you pushed the London Oratory into Hackney, you would delight at the 6.8 per cent improvement (up 24 league-table places). Push it south, into Lambeth, and you would see a 7.4 per cent increase in the borough's 5+A*-C average - up 28 places and well away from the bad-publicity zone.
Now imagine a different scenario. If Kensington and Chelsea "borrowed" the GCSE results of just one struggling school from an East Midlands LEA, the London borough's average would plummet by nearly 15 per cent. It would drop 62 places in the league tables. The same school's results would also devastate the apparent performance of Blackpool, Darlington, Barking and Dagenham, or Rutland.
This is all a bit ridiculous, of course. One school is not an LEA and yet the results of one secondary in special measures can tip the balance of judgment on 10 or more LEA-level performance indicators (three GCSE indicators, three key stage 3 indicators, attendance, absence, permanent exclusions plus a few "best value" indicators.) For small LEAs, these are politically big statistics, which have assumed almost biblical significance. A small LEA can maintain league-table respectability if it has a couple of large, high-achieving secondaries. They can mask low results in the remaining schools.
In contrast, one low-performing school can ruin an LEA's averages - even though most of its schools are making progress. There is no simple solution. Omitting the results of the "outliers" (the highest and lowest results) from LEA averages could make things even worse. But greater awareness of the issue is needed.
In the future, perhaps an Office for Standards in Education report, an Audit Commission study, or the GCSE performance tables will note that the statistical comparison of small LEAs should be undertaken with caution. As things stand, authority-wide averages can be an unreliable basis for assessing the performance of small LEAs.
Colin Alston is head of research, statistics and development for Hackney education authority.