BARBARA Owen, director of sixth-form studies at Temple Moor high school in Leeds, had a new experience this week: coming top of a league table.
"We have never come top of anything in our lives. The football team can't even win the league," she said.
Temple Moor's new-found success comes in a pilot table of the "value added" by a sample of 157 secondary schools and colleges across the country between GCSE and A-level, the first tentative step by the Government towards a general introduction of value-added measures promised for 2002.
"I knew our score would be very good but this has given me a bit of a buzz," Ms Owen said.
"We are a school with a mixed catchment, drawing on a large council estate as well as middle-class areas, and we don't normally do well in league tables."
Schools like Temple Moor, where only 28 per cent of pupils got five A-C GCSE grades this year, have long campaigned for the introduction of fairer measures of school performance than the raw GCSE and A-level statistics that currently ensure that schools with mostly middle-class, high-performing students dominate the top of the performance tables.
The demand has been simple enough: the tables should measure how far schools help children to improve, rather than just raw results largely influenced by the pupils' backgrounds.
But attempts to come up with a credible measure of such progress have had a difficult history.
Two years ago, a Government scheme to rank schools according to the improvements they made to pupils' performance between age 14 and GCSE collapsed amid accusations from headteachers that they were based on unreliable key stage 3 tests.
A system of grading schools from A-E wasreplaced at the last minute by a simple endorsement for particularly high-performing ones. In 1999, the measure was dropped entirely.
This year's post-16 table compares the A-level points score of each pupil in the pilot to the national average A-level point score of pupils who achieved similar results at GCSE. These comparisons are then averaged across each school and expressed relative to an index of 100, representing the national norm.
Temple Moor's score of 108.6 means that its pupils are performing significantly better at A-level than students with the same GCSE scores in other schools. Scores outside a range of between 98 and 102 are considered "significant" by the statisticians behind the new table.
This kind of methodology is likely to form the basis for the introduction of similar comparisons for children aged 11 to 16 promised for 2002, but it has not resolved the controversy surrounding value-added measures.
Arthur de Caux, National Association of Head Teachers education officer, warns that the reliance on total A-level and GCSE scores may encourage schools to try to bump up A-level scores by entering students for more of them, or to cut GCSE scores by entering them for fewer exams at that level.
The Secondary Heads Association is also warning that differences in the difficulty of A-level courses will also introduce inaccuracy into the system.
"Until the Government comes up with some demonstrably fair way of doing this, we can't be happy with any of these measures," Mr de Caux says.
"We accept that value-added monitoring can be very valuable when used internally, but as soon as you start publishing stuff in the tables you have to get it right."