Ministers' plans to introduce a more sophisticated measure of school performance than the raw exam results which currently fill up league tables suffered a setback last month when they were forced to abandon - for this year at least - a controversial "progress" index after being inundated by complaints.
The index was to have been a first step towards a full "value added" system intended to measure the improvement pupils make between entering and leaving a school.
It was dropped from this year's tables after hundreds of schools protested against the measure. They pointed out that it only assessed improvement during the two years between Key Stage 3 and GCSE, depended on KS3 tests regarded by many as unreliable and excluded pupils who were absent for KS3 tests from value-added calculations.
The Government retreat raises serious doubts about more wide-ranging value-added measures piloted in just over 200 secondary schools. This pilot scheme, the results of which were also published this week, shares many of the features of the controversial progress measure, condemned as "unfair" by the Education Secretary David Blunkett.
The fundamental difference between the abandoned "progress measure" and the pilot value-added project is that the latter tracks individual pupils' KS3 and GCSE results rather than the school's average.
The pilot, which the Government hopes to extend to all schools next year, involved 27,000 pupils in 205 volunteer schools. It bands individual children according to their starting point - the average of their English, maths and science KS3 test results taken in 1996.
Each child was given a value-added "score" by comparing their actual GCSE attainment to the average for their band. This was to show whether they had made more or less progress than expected. For example, if the average for a child's band was 42 GCSE points, a child with 45 points would carry a value-added score of +3 while a pupil in the same band with 40 points would score -2.
Children's "value-added" scores were totalled and averaged for each school. Schools were ranked and, as with the progress measure, the top 5 per cent awarded an A, the next 20 per cent a B, and the middle 50 per cent a C grade, the following 20 per cent a D and the bottom 5 per cent an E.
Janet Dallas, of the Department for Education and Employment who co-ordinated the pilot project, said: "The most useful information for schools is what lies beneath the value-added data.
"We chose just two ways of presenting it: showing the value added for pupils who started from a low, medium or high base; and by gender. But we could have displayed it in many different ways.
"This is invaluable for schools who want to see how they compare with similar schools."
Joan Olivier, headteacher of Lady Margaret school in south-west London, took part in the pilot because she was tired of having her school's GCSE scores dismissed as the inevitable consequence of a high-flying intake. She was pleased to be ranked in the pilot's top 5 per cent but delighted this included an A for boosting the performance of girls considered to be below average at 14.
She said: "It showed that we can do well by everyone: the low achievers and high-flyers. The pilot was a broad-brush approach and I am sure it will become more sophisticated but it is a welcome step in the right direction."
But John Thomas, headteacher of Lowfield school in York which was ranked in the bottom 5 per cent and graded E in all areas, said: "I am in favour of a move away from the raw score tables, but this value-added pilot was a pretty blunt instrument which needs a lot more work.
"Obviously we are disappointed that it shows us in a bad light, but there are a lot of problems with their calculations."
The pilot concedes that pupils move between schools and progress is often due to the efforts of more than one school, yet it takes no account of pupil turnover when calculating a school's value-added grading.
However, it publishes a "stability measure" alongside the tables showing how many pupils stayed at each school for both KS3 and GCSE. The final column in the tables shows the percentage of pupils included in the value-added calculations; those with incomplete test data were excluded.
The DFEE emphasises that all the data needs to be considered together to give a true value-added picture.
But academics who specialise in value-added calculations have serious concerns about the methods used in the pilot.
Professor Carol Fitz-Gibbon of Durham University said: "Unless it shows progress in each curriculum area, it is of no use at all to parents. A parent of a child who wants to study science needs to know what happens in that department not just an overall woolly figure."
Harvey Goldstein, professor of statistical methods at the Institute of Education in London, criticised the pilot's "crude and simple" mathematics. He said: "The DFEE concedes that the results were unreliable for cohorts of fewer than 10 pupils, yet they present the rest of the figures as 100 per cent accurate. This is not true and they have omitted the most important data of all - the margin of error in this intellectually shoddy project."