Controversial measures of progress which led to an outcry by heads last year are still being developed, reports Sarah Cassidy.
MINISTERS' plans to introduce a controversial "progress" measure have been put on hold because of fears of angering high-flying schools.
No "value-added" measure is now expected to be published until at least 2003, prompting accusations that the Government plans to scrap the idea.
It was originally hoped this year's performance tables could include a more sophisticated measure of school achievement than just raw exam scores.
However, the "value-added" index - which would have measured the improvement in pupil performance between 14 and 16 - has been dropped after concerns that it would disadvantage "top-performing schools".
It would have been a step towards a full "value-added" system which could track the improvement made by pupils between starting and leaving school.
A spokeswoman for the Department for Education and Employment said: "Ministers decided not to repeat last year's interim 'school progress measure' given the concerns expressed by many schools about the methodology and in particular whether it fairly reflected progress in the top-performing schools. The progress measure was not genuine, pupil-level value added.
"The progress measure was based on comparisons of school summary-level results for around 3,200 schools. Full value-added measures are based on comparisons of pupil level results for some 600,000 pupils; they are much more accurate and take into account pupil movement from one school to another."
Harvey Goldstein, professor of statistical methods at the Institute of Education in London, said: "They are effectively saying they are not going to do it. They are talking about 2005 or 2006 for a KS2 to GCSE measure, by which time we will be two governments further on. They are totally inconsistent - on one hand they say we need a value-added system, but on the other they fail to point out the very serious limitations of publishing tables of raw scores.
"They are totally misguided in that they only want to develop value-added to use it to publish new rankings, which is a scientific nonsense. Value-added can show up a few schools at either end of the scale, but practically everyone else is somewhere in the middle."
Pupil mobility has been the greatest problem in developing the measure, which was found to be "much greater than expected", the spokeswoman said.
Although student tracking is currently possible, its reliability should be boosted by the use of individual pupils' identification numbers, due to be piloted this academic year.
Officials are reluctant to say when the first such measure will appear in the tables but are considering measuring how schools boost pupil performance in the early years of secondary school by comparing individuals' key stage 2 and 3 test results. This would not be possible until at least 2003.
The Government's loss of faith in the progress measure began last year when ministers were forced to make an embarrassing U-turn and axe it from the performance tables after hundreds of complaints from schools.
Headteachers protested that the measure only assessed improvement over the two years between KS3 and GCSE, depended on national tests regarded by many as unreliable and did not include pupils absent for the tests.
Ministers agreed on a last- minute compromise and only identified those schools which had made better than average progress under the controversial measure, using a tick against their names.
Value-added experts are divided on whether ministers were right to halt publication.
Professor Carol Fitz-Gibbon, director of Durham University's Curriculum, Evaluation and Management Centre, backs their decision, arguing that the publication of a flawed measure would be unfair to many schools. She said: "It is essential to have pupil-by-pupil data. A measure based on school averages does not take pupil turnover into account and schools with high turnovers could be seriously misjudged.
"Getting individual pupil numbers going is a big job...but I think they are quite right to delay until they have a more reliable measure.
"If you create a value-added measure based on just two numbers - the school's average KS3 score and its GCSE results average - then you are dealing with extremely gross aggregate figures."