Value-added and traditional league tables exaggerate differences between schools and can be an unreliable indicator of their performance, the Government admitted this week.
Most variation in schools' test results is caused by factors such as deprivation, prior attainment and special needs rather than differences in effectiveness, a Department for Education and Skills analysis of exam results found.
The report, published on the department's website, is a startling admission from a government which has put tables at the heart of its standards agenda and used them to set numerous targets for schools.
It will be seized upon by opponents of league tables as evidence that the Government's insistence on publishing them is unfair to schools and is bad politics.
A source with close links to both Downing Street and the DfES told The TES that ministers believe the media's focus on using league tables to highlight failure means Labour is not winning the credit it deserves for improvements in the education system.
But ministers are unlikely to follow the example of Jane Davidson, the Welsh education minister, and scrap them.
Instead, they hope new school profiles will shift the spotlight away from the publication of raw test scores, he said.
The DfES report, Statistics of education: variation in pupil progress 2003, said that the use of "threshold" measures, such as the number of pupils reaching expected national curriculum levels or gaining five or more good GCSEs, further distorts the view of schools' performance given by tables.
Targets are routinely based on threshold measures.
The report said: "If value-added scores are used as an indicator of how effective a school is, it is important to be aware of the potential uncertainty around the figures.
"The school could have been equally effective and yet the same set of pupils might have achieved different results on the same day."
Small differences in test scores between schools or over time should not be interpreted as showing significant differences in effectiveness, it added.
Small schools are also particularly vulnerable to year on year fluctuations.
Although prior attainment, the largest factor, is included in value-added scores, the others are excluded.
Once background factors are taken into account, more than a third of schools have results which are not significantly different from the national average.
Critics say league tables paint an unfair picture of schools with disadvantaged intakes, a position backed last year by the National Audit Office.
The DfES report said that variation in schools' performance makes up less than a twelfth of the unexplained difference in pupils' results once background factors are taken into account.
The report also noted that large concentrations of deprived pupils depress school performance. Schools where half the pupils are on free school meals have test results 22 per cent lower than schools with affluent intakes. The predicted difference, based on national performance of deprived pupils, is just 7 per cent.
The finding will be seized upon by critics of the current admissions system as evidence that it causes underperformance. Deprived pupils are concentrated in sink schools because their popular neighbours can covertly select pupils from affluent backgrounds, they argue.
MPs warned last month that admission arrangements which lead to unbalanced intakes cause some schools "acute problems".
John Bangs, head of education for the National Union of Teachers, said:
"This is remarkably enlightened. It is everything we have always said.
League tables are not only inaccurate, they are about victimising schools at the bottom."