Grammar schools dominate the top positions in the tables for progress between age 11 and age 14. This is hardly surprising. It was always part of the case against grammar schools that they boosted self-esteem - and consequently performance - among those who passed the 11-plus while depressing it among those who failed.
In the early 1960s, one education authority decided to abolish selection when it discovered that, during secondary schooling, IQ scores actually fell among the 11-plus failures.
When it comes to progress between 14 and 16, comprehensives do much better.
But that is mainly because they score badly at age 14, creating ample room for progress. The Guardian had a stab at combining the two value-added tables. It produced a "top 100" by taking only those schools that got in the top 25 per cent at age 14 and then looking at their performance at age 16.
Grammar schools took 14 places in the top 100, though they account for fewer than 5 per cent of secondary schools. Many other places were occupied by voluntary-aided schools where there may have been covert selection.
A few inner-city schools got in the top 100, but not many. Most of the conurbations - Manchester, Liverpool, Leeds, Sheffield, Newcastle - were absent, as were most of the inner-London boroughs. The large majority of comprehensives in the big provincial cities scored below 100 for value-added at both 14 and 16.
The conclusions accord with what ought to be commonsense. If a school has an entry with a large proportion of children from disadvantaged homes, it will struggle to "compensate" them. Indeed, it will do well to stop them falling further behind. So powerful are the effects of family background before children start compulsory schooling that teachers can do little more than patching up. We may all wish this were not so, but it is.
I do not know why people thought value-added tables would be significantly fairer. Nearly everybody - except MPs and right-wing journalists who have only a fleeting interest in truth and fair play - understands that the tables of raw results cannot properly measure performance. Value-added tables carry a spurious authority when, in reality, they are flawed in almost every respect.
I hope it will soon be agreed that league tables should not be published at all. I do not mean that exam results or value-added scores should be kept secret. I mean that they should not be published centrally by the Department for Education and Skills. Information in standardised form should be available at each school to prospective parents.
This could include exam and test results, truancy rates, exclusion rates, annual numbers of drama productions, the average speed of Year 9 over 100 metres and anything else the target-setting wizards in Whitehall can think of.
Enterprising journalists could collect the lot and publish them as league tables all the same (though, given how little most media organisations now invest in news gathering, the risk may be slight). But I do not see why a government department should act as though it were a consumer information service, still less as one that publishes duff information.
By doing so, it sends out two messages. First, that of all the information that might be made available about schools, the league tables offer the most important and reliable guide to their merits. This is the opposite of the truth.
The second message is that parents should approach schools as competitive consumers, seeking the best "deal" for their children. I understand why parents do this, but it is not the Government's business to encourage such an approach or to facilitate it.
Peter Wilby is editor of the New Statesman