This is the season of league tables. Before Christmas we had primary and key stage 3 tables. Next week it is GCSE. And this year they come with even more trimmings. The primary tables have a value-added measure, showing progress between seven and 11, and the KS3 ones are a completely new venture.
JIt is a peculiarly English scenario. While the Scots announced in September that they were dropping performance tables and Wales and Northern Ireland ditched them a couple of years ago, Westminster ploughs on, digging itself further and further into a morass of value-added statistics which take hours to sort out and end up providing us with very little more information than we had in the first place.
Two recent reports have attacked tables which give "raw" scores. The National Audit Office (NAO) suggested that differences in standards between secondaries reduced sharply once social background was taken into account and the Institute for Public Policy Research, the Government's favourite think-tank, called the tables "poisonous" and demanded an end to the publication of raw scores.
But are value-added tables much better? Even government statisticians have their doubts. A revealing health warning on the primary tables says that "the smaller the number of pupils the less confidence can be placed on the value-added measurement". And most schools perform pretty much as you would expect, given the children they have.
Value-added tables may be a bit fairer than the raw variety but the government method of calculating progress is just one among many - if you think abortion is controversial, try value-added. Specialist schools have their own version and are sniffy about the Government's, the NAO devised yet another and some academics are dubious about them all.
Ministers offer two justifications for performance tables. The first is that they raise standards. If this were ever true, it has ceased to be so.
Tables persist and expand but in primary schools test results have stopped rising. In secondaries, the improvements in exam results at 16 continue much as they did before tables were invented. Schools in Wales and Northern Ireland match and even surpass exam results in England.
The second justification is accountability. Parents and taxpayers have the right to know how schools are doing. Only the most Neanderthal creature would deny this. We cannot go back to the days when parents had to ring newspapers and appeal for help in finding out the exam results of their local school.
But they don't have to find out through the annual ritual of performance tables. Instead, each education authority should compile a list of schools to be circulated to parents choosing primary or secondary schools. The list, to be completed in consultation with headteachers, would include the same sort of details as those on the Scottish executive website: exam results, a brief account of inspection reports, the proportion of pupils with free school meals, absence statistics and links to school websites.
These would be provided in the autumn term when parents are choosing schools. At present, a government which argues that the tables are about providing information for parents, publishes them too late to help anyone choose a school.
Newspapers might still try to compile league tables, though the latest figures show that only four out of 10 parents consult them. But a decision not to publish exam results nationally would be symbolic. It would show that ministers understand that measuring school achievement is a complicated business and remind everyone that there's more to education than A grades.