There may have been some question marks over whether parents had ever asked for such information, but certainly few others did. The only relevant publication is the data on exam results: whatever their shortcomings, the media has used them increasingly to promote the good work that schools are doing. The publication of the tables now excites little of the furore it once did, and even "league table" formulations of the results say more about the sophistication levels of those who publish them than they do of the schools themselves.
It is, of course, time to move on - time, as England will do next year, to get behind the raw scores to the figures behind the headlines. Our report from south of the border last week showed clearly that the value-added pilot in primary schools has exposed the protection raw scores give to schools with apparently good test results that are simply coasting while pillorying those with superficially poor results that are making considerable progress. So what we now require, 10 years after the Forsythian initiatives were launched, is "real information for parents".
But there can be little reliance surely on the information in the rest of the series. Leaver destinations tell us as much about local economies as they do about local schools; attendance figures reveal little more than that there is likely to remain a hard core of school refuseniks with barely any progress made over the past six years (not to mention the differing interpretations of absence in different authorities); and school cost figures simply confirm the blindingly obvious that larger schools cost less to run.
We are not suggesting, in these days of open government and transparency, that such information should be concealed. But any suggestion that the statistics provide meaningful information for parents must be challenged.