eachers and parents left researchers from the National Foundation for Educational Research with a clear message: they did not understand much of the data presented to them (page one). That can be taken two ways: they might be much more interested if statistics were presented more meaningfully or, alternatively, they are not that interested anyway no matter how much they are simplified.
The evidence from parents across Britain is worth repeating for adherents of league tables and public humiliation through performance indicators.
Reams of statistics are all very interesting and make good copy, but parents want to know about their own children and how they are getting on.
They are less interested in a statistical comparison between their school and its neighbours, even when it comes to making a choice of school - if there is a choice.
Ministers might like to consider this as they push through their parental involvement bill. What they perceive to be for the good of parents and the education system may not be uppermost among parents' priorities. Equally, we may at last be moving in the right direction by trying not to be overly prescriptive in determining structures for parent involvement.
To be realistic, the vast majority of parents will not look too carefully at published information on school performance. What matters is that the school their children go to is a good school, whatever the catchment area.
One way of establishing that is through performance data but it is not the only one and data can be easily misread, as the Statistical Commission emphasises.
The Scottish Executive has tried, somewhat unsuccessfully, to limit the release of full test results because of the apparent damage to the system and the unfairness of penalising schools in disadvantaged areas. The commission takes a different slant. It wants "helpful advice" alongside statistics rather than any constraints on access to data. But who is going to look at helpful advice against raw data?