How can minnows like the Scottish Parent Teacher Council write to an education giant like Frank Crawford, head of the Audit Unit, and accuse him of knowingly setting targets that have a built-in guarantee of failure for some schools? Well, behind this accusation lies some determined and intelligent sleuthing which would do credit to TESS private HMI, Phil Harrass.
The starting point is the council's general unhappiness about the way statistics and measurements have come to dominate education. Examination results are used as a measure of a school - despite all protestations that they should be no more than an indicator - and no one bothers to explain the reasons behind the figures. When it looked as though school targets would be based on a statistical model, using examination results, we felt it was the wrong approach.
We suggested that it would be better if schools first identified their own weaknesses and undertook to make improvements in these areas. Schools would notify local authorities of their targeted improvements and the authorities would, in turn, inform the Audit Unit which would monitor progress. By collating all the individual school targets, it would then be possible to identify a national improvement in children's achievement.
A number of local authorities were sympathetic to our proposal but such a process requires people to think. It is so much easier simply to have a set of numbers and then see if the schools can match these numbers. That requires no effort on the part of those collecting the information. You can log everything into a computer and get some fancy statistical analysis of success - or failure.
While we were discussing this whole process, Alison Kirby, our convener, came across a reference to Harvey Goldstein who said that statistics on examination results were not reliable unless you had a big enough sample, which he suggested needed to be 1,000 pupils. As no school in Scotland has a fifth-year population of that size, Professor Goldstein's statement seemed to suggest that examination results could not be used to provide an absolute guide to school performance.
Alison tracked down Goldstein to London University's Institute of Education via the statistical journals held in Edinburgh University library. He responded by sending an article entitled "Excellence in schools - a failure of standards" from the British Journal of Curriculum and Assessment.
It said that free school meals account for at most a third of the variation between schools while intake measures account for between half and two-thirds. Moreover, Goldstein also stated that comparing schools required complex multi-level modelling which included all of the many variables. In other words, it was not valid to consider schools to be the same when the only factor looked at was free school meals.
We did our best to make people aware of this information but it is a sad fact of modern life that an authoritative body like the Audit Unit only has to put information out in statistical form for everyone's eyes to glaze over. Moreover, we were aware that there is some dispute between Goldstein and Michael Barber, the Government's current educational guru, and while we knew Goldstein was our champion, we were not sure how he was viewed in Scotland.
Meanwhile, the Forum on Scottish Education was pursuing another tack - although the targets were apparently based on free meal entitlement (FME), they were really based on actual uptake of that entitlement, and the level of uptake varied from school to school. A challenge was sent to the Audit Unit on this point.
The answer refuted the quibble and quoted in support such "leaders in the field" of value-added calculations as Harvey Goldstein who apparently had used FME as part of multi-level modelling. Bingo. The unit was acknowledging the expertise of our champion. That also meant that it had to know of his work. Yes, indeed, Goldstein used FME as part of his multi-level modelling, but the important words were "part" and "multi-level". As Goldstein had shown, FME was only a minority part in that multi-level modelling.
The Audit Unit also wrote that FME was the "strongest available correlate of pupil achievement" and hoped everyone would focus on the word "strongest", whereas the really critical word was "available". It was not the strongest correlate but, in fact, a fairly weak correlate which could be nullified by differences in intake achievement measures. Emboldened by this approval of our champion, we challenged the Audit Unit and accused it of knowingly setting up some schools to fail.
One month on and we have finally had an answer. It still talks in terms of "best" and "available" but does go on to suggest that the purpose all along was to generate "provisional" targets using FME and that these are only "for consideration by schools and education authorities". Indeed, "variation in starting point" should be "taken into account when schools agree their final targets with education authorities".
So that's all right then - next time schools rewrite their targets, they should be congratulated for recognising the "provisional" nature of the Audit Unit's offering. Funny, I don't remember any major press release clarifying that, only criticism that so many schools had reduced their targets.
But where does all this leave the Audit Unit's expensive operation with FME? Provisional or fixed, it doesn't really change our original criticism for, according to Goldstein: "If all you've got available is the proportion of children on free meals, that's completely inadequate. It tells you nothing. "
Judith Gillespie is development manager of the Scottish Parent Teacher Council.