Until two years ago, "school tables" meant only two things: multiplication and furniture. Today it also conjures up an image of a mass of tabulated statistics that are more difficult to comprehend than multiplication and less useful than the furniture. That, at least, is the view of many critics.
But this week there is some tangible evidence that the Government accepts that tables can give a false impression of a school's performance. By endorsing the School Curriculum and Assessment Authority's recommendation that three pilot investigations should be launched into the "value" that schools add, Gillian Shephard has again done the decent thing. She has also shown how different her approach is from those of her recent predecessors, who were worried that such research would only muddy the statistical waters and prevent parents getting the data they need to make informed school choices.
Having praised Mrs Shephard, it must also be said that the quality and reasonableness of the SCAA working party's recommendations left her with no real alternative but to thump her rubber-stamp on the research proposal. The usefulness of the school improvement index that the working party suggested could be bolted on to the tables temporarily is, however, open to question. It would not take changes in pupil intake into account and would offer an assessment of a school's performance that would be unrealistically precise (it is surely not possible to measure a school's productivity to the decimal point). But in other respects the SCAA recommendations make a good deal of sense.
It is, for example, right to suggest that the beam of the research torch should be turned on primary schools as well as secondaries. Although SATs are not finely differentiated enough to permit value-added calculations between key stages 1 and 2, short tests could be devised to allow meaningful comparisons to be made. Indeed, Newcastle University has already begun work of this kind.
The working party's recommendation that information should be disseminated to schools on how to conduct value-added analyses is also important. Very significant insights into school effectiveness and pupil performance have been gained by value-added research and they should be available to everyone.
But before we all climb merrily on to this bandwagon it is worth asking where it is heading and whether it stands a reasonable chance of getting there. The concept of value added is an economic one that is useful in estimating how much profit a factory makes by turning a lump of metal into a widget. Such measurements are, however, much more difficult when the "product" in question is a young human being.
It may one day be possible to produce satisfactory value-added tables - those that have been published to date offer only partial truths - but as SCAA itself accepts, such correlation and regression analyses are beset with problems. It is, for example, extremely difficult to develop a simplified mathematical model for analysing such a complex, real-life situation. Is the present scoring method for GCSE grades the best one (A=7, B=6 etc)? Do two Es really equate to one B? And how do vocational qualifications slot into the scoring system?
The decision about which variables to include in the calculations is also a matter for debate. The SCAA working party could not have proposed a pilot study looking at the impact of class and race on school performance even if it had wanted to because the remit from Sir Ron Dearing did not permit it. That was not necessarily a bad decision because prior attainment is the best predictor of GCSE and A-level performance. But it is interesting that France has developed value-added measures based on social class (see page 11).
Some critics of the British tables will doubtless draw attention to this fact, but they are they should probably now accept that annual rankings are here to stay. A genie was let out of its bottle with the publication of those first tables two years ago and it will not return from whence it came, no matter how much the profession wishes it to.