There are essentially two factors governing any school's (or college's, or individual's) performance: the quality of what is given, and what is done with it. So if we are to judge the performance of an educational institution, we need to know how far it has developed, wasted or maintained the potential of its raw material.
That is why the annual publication of basic results tables continues to be greeted with hostility or scepticism by most of the education world, and why the search for a straightforward way of measuring the value added by a school has every statistician in the business involved in the quest for this holy grail.
For it has to be accepted that league tables are here to stay for a few years yet. Gillian Shephard, the Education Secretary, is at her crispest in extolling their virtues. David Blunkett, her Labour opposite number, is now declaring himself equally zealous - or preferably more zealous than thou - in defence of standards and parents' right to know. And both are looking forward to the development of techniques which will finally measure, in terms we can all understand, the value that a school has added to its pupils' examination performance.
We can expect soon the first report from a working party set up by the School Curriculum and Assessment Authority to look urgently at measuring value added in key stage 2 of the primary school, something Sir Ron Dearing rightly perceived as a priority if the proposed performance tables for 11-year-olds were not to be scuppered by another test boycott. What we cannot expect, however, is any promise that it will be easy, or that any of the multi-levelled jargon now pouring forth from competing researchers will convincingly untangle prior attainment from social factors from school effect.
A successful value-added approach will surely come, though it may have to wait five years for the first cohort of 11-year-olds' test results to work their way through. Meanwhile, most published attempts remain far too complicated and qualified to follow through to clear conclusions, and the heart sinks at Mr Blunkett's promise to add even more information to the existing tables. Something will have to go if the print is not to get even smaller, or the computers more over-heated. Probably it will have to be accepted that the columns of figures on truancy and the length of the school day, at least, add little to the sum of human knowledge.
It could with justice be said, of course, that the same is true of the whole production. We don't have to look at the league tables to know that exam results are improving. The examination boards tell us so in August when they are first published. Parents who know how to ask around can always tell which are the desirable local schools. College principals claim that the further education inspectors' reports are already providing much more valuable information about performance than the tables. Though some observers have been shocked by the huge differences in the results of the best and the worst schools, and between urban and suburban, HM inspectors have been telling us so for years (though they hoped a national curriculum and testing would put it right).
The difference is that now all parents (and journalists) are being let in on what were once the professionals' secrets, many more are using the information in their choice of school, and that can't be bad. David Blunkett believes that every child is entitled to the highest standards of teaching and that parents must be able to compare the performance of their schools with others in the neighbourhood and the rest of the country; so he is right not to worry about U-turn taunts.
The professionals are also right, however, to remain dissatisfied with the performance of the tables themselves. Though they provide essential information to parents it is not enough on which to base decisions about their children's futures, something on which the most favoured independents are at one with the neighbourhood comprehensive: you can't pin a points score on ethos, or the school play or sporting triumphs.
Nor does exposing the shame of those schools which persistently prop up the bottom of the table necessarily help. Some are galvanised into action, but may concentrate on improving the performance of the most promising without raising the expectations of the majority; others do not have the capability within themselves to rise above the factors which dragged them down in the first place, many more are just above the failure belt but also need outside help if they are to improve upon persistent mediocrity.
Where is that help to come from? The same marketing approach which brought us the league tables, also privatised inspection and severely reduced the capacity of local education authorities to advise or support schools in trouble. School improvement is now a more serious preoccupation with many heads than worrying about the curriculum, the tables or even the budget. Some can rely on their own staff and governors for the necessary cooperation and development; others demonstrably cannot. This is the area where both Ministers and the Office for Standards in Education need to enlist the active collaboration of the LEAs again, if the league tables are ever to lead to positive action.