The Department for Education's latest briefing paper Value Added in Education underlines the Government's commitment to measures of school performance that take account of pupils' starting points as well as their raw examination scores, provided such measures are reliable, easily understood and based on actual achievements. In a commendably brief and comprehensible account of the technicalities, the paper also underlines the difficulties.
The School Curriculum and Assessment Authority has commissioned research to shed empirical light on these. But questions about the purpose to which value-added measures should be put also have a bearing on their design - and on the insistence that they be based upon pupils' prior attainment.
The DFE paper suggests two main uses: at the national level to allow consistent comparisons of the performance of different institutions and at a local level, where more detailed analyses can inform the targeting of resources.
League tables that took account of prior attainment would no doubt be fairer, lend some comfort to schools achieving miracles against odds and expose better endowed schools that are unduly complacent. It is doubtful, however, whether suggestions that a school's results are only as bad as they are because of its poor intake would reassure prospective parents.
Add to this the fact that it will be the end of the century before pupils with standardised prior attainment in the form of key stage 2 tests take their GCSEs; doubts about whether the tests or exam results lend themselves to statistical juggling; the difficulty of taking account of vocational qualifications awarded on a different basis; the administrative complexity of collecting information from different schools and colleges; the danger of overloading the published league tables and the acknowledgement by the DFE that the results will not, in any case, be accurate enough to put schools in a reliable rank order, and you are bound to ask if the game is worth the candle.
The Government might, however, consider whether it should be doing more to encourage the use of value-added results to target resources. Socio-economic adjustments to public performance tables run the risk of legitimising low expectations. But if some schools and children are expected to achieve higher standards they may require greater support. The question is, which? To find that out may require use of social measures available now rather than educational ones in five years' time. The Government already accepts, in its own standard spending assessment, that some children cost more to educate because of adverse social and economic factors. And yet those differentials are barely acknowledged in the differences of funding between schools agreed by the DFE or in the Pounds 100 million the Government plans to give to schools to help them raise their standards (see page one).
The real pay-off from further value-added research could be in clarification of the relationship between poverty and performance. It could help, for example, to sort out which of the estimated one in eight schools struggling to provide adequate education are overwhelmed by adversity and which are suffering from poor management. Otherwise, redistribution of resources runs the risk of rewarding failure.