Value-added concepts reinforce a basic pessimism that disadvantage can't be reversed, say Kevin Orr and Colin Mair
THE RECENT experience of Edinburgh in analysing value-added school league tables attracted much attention and apparently pointed to an "unexpectedly" wide variation in test results from primary schools with the same social background.
In February, Edinburgh became the first council to publish primary figures for individual schools, showing free meal entitlements and how they are performing against national 5-14 targets in reading, writing and maths. This was an attempt to move towards the value-added tables for which the education profession has long argued and to allow the performance of individual schools to be judged "more fairly" in terms of the nature of the school's intake in terms of socio-economic class.
The central criticism of league tables has been that tables take no account of schools' varying intakes - that they compare deprived schools with affluent suburban ones, independent with state schools. This means that any comparisons are unfair to schools struggling with poor intakes and unhelpful to parents trying to make judgments about the performance of schools in their area. The work done by Edinburgh and other authorities in Scotland is presented as an attempt to address this and to make the judgments more meaningful.
Underlying the controversy about tables is the importance of social class. The link between socio-economic factors - income, housing, level of parental education - and a child's level of educational attainment is widely accepted. Children from working-class backgrounds tend to do less well than middle-class counterparts.
What Edinburgh appears to show is that there are wide or "startling" variations in how well schools with similar intakes are doing. From a managerial point of view it may well have been important to establish this. The data can be used to challenge empirically the lazy belief that there is little value that any one school or education authority can add to the "expected" performance of schools from poorer areas. What the value-added tables in this instance also challenge is the complacency that may surround schools in more affluent areas, with more motivated pupils and greater level of parental involvement.
Both of these points are useful to highlight. But why would anyone be particularly surprised? Both arguments have always been central to the case against league tables and while it is no doubt useful to an education authority to have supporting data the move to value-added tables obscures some of the underlying issues affecting attainment and ignores wider questions about the Government's strategy of social inclusion.
The danger is that a valid point is conflated with an invalid one. It was always stupid to see examination results as a measure of individual school performance, given the systematic nature of the statistical relationship between background and attainment. However, it is equally foolish to suggest that the pattern of crude examination results does not pose critical questions about the effectiveness of the Scottish education system.
The questions posed are, however, for ministers not headteachers. What is it about the concept of education, the design of the curriculum and the approach to assessment that leads to social exclusion for a substantial minority of children? What is it about other aspects of social policy (on housing, health, social security, employment) that results in them reinforcing and exacerbating patterns of educational disadvantage over generations?
At best, value-added tables, as in the Edinburgh case, raise questions about whether "good" schools are quite as good as they ought to be given their intake, and whether schools that emerge as "bad" on crude examination results are not doing quite well, given their intake. If used sensibly, this information may encourage mutual learning and the dissemination of best practice among schools. It may also provide a more informed basis on which to target resources.
The trouble is that despite a more sophisticated approach, the assumption of intake-related failure is built into the very way we measure performance, and allows that the initial disadvantage many children suffer will be routinely reproduced through the school system. Value-added is after all about adjusting the tables in line with the "expected outcomes" due to social class. The questions about the design of the system itself, and its interaction with other aspects of social policy, are fundamental because they challenge this structural pessimism and hold ministers, not teachers, to account for their performance.
The move to value-added tables on the part of education authorities is part of a defensive and excusatory strategy. With the old league tables, schools frantically stressed the context in which these had to be read to parents. There was a conscious process of lowering expectations and an implicit resignation that little could be done to overcome deprived backgrounds. This was aptly summed up by a former director of education who described tables as nothing more than a Cook's tour of deprivation in Scotland. The shiny new value-added tables merely reinforce the importance given to presentation and the refusal to deal with questions about the failure of the system as a whole.
The current pattern of school examination results provides a key indicator of the failure to combat disadvantage and promote social inclusion. It should provide the starting point for a comprehensive review of policy and resource allocation, and the key benchmark of whether the parliament has delivered "value added" for the children of Scotland.
Kevin Orr and Colin Mair are employed by the Scottish Local Authorities Management Centre, Strathclyde University.