The value-added perspective in performance tables is welcome, argues Chris Woodhead, chief inspector of schools, but the importance of raw test and examinationresults should not be sidelined
NEXT month sees the publication of the latest secondary school performance tables. They retain, importantly, the raw data on GCSE results but also include, for the first time, value-added data from key stage 3 to key stage 4. This move, despite the reservations expressed by some teacher unions about the quality of the data, will be welcomed by many who have argued that a value-added perspective should be built into the tables.
Few teachers now question the principle of publishing data on school performance. There is a general acceptance that the better the information parents have about their local schools, the more likely it is that they will make the best choice for their children and be able to play their proper part in their education. So, too, with school improvement.
Headteachers who know how their schools are performing in relation to other schools, and how individual teachers or departments are performing within the school, are in a position to direct resources and decide what action they should take.
The information needs of the parent and the headteacher are not, however, the same. Headteachers may use such statistical devices as regression analysis to establish the value-added in their school by sixth-form teaching.
Parents want a clear, concise picture of the school's results. If we are genuinely interested in the accountability of a school to its community, we forget this at our peril. The easier a table is to read, the more likely it is to be misinterpreted; the more comprehensive and subtle it is, the more likely it is to confuse.
There are difficult tensions here which we have not yet resolved. We should neither underestimate the ability of parents to appreciate that schools serving different communities are likely to achieve different results, nor overestimate their ability to reconcile statistics which may appear to give conflicting messages about a school's performance.
What is certain is that it is by no means easy to arrive at statistically secure calculations of the value a school has added to pupil performance. Such calculations need, ideally, to be based on hard evidence of pupils' achievement over time, in tests and examinations. We do not yet have, for most key stages, secure data of this kind, though the KS3 and KS4 cohort data are a move in the right direction. Previously, we have had to rely on a range of proxy measures, such as eligibility for free school meals.
My postbag suggests that not every headteacher believes that such reliance is sensible. In principle, therefore, the sooner we can move to comparisons of progress which are based on actual test results the better. In practice, however, we must be confident that the academic demand of the tests remains constant, that they are administered by schools with absolute professionalism, and that they are marked with meticulous consistency.
Equally, we must be certain that we know enough about pupil mobility in different schools to be confident that our calculations accurately reflect what the school has achieved. These are all difficult matters, and, I am told by the statisticians that accurate value-added judgments involve some pretty abstruse calculations.
Let's assume, though, that these problems can be solved and that value-added judgments can be presented to parents in a straightforward, helpful way. There is one further issue that is more important than any I have raised thus far. Might our pre-occupation with value-added and benchmarking actually depress expectation in schools serving disadvantaged communities?
Everyone now accepts that schools, whatever their geographical and social circumstances, can and must make a difference. Nobody really pretends that the inner city and the leafy suburb school face exactly the same problems.
The debate now concerns the reasonableness of our expectations. The more we are inclined to pre-determine outcomes, the greater our scepticism about the value of raw data, and, conversely, our enthusiasm for benchmarking and value-added measures.
There are, however, headteachers in school serving extremely disadvantaged communities who do not, for the purposes of public accountability, want to be benchmarked or judged in terms of the value they add.
They want the test and examination results their pupils achieve to be compared in an absolutely straightforward way with schools right across the board, not just with "like" schools, however defined. To do anything else is, in their view, to admit that their children are necessarily less able, and having depressed expectations, to end up with results that are lower than they would otherwise have been.
They have a point that we ignore at our pupils' peril: for the subliminal messages may perhaps turn out to be the most important. In welcoming next month's publication as an important step forward in the introduction of a value-added perspective we must never lose sight of the importance of actual test and examination results.