The unveiling of exam tables no longer creates the stir it did when the last Government started annual publication. The arguments for treating raw data sceptically are too well known, and the same schools top and tail the tables year in year out, reflecting the obvious fact that performance is closely associated with the social characteristics of the area from which schools draw their pupils. It is an interesting sidelight, however, that small island schools usually do well whereas inner city ones with small rolls and thereby the opportunity for greater attention to individual pupil needs do not benefit in the same way.
As media interest in the tables declines, attention in schools appears to grow. Our reports on pages four and five show how seriously schools are addressing themselves to the messages buried in the columns of tables. The mass of statistics, which frustrates attempts to turn the tables into "leagues", allows schools and departmental heads within them to draw conclusions about their own work and how it compares with that of other departments in similar schools.
Long before tables were devised, schools took seriously the results of OrdinaryStandard grade and Higher exams. At one level the concern was for the welfare of individual candidates whom a teacher had looked after for a year or more. At another level year-by-year performances were compared and no doubt analyses done across subject boundaries. So the availability of published statistics and those now supplied in even greater detail to individual schools does not signal a new culture of self-examination in schools. It does, however, give more flexible tools of measurement.
The notion of "value added" has been around since the last Government devised the tables. Headteachers who were careful not to reject the principle of making public evidence which would lead to comparisons, relied on the statisticians' claim that the misleading element in the tables of raw data would be removed if the composition of the secondary intake were recognised along with the subsequent "value" added in four or five years of teaching.
Conservative ministers promised publication of suitably weighted tables once the statistical reliability could be guaranteed. This has not yet happened despite extra pressure from a Labour minister. We are moving in the direction of a more sophisticated approach and extra information for parents, but there is clearly still a cautious approach in the Scottish Office.
The most useful messages are those conveyed to schools in a plain brown envelope. Limited application of value added techniques traces departmental progress by pupils from Standard grade to Higher. Relative ratings will allow schools to identify strong and weak departments on the basis of three years' trends. It is these tools which heads are using as part of their school's self-evaluation and development programmes. None would say that their intention is punitive but some of their staff might be sceptical.
The number crunching facility now available will be meaningless unless it leads to improvements in the classroom. When a headteacher has used statistical evidence to support his own knowledge that a department or a teacher within it is weak, the remedy must lie in extra support not castigation. But the virtual demise of local-authority subject advisers and the lack of money for in-service are serious constraints. Pupils in a shaky department need more consistent teaching and not the chopping and changing required to cobble together a bit of extra help for one or more teachers.