In 1980 the Schools Council warned that public examination grades should only be treated as being "accurate to about one grade either side of that awarded". This advice means that for an individual with a grade B at GCSE, their achievement can be regarded as being somewhere in the grade A-C range. This is not very precise, but is nevertheless a useful rough and ready indicator of achievement. No one seemed to worry too much about that at the time, but maybe in the 1990s examination grades are more important and so this characteristic is more significant than it used to be.
Now that we have league tables, and people tend to use GCSE examination results as the barometer of educational standards, the nature of the debate has changed and the stakes are suddenly much higher. Higher probably than they should ever have been allowed to get.
There has been, for quite some time, a considerable body of research evidence about the accuracy of public examination grades and none of it provides tremendously reassuring reading for those who want to believe in their absolute value. It has been demonstrated that grades can vary when the same pupils take different examination papers, when different markers mark the same examination scripts, and even when the same pupils take the same examination paper on a different occasion. Complex public examinations, can never, it has been shown, be precise, accurate and totally reliable.
In this context several of the messages contained in last week's Channel 4 Dispatches programme about examinations are hardly surprising. There is nothing new in showing that grades vary when individual pupils take a different paper, or when individual teaching groups shift from one board to another (see The TES, October 13). However, the producers of Dispatches conjured up eerie music and shadowy figures, supposedly fixing GCSE results in smoke filled rooms. This might have looked all right in a production of the witch scene in Macbeth, but it gave a somewhat alarming twist to what could have been a worthwhile consideration of the matter.
These were unfortunate aspects of what could have been a good documentary programme. The viewer was confronted with a shock horror presentation that leapt from one hard done by individual to another. Examiners were resigning and telling their stories to the media, dependable examination boards were going bankrupt, and school after school was switching to another board to guarantee better results for its students. Large sums of money, we were told, were at stake, and the examining groups were portrayed as ruthlessly slashing their standards in order to undercut the opposition and make a quick killing.
Not far removed from all of this are some really fascinating questions such as: * Is the use made of GCSE results, now that most 16 to 18-year-olds stay in education and training, worth all of the resources that go into generating them?
* Why are GCSE results being used as a national barometer of educational standards, when year-by-year comparisons are rendered more or less meaningless by all of the uncontrolled changes that influence them?
* Are competitive league tables unhelpfully placing the focus of attention too strongly on GCSE results, when they don't warrant the elevated status given to them?
These, however, were not the questions that Channel 4 chose to deal with. Like many other sensational media stories this one also chose to dwell on evidence that supported the case that they wanted to make.
The programme barely referred to the fact that the improving trend in GCSE results over the past few years was more or less halted this year. They paid no attention to the fact that there have been substantial changes in the proportion of middle class children entering for GCSE over this first seven years, which in itself almost certainly caused grades to improve and then plateau this year. Professor David Burghes's Exeter research showing that a third of students who sat exams from two different boards in the same day did better on one than the other, was presented without any of the details that are needed to assess its significance.
Overall this programme fell into the trap of diminishing the pride that many young people will have taken in their GCSE results this year. After all of their hard work and justified successes, they do not deserve to have their achievements devalued by alarmist cries of falling standards. Yes of course, public examinations are only fallible indicators. That is something that needs to be remembered by those who wish to use them for other purposes. We are fortunate in having a strong research base to inform our discussions about public examinations - it is just a pity that the prod-ucers of this particular programme ignored nearly all of it.
Professor Roger Murphy is Dean of the Faculty of Education at the University of Nottingham, and the president of the British Educational Research Association. He has been involved in research into examinations for more than 20 years.