Latest figures show marks required for certain GCSEs have dropped in recent years - but is it fair to blame the examiners? Fran Abrams reports
The GCSE examiner's lot is not a happy one. In recent years, summers have tended to fall into a familiar pattern: weeks of preparation and marking followed by interminable, sweaty meetings in which grade boundaries are set. After many hours, agreement is reached and - usually - everybody goes home happy. Then the results come out. And suddenly it seems everyone else in the country thinks the examiners have yet again got it wrong.
The latest evidence from the Qualifications and Curriculum Authority is unlikely to make this year's exercise any more comfortable. Figures released in answer to a parliamentary question show a drop over several years in the marks required to gain certain grades.
Nor is this pattern confined to one subject. The figures cover the higher tiers of both maths and English, and show similar pictures for both the A grade and the C grade. In all cases, the marks required in 2002 were lower than for 1997 - though the drop varied from eight percentage points for grade A maths to 19 points for grade C English.
So is this proof of what many have suspected for years - that GCSE standards have been allowed to slip over time? Or is there some other, more complex explanation for what has happened?
Anne Barnes, a senior examiner in English for more than 15 years, believes there is. She believes standards have in fact risen - that is to say, GCSE candidates have been getting stronger, and that has led to greater numbers getting higher grades.
"The awarding of grades is a very complex business," she says. "At the meeting we read between 12 and 20 scripts for each grade on each paper. We look at scripts from last year and we look at statistics. It isn't something people do off the top of their heads. Before Christmas we had a meeting which started at 9.30 am and went on until 8pm.
"I think the process is both rigorous and effective. I am totally confident that the process is as good as it could possibly be."
She thinks there could be an innocent explanation for the drop in the actual marks required to achieve particular grades. When a syllabus has been in place for a few years, she says, the mark schemes might start to become more rigorous. So students might tend to get lower raw scores, and the mark required for an A grade, for example, would have to drop in order to be fair to them.
There is some evidence to corroborate her view. In 2004, when new syllabuses were examined for the first time, there was a significant rise in the marks required - indicating that students might have found the new papers easier. But conversely, there was no such rise when the last new syllabus was introduced in 1998.
There is some independent evidence, too, suggesting the GCSE exams might have become easier during the late 1990s. For more than a decade now, Durham university's curriculum, evaluation and management centre has been setting its own tests in maths and vocabulary for Year 10 and 11 students.
About 1,300 schools now use these tests as a baseline from which to calculate students' progress or value-added. And a study by the centre's director of secondary projects, Dr Robert Coe, suggested it was indeed becoming easier for pupils of the same ability to get higher grades.
Dr Coe compared pupils' results in the centre's "Yellis" test with their later performance at GCSE. They looked at pupils with identical "Yellis" scores and found that over the five years to 1998, the GCSE grades of such pupils rose significantly.
There could be alternative explanations for this phenomenon, of course. And Dr Coe says that his more recent impressions suggest the rises may since have levelled off. But it does seem to add to the body of evidence suggesting some "drift" in the GCSE grades.
Dr Coe concluded that when measured against a student's general ability, GCSEs had indeed become easier: "There may be one important sense in which it is now true to say that the same is worth less," he wrote in a paper.
"The grade itself is ... most often interpreted as a proxy measurement of (general) abilities. If that is the case, then it matters less whether the award of each grade represents the same achievement than whether it signifies the same level of general ability in the candidate. The evidence seems to be that it does not."
Other experts tend to support his opinion. Roger Porkess, a former A-level examiner in mathematics, says he is sceptical about the reason why more students are gaining A and C grades.
"I certainly don't think it's due to a huge improvement in students'
performances," he says. "I don't see the evidence of that in people who are coming through to A-level.
"By then, you are really interested in a student's ability to handle algebra. But you can get an A at GCSE without doing any of the algebra questions."
He says that in these league-table obsessed days, it is not surprising that teachers are coaching their students in exam technique. So, for example, if there are easy marks to be had on one section of the paper, students may be encouraged to do that first before tackling the more difficult areas such as algebra.
Most experts agree that exams are not an exact science - either for the students who sit them or for the examiners who mark them. Professor Roger Murphy of the University of Nottingham, an expert on exams and assessment, has spent many years studying the system.
In the mid-1990s he undertook a detailed study of GCSE examiners' meetings which found that decisions about grade boundaries varied considerably. His team observed poor working conditions, tired examiners and even rivalry or outright hostility between examiners.
He does not believe much has changed since then. And he believes exam boards in competition with one another may indeed lean towards slight lenience.
"If they are not quite sure which way to jump, you can understand why examiners might say: 'Let's err on the side of letting slightly more students through," he says.
His view is that examiners and their decisions vary, just as students'
performances might vary on different days. And he raises an even more radical notion: maybe it doesn't even matter whether GCSE standards remain constant over time. "We're kidding ourselves if we treat exam grades as the last word on the system," he says. "The more mature way of looking at exam grades is to recognise the variability in them, and to live with it."`