Marking 'best ever'
STANDARDS of marking in last summer's exams have confounded popular belief by being higher than ever, according to a detailed analysis by the Scottish Qualifications Authority.
The authority has fiercely defended its 7,000 markers ahead of its latest recruitment campaign, launched two months ahead of schedule in an effort to put the 2001 exam diet back on track. It will begin by offering a public apology to existing markers for the failed administration of their work that led to the chaos. It will also open up its procedures to teachers and lecturers.
Concerns about markers' dented morale have forced the authority to bolster its recruitment campaign, which will aim to make more than 16,000 annual "appointments", including markers, examiners, setters, vetters, moderators and scrutineers. Most have more than one role.
Figures disclosed in the independent report from consultants Deloitte and Touche into the exams fiasco reveal that 85.5 per cent of markers were awarded an A grade, in contrast to 84.1 per cent in 1999 and 82.4 per cent the year before.
Only 2.2 per cent of markers, or 155 out of 7,060, were given a C rating and had their scripts remarked. This is the same percentage as in 1999 and below the 2.6 per cent level of 1998.
Some 88 per cent of markers had marked before. Deloitte and Touche investigators found only three subjects - Higher media studies, Higher management and information systems, and Higher business management - in which principal assessors questioned the quality of their markers.
Tom Hamilton, the SQA's senior appointments official, said: "The marking has been rechecked. We have had formal investigations and this has prduced no evidence the marking itself was flawed."
Ian Matheson, principal assessor for Higher history, went further. "It's time someone stood up and defended the markers. I have been extremely frustrated at some of the press reporting and some of the markers are very angry at the implication that marking has been less than the normal standard. Other principal assessors share my anger and frustration," Mr Matheson said.
He and colleagues had spent six days studying appeals and found no cause for concern. "The vast majority of appeals granted in history are not granted because we remarked scripts and found faults with the marking. It's because of alternative evidence from centres," he said.
Mr Matheson admitted that there were problems with recruitment last session but said this merely reinforced the quality of marking. "A much larger percentage than usual was marked by the core examining team," he said. Other strains on the system led to the calamity. "I was not surprised," he says.
Mr Matheson believes that several factors are responsible for discrepancies between estimates and actual awards. Evidence from this year's appeals reveals that the quality of internal marking is less likely to meet national standards if teachers and lecturers are not involved in external marking. The SQA has highlighted this factor in previous years.
Mr Hamilton added that a number of issues had emerged, not related to marking. A number of centres had misjudged the new grading system for passes. Students expecting A passes might only have received a B or C.
Teachers' misinterpretations of how to present courses might also explain some differences. "That is one of the theories and we are looking into it."
Where it went wrong, page 8