Skip to main content

Markers 'didn't fail'

SQA insiders deny 'wild allegations' of malpractice and point the finger at teacher inexperience

TEACHER unfamiliarity with Higher Still may be the cause of major discrepancies between estimates of pupil grades and actual awards, exam analysts suggest.

There may be nothing radically different in the way papers were marked and processed and there may be little difference from previous years. There is no fall in standards of marking, they insist.

The Scottish Qualifications Authority has been quick to defend the integrity of its marking amid what one spokesman said were "wild allegations" of malpractice.

He said: "It is entirely understandable marking has been caught up in this but it does not serve young people very well for some to bring forward anecdotal stories and hearsay. This is a data management problem and we are fixing it."

Margaret Nicol, Educational Institute of Scotland president and an SQA board member, urged action to restore public confidence, a view shared by the Association of Directors of Education in Scotland. Both want any confusion about standards of marking removed.

The directors have called for a review of the exam timetable; the reduction in time for marking, processing and quality assurance; the complexities of the exam certificate; and the first year of Higher Still and ways to simplify it. Ms Nicol said that the school assessment focus group in June had been told extra markers were all experienced and getting double pay for double work. "If this is not true it must be refuted as quickly as possible, otherwise the public will be left with the impression that the results are not true results."

Dr Jim Page, Higher physics principal assessor, defended his 60 markers and their work on 1,100 papers. "Attacks on them are definitely unfair and a slur on these people," he said. All had attended markers' meetings and met rigorous procedures.

Evidence of exceptional discrepancies in grades remains patchy across the country. Some schools report as many as three times the number of unusual grades compared to teacher estimates but many others noticed little change. Any problems may be limited to some subjects. An exam insider said: "The same procedures were followed and pass rates are the same as in previous years."

Teacher views about pupil performance in the new system of banding are one possible source of discrepancy. "Teachers were never terribly good at estimating bands within the old Higher. Teachers being two bands out was not uncommon. In the past, there was quite a wide variation between estimates submitted and final outcomes," he said.

A 12-point banding system has been reduced to seven, making it feasible that lack of experience with the new system could account for some of the difference. The exam system has always had rogue markrs and pupils wide of estimates but the appeals system took care of it, the insider said.

Marie Allan, former president of the Scottish Secondary Teachers' Association and another SQA board member, believed that there was little difference from previous years. An English examiner and marker, she said: "Procedures were carried out in exactly the same way. There was a lot more marking and I took extra home but the same standardisation and finalisation procedures were used."

Others suggest there may be something inherently flawed in the new Higher Still processes that caused the gap. Among possible explanations is the requirement merely to pass a unit at level C. This could impact on grades in the external exam.

The shortened exam timetable could be another reason why some pupils did not do as well as expected, while it is suggested that below par performances of whole classes or departments are probably due to data failure.

Richard Goring, SSTA education convener, said: "The vast majority of marking is probably fair but there was the potential to have some poor marking done. Some people did not attend markers' meetings and there was panic at the end of the day. People were being asked to mark at the last minute and some had not marked for many years."

Critics believe that shortcuts were taken in some areas in marking or marking administration, but not all, to meet the results deadlines.


* Has quality control failed?

"We followed the same quality control procedures as usual but obviously there was a compressed timetable."

* Were inexperienced teachers used?

"Normal eligibility criteria were retained. Teachers must have three years' experience of entering candidates for qualifications. There were new subjects introduced and in these cases we recruited people involved in developing these courses."

* Were markers drafted in late?

"There is always a turnover and every year we employ people who have never marked before as long as they meet the eligibility criteria. Sometimes markers withdraw at the last minute which means we have got to approach others at short notice."

* Did some fail to attend markers' meetings?

"It's an absolute condition that markers attend meetings. If they cannot attend, they must withdraw. But in some subjects it has always been the practice that experienced markers do not attend meetings and in exceptional cases markers are briefed by principal assessors."

* Were some scripts marked by pupils' own teachers?

"Markers are asked to return scripts if there is a conflict of interest."

* What about stories of bowling green buildings being used for markers?

"It's a good facility and has been used for many years. It's a suitable venue."

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you