Assessment: On-screen marking gets A for accuracy

Research shows that electronic assessment is just as consistent as paper method, despite being more stressful for examiners
18th September 2009, 1:00am

Share

Assessment: On-screen marking gets A for accuracy

https://www.tes.com/magazine/archive/assessment-screen-marking-gets-accuracy

Original paper headline: On-screen marking gets A for accuracy

On-screen marking of exam essays can be just as reliable as paper- based assessment, new research has found.

But the study by Cambridge Assessment discovered that computerised marking was harder work and more frustrating for examiners, particularly to start with.

The electronic process also made it more difficult for examiners to make quick comparisons between different candidates’ answers and to pinpoint where they were on the paper.

“Examiners appeared to work harder on screen to achieve similar outcomes to paper marking,” the researchers conclude.

They believe the consistency achieved by examiners during their research would not necessarily be possible under all circumstances.

Electronic marking of multiple-choice and short-answer GCSE and A-level questions has become increasingly common. But using computers to mark long essay answers is still relatively untried. OCR, Cambridge Assessment’s exam board, does not use the practice at all.

However, assessment board AQA piloted on-screen marking for long essay answers this summer on A-level classical civilisation and GCSE history B papers and plans to increase its use in future depending on its own research findings.

Edexcel has 3.5 million of its 4 million annual exam scripts marked online including an unspecified number of essay answers.

Previous research suggested that on-screen reading could inhibit comprehension, clarity and require more effort than reading from paper.

The Cambridge researchers used 12 examiners for their study. They were asked to mark a sample of English literature GCSE examination essays on computer and on paper.

The examiners achieved consistency in the marks awarded across both processes. But the researchers say it is possible that this was only because the limited number of papers involved left examiners with enough spare brain capacity to accommodate the extra mental effort that marking essays on screen involved.

They recommend more research to “explore whether there exists a point beyond which additional cognitive load might lead to unacceptable levels of marking consistency”.

Their initial study found examiners’ frustration was significantly higher with on-screen marking and that they were more satisfied with their marking on paper.

“Most examiners mentioned the novelty of on-screen marking or specific elements of the software environment as causes for their initial frustration,” the researchers noted.

“However, once technical problems were resolved, examiners generally grew more comfortable with on-screen marking and frustration levels decreased.”

Some felt “energised” by particular aspects of the process such as the absence of handwriting problems and “seeing the scripts off by a click”.

But the overview of a whole script and ability to quickly check back to others, allowed by paper marking, was missed.

“When marking on paper, it’s easy enough to look back at an earlier script,” one examiner said. “It’s in a pile to one side, and even if one does not remember the mark given, or the candidate’s name or number, looking at the first sentence or paragraph identifies the script wanted. With computer marking, `flicking through the pile’ is neither quick nor easy.”

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared