Robo-marking sees off red pen
The system has already caused a stir in America. Now Edexcel has announced plans to trial the Pearson Knowledge Technologies programme in Britain - initially on dummy papers.
Edexcel managing director Jerry Jarvis has been impressed with its potential.
But examiners in the UK have criticised the existing use of computers in assessment. They have argued that computerised marking of multiple-choice tests, and on-screen marking by humans, is turning assessment into "monkey work" and encouraging boards to set simple, easy-to-grade questions.
However, supporters of e-marking were adamant that a more adventurous use of computers could improve standards and cut costs. Individuals are error-prone, but computerised essay-marking ensures consistency and guarantees pupils a fair grade they argued.
Geoff Barton, a fellow of the English Association, advised a wait-and-see approach. "It's easy to dismiss these things out of hand, but actually a lot of the things teachers pick up on are basic spelling and construction,"
he said. "It could take some of the routine marking away and allow teachers to concentrate on the subtleties."
He was something of a lone voice among English teachers and academics. But developers claimed the software has marked thousands of papers at levels of accuracy that would shame the best-read examiner. The software, of which Pearson is the leading proponent, works like a sophisticated grammar-checker.
A dummy run on the organisation's website will mark your work for spelling, style, redundancy, content, sentence organisation and mechanics.
The developers said that it strips down documents and searches for key words. The more - and better - vocabulary used, the higher the mark.
Edexcel said that once it has been fed textbook materials and 200 to 300 tests, marked by an examiner, the software recognises the words and patterns that contribute to a first-rate answer and can go on to mark alone.
But not everyone was convinced that a computer can pick up on the subtleties. Dr Bethan Marshall, a senior lecturer in English education at Kings College, London, said there was often a link between the vocabulary and syntax used in an essay and its overall quality, but that this was far from always true.
"You're not getting a reading of the argument," she said. "Also what if the syntax is original but the computer doesn't understand it? You could be brilliant and get nothing."
Simon Gibbons, a member of the National Association for the Teaching of English, feared the software would encourage teaching to the test. Rigid mark schemes produced mediocre students, he added.
Tick box testing, page 18