Ditching the pen and paper test for on-screen assessment widened the gap between boys’ and girls’ maths scores by the equivalent of two months’ educational progress in two-thirds of the countries and states that took part in the Programme for International Student Assessment (Pisa), research suggests.
Overall, pupils from a third of the participating regions achieved significantly lower results when tested on the computer than they had in the paper assessment. In Taipei, for example, one in 20 achieved the highest grade in the on-screen assessment, compared with one in five in the paper test.
The computer tests also seemed to have a levelling effect – closing the gap in performance of those from the richest and poorest backgrounds.
The analysis was carried out by John Jerrim, from the UCL Institute of Education, who is the first academic to compare pupils’ scores in new, on-screen Pisa tests with those in previous, pen-and-paper versions.
Dr Jerrim analysed the maths scores of more than 126,000 15-year-olds in the 32 countries and city states that took part in a trial run of the new computerised Pisa tests.
Pisa, which is run by the Organisation for Economic Cooperation and Development (OECD), tests pupils’ ability in maths, science and reading. Pisa results have affected government education policy in many countries, including the UK.
In 2012, it trialled computer-based assessments for maths tests, using similar content to the pen-and-paper exams.
“It is unclear why the test produced these results,” Dr Jerrim said. “One would have expected that children from better-off families would have had more access to computers and be able to adapt more easily to on-screen testing.”
He added that organisers were likely to suggest a statistical adjustment to the 2015 results, so that they could be compared with pupils’ scores in previous years. “The Pisa organisers are aware of the challenges that switching to computer-based testing presents,” he said.
Andreas Schleicher, the OECD’s education director, said that Pisa’s 2012 computer pilot study was not designed to be compared to the paper assessment. He added that digital technology has changed how children use knowledge and skills, and that the new test has been designed to reflect that.
“It may be that performance patterns on those new metrics will be different,” he said.
“If people were very successful in riding a horse in the past, that may not necessarily help them to drive a car in the dense traffic of a modern city.”
Read Dr Jerrim's findings in full at johnjerrim.com/papers
Want to keep up with the latest education news and opinion? Follow TES on Twitter and like TES on Facebook