Concerns over 'potential bias' in Scottish Pisa results

Academic calls for more analysis of Scottish Pisa data after finding rise in lower achievers withdrawn from tests

Emma Seith

Pisa international education survey: Fears have been raised about the reliability of Scotland's results

An academic is calling for the results of the Pisa (Programme for International Student Assessment) education survey to be more robustly reported and interpreted by the Scottish government, after finding that the number of pupils withdrawn or deemed ineligible in Scotland is “much higher than in any other country”.

The proportion of Scottish students withdrawn from the 2018 Pisa test – which measures 15-year-old students' performance in maths, reading and science – doubled from 4.1 per cent in 2015 to 9.3 per cent in 2018.

The average student withdrawal rate across all the countries that took part in Pisa 2018 was 1.7 per cent.


From the same paper: 'Investigate serious flaws in England's Pisa data'

Background: Pisa 2018 results reveal a mixed picture for Scotland

Analysis: What does Pisa tell us about Scottish education?

The critics: 'Ignore Pisa entirely,' argues top academic

Big read: Scottish pupils among top performers in new Pisa test


Scotland also had the lowest pupil response rate of any participating country (80.5 per cent) except Portugal, and the final Pisa sample, as a proportion of the target pupil population, was much smaller in both Scotland (63 per cent) and the UK as a whole (61 per cent) than the international median (85 per cent).

Fears over the reliability of Scotland's Pisa results

All of this taken together, argues the UCL (University College London) Social Research Institute's Professor John Jerrim – in a forthcoming paper for academic journal Review of Education – could have resulted in potential biases in the Pisa data, with low-achieving pupils being underrepresented.

Professor Jerrim – who has also found that in England and Wales there is “clear evidence that some lower-achieving pupils have been systematically excluded” – is calling for the Scottish government and the Organisation for Economic Cooperation and Development (OECD), which runs the test, to carry out “a proper bias analysis” on the Scottish data.

If non-participants had actually taken the Pisa test, Scotland’s scores “could change quite dramatically”, says Professor Jerrim.

One “plausible scenario”, he says, is that Scotland’s Pisa 2018 result for reading would have been 13 points lower – going from 504 down to 491.

Reading was the focus of Pisa 2018 and the only one of the three areas assessed – the others being science and writing – in which Scotland achieved a score “statistically significantly above the OECD average”, but such a reduction would have seen the country drop to average.

Professor Jerrim's paper states: “The fact that one in five did not complete the test has clear potential to bias the Pisa results. It thus seems important that the magnitude of such potential bias is estimated and transparently reported. Unfortunately, to my knowledge, neither the Scottish government nor the OECD has investigated this issue, or published any such evidence in the public domain.”

Professor Jerrim attributes the high proportion of students deemed ineligible or withdrawn from Pisa 2018 to Scotland moving the testing window from between March and May to between October and December.

This had a significant impact on the Scottish Pisa sample, he says. During previous rounds of Pisa, the vast majority of students taking the test have been in S4, but in 2018 half were in S4 and half were in S5.

The distribution of pupils participating in Pisa across different school year groups in Scotland

Year group

2006

2009

2012

2015

2018

S3

2%

3%

3%

4%

0%

S4

89%

87%

87%

88%

50%

S5

9%

10%

10%

9%

50%

According to Professor Jerrim, this change could have skewed Scotland’s results and rendered comparisons with previous rounds meaningless because more of the students taking the test had been in school longer. More importantly, however, moving the test to later in the year also meant that students who were earmarked for inclusion in the Scottish sample were deemed “ineligible” because they had left school.

These students, Professor Jerrim argues, were “probably lower-achievers” who left school following their National exams.

The paper states: “Importantly, those pupils who change schools between S4 and S5 are probably lower-achievers; school mobility has previously been linked with lower levels of achievement (Strand and Demie 2007), while young people who pursue vocational courses tend to have – on average – lower levels of academic achievement. In other words, the high levels of pupil 'ineligibility' for Scotland in Pisa 2018 may have led to Scotland removing some lower-achieving pupils from the sample.”

Professor Jerrim says that England, Wales and Northern Ireland have also – since 2006 – had special dispensation to carry out the Pisa test between October and December to avoid conflict with exams.

He says Scotland made the change for the same reason – but the government failed to be transparent about it and to take action to mitigate its impact on the Pisa results.

He told Tes Scotland: “In many ways, moving the test to October, November or December is preferential – it is not a bad decision – [but] the problem comes with the implications of that decision. The impact it could have on the results was not investigated. The decision was not such a big thing, but how it was handled was pretty atrocious.”

This is particularly concerning, says Professor Jerrim, because Pisa results are so influential and have consistently driven changes to schooling systems across the globe and in the UK. Pisa has also become the main resource for comparing education outcomes across the UK’s four devolved nations, as it is the only cross-national assessment taken on a regular basis.

Professor Jerrim concludes: “Pisa is meant to be a representative study of 15-year-olds across the UK. But there are serious flaws with some children being excluded from the study, schools being unwilling to participate, and some pupils not showing up for the test.

“In England and Wales, there is clear evidence that some lower-achieving pupils have been systematically excluded. While what has happened in Scotland is, frankly, a bit of a mess.”

An OECD spokesman said because Pisa participants had the right to not respond to the survey or request their data be deleted, it was “inevitable that Pisa will not have 100 per cent response rates”.

However, he said that the OECD assured the quality of the Pisa samples for the UK as a whole – and for Scotland – “based on adherence to Pisa 2018 technical standards”.

When it came to Professor Jerrim’s claims about the impact that wider participation could have on Scotland’s Pisa performance, the spokesman said: “This claim is based on conjectures, rather than real data; and on the somewhat arbitrary decision by Jerrim to extend Pisa inference beyond the defined target population, including to ineligible students.

“Pisa original results rest on the hypothesis that the performance of non-respondents is similar to that of the 'most similar” participants (through the weighting adjustments for non-participation). In contrast to Jerrim, Pisa indicators do not aim to cover the excluded or the ineligible portions of sampling frames.

“Jerrim’s alternative scenarios not only extend to these populations, too, but do not account for the fact that the 'original' estimate of the country distribution already accounts for non-respondents – potentially leading to adjusting twice for the same bias.

“While it is legitimate to assess the strength of the assumptions behind the official estimates and to hypothesise alternative scenarios, the OECD remains committed to using data, rather than conjectures, in selecting the most likely scenarios.”

A Scottish government spokesman said: "The Scottish Government met all of the OECD’s technical standards for the survey and the sample of participants from Scotland was deemed by the OECD to be consistent with a robust and representative sample.

"The change in testing period brought Scotland in line with the rest of the UK and was implemented in line with OECD procedures.

"If the OECD have concerns about quality or bias in data, participating countries can be asked to do further analysis. No such request was made in this case."

Register to continue reading for free

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you

Emma Seith

Emma Seith

Emma Seith is a reporter for Tes Scotland

Find me on Twitter @Emma_Seith

Latest stories