‘Investigate serious flaws in England’s Pisa data’

Exclusive: Letter from leading expert calls on statistics watchdog probe into claim UK Pisa data ‘biases’ were not investigated
22nd April 2021, 12:01am

Share

‘Investigate serious flaws in England’s Pisa data’

https://www.tes.com/magazine/news/general/investigate-serious-flaws-englands-pisa-data
Call For Independent Investigation Into Pisa Data

The UK Statistics Authority needs to conduct an independent review into the Pisa data for the UK, a leading academic has urged.

The warning comes after a research paper, published today, found “serious flaws” in the data from the Programme for International Student Assessment 2018, which tested a sample of 15-year-olds in reading, maths and science in a number of countries.

The paper, authored by UCL social statistics professor John Jerrim, found that low-achieving students were underrepresented in the England and Wales samples, while a number of anomalies, including a high number of ineligible students, were found in the Scottish data. 


Related: Pisa: Low-achieving students ‘systematically excluded’ 

Explainer: What is the Pisa test and what does it measure?

Pisa 2018: England climbs Pisa table as reading and maths improve

Read: Pisa data ‘incredibly underused’, says academic

Analysis: Sampling issues cast doubt on Canada’s Pisa performance


In a letter to the Office for Statistics Regulation, seen exclusively by Tes, Professor Jerrim warned that more should be done to interrogate potential bias in the data collected as part of Pisa, especially since the UK has “ideal data” to do this within key stage 2 and GCSE grade records.

He said: “The most pressing issue is for the Office for Statistics Regulation to conduct an independent review of the UK’s Pisa data. Clear guidelines need to be put in place to ensure more transparent reporting in the future.

“Crucially, at an international level, the OECD [Organisation for Economic Cooperation and Development] needs to reconsider its technical standards, the strictness of which these are applied, and its data adjudication processes. The processes currently in place are nowhere near robust enough to support the OECD’s claims that Pisa provides truly representative and cross-nationally comparable data.”

He wrote in the letter: “Section Q1.5 of the UKDS Code of Practice for Statistics states how potential sources of bias should be identified, and the impacts reported.

“As I have noted in the…paper, the reporting of the Pisa 2018 data for the UK has in my view not adhered to this.”

Below is the full letter, published exclusively by Tes.


Re: A request for the UK Statistics Authority to undertake an independent review of the PISA data for England, Scotland, Northern Ireland and Wales.

Dear UK Statistics Authority.

I am writing to you to raise an issue with regards the PISA 2018 data for England, Northern Ireland, Scotland and Wales, with particular concerns about aspects of how the data has been reported.

I have used the PISA data for a number of years, and authored the PISA 2015 reports for England, Wales and Northern Ireland. I have analysed PISA data for more than a decade, and have several peer-reviewed publications using this resource. This includes both the methodological underpinnings of PISA, as well as using the data to address substantive research questions.

Attached to this letter is a new paper I have written about the PISA 2018 data for the United Kingdom. In this, I go into detail about the investigations I have conducted into the PISA 2018 data for England, Northern Ireland, Wales and Scotland - and the various concerns that this has raised.

In the conclusions, I call for the UK Statistics Authority to conduct an independent review of the PISA data for the UK, with a particular focus upon the transparency and clarity of reporting. In particular, I recommend the UKDS to consider developing some “best practice” guidelines for each of the four UK governments to follow in the reporting of future PISA rounds.

In reference to your Code of Practice for Statistics, I would particularly like to draw your attention to the following:

  • Code of practice Q1.5 The nature of data sources, and how and why they were selected, should be explained. Potential bias, uncertainty and possible distortive effects in the source data should be identified and the extent of any impact on the statistics should be clearly reported.

Section Q1.5 of the UKDS Code of Practice for Statistics states how potential sources of bias should be identified, and the impacts reported. As I have noted in the attached paper, the reporting of the PISA 2018 data for the UK has in my view not adhered to this. In the case of England and Northern Ireland, a non-response bias analysis was conducted - but was not reported anywhere and not in the public domain. Instead, the sample was only described as “representative” and “positive” without further details provided. This is despite the actual evidence being, in my opinion, inconclusive at best. For instance, I am concerned by the fact schools with lower levels of historic GCSE performance were less likely to participate in the study in England has not been clearly reported. What has been reported is - at best - only a partial reflection of what the (limited) evidence truly shows.

More generally, there has been no attempt to identify “the extent of any impact on the statistics” from issues surrounding non-response and non-participation. As I note - and demonstrate - in the attached paper, the UK has the ideal data to thoroughly interrogate the representivity of the PISA data, through links to administrative records (e.g. Key Stage 2 test scores and GCSE grades in England). This analysis has however not been conducted, and is not currently routinely done as a matter of course. Much more could - and should - be done to interrogate potential bias in the statistics reported.

  • Q1.6 The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics.

The attached paper notes what I consider to be important limitations with the PISA data for the UK. Some of these points are clearly reported by each of the UK governments and by the OECD (e.g. school and pupil response rates). However, other important information is not. This includes the extent of within-school exclusion of pupils from the sample and - in the case of Scotland - the unusually large proportion of pupils deemed as “ineligible” from the study. Most importantly, there has been a failure to clearly document the cumulative impact various forms of non-participation has upon the PISA sample. For instance, the fact that almost 40% of the target population are lost due to various forms of non-participation should (I believe) be noted, as well as the fact that this is one of the highest rates out of any of the 80 participating countries.

  • Q1.7 The impact of changes in the circumstances and context of a data source on the statistics over time should be evaluated. Reasons for any lack of consistency and related implications for use should be clearly explained to users.

In PISA 2018, Scotland changed the date the PISA test was conducted from previous rounds. As I explain in the attached paper, it should not be assumed that this is an innocuous change, and could have materially impacted the results. Yet, as far as I am aware, no attempt has been made to evaluate the impact that this change has had upon the results. Equally, as detailed in the attached paper, I do not believe that this change - and its potential implications - has been “clearly explained to users”. Rather, I believe the Scottish government has selectively changed the text from previous iterations of their report to make the potential implications of the change less clear.

As you may see, in the attached paper I also raise a number of additional concerns to the above. Most notably, some of the figures reported by the Scottish government with respect to the PISA sample do not match the figures reported by the OECD (see Appendix B). Together, these issues I believe have the potential to undermine public confidence in how official government statistics are produced and reported.

I have attempted to engage with the Department for Education in England, and the Northern Irish, Welsh and Scottish governments about these issues, as well as the OECD. I sent them a copy of my paper on March 4th 2021, and offered to present my findings to them and to discuss the work. They, however, have not responded - and have failed to engage with me on any of the points that I have raised. Instead, the only communication I have had is with the National Foundation for Educational Research - who were the PISA 2018 contractors for the UK - but with no resolution about some of the key points I have raised. I do not believe, however, that it is appropriate for governments to believe they have contracted out responsibility for the issues I have raised.

If it were of interest, I would be happy to present my findings to you, to help clarify any of the issues that I have raised. Likewise, I would be very happy to discuss this with you further one-to-one or in a small group. Please do not hesitate to contact me at J.Jerrim@ucl.ac.uk or 07590 761 755 if you would like to discuss this further.

Best wishes,

Professor John Jerrim (Professor of Social Statistics, UCL)

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared