Why England’s Pirls results aren’t all that they seem

England’s climb up the international rankings for reading has been seen as a success story for phonics – but the results aren’t as straightforward as they seem, says Christian Bokhove
31st May 2023, 11:35am
Why England’s PIRLs results aren’t all that they seem

Share

Why England’s Pirls results aren’t all that they seem

https://www.tes.com/magazine/teaching-learning/primary/why-englands-pirls-results-arent-all-they-seem

When the results of the 2021 Progress in International Reading Literacy Study (Pirls) were reported in May, many were quick to link England’s high ranking with the use of systematic synthetic phonics in schools.

The study, which measures the comprehension skills of 9- to 10-year-olds every five years, included nearly 400,000 students in 43 countries worldwide. England came fourth - a jump up from its previous position of joint eighth, in 2016.

It’s tempting to look at that rise and see a straightforward story of success. Yet, international rankings are rarely as simple as they seem.

For example, the Pirls 2021 data clearly reflects the challenges of the Covid-19 pandemic. Some countries, including England, delayed their data collection, which meant that they tested the students in a different year. In addition, in terms of the raw scores, only three countries managed to improve their results; the majority performed worse. 

England’s scores stayed relatively stable and, given the context, we should be content with this. Of course, we all like growth, but endless growth is not realistic. 

What do Pirls scores really show?

It’s hard to point to a specific cause for England’s success, but what we do know is that these types of assessments are almost always correlational and say little about cause and effect. This is especially true if you want to pass judgement on whole education systems. 

Politicians will often only highlight how their own policies have worked, without giving any credit to the policies of their political opponents. 

In the context of Pirls 2021, we could point out that Jim Rose’s 2006 Independent review of the teaching of early reading, produced under the last Labour government, had a major influence on the use of phonics in schools. And if we are going to argue that phonics policies have improved Pirls scores, then we certainly shouldn’t forget that there was a larger increase in scores between 2006 and 2011, than there was between 2011 and 2016. 

Another thing to bear in mind is that comparisons with other countries are really hard because countries’ education systems differ so much. For example, years ago, Finland was labelled the “place to be” for education, based on Programme for International Student Assessment rankings. Yet, the country’s results have recently been in decline, which some might say serves as a warning to countries like England that are currently doing well in the rankings.

That’s not to say countries can’t learn from other countries. In many ways, it’s not the scores that are the most informative parts of these assessments, but the context questionnaires that come with them. Pirls 2021 includes plenty of interesting information: for example, the fact that England managed to narrow the performance gap between boys and girls - because boys did slightly better and girls did slightly worse. 

It’s also interesting to see that a far larger percentage of students start primary school with literacy skills in some countries than in others, and that affective attributes differ a lot between countries: for example, whether students like reading or how confident they are at it. 

Pirls also offers insights into the extent to which digital devices are being used by students to find and read information, and how much “home environment support” students receive with reading (an area where England, sadly, didn’t have any data). 

It’s not easy to unpick all these different layers of information. But that’s just another reason why assessments like these should always be a starting point for curiosity and debate, rather than proof of how effective one particular policy might be. 

Christian Bokhove is associate professor in mathematics education at the University of Southampton and a specialist in research methodologies 

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared