The Programme for International Student Assessment (Pisa) has become a permanent and critical feature of global education policy. I work with governments all over the world and one of the few things they have in common is a fascination with the countries that come top of the charts. Tallinn can expect a heavy increase in visits from ministerial delegations over the next few years.
But to use Pisa well for policymaking, it’s critical to look beyond the headline figures and analyse the vast mass of data that sits underneath. For instance, the dataset this time shows how incredibly important the nature of a country’s immigration is to its overall performance.
Let’s take two countries – Canada, which is the second-highest-performing country outside of the Far East in reading, and Germany, which is a fair way down the list. For years, policymakers have been making pilgrimages to Ontario and Alberta to find out what they’re doing that’s working so well. Germany is seen as merely average and uninteresting. Yet the difference in reading score comes down almost entirely to immigration. Second-generation immigrants in Germany do 42 points worse than non-immigrant German pupils. In Canada they do 10 points better.
Overall, mainland Western European countries have all seen their scores depressed by the nature of the immigration they’ve experienced over the past few decades. Meanwhile, countries like Canada and Australia, where immigration is largely from high-performing Asian countries, have seen their scores improve. There’s no doubt whatsoever that changing immigration policy is a quicker route to Pisa success than education reform.
How immigration impacts on Pisa performance
This is hugely sensitive topic and I find myself uncomfortable even writing about it, but it’s absolutely critical to understand if we’re to avoid policymakers taking the wrong lessons from Pisa and assuming everything is about policy change rather than wider social and cultural shifts. It's also important that we don't mistake more inclusionary migration policies for education policy failure so as to avoid perverse incentives for Governments.
That said, there are policy lessons hidden within Pisa if you look hard enough. I’ve always found the comparisons between the different nations of the United Kingdom valuable because they are, relatively, similar culturally and socially. Moreover, devolved governments in Scotland and Wales have very deliberately chosen to diverge from England’s education policy over the past few decades, so it makes for an interesting, if not conclusive, natural policy experiment.
The 2018 data shows a mixed picture. Wales has improved from a low base, though is still the worst-performing nation in the UK. Scotland, having been the strongest UK performer in the earlier Pisa datasets, is now some way behind England in maths and science. We can’t say for sure why that is but the introduction of the unfortunately named Curriculum for Excellence in Scotland does seem to have caused real issues for maths north of the border. (nb, I’ve seen a lot of commentators blame the SNP for Scotland’s decline, but it’s worth noting that all parties supported Curriculum for Excellence. Sometimes political consensus on education isn’t such a great thing).
Another lesson for politicians is that an awful lot of supposed policies don’t seem to make any difference at all. A good example is master’s degrees for teachers; something that was a major focus in England a decade ago. There is absolutely no correlation between a high number of teachers with master’s degrees and system performance. Likewise, the old Right-wing favourite of academic selection. It certainly makes systems more unequal but it doesn’t improve performance. In fact, most of the highest-performing systems are clustered towards the least selective end of the scale.
So what would I take from all this data if I was still a policymaker in England? First, it would make me wary of endorsing any dramatic reform programmes. While we can’t definitively say that improvements in England are due to policy changes, we can say that this data doesn’t suggest we have major problems. Moreover, most of the countries that have received the heaviest attention off the back of Pisa are going in the wrong direction (again, we can’t definitely say if this is because of policy or other social issues). Finland has dropped in every Pisa since it came top in 2001, so I think we can give the trips to Helsinki a rest.
The only policy pattern that, very tentatively, seems to stand out to me in Pisa 2018 is around curriculum. Outside of small Far Eastern states like Singapore and Hong Kong, the only consistent high-performing improvers over the past few cycles are in Eastern Europe and, in particular, Estonia and Poland. Both countries undertook major curriculum reforms in the first decade of the century, which appear to be paying dividends. Generally speaking, countries that have gone for very competency-focused curricula, like Finland (and Scotland), seem to be going in the wrong direction. But I would want to do a lot more digging and analysis before I was confident that these relationships were causal.
The main lesson from 20 years of Pisa, after all, is to be cautious. It’s very easy to draw misleading lessons from such complex data and to find yourself a long way down a road going in the wrong direction.
Sam Freedman is CEO of the Education Partnerships Group, formerly executive director at Teach First and formerly a senior policy director under Michael Gove at the Department for Education. He tweets as @samfr