The 2015 Trends in Maths and Science Study (Timss) gives us a detailed breakdown of pupil performance in maths and science across 57 countries and 7 states or provinces. It measures performance in these subjects at the 4th and 8th grades of school (Year 5s, 9-10 year olds, and year 9s, 13-14 year olds, in England).
Many commentators and professionals argue that international comparisons such as Timss and its big sister Pisa serve little purpose in assessing the performance of a single country because, after all, it is context that matters. Can we reliably compare a pupil in Knowsley to a pupil in Shanghai? Are test scores in a single year, in two subjects, enough tell us how well government is serving its’ pupils and future workforce? And do these tests actually mean anything for long term economic performance?
All of these questions about the purpose of international comparisons are valid. But, in my view, these assessments – and the rich contextual data collected along with them - help researchers and policy-makers understand better how England’s education system is working and whether we can learn from the triumphs or errors of other countries. There are three immediate findings that struck me as I digested England’s lengthy Timss report over the past 24 hours.
Is there just a data problem?
First, Timss has now been running for 20 years. This gives us an assessment of pupil attainment that is largely immune from the plethora of changes in domestic assessment over the last couple of decades and three different governments.
Since 1995, the mean maths score of both Year 5 and Year 9 pupils has improved in absolute and relative terms, from significantly below the international mean to significantly above it. What is interesting, however, is that most of this improvement happened between 1995 and 2007. Since then there has been improvement for both year groups, but at a much smaller rate considered statistically insignificant. For example, Year 5 mean maths score increased by 57 points between 1995 and 2007, but only by a further 5 points since then.
So what has happened over the years between 2007 to 2015 to cause this seeming stagnation in performance? A possible explanation could be the effect of interventions prior to 2007, including the focus on literacy and numeracy under David Blunkett’s tenure as Education Secretary which were then followed by the primary strategy and new accountability measures including floor standards and local authority level targets. Did these interventions create an initial boost to results which subsequently tapered off?
Or, is this just a data problem? The DfE’s national report for England, published yesterday, stated that in 1995, the Timss cohort comprised a mix of year 4 and 5 pupils (as well as the older Year 9 cohort), and it was only in subsequent years that it focused on Year 5 and Year 9 only. The inclusion of year 4 pupils in 1995 could have created a lower baseline in that year than would have otherwise been expected had it included Year 5 pupils only.
Further study is needed to unpick what has really happened in England but, in the meantime, one thing this does demonstrate is that international benchmarking can help us to understand how domestic policy has influenced pupil attainment, outside of national testing regimes.
The next important point about international benchmarks is what it tells us about the relative gap between disadvantaged children and the rest in England compared to other countries. Unlike the overall test scores, the gap tells us about the relative achievement of disadvantaged pupils compared to their peers. If the government and the profession are serious about social mobility, then benchmarking the gap matters.
Timss uses the number of books pupils say there are at home for the disadvantage measure in Year 9. According to the analysis in DfE’s report, this corresponds reasonably well with the domestic Free School Meals measure used for measuring the disadvantage gap in our national performance tables. Of the 34 countries for which there is data on the level of home resources for Year 9 pupils, England has the 26th largest gap, as measured by the difference between the average score of those with many resources, and those with few. The gap in England is 122 points, compared to an international average of 109.
Now, we cannot argue here that changes to curriculum or assessment domestically affects our ranking internationally. All pupils in England have been generally subject to the same reforms. Instead, we should continue to question why the 5th largest economy in the world ranks 26 out of 34 on a measure that essentially assesses social mobility.
Finally, international benchmarking can also help our understanding of what has or hasn’t worked in other countries. Governments of all colours, academics and think tanks often look to international evidence when considering how to address social policy challenges. Taken together with empirical research, these assessments can provide useful insights about how other countries are responding to interventions that may have taken place many years ago, but are only just becoming visible in pupils’ test scores.
Finland is, arguably, the most interesting of these studies at present. Often cited as a success in education, Finland’s ranking in both Pisa and now Timss has declined in recent years, and in these results a fall in ranking for grade 4 science and maths results is associated with absolute declines in results too. The cause of this decline has yet to be definitively identified and has generated an interesting debate in the academic world. In a paper published by Gabriel Heller Sahlgren in 2015, Sahlgren purports that Finland’s initial success might be more attributable to its traditional, instruction-led teaching methods of the 1970s, and that the move to student-centred learning is now contributing to the country’s decline in international tables. Others refute this, arguing that it ignores many of the other practices that Finland has adopted over the last few decades as well as wider international evidence.
While we may never know the precise cause, Finland nonetheless provides a researchers and academics with one of the most interesting international case studies.
Next week sees the launch of the PISA 2015 results which will tell us about the performance of 15-year olds. The Education Policy Institute will be hosting the global launch of the findings, with the OECD’s Secretary General (Angel Gurria), the Director for Education and Skills (Andreas Schleicher) and a panel of experts including Amanda Spielman, Russell Hobby and Brett Wigzdorf all giving their views and reactions. If you’d like to know more about the event, please email firstname.lastname@example.org.
Natalie Perera is executive director and head of research at the Education Policy Institute