We're good at something, but what is it? Pisa problems

2nd April 2014 at 17:46



On Tuesday, it was announced that England's teenagers were 11th in the world for something. 11th! Out of 44 countries! Normally we lag so far behind in international comparisons that one wonders why we turn up at all. There is good money to be made by anyone who cares to research the link between our Eurovision Song Contest ranking and our educational dominance. 


But you won't find this fact being paraded on a sedan chair anytime soon. In fact, I'd say response has been positively muted. Everyone's claiming a tiny victory from it, of course, and everyone believes that it proves exactly what they thought all along. But there are no knockout ideological points to be scored from it. Why? Because this competition was about problem solving. 


From the TES news story:


"Pupils in England are 'significantly better' at problem solving than the average for the industrialised world, the latest results from an influential comparative education study show.


The country finished 11th among the 44 different international territories where 15-year-olds took new computer-based tests, as part of the last round of Pisa (Programme for International Student Assessment).


England’s pupils also did significantly better at the tests – designed to see how prepared they are for everyday problems like buying train tickets – than their counterparts in other countries who had performed at similar levels in Pisa’s maths, science and reading tests."


Suddenly, it's the Days of the Raj again, and we're all waving flags. But soft, what's that just above our heads?


'Singapore and South Korea were top in tests taken by 15-year-olds.'

That's burst our bubble. Singapore, Korea, Japan, Macao-China, Hong Kong, Shanghai, Taipei... all of those ROTTERS who already won the maths and literacy Olympics. Can't they leave us anything?

Looking at the story, there are several issues that people within education may have with the findings:

1. It doesn't suit the expectations of those who believe, like Ken Robinson and many others, that creativity and problem solving and lateral thinking are dying arts in the UK, as children shudder under the lash and tremble on the rack of a 19th-century Victorian factory model of education.

Shift Happens has a lot to answer for, as does anyone who forecast the death of UK plc because we were, apparently, cramming our kids with drab liturgies of saints' names and Latin verb roots. Not a bit of it, it seems. According to the OECD, our kids are humming with cognitive fluidity; knot-unravelling ninjas, bristling with transferable thinking skills.

In other words, if creativity and problem-solving are your educational jams – and they are, much like Hansel, so hot right now – then whatever we're doing, we got it right. More, I cry, more. It rather gives the lie to those who claim we're beating the creative juices out of our kids like galley slaves. Michael Rosen should be sending Michael Gove flowers.

2. Worse, if you want to see who has the lease on the podium places, you're looking at places like Hong Kong, and Singapore, where rote learning and cramming; knowledge-heavy, ED Hirsch-friendly core curriculums are the lingua Franca. I expect to see Montessori schools across the world chucking out their rule books and getting busy with rows, columns and chanting about gerunds. 

3. And there's a bigger problem. What are we actually testing anyway? Problem solving is the new black in international edu-salons these days, but its importance has long been assumed, rarely evidenced. Put simply, the belief that children are lacking in creative powers and choked by knowing things has never been substantially evidenced. "The 21st century demands such things" people cry with confidence, but why? What has made anyone think that creativity and ingenuity are at a low tide? You might think so by listening to the pop charts, but beyond that, where's the beef? 

And even more substantially, there are important problems with the very definition of the test. What are we testing? Is there any evidence that it tests anything other than the ability to pass the test? Is there any evidence that acumen in this area translates into acumen in any other? Or have we simply invented an assessment criteria, benchmarked children against it, and then congratulated ourselves on the results?

I'm not sure that we've actually discovered anything meaningful here. In fact, I'm not even sure why the OECD, through Pisa, decided to assess this. Given the enormous impact that the other arms of Pisa have on domestic policy, it's easy to suspect that there is an agenda designed to promote problem solving as a key area of educational taxonomy. 

The problem is, no one seems quite sure what the results mean. If you have an idea, let me know.