A sure-fire plan for test success, but what about the quality?
International comparisons in education have been ticking away in the background since 1964, and have yielded a lot of interesting information. They showed, for example, that gender differences in subject choices and performance were similar across countries, and not uniquely British.
But under Tony Blair, education league tables became a matter of national prestige. The UK did rather well in the new Organisation for Economic Co-operation and Development tests in 2000 and ministers such as Charles Clarke and David Miliband kept bobbing to claim the credit. Michael Barber, as head of the Standards and Effectiveness Unit, boasted to the US that we knew how to improve education systems - and fast.
All this unjustified pride, however, was setting us up for a fall. Sure enough, in 2003 the scores plummeted. The Government cunningly got round this by allowing us to be excluded on the grounds of a poor response rate, even though it was about the same as in 2000. But their manoeuvring only makes the 2006 figures look worse. We come out, with France and Germany, as average. But as Ofsted says, satisfactory is not good enough.
So what is to be done? We could follow football and put a foreigner in charge. Perhaps someone from Finland - the country that always seems to come top - could do a Sven for us. But he only got us to the last eight and we want to be winners.
There is a sure-fire way to the top. First we need to transpose the methods that have resulted in the national test and exam scores going up year by year. We should:
- Teach to the test with regular practice sessions in the preceding year;
- Motivate by closing schools and sacking teachers if their pupils do not do well;
- Introduce national strategies to teach reading literacy, science literacy and maths literacy - which are different from reading, science and maths (in Pisa 2000, pupils in Ireland not studying science did better in science literacy than those who had studied it).
Then we could learn from our competitor nations by:
- Playing to the limit of the rules by withdrawing pupils who struggle (as top-scoring Russia is alleged to have done in the recent Pirls); and by ensuring that, in primary tests, only older children are entered;
- Pack the test-setting teams with Brits, since countries that testers like - the Dutch, Australians and Canadians - do well.
There is no doubt that ruthlessly following this five-point plan could take us to the top - unless other countries resorted to these ploys too. But what would it do for education?
Education tests are not thermometers or rulers. In football, scores do say something about quality since teams go head to head. But in education, when tests are used to judge schools and teachers, the scores take on a life of their own. The relationship becomes rather like that of astrology to astronomy: elaborately contrived numbers detached from the underlying phenomenon.
We should treat Pirls and Pisa seriously and learn what we can from them. There is, however, a logical flaw in Pisa. It deliberately eschews the curriculum, but seeks to interpret the results as school effects. In fact, factors outside education, such as immigration, affect the figures too. The blip in 2000 may have cast the UK's 2006 results in a particularly bad light, but we do need to continually do better. This, however, has to be in terms of improving the quality of the education system, not striving to push up scores at all costs.
Alan Smithers, Professor and director of the Centre for Education and Employment Research, University of Buckingham.