Findings lost amid political jockeying;Research focus
The influence of international comparisons of pupil performance has grown steadily since the First International Mathematics Study was carried out in 1964. By the late 1980s there were continual attempts to use international comparisons to justify a return to "basics", in terms of both curriculum content and teaching methods. And this trend has continued following the publication of the Third International Maths and Science Study (TIMSS), which tested children aged 9 and 13 and young people in the final year of secondary school.
While I do not pretend that the numeracy of our pupils is satisfactory, some recent reporting of international comparisons does not even attempt to represent the facts accurately. But then there was never any doubt that TIMSS, which reported its results in the form of an international league table, was a thoroughly political activity.
The results for the 13-year-old sample were known by May 1996, but the testing agencies and government officials in each country were required to maintain confidentiality until November 20. It was rumoured that the delay was to ensure that the poor results of the United States did not affect the outcome of the American presidential elections.
While this may be untrue, the link between international comparisons and British politics is clear. The only country to break the agreed embargo on releasing TIMSS results was the UK, and for political reasons. On July 3, 1996, by order of British education ministers, the comparatively low ranking of England in mathematics was leaked to The Times and was criticised in a front-page headline, "English pupils plummet in world maths league".
Ministers risked incurring the wrath of the international community because they feared teachers' reaction to the introduction of both mental arithmetic and calculator-free papers into national mathematics tests, and wanted to provide a legitimate reason for this action.
Both the outgoing Tory government and the incoming Labour Government have also used the results of the 45-country TIMSS research to justify their emphasis on raising levels of numeracy, especially in primary schools. Little attention has, however, been paid to TIMSS data indicating the low resources for English teachers (large classes, less preparation time, fewer textbooks).
But in any case "plummeting" headlines were hardly justified. The intention in TIMSS was to compare the mathematical standards of 13-year-olds by testing the two year groups in which most 13-year-olds were expected to be taught. In the event the results for 13-year-olds were hopelessly bunched together; 24 countries had median scores between 47 per cent and 52 per cent, including almost all of Europe, the United States, Canada, Australia and New Zealand. England was only slightly below the middle of this group. Only four Pacific Rim countries scored higher than 52 per cent and it was mainly developing countries which scored lower than 47 per cent.
The published league table relates not to 13-year-olds but to performances of pupils in the whole-year groups equivalent to English Years 8 and 9. Here the differences between countries appear larger because of sampling disparities. For example, nine countries excluded 20 per cent or more of the lowest-attaining 13-year-olds, and 10 to 20 per cent of the lowest-attaining 13-year-olds were excluded in a further 10 countries. The comparable figure for England was 1 per cent.
Similarly, in the TIMSS report the reader has to work very hard to find the note which explains that only 32 per cent of Thai 13-year-olds attend school. This might go some way to explaining why Thai results appear similar to England's.
Close scrutiny of the findings actually shows that England's mathematics performance is similar to that of most European and Anglophone countries, with neither a particularly wide range of ability nor an especially long tail of under-achievers.
It seems to be a matter of great national concern that England performs badly in number, in TIMSS and elsewhere.
However, much depends on the nature of the items in the tests, in particular on the relevance of the aspects which English teachers do not claim to concentrate on. For example, adding complex fractions was specifically de-emphasised in the national curriculum and postponed until older age groups, as it was considered to have little practical relevance for most pupils, although of some importance for later mathematical and scientific work.
In comparison to the TIMSS mathematics tests, which have been shown not to match our curriculum very well, the science tests show a degree of match for England which was much closer than that for most comparable countries, especially at 13. This seems likely to be a major factor in explaining our superior science performance.
England also did well in the TIMSS practical problem-solving tasks and in geometry - but these facts went largely unreported as the media tend to ignore results which show national performance in a positive light.
Both the Tory and Labour governments have said little about these successes either and the reasons are not hard to find. Both had decided on a "back to basics" line emphasising higher standards in arithmetic and the need for more whole-class teaching.
Any press release which pointed out the excellence of our performance in the application of mathematics to practical problems would risk questioning the need for the pressure on primary teachers to move to more traditional methods.
The TIMSS findings do suggest that we should perhaps shift the curriculum content to enhance our performance in number. But the good results in geometry and in science cast doubt on the need for major reform of teaching methods.
Teachers and the general public need to be highly suspicious of those who use international data selectively to give unequivocal messages about how to improve teaching.
Margaret Brown is professor of maths at King's College, London, and president of the British Educational Research Association.
* Comment, page 20