Take care when you compare

14th June 1996, 1:00am

Share

Take care when you compare

https://www.tes.com/magazine/archive/take-care-when-you-compare
By all means learn from good practice abroad in maths and science, says Wendy Keys. But we should not leap to conclusions about how poorly our children perform at these subjects before a giant international survey is published.

David Reynolds (TES, June 7, 1996) argues that we should look beyond the white cliffs of Dover to learn from other countries about effective teaching and learning practices. I agree - but I would also sound a note of caution.

It would be extremely unwise for those concerned with education to act on mere speculation. Assessments of the efficacy of this or that approach must be based on solid evidence, drawn from recent research, using fully representative samples of schools and pupils in every country. Before seeking reasons for the apparently inferior performance of English pupils, we must first be certain that such differences really exist and that like has been compared with like.

Much has been made of such studies as the Kassel project (TES, March 15) and the International School Effectiveness Research Programme (ISERP) (TES, May 3), which apparently demonstrate inferior performance in maths by English pupils.

But neither of these studies was designed to compare the maths performance of students in different countries. In neither were the samples of schools randomly selected or large enough to draw valid conclusions. As the authors of a recent report on ISERP state: “It is important to note that our study was not set up to look principally at the issue of which countries appear to achieve high levels of academic and social development of their students.” The study can offer no more than “potentially interesting speculations as to effective countries.”

Although both of these studies have reported interesting findings on effective teaching approaches, it is simply not possible to draw from them firm conclusions about the relative maths performance of students in different countries.

There are three conditions for valid international comparisons of student achievement: * samples in each country must be fully representative * tests must be as fair as possible to all countries * administrative procedures must be similar in all countries.

One survey that fulfils all three is the Third International Maths and Science Study (TIMSS), which will publish its first reports in November 1996. With more than 500,000 students tested, TIMSS is the largest and most ambitious study of comparative educational achievement ever undertaken. In 1994-95, tests of maths and science were administered to carefully selected samples of students in five different year groups.

Each student taking part in TIMSS completed tests of maths and science at the same time. The students, their teachers and headteachers completed questionnaires about their backgrounds, attitudes and classroom experiences.

The first national and international reports on TIMSS will compare the performance in maths and science of 12-, 13- and 14-year-old students (Years 8 and 9 in England) in nearly 6,000 schools in 45 countries. The national report is being prepared by the National Foundation for Educational Research. Reports on other age groups are planned for 1997.

The TIMSS results will provide solid evidence regarding English performance in maths relative to other countries - and put an end to the recent wave of speculation and unsupported assertion. Indeed, it is the only study in recent years to produce findings based on adequate, nationally representative samples.

In order to avoid bias, samples of schools in each country must be representative of the schools in that country in terms of type, size, achievement level (in England, TIMSS used GCSE results) and geographical distribution.

The sampling for TIMSS followed rigorous random procedures. Each country’s sampling plan and all details of samples had to be documented and agreed by an independent sampling referee. Samples that do not meet TIMSS’ rigorous standards will be excluded from the main tables of the international report.

In addition, samples must be of an acceptable size: most of the countries taking part in TIMSS, including England, tested students in more than 100 schools.

To ensure that tests in international comparative studies are fair, their mathematical and scientific content should be equally relevant to all countries taking part. Although this is impossible to achieve completely, TIMSS set up rigorous procedures to ensure maximum comparability.

The TIMSS tests were developed after careful analysis of the curriculums in science and maths in participating countries. Items for testing were tried out during two pilot studies before the final selection was made. (For lower-secondary students, 286 items for testing were selected, 151 in maths and 135 in science.) The items were then grouped into eight different tests, each mixing science and maths, and each student completed one test only.

All national changes to test items had to be agreed with the International Study Centre (ISC). For non-English speaking countries, all translations had to be agreed by the ISC to avoid changes of emphasis or level of difficulty. In Britain, the only changes made were to American spellings and units of measurement.

The maths tests included fractions and number sense, geometry, algebra, data representation, analysis and probability, measurement, and proportionality. The science tests covered earth sciences, physics, chemistry and environmental science.

TIMSS used the same test items to assess pupils’ achievement in maths and science in all participating countries. But, because curriculums vary from one country to another, each country set up panels of experts who identified the items tested that were most important to their own curriculums for the year groups tested. The TIMSS reports will include tables comparing each country’s results with other countries on these self-selected items.

In addition, to ensure that the survey was administered under the same conditions in all participating countries, the administration manuals, which included scripts for the teachers administering the tests and details of time limits, followed the same general pattern in all countries. As with translating the tests, each country’s manuals had to be agreed with the ISC.

Finally, the ISC set up a quality control programme to monitor the conduct of the study in each country. This involved visits to a sub-sample of schools by independent scrutineers.

Many western countries are seriously concerned that they are lagging behind other countries, especially the Pacific Rim “tiger” economics, in terms of technological progress and world trade. These concerns, coupled with the belief that economic success may be linked with the effectiveness of education systems (especially in maths and science), have led governments to question the effectiveness of their own educational systems.

The first results from TIMSS will enable countries to compare their own performance in maths and science with those of their major economic competitors. It is significant that the TIMSS cross-national comparisons will be included in the forthcoming Organisation for Economic Co-operation and Development publication, Education at a Glance: OECD Indicators, to be published in December 1996.

The results of international studies are frequently used to stimulate debate on the relative effectiveness of educational systems with a view to bringing about reform. In order for them to be used for this purpose it must first be established that the results are valid. The data emanating from TIMSS will be the best evidence available on the relative performance of our pupils.

In the meantime, it is dangerous - and premature - to claim that this or that approach that appears effective in one cultural setting will necessarily be effective in another.

Dr Wendy Keys is a principal officer at the NFER and TIMSS national research co-ordinator for England.

Countries taking part in the TIMSS

Western Europe Austria, Belgium (Flemish), Belgium (French), Cyprus, Denmark, England, France, Germany, Greece, Iceland, Ireland, Italy,Netherlands, Norway, Portugal, Scotland, Spain, Sweden, Switzerland Eastern Europe Bulgaria, Czech Republic, Hungary, Latvia, Lithuania, Romania, Russia, Slovak Republic, Slovenia Asia and Pacific Region Australia, Hong Kong, Indonesia, Japan, Korea, New Zealand, Philippines Singapore, Thailand North, South and Central America Argentina, Canada, Colombia, Mexico, USA Middle East and Africa Israel, Iran, Kuwait, South Africa

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Nothing found
Recent
Most read
Most shared