Figures that don't add up

12th May 2000 at 01:00
JUST to prove I read The TES Scotland carefully, let me draw together three items in the edition of April 28. The first is the Jotter piece that there are 16 statisticians working for the Scottish Executive Education Department and that this is twice as many as for any other department.

The second is the worrying observation from John Elvidge, the head of education within the Scottish Executive, that "simply because we can't measure everything is no excuse for not measuring what we can" - I'm about 156 centimetres tall by the way - and the third is Brian Boyd's excellent article on the reliability of research.

Brian Boyd is quite correct to point out that research carried out by bodies like the Scottish Council for Research in Education and the former Centre for Educational Sociology was careful and rigorous. For example, when Andrew McPherson was looking into the effectiveness of comprehensive education he waited until all the youngsters who had started in the selective system had left school so that he could be sure he was looking at the actual effects of the comprehensive system, and not at the process of change or at a comprehensive system moderated by any hangovers from selection.

It was on the basis of such careful research tha he concluded that a comprehensive system served all pupils well.

However, what has passed for "research" more recently has been a quick look at official statistics, followed by a knee-jerk reaction without any due consideration as to what the figures really mean. A classic example here has been the way the whole scare over modern language teaching was based on an utterly false comparison of the numbers taking a modern language Higher in 1996 and in 1976.

The figures used for the comparison were the percentage of those actually in the fifth form at the two dates, rather than the percentage of the whole year group, and no account was taken of changes in the percentage staying on to fifth year over the 20-year period. The statistics may have been correct; their use was invalid.

So the answer back to John Elvidge is that it's all very well measuring everything that you can measure, but it's useless unless you stop to work out what the figures actually mean.

Finally, I'd like to ask Mr Elvidge a question. Does the fact that education has twice as many statisticians as any other department mean it's twice as efficient, or twice as inefficient?

Judith Gillespie

Findhorn Place, Edinburgh

Letters to the Editor

Log-in as an existing print or digital subscriber

Forgotten your subscriber ID?


To access this content and the full TES archive, subscribe now.

View subscriber offers


Get TES online and delivered to your door – for less than the price of a coffee

Save 33% off the cover price with this great subscription offer. Every copy delivered to your door by first-class post, plus full access to TES online and the TES app for just £1.90 per week.
Subscribers also enjoy a range of fantastic offers and benefits worth over £270:

  • Discounts off TES Institute courses
  • Access over 200,000 articles in the TES online archive
  • Free Tastecard membership worth £79.99
  • Discounts with Zipcar,, Virgin Wines and other partners
Order your low-cost subscription today