Figures that don't add up

12th May 2000 at 01:00
JUST to prove I read The TES Scotland carefully, let me draw together three items in the edition of April 28. The first is the Jotter piece that there are 16 statisticians working for the Scottish Executive Education Department and that this is twice as many as for any other department.

The second is the worrying observation from John Elvidge, the head of education within the Scottish Executive, that "simply because we can't measure everything is no excuse for not measuring what we can" - I'm about 156 centimetres tall by the way - and the third is Brian Boyd's excellent article on the reliability of research.

Brian Boyd is quite correct to point out that research carried out by bodies like the Scottish Council for Research in Education and the former Centre for Educational Sociology was careful and rigorous. For example, when Andrew McPherson was looking into the effectiveness of comprehensive education he waited until all the youngsters who had started in the selective system had left school so that he could be sure he was looking at the actual effects of the comprehensive system, and not at the process of change or at a comprehensive system moderated by any hangovers from selection.

It was on the basis of such careful research tha he concluded that a comprehensive system served all pupils well.

However, what has passed for "research" more recently has been a quick look at official statistics, followed by a knee-jerk reaction without any due consideration as to what the figures really mean. A classic example here has been the way the whole scare over modern language teaching was based on an utterly false comparison of the numbers taking a modern language Higher in 1996 and in 1976.

The figures used for the comparison were the percentage of those actually in the fifth form at the two dates, rather than the percentage of the whole year group, and no account was taken of changes in the percentage staying on to fifth year over the 20-year period. The statistics may have been correct; their use was invalid.

So the answer back to John Elvidge is that it's all very well measuring everything that you can measure, but it's useless unless you stop to work out what the figures actually mean.

Finally, I'd like to ask Mr Elvidge a question. Does the fact that education has twice as many statisticians as any other department mean it's twice as efficient, or twice as inefficient?

Judith Gillespie

Findhorn Place, Edinburgh

Letters to the Editor


Subscribe to get access to the content on this page.

If you are already a Tes/ Tes Scotland subscriber please log in with your username or email address to get full access to our back issues, CPD library and membership plus page.

Not a subscriber? Find out more about our subscription offers.
Subscribe now
Existing subscriber?
Enter subscription number

Comments

The guide by your side – ensuring you are always up to date with the latest in education.

Get Tes magazine online and delivered to your door. Stay up to date with the latest research, teacher innovation and insight, plus classroom tips and techniques with a Tes magazine subscription.
With a Tes magazine subscription you get exclusive access to our CPD library. Including our New Teachers’ special for NQTS, Ed Tech, How to Get a Job, Trip Planner, Ed Biz Special and all Tes back issues.

Subscribe now