# Statistics that tell the story

10th October 1997 at 01:00
Ian Senior has devised a measurement of exam success that enables him to see exactly how his school is performing

At Kings Langley School, and probably elsewhere, governors have been provided each year with a vast array of detailed statistics showing exam performance using various standard types of measurement.

What is enormously difficult for busy governors, given this abundance of information, is to see the wood for the trees. In particular two crucial questions need to be answered: u How has our school done this year overall compared with previous years?

u And how has it done compared with other schools in Hertfordshire or nationally?

I was keen to answer these questions in a quantitative way. I have therefore tackled the problem by creating a system that encapsulates the answers to these two questions in an index number that is unambiguous. The concept is relatively easy to understand.

A school's exam results are compared with those for Hertfordshire, or for England as a whole if the data are available. I take a given measure such as the percentage of students obtaining five or more A* to C grades at GCSE, and divide the figure for the school by the figure for Herts. If the school gets just the same percentage as Herts, it scores a ratio of one which is multiplied by 100 to become an index number of 100. If the school does better than Herts, the ratio is above one and the index is above 100; and vice versa.

Using published standard performance measures, a number of index numbers are obtained for the three types of exam. These numbers are shown in the blocked columns in the table above. The index numbers themselves are then averaged to produce a composite index number which is at the foot of each column.

The results for a fictitious school, St Nowhere, are shown in the table. St Nowhere's performance is seen to be uneven across the three forms of exam but is particularly strong at A and AS-level. The last column shows St Nowhere's average over the most recent three years and smoothes out fluctuations. St Nowhere's composite index over three years is 105. This means that overall it lies 5 per cent above the Herts average, but the trend is downwards.

If parents are permitted this information, they may hesitate to enter their children for the school. St Nowhere is clearly going downhill or not improving at the same rate as other schools in the county. This would have been difficult to spot without the index.

The present table weights the three forms of exam equally. This gives equal weight to exams with small numbers of candidates such as AAS and to those with large numbers, such as GCSE. It would, of course, be possible to weight the index numbers according to other criteria, for example the number of students entered or the number of subjects sat, or both. This would make the table less easy for a lay person to use. Further, these additional data are not included in the source documents from which the table is derived.

I am aware that my composite index measures only exam results and not value-added, but the strength of the concept is that when value added data become available the composite index can easily be modified to incorporate these.

We shall then have a tool that shows where the best teaching is happening as well as the best exam results.

My table may prove to be a useful tool for governors and senior management teams who genuinely want to know how their school is performing. At this stage, the table and composite index are offered for public debate. In due course, if their validity is confirmed, the composite index could become a tool for informing parents also.

Not a subscriber? Find out more about our subscription offers.
Subscribe now
Existing subscriber?
Enter subscription number