The basis of calculations used by HMI's audit unit would be frowned upon by the most basic of textbooks, says Eric Gillies
I was much encouraged by Bob Sparkes's article (TESS, September 24). I, too, have had serious reservations about the use of Standard grade point average, or GPA. The proposition that GPA is a "measure of general ability" needs critical examination. First, it does not take into account the number (from seven to nine in most schools) or quality of Standard grades a pupil has attained. As to quality, in the eighties Alison Kelly analysed Standard grade results and revealed wide differences in difficulty among subjects - the "Kelly factors".
The same exercise is carried out annually by the Inspectorate's audit unit and the wide variation continues. Thus two pupils could have the same GPA - one based on a high proportion of "easy" subjects, the other on a different number with a high proportion of "hard" subjects. On what basis could we say the prior attainment of both pupils was similar?
This concealed variation has been exacerbated by an increasing presentation in social and vocational studies, which is, let's face it, a very easy Standard grade. Some schools have opted for SVS presentation as much for its being a quick and perfectly legitimate "leg up" to Scottish Executive Education Department targets as for its contribution to pupils' education.
A further blow to the utility of GPA is dealt by the annual application of concordance procedures which inflate around 10 per cent of element results by at least a grade better than attained in the exam scripts. This boost in attainment has nothing to do with the pupils - it is a reward to subject departments whose estimate grades are reasonably accurate.
More telling is the wide variation among S grade awards which are apparently the same. Each subject comprises two or more elements which are put together to calculate a pupil's final award. For example, an English Standard grade 2 can, in theory, be achieved by any one of 31 different combinations of grades (element profiles) in the three elements - reading, writing and talk (although only a dozen or so of these element profiles occur in any quantity).
The Scottish Qualifications Authority analysed pass rates in 1997 Highers for pupils with different element profiles in the same subject at Standard grade and found wide variation. In Higher English, for example, pass rates for pupils with English S grade 2 varied from 39 per cent to 100 per cent. Other subjects fare similarly.
Another dictum is that "GPA is a good predictor of performance at Higher grade" - the specific concern of Mr Sparkes. I have tried in vain to wrest from the audit unit an empirical justification. My last attempt, a formal written request, remains unanswered. Each year the audit unit maps pupil performance in each Higher grade subject against the same pupils' GPA figures. The results are a set of national scatter diagrams. While we wait for the audit unit's national data to cook, here are two simplified examples I made up earlier.
On each diagram the audit unit draws the "line of best fit", or regression line. This is the line statistically nearest to all the dots on the graph. The equations of these lines are then published as evaluators of performance for that year's Highers, and predictors for next year's (the equations do not change much from year to year). For example, the regression equation for Higher maths this year is y = 3.24x + 2.43. Looks surgically precise, doesn't it?
Yet any elementary textbook on statistics will counsel against uncritical use of averages when comparing sets of data, and will call for examination of the spread of measures around the average as an essential part of statistical analysis. The audit unit, however, tells me that "this information is not issued to schools".
Look at graph 1. The pattern of dots is such that GPA and Higher band are tightly coupled. The regression equation (or average) for this graph is y = 2.2x + 2.4. The spread, or standard error, is 1. Using this graph as a future predictor of Higher performance from pupils' GPAs, then, you would expect to be out by one band, on average, either way.
Now look at graph 2. Here the dots are far more widely spread out. The standard error for this graph is about 3. Used as a predictor, then, this graph's regression equation will be "out" by three bands either way, on average. But this is the difference between an A pass and a fail. You've no doubt guessed the equation for this graph. Yes, it's y = 2.2x + 2.4, exactly the same but signifying something very different.
The inputs, the process and the outputs of the "GPA value-added" approach to target-setting are all suspect. It ill behoves us as professional educators to accept uncritically a mixture of statistical
mumbo-jumbo and polemic - a rod to beat our own backs.
Eric Gillies is depute rector, Whitburn Academy. The views are his own and not necessarily those of his colleagues or employers.