New league tables, same distortions
With considerable fanfare and claims of accolades from educational theorists, the Observer has launched another form of league table. While it may be desirable to seek more sophisticated ways of measuring school performance, it is certainly not a "step forward" to rush into another form of league table, especially one as deeply flawed as I believe this one to be.
Value-added scores are put forward as a measure of improvement or otherwise in going from Standard grade to Higher. Sometimes this information when interpreted with great caution in the context of other school parameters can be useful. This does not, however, justify slapping these results into a league table. Why not?
One reason is that in schools with a small number of students going on to Higher grade the numbers involved in many subjects do not allow reliable statistics to be produced. Another problem is that an artificially enhanced score could be produced if a school does less well than usual in Standard grades without much change in Higher grade results the following year.
But here we have a "quality" newspaper claiming to have created a level playing-field by using a highly reliable, scientific method. Parents may easily assume that real differences in school management and quality of teaching are being recorded here and make a choice of school accordingly. Yet the compilers have made methodological errors and unjustifiable inferences.
My first doubt relates to the free school meal percentages. In any scientific study it is necessary to control what are termed "confounding variables". In the Observer study free school meal uptake is used as a measure of deprivation and thus of disadvantage in relation to performing well at Standard grade. What should have been explored, however, is the possibility that there are schools with similar uptakes but widely separated in the tables where much of the difference in results is due to variables beyond the control of the individual schools. In other words, the Observer scores often do not denote real differences in the quality of management or teaching.
In East Lothian, for example, the Observer score for North Berwick High (+11.3) is very different from that of Ross High (-18.8). The free school meal figures are 5 per cent and 9 per cent respectively. The researchers are remiss, however, in not even considering the possibility of confounding variables. One variable that suggests itself right away is that of national testing results in language and maths. What is the typical percentage of pupils entering secondary 1 in each school at levels D, E or better? One would expect these percentages to show a high positive correlation with success at band 1 and 2 at Standard grade.
There are similar pairs of schools throughout the table where the differences cry out for further investigation. What about Notre Dame High School for Girls at the top of the Glasgow table and Knightswood Secondary at the bottom? Or Dalziel High and Calderhead High in North Lanarkshire. Are there significant differences in national testing scores in secondary 1 or sociological (including degree of family support) factors?
The high point of absurdity is reached in the high praise heaped on Farr High in Highland and its placement near the top of the league. Here is a very small school with presumably fewer than 30 pupils in fourth year where small changes in the pupil population can lead to relatively large percentage changes in results.
Furthermore, it is just the type of remote school where relative poverty in some families can quite easily coincide with high academic ability. Farr may well be a happy place with very dedicated teachers but there is not a shred of reliable evidence in the Observer statistics to suggest that the quality of education is better than in hundreds of other schools.
Another obvious example of statistical nonsense is seen in the Glasgow table. Many schools have a high (in all cases completely accurate?) school meal uptake and a low number (in small schools representing only a few individuals) of pupils gaining five or more Standard grades at bands 1-2. To create strict rankings from such data demonstrates that the compilers are in need of remedial help in the interpretation of statistics.
The record of newspapers in reporting league tables so far has been dismal: some schools have been praised and others held up for blame on the basis of extremely poor evidence. Parents do have a right to know about the quality of education in individual schools. But HM inspectors produce this for them in a well researched, comprehensive and reliable way.
HMI also makes constructive suggestions for improvement. If more frequent inspections are needed to keep parents and staff up to date, then so be it.
It is also sensible for educational researchers to continue to look for reliable means of evaluation. To treat schools like football teams, given the exceptional difficulty in creating a level playing-field, is indefensible.
Michael Davenport is assistant head teacher, Ross High School, Tranent.