What are league tables?
The DfES maintains that it has never published "league tables". Instead, it produces "achievement and attainment tables", which provide information about how pupils have performed in national exams. Schools are grouped together by the local education authority and then placed in alphabetical order. It is local and national newspapers that use these statistics to establish an order of ranking. But it's disingenuous of the Government to dismiss league tables as a media invention. Its own website encourages comparisons, and has bar charts showing how individual schools compare against LEA and national averages.
League tables were introduced in 1992 by the John Major-led Conservative government to allow parents to see how local schools measured up against each other and schools nationally. If parents were to be encouraged to make choices, went the reasoning, then it was important they could make informed choices. The same argument is used by the current government. Another justification is that they make schools more accountable, and raise standards by encouraging healthy competition and forcing poor performers to improve.
Critics say the tables throw the emphasis on to academic performance, with no recognition of the pastoral care and extra-curricular opportunities that schools provide. This puts pressure on teachers to teach to the test, which stifles independent thought. Schools that perform badly in the tables can find it difficult to attract new teachers, while existing staff are more likely to become de-motivated. A 2005 survey by the Teacher Support Network found that 70 per cent of primary teachers felt league tables had a negative effect on their well-being.
Northern Ireland, Wales and Scotland
England is the only UK country to have consistently published centralised performance tables. Northern Ireland broke rank in 2001, scrapping tables and requiring schools to publish results in their prospectus instead. Later that year, Wales also abandoned tables, deciding results should appear in prospectuses and governors' reports. In 2003, the Scottish Executive changed the way secondary school data was presented, making it available on a school-by-school basis on its Parentzone website. Primary results, meanwhile, were designated for internal use only.
But last year The Sunday Times's Scottish edition used freedom of information laws to produce its own primary league tables, claiming it was striking a blow for the "rights of parents". In Wales too, some newspapers have published their own tables, and a TES Cymru poll in May 2004 found that two-thirds of parents wanted league tables to be made available. In December 2005 results for secondary schools were once again published on the Welsh Assembly website "as part of a commitment to open government". In the Republic of Ireland, it is illegal for newspapers to publish school tables, and there has been little parental pressure for change.
The most common criticism of league tables is that they are unfair. When first introduced, they consisted only of raw data, such as the percentage of pupils gaining five or more GCSEs, or reaching level four in KS2 tests.
Not surprisingly, the tables were headed by selective schools and those in wealthy middle-class areas. Critics insisted that the figures revealed nothing about how good a school was, only what kind of catchment area it served. Since 2003 the tables have included a "value-added" measure, intended to give an indication of the progress pupils make during their time at school. But raw data still has its enthusiasts: it's reliable, easy to understand and reflects the fact that achievement - not improvement - decides university places and job prospects.
Value-added scores are calculated by comparing children's past and present performance against a national median, to calculate whether they have made more or less progress than might statistically be expected. But not everyone is convinced that value-added measures are any fairer than raw data. Some argue that schools whose pupils have been performing well all along find it harder to add value than those whose pupils start from a lower baseline. Many of the schools which fare best in the value-added tables have high numbers of children for whom English is not a first language. Perhaps it is only logical that children coming to terms with a new language may fail to fulfil their potential at first, only to blossom later.
But there's another problem with value-added data: the same factors which govern achievement, such as family background, also seem to govern progress. According to Professor Stephen Gorard's research at the University of York, few schools do markedly better or worse in the value-added tables than they do in the ordinary ones. "Perhaps which school you go to doesn't, in the end, make that much difference," he suggests.
"All schools follow the same curriculum, and are roughly equally funded. It may well be that it is children who shape schools, not schools who shape children."
From next year, league tables will include "contextual value-added" (CVA) scores. These will be based on pupils' levels of improvement, but adjusted to take account of social factors such as the number of children receiving free school meals and the number with special needs. But CVA will almost certainly make the tables more difficult for parents to understand. And there are doubts about just how reliable the calculations will be.
Professor Peter Tymms, director of the Curriculum, Evaluation and Management centre at Durham University, argues that accurate comparison between schools is virtually impossible, regardless of whether or not statistics are weighted. "No two schools are the same. Just because they have the same number of children on free school meals doesn't mean a thing."
The most-hated education initiative?
Possibly. Since league tables were introduced, and whatever form they have taken, teachers, headteachers and union officials have consistently condemned them. Even heads whose schools do well in the tables often speak out against them. "Unfair and unjust" is the verdict of Barbara Jones at table-topping Combe CE primary in Witney, Oxfordshire. So will the weight of professional opinion make the politicians buckle? Unlikely. Minister of State for Schools Jacqui Smith recently called tables a "non-negotiable part of school reform", and if a Conservative government were to succeed Labour, it too would be fully committed to publishing results.
Lies, damned lies and statistics
The crux of the problem is that schools are such diverse and complex places that it simply isn't possible to reduce the work they do to a set of numbers. And even if the figures suggest one school is doing better than another, they offer no insight as to why.
Most statisticians dislike the idea of figures based on arbitrary thresholds such as the five A*-C grades, pointing out than in one school 49 out of 50 pupils can get an A*, but if one pupil gets a D it would be placed below a school where 50 out of 50 pupils scraped a C. Even a scoring system based on average points per pupil can be misleading, with candidates who gain exceptionally good or bad results able to skew the score, particularly in a small school. Each year the way the statistics are calculated changes slightly. For example, an A* at GCSE used to be worth eight times a G grade, but under the latest points system it's worth just 3.6 times more. These constant changes make year-on-year comparisons difficult, undermining one of the key arguments in favour of tables, namely that they help monitor standards.
Bordering on the unethical
One problem with encouraging competition is that some schools get more competitive than others. Attempts to improve results through better management and teaching do deserve recognition. But if schools want quick-fix solutions, then there are ways and means of playing the system.
For example, holding extra revision classes for pupils on the CD borderline is understandable; the five or more A*-Cs figure is still the one that parents and newspapers look for. But are there similar classes for those on the AB borderline? Unlikely. Some heads even fund private tuition for children who might not get the magic five Cs. More dubiously, schools may be tempted to push students towards subjects which are seen as soft options.
Even the value-added measure is open to manipulation. What's to stop an unscrupulous primary under-preparing its KS1 children to ensure great value-added figures a few years down the line? And then there's the marking of coursework. How easy is it for teachers to remain even-handed when assessing the work of a CD borderline candidate?
A piece of cake?
Another bone of contention is the issue of "equivalence" between vocational and academic subjects. Currently, an intermediate GNVQ in information technology is equal to four GCSE passes. That means successful candidates - and there were more than 50,000 of them last year - need only one more GCSE pass to meet the magic five A*-Cs mark. In 2005, nine of the 10 most improved secondary schools had entered substantially more pupils for GNVQ than they had the year before. Some people argue that average points calculations are also skewed, with vocational subjects over-valued in comparison with academic ones. Traditionalists question whether an A in physics (52 points) should really count for less than a distinction in cake decorating (55 points). Amid the academic snobbery lie concerns that success in vocational courses is masking a decline in standards of literacy and numeracy. Last year, more than 300 schools improved their league table standing, even though their results in maths and English went down, prompting the Government to pledge that from next year, maths and English will have to be two of the five GCSEs included in the A*-C measurement. It is thought that more than 100 schools will drop at least 1,000 places in the GCSE league tables when next year's rankings are calculated ("Winners become losers in changes", TES, January 27, 2006).
Do league tables help parents?
The DfES reminds parents visiting its website that they will "probably want to look at the achievement and attainment tables to decide which schools to express preferences for". It's this encouragement to use league tables as a tool for judgment that most concerns heads. They worry that tables - with their seemingly clear-cut evaluation of a school's merits - actually discourage parents from looking closely at what a school has to offer.
But a 2005 survey by Mori, carried out for the General Teaching Council, found that fewer than a third of parents had even looked at performance data when choosing their child's school. Perhaps this isn't surprising given that a National Foundation for Educational Research study, also last year, concluded that most parents did not understand the data presented to them. It seems heads and the media are more hung up on league tables than the people they are aimed at.
Schools could produce their own profiles, either separate from their prospectus or as part of it. These would have to include key statistics, laid out in a standardised format. But they would be able to set the information in context by including statements about the school's ethos and pastoral system, and its extra-curricular opportunities. But critics say this would reduce statistics to a marketing tool, and encourage schools to put their own spin on the facts and bamboozle parents. And who would take responsibility for checking that the data was correct?
Finally, there are those who claim that the fairest way of assessing schools already exists, in the shape of Ofsted reports, and that scrapping league tables would give reports more prominence.
* www.learning.wales.gov.ukscripts fenews_details.asp?NewsID=2217
Main text: Steven Hastings
Main photograph: Alamy
Additional research: Sarah Jenkins
Next week: Crystal meth, drug scourge of the Midwest