League tables, contextual value added and the Government's approach to measuring school effectiveness have been condemned as "meaningless" by two leading academics, including one of the original architects of the CVA measure.
Stephen Gorard from Birmingham University has produced a devastating demolition of CVA data, showing it is riddled with errors that make it "useless" for comparing schools. He says invalid decisions of real consequence are being made about teacher and school performance on the basis of a "voodoo science", which should be abandoned immediately.
Professor Gorard also warns that the school report cards being developed as an alternative to league tables will do nothing to solve the problems because they are based on the same data.
Meanwhile, Harvey Goldstein, a member of the group that helped devise the measure, has renewed his attack on it. It was originally designed to make exam league tables fairer by adjusting results to take account of pupil background and their prior achievement.
Professor Goldstein and George Leckie, one of his colleagues at Bristol University, say all league tables, including those that use CVA, have "very little to offer as guides to school choice".
The official line is that CVA is important for parents and policy-makers because it "isolates the school effect". In other words, parents can see what difference each school makes to its pupils.
Professor Gorard says the Government took that view because school effectiveness researchers have assumed that any variation in raw exam results unexplained by pupil intake must be caused by something the school is doing.
In fact, he says, the differences suggested by CVA are "made up almost entirely of the error component in the original figures". These are caused by inaccurate, inconsistent and missing data. For example, only 85 per cent of pupil records have complete sets of data for five of the crucial CVA variables - free school meals, pupils in care, special needs, gender and ethnicity.
There are also differences between supposedly comparable qualifications and issues all the way down to whether or not a fire alarm interrupted an exam in one school and not in another.
Crucially, Professor Gorard argues that much of this error is not random but biased against particular groups of pupils. Therefore it cannot be compensated for with the statistical techniques traditionally used to cope with sampling errors. The result is that the adjusted exam results used in the CVA measure are only 80 per cent accurate.
He says this inaccuracy is magnified when a school's final CVA figure is calculated by comparing predicted exam results - based on scores from pupils with the same average prior achievement and background - against the results pupils actually achieve.
Professor Gorard's paper shows this can lead to CVA figures that are incorrectly inflated or deflated by as much as 1,000 per cent, making them "useless".
"It is certainly no basis for making national policy, rewarding heads, informing parents, condemning teachers or closing schools," he writes.
He says Ofsted verdicts are skewed and the public misled and excluded from understanding complex measures that are then used for a model of school effectiveness that "does not and cannot work".
The Bristol University paper says parents choosing schools want to know about their current performance. But league tables, including those using CVA, are based on the results of pupils who joined the school several years ago. Performance is unpredictable and varies over time so the tables are misleading, the academics conclude.
Last year, Professor Goldstein said that CVA was "at best misleading, at worst dishonest" because of the relatively small number of pupils used to calculate the figures.
John Dunford, general secretary of the Association of School and College Leaders, said: "It is wrong to say that CVA is not perfect and therefore should be abolished. It should be improved, but the notion of contextualised results remains an important aim."
A Department for Children Schools and Families spokesman said: "No single measure of performance can tell the whole story about a school's effectiveness, and CVA must not be viewed in isolation. Attainment data continues to play an important role in painting the full picture of a school's performance and, in the 2009 tables, this will be supplemented by new progress measures."
- `A case against school effectiveness' by Stephen Gorard. The Bristol paper can be found at: www.bris.ac.ukcmpopublicationspapers2009wp208.pdf
CRACKS IN THE SYSTEM
- Contextual value added is supposed to be the most sophisticated school performance measure yet, adjusting exam league tables for broader factors such as pupils' ethnicity, gender and socioeconomic background as well as prior achievement.
- Since January 2006 it has been a major factor in Ofsted verdicts - it was introduced to league tables the following year - leading to immediate controversy.
- Some heads felt they were receiving poor reports on the basis of a measure that they did not understand and that had produced mystifying results.
- Stephen Gorard's work provides a possible explanation: it is because the measure is unreliable and based on errors.
- The group of academics that advised the Government on the measure also voiced misgivings before its introduction. These concerned the margin for error and its accuracy in reflecting schools' performance over time.
- New data has allowed Harvey Goldstein, one of the advisers, to conclude that the latter fears were well founded and that while the measure might be useful for policymakers, it is unsuitable for parents trying to choose schools.