The new academic year has started with a bang from the ministry of propaganda, otherwise known as the Department for Education. A fishy-looking, self-congratulatory press release, proclaiming the arrival of 52 new free schools, is a case study in statistical jiggery-pokery, of which the department responsible for overseeing the search for truth among our nation’s children ought to be ashamed.
The release, headlined Back to school for thousands as new free schools open, offered two key statistics.
First, the release proclaimed the "higher standards" of free schools (by implication, higher than all the other types of schools which the department also funds; welcome back to school, the rest of you). This was based on data showing that 29 per cent of them are said to be rated "outstanding". Second, it revealed that the number of free schools opened since 2011 has now passed 500.
But a closer look at the statistics is extraordinarily revealing.
To take the "29 per cent 'outstanding'" figure, this stands up at first glance. A look at all the schools most narrowly defined as "free schools" – more of this below – from Ofsted’s database reveals that, yes, 57 of the 199 frees inspected so far have got the inspectorate’s highest grade.
This a significantly higher rate than that enjoyed by state-funded schools as a whole, at 21 per cent.
What the DfE has not said in this or previous statements, however, is that the distribution of Ofsted grades in free schools is more polarised than in state schools as a whole. That is, while there are indeed proportionately more "outstanding" frees, the proportion rated "good" or "outstanding" is actually slightly lower, at 85 per cent versus 89 per cent in the state sector as a whole.
Meanwhile, the rate of free schools rated "inadequate" is double that among state-funded schools as a whole, at 4 per cent versus two.
This relative polarisation of Ofsted results seems unsurprising, given the advantages and disadvantages of the policy. I don’t have space to give much detail on this but I would summarise that, on the upside, free schools are new institutions, sometimes benefiting from successful organisations’ expertise and usually benefiting from subsidies allowing them to build up gradually in their first years. However, on the downside, some have failed where groups have been inexperienced and because free school "planning" often seems to stretch the word’s meaning.
If it is too much to expect the DfE to acknowledge that its policy is far from perfect, it does, at least, surely have a duty to present Ofsted statistics in the round.
But it gets really interesting when we look at the figure of 500 free schools opened. For Ofsted’s database, which includes free schools which have yet to be inspected, lists only 305 as of July this year.
So the first thing the DfE did was to include all categories of free schools. In its official categorisation, university technical colleges and studio schools are also counted under the heading "free schools". So that bumps up the numbers.
The trouble is, though, that these schools are also much less successful, in Ofsted terms, than other free schools. The DfE has included them in its statistics, remember, for the number of free schools open. So it only seems right that they must also be included in the data on free school success rates.
But do this, and the Ofsted ratings for free schools become less impressive, at only 24 per cent "outstanding", and 6 per cent "inadequate". So free schools, on this measure, still have a higher proportion of "outstanding" verdicts, but it’s much more marginal – 24 vs 21 per cent – while "inadequate" ratings are treble that for the state sector as a whole.
So the DfE, in its press release, has used one set of statistics to make its free school numbers look good, and another dataset to make the policy’s Ofsted ratings look good.
But the really staggering aspect comes at the end. Even with studios and UTCs included, there are still not 500 open free schools. Again, this is clear from the data: the DfE’s own statistics on "open free schools" take us to only 473 institutions – including studios and UTCs – as of this month. So how did the DfE get to 500?
The only way I can see is that the DfE included closed free schools. On its Edubase database, where schools opening this month are incorporated, we get to just over 500 free schools , which have been open since the policy started in 2011. But this includes 36 schools which have since been shut, many of them after a chaotic few years.
Several of these, of course, had failed Ofsted inspections spectacularly: you only have to read news stories such as "Government shuts free school amid claims taxpayers' money was wasted", "Bolton Wanderers Free School which will close just three years after opening owes government nearly half-a-million pounds" and "Oldham's failing Collective Spirit secondary added to list of free school closures" to know that the policy is sometimes crashing alarmingly, to pupils’ – and taxpayers’ – cost.
But those inspections are not in the data by which the DfE would now like the free schools policy judged, because these institutions no longer exist.
So there it is: the DfE is cherry-picking data so that only one portion of free schools’ Ofsted judgement profile is publicised. And it appears to be using less successful – and even closed – free schools to bump up the numbers said to have been opened, but which are then not to be counted in assessments of the policy’s success.
I wonder: is the free schools policy in trouble, given that the government has to resort to this? We all, as taxpayers funding this shameless manipulation of statistics, deserve so much better.