Ofsted's new inspection regime has proved to be much tougher on schools than official figures released earlier this month suggested, TES can reveal.
The watchdog originally announced that 9 per cent of the schools it inspected between January and March were judged to be inadequate and failed. It has now admitted that the real figure was actually 13 per cent of the 2,075 state schools it visited during those three months of its new inspection framework.
The discrepancy is explained by verdicts from 111 schools that were missing from the original statistics, a disproportionate number of which were failed.
The news is likely to reignite fears about the severity of the framework. Graham Lancaster, an Essex-based Ofsted inspector, spoke out earlier this year, describing the new regime as "rather frightening" and suggesting that it was failing too many schools (TES, 1 June).
The missing reports had not been finalised by the 1 May cut-off date for the original statistics. But Ofsted made no mention of this when they were first released. They only came to light because heads' leaders noticed a sharp drop in the proportion of schools being failed between January and March.
Russell Hobby, general secretary of the NAHT heads' union, had feared the fall might have been caused by the inspectorate deliberately trying to manage the high number of schools it was failing.
He now accepts that this was not the case. "On the other hand," he added, "this does mean the number of schools being put into categories (being failed) is higher then we thought and Ofsted's regime is tougher than it originally portrayed."
The new figures show that 7 per cent of schools inspected between January and March were judged outstanding, 48 per cent good and 32 per cent satisfactory.
The equivalent percentages during the 2010-11 academic year under the previous framework were 11, 46 and 38, with just 6 per cent judged inadequate.