OFSTED's inspections are wrong too often, and the effects of its misjudgments can be devastating, says Tim Brighouse, Birmingham's chief education officer
Last week a chance remark to the Commons select committee on education released a torrent of comment - sometimes emotional, but often analytical - on the reliability, consistency and validity of the Office for Standards in Education inspections of schools.
Underlying the comment and the work of the committee is the need to examine whether OFSTED is the first fault-free "100 per cent reliability" organisation and, if it is not, the extent and consequences of its fallibility. Does it perhaps have a unique and dangerous capacity among public bodies of hiding its errors and in the process burying the consequences for the victims of its mistakes?
The occasion of the furore is easily summarised, but not so easily understood or accepted. A large secondary school on a challenging Birmingham outer-ring estate with new, committed, skilful and inspired leadership, had moved from "serious weaknesses" through very favourable comment from monitoring HM inspectors to being proposed by an OFSTED team for "special measures" - all in the space of less than 18 months.
The sense of injustice and outrage in the school and the wider professional community was matched only by the loss of energy, health and heart among a superb body of staff. At the select committee I chose to illustrate some general points by reference, anonymously, to the school's plight and especially that of its outstanding headteacher who, along with his family, had been so damaged by an act of great misjudgment and injustice.
That OFSTED, even at its best, was capable of error and misjudgment had been brought home forcibly during Birmingham's own inspection. Its final draft reports so often bring to mind the characteristics of a poker-playing sixth-former writing for a tabloid. "Poker-playing" because they are unwilling to reveal their hand of evidence even when called; "sixth-former" because they often confuse assertion with argument and opinion with evidence and "tabloid" because their use of language can sometimes be extravagant and exaggerated.
These faults have been partly mirrored in school reports. Judgments are occasionally based on inadequate and often unrevealed evidence while there is often lack of coherence and consistency between the summary findings and those in the main body of their reports.
These continue to undermine confidence in OFSTED's quality assurance procedures. The nine out of 10 schools that have a mainly good experience shrug off these issues and, after the usual post-inspection energy loss, get on with their vital business. But what of the schools subject to inaccurate assessment?
OFSTED was bound to have these problems when, before Chris Woodhead was appointed, it recruited thousands of inspectors and trained them in a matter of days rather than using the year-long apprenticeship model tried and tested under HMI.
Inconsistency of judgment and variability of inspection outcomes were thus built in from the beginning. Despite the chief inspector's personal, persistent and heroic attempts to improve the quality of inspection standards by frequent edicts and reviews of the inspection framework, doubts about validity remain. After all, you can paper over the cracks in a faulty design only so many times before its structural weaknesses undermine the confidence of those who depend on it. That has now happened with school inspection.
Half remembered events and urgent questions begin to crowd in. Was there a "double-blind" inspection of the same lessons by HMI and OFSTED inspectors a couple of years ago on a seven-point scale and what discrepancies did it reveal?
On how many occasions have corroborating HMI teams disagreed with the original OFSTED team recommending special measures?
And when they did, who made the final decision, in whose favour and on what evidence?
What percentage of observed lessons have been registered on each of the one-to-seven scale during OFSTED inspections and when they were telescoped to one-to-three, on what basis was that done?
And, significantly for the Government's strategy, how many special measures and serious weaknesses schools were reported in the first half of this autumn term and how does that compare with the first cycle?
Or is Birmingham's tally of nine primary and secondary schools in "special measures" in four years followed by six in three months since August exceptional? If it is, is that the result of a local education authority post-OFSTED-audit depression or some other factor, and if so, what?
Perhaps the most important unanswered question of all however is the impact of the present unreliable system on teacher morale and supply. With the prospect of frequent, full and inconsistent inspections in urban areas there is the real danger that teachers will choose to teach elsewhere.
Yet urban areas are where the real battle is to be won and is being won - in Birmingham our tally of special measures is well below the national average just as our rate of improvement is better - thanks to the quality and skill of school staff and their heads.
"Don't let this ever happen again" is how someone close to the Birmingham headteacher and the staff put it when their school suffered their flawed OFSTED judgment.
My chief education officer colleagues think the margin of error in OFSTED judgment is 15 per cent. If it were only one in 10 it would be more than 2,000 schools. As I have seen, even one is one too many. That is why I referred to it briefly and anonymously - though the tabloids disgracefully chased it down and invaded the school's and the family's privacy - in presenting evidence to the House of Commons education select committee. Schools like that one - indeed all schools -deserve a less fallible inspection system.
So does the chief inspector of schools.