OFSTED methods fail to impress

The Office for Standards in Education "works", asserts Peter Matthews, its head of quality, (TES, August 28). Let us examine the arguments put forward.

First, he cites OFSTED's own post-inspection questionnaire to schools. Apparently, only about 10 per cent of schools tell OFSTED they think its judgments are unfair and inaccurate. Is this surprising given that schools are replying without any anonymity?

Second, he cites "instructions on sampling" that guide inspectors. My point (TES, August 14) was that no justification for these instructions has been produced. Studies to establish the size of sample needed for reliable judgments should have been conducted before any inspections were permitted. Poor methodology once again.

Third, he cites a single study of the extent to which two inspectors agree when they observe the same lesson. This was a study of inspectors by inspectors, using volunteers who had previously worked together and who knew they were part of an investigation. Only 17 per cent of those invited to participate agreed to do so. In short, hardly a representative sample.

For credibility, an independent body, such as the Royal Statistical Society, should have been asked to design and implement an evaluation that was representative of typical inspectors working under representative conditions. Poor methodology, again.

But does OFSTED work? Is it improving schools? Where is good evidence for that claim?

Nationally, we have a decline in enrolments to teacher training, a decline in the quality of applicants and a shortage of candidates for headships.

These problems may or may not be related to "naming-and-shaming" activities but OFSTED is definitely responsible for having taken 616 primary heads away from schools to become additional inspectors. Only 38 per cent returned to their schools. This was a massive loss of experienced primary heads.

In the context of the publication of school performance tables, the increasing use of value-added measures, the competition resulting from parental choice and the local management of schools, is not the pressure on schools quite sufficient?

Many researchers believe that in this context OFSTED does not work and its poor methodology is a disgrace.

OFSTED's methods need to be reformed to be more credible and more in line with modern practice.

They could learn something from the auditing approach adopted by the Further Education Funding Council.

Carol Taylor Fitz-Gibbon School of Education University of Durham Durham

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you