THE TOOLEY and Darby Report on Education Research co-sponsored by the Office for Standards in Education has certainly discovered some motes but I would like to counterbalance it by reference to the beam in the eye of Ofsted. The main flaw, both in the educational research examples cited and in OFSTED, is inadequate methodology.
A figure of pound;70 million is cited as the amount spent annually on educational research, but the OFSTED budget is considerably larger than this.
So how good is OFSTED's methodology? Tooley and Darby seem to think that citing how the sample was drawn and the size of the sample is sufficient for "good practice". It is not. The adequacy of the sample for the judgments to be made needs to be considered.
Where is there a single paper from OFSTED even considering the issue of the size of the sample to be drawn in observing teachers teaching?
Furthermore, the pre-announced nature of an OFSTED visit raises severe questions as to whether OFSTED sees the school as it really is.
Second, there is now, late in the day, one study reporting the extent to which two inspectors agree when observing a lesson. OFSTED did not commission external researchers but conducted the study of their own inspectors by themselves using a volunteer sample of inspectors who knew that their judgments were being studied and had often worked together in the past. Thus, although the study was no doubt carefully conducted, its design was inadequate.
Third, where is there a single study of the validity of OFSTED's judgments? I suggested to Sir Stewart Sutherland, the first chief inspector, that studies of OFSTED inspectors' judgments (guesses) of pupils' progress should be checked against value-added data, i.e. against measures of progress.
Since we have value-added measures going back to 1983, OFSTED could have availed itself of this offer of help, but it did not and, moreover, told the Select Committee that value-added measures were not available. This was an inaccurate statement.
The need for a credible sample, reliable measurements and valid conclusions were pointed out in an article in The TES (March 29 1996). OFSTED responded with silence.
In an article in Managing Schools Today last April, I reviewed the reliability study and presented evidence that OFSTED's judgments of failing schools did not seem to be consistent with value-added data. To date, there has been silence.
May I recommend this method of response to those researchers attacked by Tooley and Darby? They are small fish compared with the might and expense of OFSTED. The Government might worry about the motes, but it should do something about the beam.
Carol Fitz-Gibbon. Professor of Education. University of Durham