How do we assess the evaluators?

15th November 1996, 12:00am

Share

How do we assess the evaluators?

https://www.tes.com/magazine/archive/how-do-we-assess-evaluators
Maurice Kogan questions OFSTED’s ability to weigh local authorities. Evaluation is supposed to be good for us, an aerobic activity as someone once described research. Certainly, the schools know all about it now, with the Office for Standards in Education asserting an inspectorial power unheard of ten years ago. Now local education authorities have begun to commission evaluations of their own, working with independent teams. Staffordshire led and was closely followed by Kirklees.

The Standing Conference of Chief Education Officers has been working on a frame for LEAs to adapt for their own use. From January 1998 all LEAs are to be open to OFSTED inspection. OFSTED is already into two LEAs and has played, or will play, a useful part in the Staffordshire and Kirklees LEA-initiated evaluations.

LEAs have long been evaluated for their management efficiency by district auditors working within the guidelines laid down by the Audit Commission. It is, perhaps, odd that statutory evaluation will now fall to OFSTED whose expertise is largely grounded on knowledge of the schools, with little of the expertise of district audit and the Audit Commission in the evaluation of larger systems.

The Audit Commission has the advantage, too, of being trusted because it starts with no biases against local government, and is governed by independent commissioners unlikely to encourage loose cannons to fire off at will. Staffordshire and Kirklees commissioned evaluations primarily to assist them in a continuing formative and development process. By contrast, OFSTED and District Audit evaluations are summative? These can be used, of course, for subsequent development. But only the LEA is able to integrate evaluation into a continuous process of its own development.

The LEA evaluations judged the efficacy of the total educational system by noting the inputs in terms of resources, the contexts in terms of socio-economic and other factors, and the outputs in terms of measured standards achieved by the schools and other institutions. We assessed the LEA’s policies and vision by asking whether they were appropriate to the contexts and whether they impressed themselves on the main instruments of LEA policy and practice, namely, business plans, service level agreements, the work of advisers, and on school development plans. We assessed the place of the LEA within the corporate system, the working of the education department, its relationships with schools and other providers, and its levels of consultation and negotiation.

In Staffordshire, OFSTED carried this inquiry into the working of schools and into the class rooms. It did so expertly, if at considerable cost. It is backed up by its database from which it can make time-series analyses - resources, incidentally, which it hugs close to its chest. In Kirklees, a non-OFSTED study in nine schools took it up to the classroom door. Both reached conclusions about the LEA’s efficacy.

Determining whether educational standards are adequate does not, of itself, establish how far the LEA has contributed to the standards achieved. To do that OFSTED must follow a path accessible to other equally knowledgeable evaluators. Its guidelines state that it will pursue quite similar issues of context, policy-making process and implementation to those pursued by its LEA precursors.

The OFSTED contribution to the Staffordshire report did, in fact, test the connection between LEA macro policies and activities in the school. Moreover, if OFSTED lacks expertise in the overall organisation of the LEA, it will surely buy it pretty quickly - as the Audit Commission has done - wasteful duplication of district audit expertise though that might be.

What then are the differences between the management evaluation now undertaken by district audit, LEA-commissioned evaluation for self-development and OFSTED inspections?

First, OFSTED inspections will carry the force of law. They will thus differ from a self-generated consultancy or audit, of the kind generated by the most efficient private concerns, where the evaluated can take or leave the advice given.

Second, while Her Majesty’s Chief Inspector is on record (TES, October 25) as saying that LEAs will be judged according to their own criteria, that has not been the experience of the schools, and the battery of criteria being worked out is formidable. LEA behaviour will become conditioned by a central quango’s concept of what constitutes good local government.

Will OFSTED publish league tables? At present, its concepts derive from knowledge of local administration and schools but are not grounded in any concept of local government as a whole.

Third, OFSTED will evaluate 12 LEAs a year. In the 13 years between these inspections, LEAs will want to engage in reviews of their own. But the main difference will lie in the ownership and use made of the evaluation. LEAs are elected and responsible bodies: their self-evaluations, if moderated by external experts, should ensure that they perform efficiently because they will be linked with their own programmes of development. OFSTED evaluations should contribute to these processes, as now happens with district audit evaluations, but are best undertaken separately from LEA evaluations, so as not to override them.

Underlying these relational issues are concerns about methods. OFSTED and audit have moved the emphasis from the interactive and connoisseurial model prevalent in the 1970s to the production or input-output model. Attending to outcomes was under-played in earlier forms of evaluation. It is doubtful, however, whether the production model is appropriate to the evaluation of large and complex organisations such as LEAs. Judgments on the changing outputs of the schools cannot establish the link between those outputs and the interventions of LEAs.

OFSTED seems likely to combine of process studies with those of outputs but the emphasis to be followed is not yet clear. The LEA evaluations set up implicit criteria. For example, they assumed there should be a connection between the LEA’s policies and school development plans and that the traded work of an LEA would respond to what schools wanted and to the overall policies of the LEA.

Criteria need to be clarified. But it is questionable whether there should be a universal battery of criteria. In the two evaluations undertaken so far, much the same issues have been pursued, although with somewhat different emphases. LEAs, however, are different and evaluation should respond primarily to the agenda to be carried forward beyond the evaluation. Moreover, if good evaluators are to be used, they will expect some leeway in interpreting their briefs creatively.

Until recently, it looked as if there might be reciprocity between the three forms. It now seems as if OFSTED would marginalise LEA evaluations if it could. This bears on the membership of LEA’s evaluative panels. In view of the tetchy politics surrounding LEAs, an independent convener seems necessary. Of the other members, a local government politician can bring in dimensions not familiar to technocrats, but must be known to be able to think objectively and to drop partisan attitudes.

At least one person, such as a recently retired HMI, able to relate to schools as an acceptable professional, and a recently retired or active CEO would be essential. The team needs a senior LEA official to guide them to people and papers and give informal advice. But it would be difficult for both sides for him or her to be a member of the team which must be able to discuss the LEA without inhibition.

The more difficult point is OFSTED and District Audit membership. An evaluation needs their skills and knowledge resources, and their presence may add credibility to the exercise. But the association with central government can inhibit comment on some of the contextual aspects of the exercise. There are divided opinions on whether it is enough to secure their contribution as an independent offering. These questions, however, assume collaboration between equal partners which OFSTED has yet to demonstrate it wants. More the pity.

Finally, it may be asked whether resources for national inspection and audit are in proper balance with those for LEA support and development. Under-resourcing compels LEA professional officers to raise resources by work that either distracts from or conflicts with their supportive roles. Should more go into statutory inspection than into helping schools be more effective through support, monitoring, assisted self-evaluation and targeting?

The resources available for nationally mounted inspection and audit may be well spent. They have never been evaluated, and it is high time they were. But they are out of proportion to the resources left to LEAs for support and the developmental monitoring of the schools.

Maurice Kogan is Professor of Government and Joint Director, Centre for the Evaluation of Public Policy and Practice, Brunel University. He was chairman of the teams that recently produced evaluations of Staffordshire and Kirklees LEAs

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared