So how good is your education authority?;Platform;Opinion
EAST RENFREWSHIRE is in the vanguard of combining its own measures for ensuring quality in schools with the growing attention to evaluation of the quality of education authorities themselves. How has that come about?
The council's aim has always been to bring quality monitoring and development closer to the professionals and to focus the processes on effective learning and teaching. Following extensive discussion with headteachers, parents and other stakeholders, it was agreed that heads would take the role of quality manager and be responsible for support, monitoring, measurement and enhancement of quality in their schools .
Meanwhile a task group on best value from the Scottish Office and the Convention of Scottish Local Authorities said in its first report in July last year that the main components of a best-value regime are sound strategic, operational and financial management, performance measurement and monitoring, focus on the customer or citizen and continuous improvement and competition.
It seemed sensible to combine our own quality arrangements with the main characteristics of the best-value regime. The council had already agreed to pilot the Accounts Commission's module on management arrangements. This work of strategic and corporate service planning was also applied to the education service, focusing on learning and teaching.
By this time, the council had decided that self-evaluation was the best route. By August 1996, all schools had agreed to help pilot different aspects of the national document, How Good is Our School?, with three of them presenting a standards and quality report to the education committee. In their new role of quality manager heads empowered and enabled colleagues to look at their own performance.
There is also a small team of quality development officers in East Renfrewshire whose remit is to support schools in their self-evaluation. Each officer takes part in a variety of monitoring exercises within a five-year planning framework which includes the normal HMI inspections, annual school plan monitoring, issue-based inspections, or "taking a closer look" inspections of the work of the schools. Within the five years all schools take part in at least one of these activities.
This schedule seemed to fit well with the best-value proposals. The benefits lie in the focus on learning and teaching and on the use of qualitative comparative data.
The secondary heads agreed to begin a target-setting exercise - a modest initiative with the directorate suggesting that each school should target one area each session. But the heads said they wished to encompass both Higher and Standard grade.
So far so good. Progress had been made on the quality agenda for learning and teaching. The best-value regime was, however, to introduce in a much more formal way the concept of cost and the relationship of cost to performance. Cost would be included to assess the value-added aspect of each activity or process to educational effectiveness. So there has to be a means of preserving positive evaluation in the classroom which also takes on board the need for thorough appraisal of costs and value for money.
When we compiled our best-value service review programme, a careful balance was struck between the cost-technical type of review and that which clearly addresses the quality of learning and teaching.
Around last October it became clear that we should not only validate the schools' use of self-evaluative techniques, but also look at the validation role of the authority itself. It was agreed to ask HMI if it would help "close the quality circle" by evaluating the progress made under the new quality development framework.
With several colleagues, I responded to an open invitation to join a working group with HMI whose ethos has turned out to be one of partnership and mutual learning. I was interested to learn if our concept of quality development was feasible. The inspectors wanted to learn from colleagues in education authorities. Their work is not in the political environment of a local authority but they have a vast amount of skill in analysing complex situations, gleaning information from stakeholders and monitoring the effects of a policy on the ground. By June this year the group had produced a set of draft performance indicators that could be used in the first evaluation, which it was agreed would be of East Renfrewshire.
The prime objective was to look at quality development strategies and indicate how they are assisting school improvement. This central core of an education authority's work can well demonstrate best-value principles. The evaluation has not looked at the entire scope of the authority's functions.
Using draft performance indicators, it is hoped that the evaluation will result in a report describing one system in one council. As more evaluations are undertaken, the draft performance indicators are likely to be refined. Eventually, all authorities should be able to use the indicators to gauge their own performance.
The evaluation team consisted of five inspectors and an observer from the Accounts Commission. Its work was in three phases and the team had opportunities for discussion with individuals and groups. These included the chief executive and leader of the council, the convener and vice-convener of the education committee, the director of finance, education department staff, schools and chairmen of school boards.
We are now ready for the report on The Management of Quality Development in East Renfrewshire Council Education Department which will be published next Friday (November 6).
Eleanor Currie is director of education for East Renfrewshire.