As the group formerly known as advisers meets for its annual conference today, Brian Boyd and Fiona Norris consider its future
ince the reorganisation of local government in 1996 and the establishment of the Scottish Parliament in 1999, the role of advisers in local authorities has changed significantly. Improving Our Schools (SEED 1999) placed an expectation on local authorities to support and challenge schools, and performance monitoring became a key function.
Between April and August last year, the Association of Educational Development and Improvement Professionals Scotland (AEDIPS) devised two questionnaires which were sent to local authority improvement personnel.
What emerged was a range of issues relating to their changing role, including issues such as the increased burden on local authority staff, their role in continuing professional development (CPD) for teachers and their changing relationship with HMIE as part of the quality improvement process.
What is clear from the data is that the roles and remits of advisory staff have changed. However, it would appear that, while the new emphasis appears to be on the quality improvement process within schools, many of the traditional duties they have carried out remain.
One role that appears to have disappeared in many authorities is the delivery of what was once called in-service training, now referred to as CPD. Advisory staff now facilitate CPD, by responding to schools' needs analyses, constructing programmes of CPD and supporting probationer teacher induction programmes, the Scottish Qualification for Headship, aspiring chartered teachers, and the like.
The extent of the change in role comes out most graphically when the performance monitoring and support for school self-evaluation tasks are considered. Most of the authorities represented in this survey used How Good Is Our School 2? (HMIE 2003) as the foundation of their quality improvement process. Indeed, the amount of time spent by local authority staff as part of the HMIE inspection process has become a major issue within the advisory service nationally.
Respondents were asked to quantify the time spent on activities in relation to HMIE processes prior to, during and after the visit by inspectors, in terms of days and hours. The range was huge, from one to 11 days pre-inspection and from one hour to 18 days post-inspection. Interestingly, the smallest amount of time and the narrowest range was during the inspection itself. It seems that, once HMIE is present in the school, there is no role for quality improvement officers (QIOs). In spite of the amount of time spent on HMIE processes, most respondents felt the potential for true partnership was not realised.
One authority calculated that the average time spent in supporting a school was seven days prior to the HMIE inspection, one day during the inspection itself and up to 11 days as part of the follow-up. If this is in any way typical, it suggests that there is a hidden cost in terms of resources when HMIE calls.
The evidence from the surveys is that quality improvement of schools now drives much of the agenda within local authorities. Not only are QIOs having to become generalists, cross-sectoral as well as cross-curricular, but they are now working at every level in schools - from pupils to parents and from teachers to headteachers, a role which, in the past, only primary advisers carried out.
However, the most significant change is that, while the focus is on quality improvement generally, most of the activities seem to be concentrated around the HMIE inspection. One council has produced a school inspections checklist which runs to two and a half pages of A4, closely typed, beginning with the "formal notification of inspection from HMIE" to "authorityschool discussion of action plan". This is all timed around the inspection, from "inspection minus three weeks" to "publication plus one week".
Yet in this checklist, as in the data from experienced advisers, there is minimal involvement of local authority staff during the inspection itself.
Therefore, if the inspection itself is clearly an HMIE function, and if so much time is spent before and after by local authority staff, what is the nature of the relationship between the two sets of activities and the two sets of professionals?
From the schools' perspective, where does CPD fit into the process? It seems self-evident from the data that there are two sets of activities, both very time- consuming; each plays a part in the quality improvement process. There is a sense that, given the relationship between HMIE as "trainers" and QIOs as "the trained" and given that there has not been much in the way of formal acknowledgement by HMIE of the role of QIOs in the final published report, there needs to be some debate nationally about how the work of QIOs and HMIE can be truly complementary.
It may be that there is conceptual confusion between "improvement" and "inspection". Are they synonymous or is one (inspection) a subset of the other (improvement)? Is national inspection of the kind HMIE currently undertakes necessary when so much support and challenge is offered by local authority staff? If the answer is yes, then would the system benefit from a greater separation of roles, locally and nationally, with "development"
being the task of the advisers and "inspection" the role of HMIE.
A third option is possible, which is that joint training opportunities are increased and QIOs take on an enhanced improvement role, with HMIE "quality assuring" the local authority staff.
The real issue may be the extent to which quality assurance, whether through external inspection or not, is conceived as a developmental activity. Fullan and Hargreaves have written extensively on the need for collaborative models of school improvement, both internally within schools and among all of the players in the school improvement game. At present, most of the activity seems to be narrowly focused on the inspection element of the process.
Can the new manifestation of the traditional advisory service perform both a developmental and an improvement function? The evidence from the surveys suggests that improvement officers continue to try to do so, but at a cost.
Perhaps the way forward is for a period of reflection, nationally, on how the improvement agenda itself can move forward and learn from the experiences of those who are committed to it.
Brian Boyd is honorary president of the Association of Educational Development and Improvement Professionals Scotland. Fiona Norris is a quality improvement officer in Inverclyde and member of the AEDIPS executive.