Double standards from the inspectors

21st July 2006, 1:00am

Share

Double standards from the inspectors

https://www.tes.com/magazine/archive/double-standards-inspectors
A closer look at the way in which two authorities were scrutinised is not reassuring, says Sandy Wilson

HMIE has recently published pilot inspections of Clackmannanshire and East Ayrshire councils in the second “INEA” round of education authority inspections. Senior staff in authorities across Scotland will have examined these findings to gauge their own readiness for a new round of inspections.

HMIE’s charter states that its core values are “integrity, honesty, objectivity and impartiality”. These two reports fail to demonstrate this admirable statement of intent.

Until 2004, I was head of education and lifelong learning in Clackmannanshire and am therefore not entirely disinterested. However, I am not in the business of disparaging the quality of East Ayrshire’s education services or trying to prove that Clackmannanshire is in some way superior to it. I shall focus on two related performance indicators: “Impact (of the authority) on learners” and “Continuous improvement and performance”. East Ayrshire is rated very good and good respectively, while Clackmannanshire is rated good and adequate.

To what extent does the evidence cited in the report confirm the distinctions made? In terms of absolute performance levels, in the last year for which primary 5-14 levels are available (2004), attainment levels in both authorities were below the average for comparator authorities and for Scotland as a whole. At secondary 5-14, East Ayrshire was above, or near, national and comparator authority levels of attainment, while Clackmannanshire remained below.

Both authorities are performing below national and comparator authority levels in almost all measures relating to S4 and S5, with East Ayrshire outperforming Clackmannanshire in all but one measure. In S6, Clackmannanshire’s performance levels are well below national and comparator authorities’ averages, while East Ayrshire is above comparator authority averages and at or around national averages on all measures. At first glance, therefore, HMIE’s judgments on the two critical indicators would seem to be fair.

However, what these absolute performance levels do not reveal is the impact the authority has had over a given period (the rate of change in pupil attainment) and how any improvement can be demonstrated to have attributed to the efforts of the authority. At primary 5-14, East Ayrshire’s performance is summarised thus: “Over the period 2002 to 2004, attainment in reading and writing in primaries had remained broadly constant, while mathematics had improved slightly. Performance in all three areas was below the averages for comparator authorities and national averages, in 2002 to 2004.”

Clackmannanshire’s performance is reported in two ways. “Over the period 2001 to 2005, there had been a steady improvement in pupils’ attainment in writing and mathematics. Attainment remained static in reading”. And “between 2001 and 2004, the rate of improvement in reading and mathematics was in the middle 50 per cent of comparator authorities and in writing it was in the top 25 per cent. However, in 2004 performance in all three areas was below the average for comparator authorities and national averages.”

The data sets used for the authorities relate to different periods (Clackmannanshire’s is even reported on over two different periods).

However, data based on 2001-04 or 2002-04 demonstrate unequivocally that Clackmannanshire’s rate of improvement is consistently higher than East Ayrshire’s during both periods in reading, writing and mathematics.

The summary statements of the rates of change in level E performance by the end of S2 describe “an improving trend in reading and writing” while remaining “fairly constant” in mathematics in East Ayrshire from 2002-04.

Clackmannanshire’s performance in reading and writing “stayed steady” from 2003-05 and in mathematics, attainment had been “steady” in 2003 and 2004, but had “improved significantly” in 2005.

So, as with the primary data, different periods of change have been examined for the S2 results in the two authorities. If they were examined using the same periods as in primary, Clackmannanshire’s average rate of improvement over the three measures is significantly better than East Ayrshire’s.

At Scottish Qualifications Authority level, the two authorities would appear from the summaries of trends over time to have performed similarly, with neither showing any “strong or consistent trends” in improving standards.

The evaluations of very good and good are less convincing seen in this light. The only fair and consistent way to judge authorities’ rates of improvement is to provide an analysis over the same periods. It is surely indefensible that HMIE has failed to do so.

The authorities’ impact on learners and their capacity for continuous improvement are not, of course, dependent on attainment data alone. The two reports consider the effectiveness of the quality improvement officer (QIO) teams although, confusingly, this is included in the section on “key outcomes and impact” in East Ayrshire, whereas in Clackmannanshire it is in two parts of the section on “leadership”. Arguably, Clackmannanshire’s QIO team is rated more highly, but this impact appears to have less force because of where it is placed in the report.

Actions the two authorities have taken to secure improvement in learning are difficult to compare. However, inconsistency in the reporting of these is again an issue. For example, East Ayrshire is commended for being aware of relatively weak performance in primary 5-14 reading and writing and for giving a high priority to these areas. In Clackmannanshire, the success of the synthetic phonics programme is recognised, with a reasonable caveat that attention is required to link the gains made in the “technical aspects of reading, in order to improve further pupils’ levels of attainment in reading 5-14”.

The cavalier manner in which HMIE has used different bases for its statistical analyses is, at the very least, careless or incompetent. The inordinate delay of seven months between the dates of inspections and the publication of the reports makes one wonder what difficulties they have had in this new pilot model of inspection. The outcome is certainly less than impressive, and is one which undermines any residual confidence I had in HMIE’s capacity to understand, far less report objectively on, the performance of education authorities.

Sandy Wilson was head of education and lifelong learning at Clackmannanshire Council.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Nothing found
Recent
Most read
Most shared