Better reporting on progression

27th February 2009 at 00:00

The Sunday Times article "Schools are failing Scots pupils" (February 7) was, depending upon perspective, either a poor or an excellent piece of reporting. Some of its readers, whose offspring attend expensive private education, may have smiled over their Sunday morning croissants as they read the article, silently thanking the report for its de facto justification of their cash going to this sector.

On the other hand, those who are aware of the inherent weakness of 5-14 assessment as a tool to measure progression from P1-S2, would have groaned at the misuse of FOI for poorly-informed, sensationalist journalism.

My authority, like all others in Scotland, seeks to have detailed evidence of strengths and development needs of individuals and year groups to allow effective targeting of improvement programmes. 5-14, however, is not appropriate for this purpose.

In evaluating the usefulness of assessment information, normally two sets of criteria are applied: "validity" (the assessment measures what it is intended to measure) and "reliability" (the assessment would yield the same result, irrespective of the context in which it was applied).

5-14 assessments are not, and never were intended to be, tools to measure year-group progression in the context of reliability and validity. The knowledge and competences the tests measure, although defined clearly, are broadly stated. Second, there is no external moderation exercise conducted on assessments, and the context in which children can be assessed rightly varies.

So, while valid, 5-14 grades are not reliable, children may sometimes be reassessed when they move from school to school. They were part of an individualised assessment regime which would assess pupils at the point when, in the professional opinion of the teacher, they were ready to be tested.

Unfortunately, in the absence of other measures, the use of this regime was "adapted" in the 1990s to complement a "standards setting" exercise led by HMIE, and the focus moved towards using gathered data on 5-14 assessments to help identify trends in the performance of year groups and thus facilitate an analysis of progress in literacy and numeracy from P1-S2. This change in emphasis created a tension in the 5-14 assessment programmes, in that schools were implicitly encouraged to test larger groups to allow the data to be gathered for national and cross-authority comparison, despite a conflict with initial and sound advice only to test pupils when they were ready.

This tension continued until the Scottish Government, one imagines in recognition of these issues, stopped collecting 5-14 data for national and cross-authority comparison. It was up to authorities from that point to collect and report on this data in their constituency, if they deemed it to be of value for their own use - risking the negative reporting I mentioned at the outset.

The challenge for Scottish education now is to come quickly to a national consensus on how to improve reporting on pupil and year-group progression, ensuring it is reliable and valid. Otherwise, we may simply re-play annually the sad story of 5-14.

Andrew Sutherland is head of schools at East Ayrshire Council.

Log-in as an existing print or digital subscriber

Forgotten your subscriber ID?


To access this content and the full TES archive, subscribe now.

View subscriber offers


Get TES online and delivered to your door – for less than the price of a coffee

Save 33% off the cover price with this great subscription offer. Every copy delivered to your door by first-class post, plus full access to TES online and the TES app for just £1.90 per week.
Subscribers also enjoy a range of fantastic offers and benefits worth over £270:

  • Discounts off TES Institute courses
  • Access over 200,000 articles in the TES online archive
  • Free Tastecard membership worth £79.99
  • Discounts with Zipcar,, Virgin Wines and other partners
Order your low-cost subscription today