The results of the first Scottish Survey of Achievement, to promote "a new, more robust" system for measuring pupil performance in key areas of 5-14, were published in June this year. The research is designed to offer a more accurate profile of attainment in English language and other core skills, including numeracy, at the P3, P5, P7 and S2 stages.
It's worth noting at the outset how the attainment profiles in June 2003 changed when the national tests were superseded by the online national assessments collated in 2004.
Reading showed fairly significant changes nationally, with decreases from P2 to P4 and increases from P6 to S2. Writing was least affected: but with trends similar to that for reading - a bit harder at levels A and B and then easier beyond, with writing continuing to be the least attained outcome in both primary and secondary.
Maths showed significant changes nationally and was the most affected in percentage terms showing unprecedented increases in attainment at every stage from P2-S2. The online assessments for maths clearly got easier than their predecessors at all the levels, A - F.
In order to compare how the survey findings for 2005 (published in June) articulate with the attainment profile being offered by the 2004 online assessments, I have assumed that:
* Attainment of a national assessment level is comparable to being "well established" at that level in the survey.
* Numeracy is comparable to maths.
* Using data in the survey involving 28,000 pupils in the P3, P5, P7 and S1 stages is equivalent to national assessment returns involving more than 50,000 pupils for each stage.
So given these assumptions, how do the attainment profiles of the SSA survey compare to those being offered by the online national assessments? Reading showed a high variance, with all four measures of attainment in the survey being significantly less than the profile offered by the online national assessments and writing showed relatively little variance.
Mathematics showed a fairly significant variance, with all four measures of attainment in the survey being less than the profile offered by the online national assessments, one very significantly.
The strongest variances from the profile offered by the online assessments, for reading and maths, were at P7 and S2.
In reading, the most critical outcome for accessing and making progress in the curriculum, (and perhaps in life itself), less than half of P7 pupils in the SSA survey were said to be either very good or well established at level D. Yet the national assessments indicate about 70 per cent of P7s to be at level D or better and is the most extreme inconsistency.
In numeracy, less than half of S2 pupils could not be said to be either very good or well established in numeracy; yet the national assessments indicate that about 60 per cent of S2s are at level E or better for maths.
So how can we go forward, learning from the assessment? The national assessments could be made more robust than they are at present in order to increase their credibility and reliability and, if not, perhaps abandoned.
I am less able to comment on reading and writing but, for maths, it's clear that a snatch result in a national assessment can disguise a lack of fluency or even adequate coverage of that level, which can leave both the pupil and their next teacher with a problem. This can become more acute further through the levels.
Colleagues with specific responsibility for "raising attainment", often measured by national assessments, should bear in mind the very clear tension between quantity and quality. They could give more credence to those schools and teachers that place an emphasis on quality prior to moving on to the next level, and who may be more conservative with their testing. There are many such teachers and schools, yet they can often be too easily put on the defensive when being compared to schools that may be more cavalier with testing.
It's fine for a pupil to get 50 per cent in a Higher maths exam, and be seen to pass it, since this may well be an exit exam with no other maths exam at any other time in their lives. However, it's different at the 5-14 stages: getting 40-60 for a level B, where the 20 dropped marks can be in critical areas of numeracy, and then moving on to level C having "attained"
level B, is not so fine.
The consequences of such progressions become more acute the further through the levels a pupil progresses. It's not uncommon to find pupils working towards level E lacking fluency at level C tables or level D fractions, and helps to explain the tendency for teachers to consolidate previous learning as well as the inhibited progress pupils can experience at levels E and F.
It would be helpful if parents were made aware that passing a national assessment does not necessarily mean their child was competent at that level and that it could be beneficial to consolidate the quality of the level prior to their child moving on. The Parental Involvement Act could have a positive role in this.
The survey findings suggest we need to support reading more effectively in order for more pupils to attain higher levels. and the current moves to declutter the curriculum will be as good a step as any in this direction.
The findings have particular implications for secondaries, where the curriculum is highly dependent on adequate reading skills. Hopefully the trend towards more interactive teaching will help to alleviate the problem, while still helping to address shortcomings in reading skills at S1-S2 and earlier.
And as for numeracy, the critical skills are having fluency in addition and subtraction at level B, tables at level C and fractions at level D. If we can get them sorted, then progress in numeracy becomes so much more attainable - with greater potential then for better attainment in maths.
Let's get the bairns daein it in thir heids, to complement the changes ahead with A Curriculum for Excellence.
Tom Renwick (www.mathsontrack.com) was a principal teacher and adviser in maths. He now supports local authorities, schools and parents in 5-14 maths.