Hit by a hangover over assessment
Now we have son of national testing. The same as before with cosmetic changes: downloading of test materials, the inclusion of a writing task from class work and the arrangement of maths questions within two units rather than the previous four plus mental.
Twelve years ago, there was widespread opposition to the introduction of national testing. Teachers, parents, education authorities and Labour politicians combined to stoke fears of pressurised pupils, a narrow curriculum and teaching to the test. They are still cited by those who oppose the latest developments. However, this time they lack the support of parents who have discovered that national tests are not a threat to children but a normal part of school life like doing your homework or remembering your gym kit. Parents are right. National assessments are not dangerous. They are beneficial and if we abandon them we have to find a better replacement.
Before national tests, most primary schools were ignorant of how well they were doing and didn't consider it important. They were snug under a blanket of complacency, tucked in by headteachers who accepted unquestioningly the panaceas of "the integrated day" and children working "at their own pace".
The only information provided to parents was a tick in the "satisfactory" box. The middle box was safe. Ticking upper or lower boxes could mean a request for evidence and, usually, there wasn't any. After seven years of this, no wonder secondary schools stuck to their "fresh start" approach.
But national testing has been undermined by its many problems. The 5-14 levels are too broad. The tests have been of variable quality with teachers able to find "easy" and "harder" tests for the same level and they lack any statistical reliability since they have not been standardised across the range of the population.
Worst of all is the way in which results are used by politicians and the media. At the national level, poverty is the main influence on results - hence the appearance of East Renfrewshire and Glasgow, respectively, at the top and bottom of most lists. In many schools, a rise or fall of a few percentage points is not worth bothering about and is likely to be reversed in the following year, or the next, depending on the abilities of pupils in individual year groups.
But the media and politicians ignore the real story of results and prefer to indulge in point scoring. Results of less than 100 per cent bring out the "could do better" headlines while the slightest rise encourages governments to claim the success of all their policies. Fortunately, parents are more sensible and understand why a school's results fluctuate from year to year.
The option of using the occasional Assessment of Achievement Programme (AAP) national samples for self-evaluation has to be a non-starter. We know what the findings will be. Primary 4 will be good, primary 7 bad and secondary 2 worse and we can happily dismiss the findings as not applying to our school: it's everyone else who's out of step.
It looks as if the new national assessment system has inherited the problems of its predecessor. However, abandoning it is not an option since schools' efforts at self-evaluation and improvement are useless without a reference point - even a flawed one - for attainment.
Hang on to some of the Christmas wine. It will comfort us as we operate our new assessments.
Brian Toner is headteacher of St John's primary in Perth.