So national tests are to be replaced with national assessments. With last session's first-year secondary class at the cusp of the transition, I decided to conduct a little experiment.
They were a typical mixed-ability intake - ranging from 5-14 level A to level F. I chose reading tests in order to obtain a numerical score and to allow for greater objectivity. I administered the statutory two tests for each type, with little or no time lapse between.
Overall, compared to their results in national tests, 37.5 per cent of pupils scored one level higher in national assessments and 12.5 per cent scored two levels higher. Exactly half the class therefore appeared to have improved attainment. Only one pupil dropped a level.
When the results are looked at level by level, other concerns emerge. At the upper end of attainment, levels E and F, the correlation between tests and assessments appears to be sound, with 87.5 per cent achieving the same result. At level D, however, this drops dramatically to only 17 per cent.
At level C, it falls again, to 14 per cent. Here then, are the pupils who appear to be improving by leaps and bounds.
The trouble is, I have a niggling concern that the only thing that "improved" was the IT skills of our probationer who spent days navigating the Scottish Qualifications Authority website to produce the material for us.
Maybe it just goes to prove the old adage - there are lies, damned lies and national assessments.
Kensaleyre, Isle of Skye