We too submitted all our papers for re-marking last year, for the second year running, and had 111 out of 205 awarded different levels. All bar one dropped, levels including a number from level 7 to level 5.
In our case a "rogue marker" was also blamed: the re-marker asserted that the first 100 scripts (which had been "moderated") were accurate and that the marking had been careless after that. Apparently this "will not happen again".
I marked test papers myself last year, dissatisfied with what had happened in our school over the previous years. At the markers' training meeting there was considerable confusion as to how some of the sample scripts had been awarded their marks.
Having successfully marked two other schools I have looked again at our own scripts and I still can not believe some of the levels awarded, even after re-marking!
There is something more fundamentally wrong with the tests than the training of markers and the moderation procedures, which were "improved" last year.
Having been involved in A- level marking for some time, and consequently fully aware of the subjective nature of English marking, I am amazed at how subjective the marks awarded to these tests really are.
Testing pupils' reading response and writing skills cannot be done in such a fundamentally crude manner. Our re-mark last year was much closer to our teacher assessments than the first marking, but I still know which one I would put first on any report, and which is of value when target-setting.
In response to the QCA's assertion that so few scripts are re-marked, which must show how accurate they are, I know a number of schools who also felt that their tests were very badly marked, but who did not appeal because of the very tedious process and the feeling that these results do not matter to anyone.
Head of English
Buxton community school