Key stage 2 test results have now been released, and without further ado, here are the national figures:
- Pupils meeting the expected standard in reading, writing and maths: 53%
- Pupils meeting the expected standard in reading: 66%
- Pupils meeting the expected standard in writing: 74%
- Pupils meeting the expected standard in maths: 70%
- Pupils meeting the expected standard in spelling, punctuation and grammar (Spag): 72%
- Average scaled score in reading: 103
- Average scaled score in maths: 103
- Average scaled score in Spag: 104
Perhaps most surprising is the national combined figure of 53 per cent. It appears that England has fallen below its own floor standard. I suspect most people were anticipating this number would be higher but that's the point: we had no idea what to expect. We had nothing to hang a prediction on. It is therefore vital that we remember that these results bear no relation to what’s gone before.
So, compare your school's results to these figures by all means − this will form the basis of autumn performance reports, after all − but please resist comparing 2016 results with those of previous years. Ofsted has already stated that this year's RAISEonline reports and dashboards will focus on 2016 data and will not include previous years' results, which is certainly reassuring.
So what's the problem?
The issue is that back in March 2014, the Department for Education stated that the new expected standard would be broadly equivalent to a level 4b. This claim has since been repeated so many times that it has become fact and has inevitably led to all sorts of spurious practices.
For example, some schools have attempted to convert new assessments into "old money" and to set 2016 targets on the basis of whether or not current pupils would have likely achieved a secure level 4 in previous years. Local authorities have requested predictions that have been compared to previous years' results and governors, SIPs and HMIs have been informed that results will be up or down on previous years based on such predictions.
All of this has stemmed from that misguided statement in the original primary assessment consultation that the new expected standard would be equivalent to a 4b, a level reached by 69 per cent of pupils in 2015.
So, the results are out and they are what they are. You may be above or below the national figures; you may be relieved or distraught. But please let's not compare these results with those of previous years and let’s reject attempts by others to do so. These datasets are incomparable and there needs to be an iron curtain built between them. Even if the comparison is favourable, it is for the greater good that we nip this fallacy in the bud now. If we don't, a lot of schools are going to find themselves at the sharp end of some very difficult conversations.
This is year dot. There is no comparison; there is no trend. And thankfully, even Nicky Morgan agrees about that.
James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities