When primary school headteachers logged into the NCA Tools system to download their pupils’ Sats results back in June, they were only getting half the picture.
The data they were able to download included test scores and a summary, but it didn’t show the all-important combined results or the progress scores for each subject. Headteachers had to wait until September to access that data, by which time they’d attempted to calculate it themselves, accessed data from third parties or paid consultants to do it for them.
When 3 September rolled around, heads had access to the rest of their data – this time logging into a different system: the Department for Education Tables Checking website. Well, I say ‘the rest of their data’, but actually, although this next data set does include combined results and progress scores in each subject – broken down by pupil characteristic groups – it does not take any successful reviews of test outcomes into account (schools aren’t notified of these until 12 September back over on the NCA Tools website), nor does it take discounted pupils into account (because that is the purpose of the checking exercise).
And what does all of this mean? It means that the Tables Checking data is often not an accurate depiction of school performance; it also means that, yet again, schools are forced to recalculate the data themselves.
Schools need a 'useful' summary report
But hey! Analyse School Performance (ASP) is updated in October. That’ll be accurate, right? Schools log into ASP – yet another system – only to discover that: a) it only includes key stage 2 data (the rest will appear in dribs and drabs); and b) it still doesn’t take account of review outcomes or discounted pupils. And this means that Ofsted’s Inspection Data Summary Report (IDSR) – downloaded via ASP – is also inaccurate (even though you might be expecting a visit any day). So there is more pressure on schools to ensure their data is as up-to-date as possible.
In fact, the first opportunity schools get to see an accurate set of results from a DfE system is when they’re published publicly in performance tables in December, five months after pupils left the school and a full seven months after they have taken Sats.
Every year I get asked about what data is published when and where, and every year I have to remind myself of the schedule and the systems and what’s included and what isn’t. Quite frankly, it’s baffling. One of the biggest favours the DfE could do for schools is to merge all these systems, get data out earlier, and provide a useful summary report.
This would reduce schools’ reliance on third parties to analyse and make sense of what is, after all, their data – and it would mean fewer passwords to remember, too, which is always welcome. I mean, it’s not like headteachers have a little black password book hidden in their office (of course not). But you could understand it if they did.
James Pembroke founded Sig+, a school data consultancy, after 10 years working with the Learning and Skills Council and local authorities. www.sigplus.co.uk