Local authority moderation leads up and down the country have been taking the test set by the Standards and Testing Agency (STA) to check if they’re up to scratch for moderation this year. Although there’s been no admission as such, it seems quite clear that the chaos of the 2016 teacher assessment results has been recognised at the Department for Education, so it’s tried to do something about it.
I say there’s been no admission. Rather like a toddler with chocolate around his mouth, an admission was hardly required given the wild variety in outcomes across the country in teacher-assessed writing last year.
We all knew that the new expectations for key stage 2 writing were much more demanding, yet somehow authorities such as Norfolk and North East Lincolnshire saw drops of just 4 per cent and 7 per cent in expected attainment compared with 2015, while other places such as West Sussex and Swindon saw declines of 25 per cent and 28 per cent.
You might argue that some authorities did a better job of preparing children for the new requirements. But why is it then that all four authorities saw similar drops of 5-10 per cent in grammar tests? Writing judgements are now largely based on grammar and punctuation. No, it’s clear that different authorities had very different approaches.
Fearing the worst
So will a test for moderators help to resolve that problem? I fear not.
The issues we faced in 2016 were plentiful: a new framework that was delivered late; no guidance or training for moderators from the STA; a lack of clarity about what constitutes “independent writing”; different expectations about the quantity and quality of evidence required…
Few of those issues will be resolved by making moderators take a test, but worse, some of those issues have made this year’s process even more complex. If moderators in Swindon gave overly harsh messages to their schools last year (some might argue “realistic”), then how will they ensure that a different message gets out this year? Will schools that suffered harshly in moderation in 2016 know how to adjust their expectations without the support of a moderator in 2017?
Instead of smoothing the waters, we could face a new volatility
What if the converse is true? Perhaps the local authority in North East Lincolnshire was wide of the mark in 2016 and encouraged too much generosity in assessing against the framework. Are schools that found their results positive in 2016 really going to be minded to apply the criteria much more stringently this year? Perhaps those schools that receive a moderation visit in 2017 will be drawn more closely to the expected application of the framework, but as ever, that’s only a quarter of schools.
In 2016, schools were desperate for guidance and clearly in many cases they listened to the advice of their local authorities. It’s for that reason that I suspect we wouldn’t see a huge discrepancy between results in moderated and unmoderated schools within each authority. I rather fear that instead of smoothing the waters, the outcomes this year could face a new volatility, which sees schools visited by the newly trained moderators affected in ways that unmoderated schools are not.
You can guarantee that whatever the other advice around, there will be heads up and down the country aiming for that national average of 74 per cent from last year. Who knows what new chaos will ensue?
Michael Tidd is deputy head at Edgewood Primary School in Nottinghamshire @MichaelT1979