Ofqual investigation concludes troubled KS2 Sats were 'inconsistent'

Lack of communication meant difficulties in sorting official guidance from 'personal interpretation'

Helen Ward

News article image

An official investigation into last year's troubled KS2 Sats has revealed a long list of concerns and inconsistencies about the way the controversial assessments were run.

The report by exam and testing regulator, Ofqual, examines problems with the 2017 moderation arrangements for teacher assessments of writing and highlights difficulties in training, differences in how long schools had to prepare for moderation and variations in the way that moderators' visits were conducted.

It concludes that in 2017 the moderation - carried out by local authorities - was “more inconsistent than it could have been”.

The process is supposed to ensure that teachers’ judgements of pupils’ writing are consistent across the country.

But after changes to the way writing was assessed in 2016, unions raised concerns that the local authorities’ expectations of the assessments varied so much that it became unfair to compare schools.

As evidence of concerns about the process, Ofqual’s report cites Tes articles showing that two thirds of moderators had failed at least one of the three training papers given by the Department for Education's Standards and Training Agency (STA).

The watchdog combined observations of moderation visits to one school in each of 12 different local authorities, with interviews with teachers, moderators and moderation managers from 17 local authorities.

It looked at the training given to moderators, the logistics of moderation and how moderation was carried out.

Ofqual said moderators were critical of the way a script was used for the centralised training delivered by the Standards and Testing Agency to two people from each local authority and the faxt that attendees not allowed to asked questions.

This meant that those moderation managers who had been trained and then had to train local moderators could not answer their questions.

“While some LAs largely repeated the same training that had been delivered by the STA, others expanded on this, for example by offering more explanation about how to interpret certain [assessment] framework statements,” the report states.

Sometimes guidance was sent from the STA to the individual moderation managers was not shared at national level, and other information was shared on social media. “It became difficult to understand which pieces of guidance were official and which were people’s personal interpretation,” the report stated.

The report also pointed out that all schools sampled for moderation by local authority were generally notified on the same date – meaning some got two weeks notice and others four weeks’ notice. “On the other hand, while those who were moderated later tended to have more time to prepare for their visit, they also tended to have less time to submit any additional evidence [after the visit], if required.”

Some participants in the study told Ofqual that not all LAs required additional evidence they they asked for from schools to be submitted moderation. Instead some authorities "allowing schools to "internally moderate ‘minor’ additional evidence, and suggested that this risked gaming".

During the visits, all moderators read through the pieces of writing before they assessing them against a checklist of requirements for pupils work.

But, there was further inconsistency, the report found: “Some moderators went a step further, and asked teachers to describe each pupil to them before reading the work, such as whether the pupil enjoyed writing or reading, or what kind of personality the pupil had. These moderators felt that this helped them to understand the pupil as a writer, which helped them to evaluate their work. However this would suggest a departure from the [assessment] standards”.

Ofqual says its report does not provide a definitive judgement on the quality of moderation or a broad representation of national practice – but it does identify potential risks to the consistency of moderation.

“It was likely that moderators’ judgements were more inconsistent during 2017 than they could have been, and that some variations could have operated between local authorities, but it should be possible to reduce inconsistency in future years.”

Ofqual has recommended that the Standards and Testing Agency redesigns its tests for moderators and keeps the approach to assessment of writing under review.

The Standards and Testing Agency has already changed the frameworks for teacher assessments of writing for 2018 and will run small-scale pilots of comparative judgment of writing.

Ofqual said that following its research the STA has also agreed other measures including improving their helpline and email response times and encouraging local authorities to moderate more than the 25 per cent minimum sample of schools.

A Department for Education spokesperson said: "Primary schools should not be judged on their writing data only from 2016 or 2017 - this data is only a starting point for conversations about performance. We have taken action to ensure that teacher assessment is accurate and consistent, including further training for local authority moderators and new guidance for schools."

Want to keep up with the latest education news and opinion? Follow Tes on Twitter and Instagram, and like Tes on Facebook




Register to continue reading for free

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you

Helen Ward

Helen Ward

Helen Ward is a reporter at Tes

Find me on Twitter @teshelen

Latest stories

Coronavirus: Partial vaccination of ASN staff 'is putting pupils at risk'

Gillick competence: What schools need to know

You may not have heard of the ‘Gillick competence’ but it may well be used by pupils to accept or reject the Covid vaccine – here’s what schools should be aware of
Andrew Banks 23 Sep 2021