Examinations - Schools pick a new fight over grading

`Clear evidence’ turns up the heat on exam boards and watchdog
27th September 2013, 1:00am

Share

Examinations - Schools pick a new fight over grading

https://www.tes.com/magazine/archive/examinations-schools-pick-new-fight-over-grading

The English GCSE grading controversy was a fight that went all the way to the High Court. On one side were teachers who claimed that their students had been robbed of the passes they deserved; on the other, exam boards and the regulator who claimed they were clamping down on grade inflation.

In the end, the judge found against the students and turned down demands for a regrade. But now - months after the case concluded - schools claim to have the “clear evidence” that would have won them the case.

Official statistics show that grading decisions overseen by regulator Ofqual led to results being significantly out of line with the previous year, with almost 18,000 students missing out on crucial C grades as a result, it has been claimed.

The findings were presented by David Blow, a data expert for the Association of School and College Leaders (ASCL), at a conference organised by the union last week.

“This shows that Ofqual and the exam boards made a mistake,” Mr Blow said. “Overall the grades were harsher than they should have been.”

The research also called into question grades awarded in this year’s GCSEs, said Mr Blow, headteacher of the Ashcombe School in Dorking, Surrey.

Thousands of students received lower than expected English grades in August 2012 and it emerged that grade boundaries had been dramatically raised between January and June.

Ofqual ruled that January’s grades had been too “generous” but that June’s had been correct and should be upheld because of the need to “maintain standards over time”.

The regulator said that when changes in the students taking English and English language GCSEs were accounted for, the percentage achieving an A*-C grade in June had fallen by just 0.3 of a point.

Exam board figures for “matched candidates” - students with available data on their primary test results - actually revealed results that were a little better than expected, they said.

These points were central to Ofqual’s case in December last year when the regulator and the Edexcel and AQA exam boards were taken to court by an alliance of students, schools, teaching unions and local authorities.

Now, more comprehensive figures had proved that the grading was actually significantly out of line with previous years, Mr Blow said.

Department for Education and Ofsted statistics looking at English and English language GCSEs for all matched candidates showed that the proportion achieving A*-C was 69.5 per cent, Mr Blow said. That was 3.5 percentage points down from where students’ primary test data suggested it should be if standards were being kept in line with 2011, he added.

According to these figures, 17,805 fewer students achieved an English A*-C grade than should have been the case. “That is the actual outcome and you can see starkly that English did go down,” Mr Blow said. “We were proved right in the end.”

He added that the sudden fall in English grades also called into question this year’s results. They had been graded using 2012 as a reference point and were therefore likely to be out of line with 2011 grades.

Ofqual, AQA and Edexcel said they remained confident that the 2012 grades were correct and that any discrepancies existed because the DfE figures were produced using different data.

But doubt has been already cast on the methods used by exam boards to calculate grade boundaries. Evidence presented in the court case showed that Rod Bristow, president of Pearson UK, owner of the Edexcel board, had raised concerns that the figures used by other boards were “not very reliable”.

Malcolm Trobe, deputy general secretary of ASCL, said it was important that boards were transparent about the data used.

An Ofqual spokesperson said: “We are confident that the results in 2012 and 2013 are robust and that the standards set were right. The DfE data includes solely the best result for each student by the end of key stage 4. The data used by Ofqual includes all the results for every exam entry in the summer series. Inevitably, these sets of results look different.”

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared