‘Don’t blame an algorithm - blame the people behind it’

The use of the exam-grading algorithm for A levels and GCSEs was unethical and unjust, writes AI expert Rose Luckin
17th August 2020, 12:56pm

Share

‘Don’t blame an algorithm - blame the people behind it’

https://www.tes.com/magazine/archive/dont-blame-algorithm-blame-people-behind-it
Gcse & A-level Results 2020: The Algorithm Itself Isn't Dangerous - The People Behind It Are, Writes Rose Luckin

The results of the exam-grading algorithm are worrying, and deeply damaging for those whose marks have been downgraded, most of whom come from disadvantaged backgrounds.

No reasonable, rational person could fail to sympathise completely with the palpable despair and injustice felt by young people, their parents and teachers.

So, what went wrong? The word “algorithm” has become the mysterious, faceless villain that litters the debate and discussion around this year’s GCSE and A-level results. But what is it, and how can we make sense of its role in this fiasco?

GCSE and A-level results: the real danger

Let us set one thing straight from the start. Algorithms are not dangerous, per se. It is the people who made the decisions about which algorithms to apply, what data to use and exactly how to combine data and algorithm who are the problem, not the algorithm itself.

The particular use of this exam-grading algorithm is both unethical and unjust. Both the data and the algorithm should have been much more thoroughly tested to ensure that they were fair and unbiased. 

In the interests of transparency, the use of an algorithm and the whole process of awarding this year’s examination results must also have been explainable to the people it affects

A glance at the technical report released by Ofqual illustrates that this was not the case, and that it was not explainable to the vast majority of the population, let alone to the particular group of people whose lives it impacts upon the most: the students. Furthermore, the outcome of the process is patently unfair and clearly biased. 

Unethical algorithmic approach

The government has prided itself on being at the forefront of data, artificial intelligence and ethics. This is evidenced, for example, by the 2018 Data Ethics Framework and the formation of the Centre for Data Ethics and Innovation. It is therefore shameful that its own ministers are supporting the use of an unethical data-driven algorithmic approach.

Almost 40 per cent of A-level grades were downgraded from the teacher-assessed grades, with the largest differences seen among students from the lowest socioeconomic backgrounds. 

By contrast, the increase in students achieving A or A* grades compared with 2019 was much higher at independent schools (4.7 percentage points) than state comprehensives (2 percentage points). 

In the end, teacher-assessment grades played little role. Rather, it was the rank ordering that was more influential. Teachers are more experienced at predicting grades than predicting these rank orderings, so that would not have helped. 

But the biggest problem with the data is the influence of historical data, which skews the results towards repeating what a school or college has achieved in prior years, rather than what a particular student has achieved or is likely to achieve in the current year. 

 

Adjustments for the prior performance of this year’s students have been compiled from data of varying quality - for example, mock exam marks, when the timing and manner in which these were taken was not the same across all schools and colleges. 

But perhaps the biggest problem of all is the belief that a standardisation algorithm was the right approach in the first place.

Should protection of the numbers of passes and the numbers of particular grades from one year to the next be the real priority, in a year when students have suffered such a great deal already? Surely their wellbeing and their ability to progress should have been uppermost in the minds of ministers and the examination industry? 

So, we should not blame the algorithm. or indeed be too concerned about the use of an algorithmic approach. 

However, this particular algorithmic method was clearly flawed and the responsibility for that lies firmly with those in charge at Ofqual, who shunned expert assistance when it was offered.

We also need to ask to what extent government ministers and their advisers actually understand the data science that is at play here. The lack of data and artificial-intelligence literacy within the population is a worry. But to observe this deficit in Parliament and among policymakers is a catastrophe waiting to happen. In that respect, this is just the tip of an enormous iceberg.

Rose Luckin is professor of learner-centred design at UCL Knowledge Lab, and director of EDUCATE Ventures. She tweets as @Knowldgillusion

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared