I’ve always had a soft spot for the Luddites and I think history has been very unfair in painting them as self-serving, selfish men determined to stop the march of progress and prosperity.
The reality is that they were highly skilled people faced with ruin who understood that it would be the chief executive class who would gain most from a move to lower-quality products produced using low-skill labour and new machinery. I particularly like the fact that, just like Robin Hood, the leader, Ned Ludd, may never have existed and was said to have lived in Sherwood Forest.
Of course, viewed from several centuries later, we can see the long-term benefits of most new technologies, which reinforces the negative connotation of the term Luddite. In turn, this makes many enthusiastic for more advances, especially in areas like artificial intelligence (AI). The general theory is that we have nothing to fear. AI may destroy many jobs but it will replace them with more interesting, safer and higher-paid roles.
But while that may have been true of the past, will it hold true in the future? Will it hold true even in the short term?
More by Ian Pryce: Education shouldn't be about pandering to employer need
Blended learning: Why students should co-design courses
Nobel Prize-winning economists Abhijit Banerjee and Esther Duflo have shown that the Luddites were not that wrong. Their jobs disappeared. Blue-collar wages in Britain halved between 1755 and 1802. It took 65 years for wages to return to 1755 levels. The economic historian Robert Fogel showed that British boys of that time were more undernourished than slaves in America.
So when we see this and listen to Professor Stephen Hawking’s chilling assertion that the development of full artificial intelligence could spell the end of the human race, should we not think harder about its implications for the education we deliver?
Kazuo Ishiguro’s latest novel, Klara and the Sun, envisages a system where elite students are “lifted” through genetic engineering. They study by themselves online because schools are not good enough for them, but then have to meet others at special events to learn to socialise. The unlifted have few prospects. Even intelligent humans, like the main character’s engineer father, find themselves “substituted” and workless thereafter. Klara, the artificial friend, provides care, even love, and is in some ways the most human and empathetic character.
Our traditional certainties have been that raising educational standards and levels is the best protection against “substitution”, will lead to a good and pleasant life, and deliver social justice. We assume that technology is largely a force for good and will help us to achieve more individually. It will enrich our lives and increase our productivity. We are urged to embrace investment in technology, and artificial intelligence as a result. At a national level, governments clearly prefer machines to people. If a college takes on a new employee, we are taxed (employers’ national insurance). If a company takes on a robot, it can claim tax relief (capital allowances).
How AI and automation could affect employment and education
When technology investments are planned, it is rare for us to see them as a way of increasing employment. The fact that overall employment has been rising is unrelated, or at best indirectly related, to those investments, and this might be changing. Banerjee and Dunlop found that automation in the 1990s had large negative impacts on employment (6.2 jobs lost per robot, wages depressed). It created some high-skill jobs but wiped out many others, while increasing jobs requiring very little skill.
It is worth exploring our own investments to look at their objective. In my own college, the experience of lockdown online learning means that we are looking seriously at whether we can very significantly increase group size for some parts of our programmes using technology (reduced teaching cost per student hour). We have been exploring how to use technology and algorithms to assess and distribute student support funds. We have been exploring how we use access and CCTV technologies to reduce the need for security and supervision staff and to provide us with useful information about the way students move around our premises.
Other projects have looked at whether we need to staff library areas. We have tried to move to a fully cashless college to reduce the need for cashiers. Artificial customer service popping up to help you access our websites is becoming standard. Following up low attendance through automated emails is an easy thing to program. It is easy to envisage a near future where exams are invigilated by all-seeing robots, perhaps in your own home. We are not far away from machines being able to mark essays. Increasingly we want students to self-enrol. Wherever we look we seem to be talking about labour-saving (or, more accurately, employment-reducing) expenditure. We are assuming that new employers and other organisations will take up the slack. Increasing staff numbers is not an explicit objective.
Of course, as a charity, our aim is not profit. Productivity improvements can feed into much higher pay for those still employed, or be recycled into our core educational function in the form of expert teachers and others. We are also in the fortunate position where the need for education will increase, so overall employment in our sector might even grow.
But if the rise of the robots elsewhere means fewer jobs or reduced GDP, then the education we provide may need to change significantly. If the future sees people substituted more often, maybe that recent Skills for Jobs White Paper needs to be republished as Skills for Leisure? And maybe Ned Ludd will swap from pantomime villain to Hollywood hero at last.
Ian Pryce is principal and CEO of the Bedford College Group