Author: Neil Selwyn
Publisher: Polity Press
Details: 160pp; £9.99
Just because something seems inevitable doesn’t mean that it will happen. Those of us who work in schools are told, almost weekly, by government, writers, and (God, help us) edupreneurs that artificial intelligence (AI) is about to transform our dull, utilitarian, suburban ways of teaching into something creative, gleaming, virtual and global.
We are regularly informed by people who never have to worry about student behaviour or school budgets that education has to be disrupted and “Uberised”.
Meanwhile, teachers will come and go, not talking of the learning platform Michelangelo, getting on with planning lessons, teaching children to read, write and other pre-analogue skills that, with frustrating predictability, somehow manage to produce the people who seek to question their value in this machine-driven age.
We wait. And while we wait, we mark some more essays. We wait some more. An internet minute begins to resemble the director’s cut of a Samuel Beckett play. The fact is that the AI revolution has not been actualised for most teachers. Indeed, it is not yet re-booted and suited, let alone ready to work in our schools.
The problem with much of the debate around technology and schools is that those involved, rather like two disputatious neighbours, are arguing from different premises.
But, as Neil Selwyn argues in this succinct and useful primer on this evolving and complex area, the discourse between each side is “so completely unhinged, it’s impossible to tell what’s important and what’s not”.
Selwyn’s sympathies lie with the teachers, not the geeks. That said, he contends that “education continues to be one of the least future-focused sectors there is”, and that has to change.
But, if that change is to come, it has to be on our terms, not on the terms of those who stand to profit financially. He is surely right in arguing that, “if a developer working in the area of education and technology knows everything about technology but nothing about education, then they cannot be considered qualified to do the job”.
But, too often, they are because so in thrall are policymakers to the misguided idea that we can “fix” something so innately imperfect as a school system, that expensive errors are still repeatedly made. Like the dumbest machines, human beings keep repeating their mistakes in the belief that the same actions will produce different outcomes. They never do.
Could it be that teachers fundamentally misunderstand the designs behind the machines they are so keen to get into their classrooms? Of course, a good teacher will make excellent use of an iPad, or a laptop or even a robotic humanoid. But that is the school adapting to the technology rather than the technology being shaped by learning and the school’s priorities. Selwyn states that tech’s “implicit assumptions” about education too often do not chime with what teachers want.
More worryingly, AI could force teachers to change how they function in order to make the technology work: he describes credible scenarios of teachers having to walk in certain areas to activate sensors, changing how they speak to engage with language processors or (the horror, the horror) getting students to alter their writing style so that automated grading systems will reward them according to the criteria they have been programmed to identify. Selwyn quotes approvingly Judy Wajcman’s claim that AI could lead to “not less work but more worse jobs”.
AI, being so new to schools, brings with it a set of undefined outcomes. In a typically understated claim, Selywn writes that “questions over what constitutes ‘harm’ in an educational context are less clear-cut than on a battlefield”. Or, to quote Paul Virilio, “when you invent the ship, you also invent the shipwreck”.
Undisciplined imaginative leaps
Use whatever imagery you want but, if you move fast and break things in schools, the casualties are often too young and human to be easily “fixed” through a software update.
This book challenges us to ask if our thinking is too one-dimensional – a replication, perhaps, of the narrow programming that is the fuel of AI rather than a reflection of the complex and often undisciplined imaginative leaps that characterise the human brain.
I found myself agreeing with Selwyn when he concludes by asking what is the point of building robots that essentially ape human form and action, doing what we already do, but faster and cheaper, and without the body odour? Surely future AI technology should aspire to something finer than being able to mark 30 essays in five seconds.
We will all benefit if teachers and technologists seek to move “outside the human perspective altogether”, to invent new ways of seeing the world. By doing so, we might add to that rich and complex process that aids, rather than hinders, learning.
David James is deputy head (academic) of Bryanston School, an independent school in Dorset
You can support us by clicking the book’s title link: we may earn a commission from Amazon on any purchase you make, at no extra cost to you.