Should Minority Report technology be used in schools?

Predictive technology promises to pre-empt misbehaviour or violence in schools – but is it ethical, asks Simon Creasey
1st November 2019, 12:04am
Should Minority Report Technology Be Used In Schools?

Share

Should Minority Report technology be used in schools?

https://www.tes.com/magazine/archived/should-minority-report-technology-be-used-schools

David sits in the office, tapping the arm of the chair and staring through the window at the sports fields and the trees beyond. The headteacher begins. She explains that the advanced behaviour tracking (ABT) system has signalled that he is an immediate threat, so the school has had to call in the police.

David glances at the two young officers sitting in the corner of the room, staring right at him. He looks back to the headteacher. “ABT says there’s an 81 per cent risk that you will commit a violent act in the next three days,” she continues, her hands shaking. “There is a 67 per cent chance that act will be fatal to another student. There is an 87 per cent risk you will also do harm to yourself. This crosses all our pre-approved thresholds for action within the ABT system.”

David thinks back to the questions. He tries to work out which one gave him away. He touches the patch on his arm, and tries to work out what bits of information he unwittingly gave up. He pictures the smart cameras dotted around the school, and wonders what they saw and heard.

He looks at his mum and pushes his bag further under the headteacher’s desk with his foot. He tries to work out whether they know already that inside the bag is a knife and that today he intended to use it.

This scenario probably seems unlikely to you. Pre-crime detection is the stuff of science fiction and even if the technology did exist to track behaviour and make predictions, the ethical and privacy concerns of such technology would surely mean it would never find its way into schools.

And yet the tech does exist and it is already in classrooms - not to the degree in the story above, but we are further down the road to giving algorithms the power of decision making than many teachers realise. This technology is advancing fast, with the research labs of tech firms and universities combining to create tools that promise to stop the school shooter, stop the knife attack, stop the sexual assault, even stop self-harm and bullying - all before it happens.

But can those promises be relied upon? And should we permit the use of this technology in schools so we can find out?

AI predict a riot

Most people of a certain age associate pre-crime prediction technology with Steven Spielberg’s 2002 dystopian sci-fi thriller Minority Report, which is based on a short story by Philip K Dick. The film envisages a future in which criminals can be identified, arrested and sentenced by a specialist “pre-crime” police unit before they manage to commit murder. It all seemed very far- fetched at the time, but less than 20 years on, that technology is no longer complete fiction.

The most high-profile example of pre-crime detection now in operation is PredPol, a piece of software used by a number of US police departments. The predictive policing system grew out of a research project between the Los Angeles Police Department (LAPD) and University of California, Los Angeles (UCLA), which set out to see if crime statistics could be used for more than just historical purposes.

Working with mathematicians and behavioural scientists from UCLA and Santa Clara University, the team developed a machine learning algorithm, which was further refined with input from crime analysts and officers from LAPD and the Santa Cruz (California) Police Department. Its job is to “predict where and when specific crimes are most likely to occur”. By using this information, police departments “know” where better to deploy their resources, which should, in theory, help to reduce crime rates.

It’s not too much of a stretch to imagine a similar system working in schools. Analogue versions of it are in place already, with senior leadership team members positioned at crucial locations based on a bit of cobbled- together data, teacher experience and context. A tech solution to smooth and refine the process would likely be welcomed with open arms by teachers rather than questioned because it is about locations, not individuals - and because the action is already familiar.

If you doubt the above, then you are probably unaware that more controversial technology is already in use in schools, with tools seeking out individual rather than general behaviour patterns.

In China, facial recognition software has been used to monitor the concentration level of students and “smart uniforms” have been used to track a student’s location. Predictive technology is so prevalent in Chinese schools that Lei Chaozi, director of science and technology at China’s Ministry of Education, told the BBC that the government plans to “curb and regulate” its use.

In the US, meanwhile, a report undertaken by ProPublica earlier this year found that an “aggression detector” has already been rolled out in “hundreds of schools, healthcare facilities, banks, stores and prisons worldwide”. It is essentially a microphone system in which software identifies sounds of aggression so that, in the words of the manufacturer, adults can “engage antagonistic individuals immediately, resolving the conflict before it turns into physical violence”.

Soon, schools may be able to go even further: a report in the Washington Post in August suggested that the US government was considering funding a project that looks for warning signs that a person would become a school shooter by tracking their usage of tech, such as Fitbits, smartwatches and home tools such as Amazon Echo.

In the UK, it would be easy to assume that such tools are rare examples in other countries but, in fact, tracking and making predictions about individuals using technology is becoming commonplace here, too.

Earlier this year, the UK Home Office pledged a further £5 million to support the development of innovative predictive technology to help police forces prevent crime (it provided £4.5 million for the project’s first year, 2018-19). The National Data Analytics Solution initiative is being led by West Midlands Police, which is testing “a data analysis system that analyses large volumes of police-held data to assess the risk of someone committing a crime or becoming a victim”.

Could something as targeted as this end up in schools here? The Home Office says that the “Department for Education has no current plans to introduce this technology in schools”. The use of the word “current” suggests they are not ruling it out, though.

You would also expect a degree of data sharing between police and schools around individuals that the system flags up as being of interest. Pushing on to a school-specific tool would be a natural enough evolution.

That shouldn’t shock you: we are already using not-so-dissimilar tools, according to Elana Zeide, PULSE fellow in artificial intelligence, law and policy at UCLA School of Law.

“The most popular are tools that inform recruiting and admissions decisions and ‘early warning’ systems that monitor student activity, digital interactions and social media to detect when they may be at risk of dropping out, suffering mental health issues or exhibiting violent tendencies,” she says.

A common area of use for tracking technology in schools is safeguarding, the rise of which has coincided with the increased pressure on schools to act as the first-, second- and even third-line responder to mental health challenges among young people. It is also an area in which schools are under huge pressure not to miss anything, with incidents that do occur resulting in the school being immediately accused of failing in its duty unless it can demonstrate that robust systems were in place.

Countless safeguarding apps and solutions are being sold into schools. One of the newer systems that has emerged is AS Tracking. It is the brainchild of Jo Walker, a former behavioural, emotional and social difficulties adviser for a local education authority, and Simon Walker, who previously worked as an applied cognitive biologist.

The idea behind the tool is that a person’s actions are “steered” by “hidden” biases and that certain hidden biases - or behaviours - correlate with mental health concerns. Steer, the company that makes the tool, claims AS Tracking can reveal those hidden biases. Rather than questionnaires, it uses a computer assessment that gives pupils certain scenarios and then the algorithm tracks and analyses their responses.

“It makes visible the hidden biases that parents and teachers cannot see,” says Walker. “Children these days are very aware of what they ‘show’ and what they ‘hide’. Unless a child makes a concern visible, then adults cannot act on it.”

The company says the tool is a result of data from 15,000 pupil trials and extensive doctoral research from the founders. Walker says that in 2019-20, 150 schools will be using the company’s technology. The majority of these schools are in the UK, but the technology is also being used in five other countries.

Walker claims accuracy rates of 82 per cent in detecting early risks of self-harm, bullying and students who are not coping with pressure or anxiety. He says that in one school, AS Tracking action plans have resulted in a 20 per cent reduction in self-harm.

However, thus far the only independent evaluation of the technology has been conducted as part of a master’s dissertation. That dearth of external assessment is common among the majority of the predictive technology going into schools, says Zeide.

“There is little information published about the efficacy of these systems,” she says. “Vendors tend to rely on anecdotal evidence and rarely subject their programmes to third-party scrutiny or auditing. This lack of transparency is problematic and many schools fail to do adequate due diligence when implementing new technologies.”

This lack of transparency does not appear to curtail adoption. Ben Williamson, chancellor’s fellow at the Centre for Research in Digital Education and the Edinburgh Futures Institute at the University of Edinburgh - and author of Big Data in Education: the digital future of learning, policy, and practice - says there has been a “surge of psychological, neuroscientific and genetic work on identifying ‘problematic’ children”.

“The whole policy trend around ‘social-emotional learning’ is based on the psychological premise that you can identify a child’s psychological personality profile and then seek to improve it,” he says. “Lots of education technology companies have moved into this space with technologies to identify those psychological qualities.”

All of these companies - including Steer - reiterate that teacher judgement is still a crucial part of the puzzle. But handing over pupil data - and at least part of the decision-making process about what that pupil might be capable of - is seemingly something with which most schools are comfortable.

That may be because teachers are already used to learning analytics software and management information systems; both are powered by pupil data and both have eased teachers into the idea of trusting a machine to make decisions about pupils. The shift to diagnosing potential mental health challenges has been a smaller one as a result.

However, some of those involved in the personal learning space are not convinced that the door they have nudged open should be burst through by behavioural analytics tools. Indeed, Priya Lakhani, founder of one of the leading learning analytics platforms in the UK, Century Tech, questions whether the rise of these tools is sensible, plausible or welcome.

“There is little appetite in the UK to use technology to predict behaviour, like in the US, or to use facial recognition to monitor students’ attention levels, like in China,” she says. “Teachers are best placed to know their students and using technology to replace vital human functions like these risks losing the trust between schools, students and parents.

“It also fundamentally will not work, as technology has limits, risking undermining the excellent work that schools are doing using technology to improve the way that teachers teach and learners learn.”

Certainly, some rollouts of predictive technologies away from education have failed or been shown to be problematic. Earlier this year, the LA Times reported that a three-year programme using predictive policing had been shut down in Palo Alto, California, as the LAPD “didn’t get any value out of it ... It didn’t help us solve crime”, according to a police department spokesperson.

Reservations have also been expressed about the accuracy of so-called “emotion recognition” technology. Supporters claim it is capable of identifying different expressions - including fear - and can predict emotions based on images of people’s faces.

However, a number of studies and tests have shown accuracy of the software is patchy.

This may be down to the fact that predictive technology is only as good as the data being put into it, and so human error and bias can negatively impact results. A good example of this was a recent viral post about a “racist” hand dryer - the sensor would only activate for those with lighter skin (the theory being it was only tested on the lighter skin tones of those working in the tech industry). It prompted a rush of other examples, such as voice recognition software favouring male voices (owing to most of the people making the software and testing it being male).

Data dilemma

Williamson says these are problems that schools need to be more familiar with. “We already know certain disadvantaged groups underperform compared [with] others in schools, for very complex social, economic, cultural and political reasons,” he says. “In the same way as predictive policing has led to enhanced discrimination of certain groups, we need to be alert to how predictive schooling could discriminate against those who are already most disadvantaged.”

Zeide agrees that systems that try to predict future behaviour in order to allow for early intervention can have a negative effect on students who might inaccurately be flagged as dangerous.

Williamson adds that there are huge privacy issues here, too - the catch with much of this technology is that to get more accurate results, you need to give up more data. There are a lot of questions not being asked in schools and too few answers being offered by providers: how transparent is the data collection and usage? Who owns that data and who is responsible for it? What role does the school, pupil or pupil guardian have in the collection, storage and use of the data? What right to reply, challenge or control does an individual have if they are singled out by this tech and have that information added to their data profile?

Looking at the NHS, you can see how far behind schools now are. Since the Caldicott Review in 1997, which investigated the confidentiality of patient records, every NHS organisation has a Caldicott Guardian. This person is responsible for the ethical use of patient records, ensuring they can be shared for a patient’s direct care while protecting an individual’s right to a reasonable expectation to privacy of those records.

In short, you would expect your data to be seen by your surgeon but not necessarily a research analyst at Google. The Caldicott Guardian should make sure the former can use it and the latter cannot until they can demonstrate meeting the duty of confidence, for example by getting consent to all processing proposed (though recent reports, such as a 2016 investigation by New Scientist, suggest that even this level of scrutiny does not always mean data does not get shared). Where is the equivalent in education?

Schools are struggling enough with the legal obligations of GDPR, which are simpler than these more nuanced ethical questions.

All the issues cited above are not to say that predictive technology has no place in schools; it may well prove pivotal in protecting young people. But the potential problems have clearly not been addressed carefully enough and these tools have managed to get into schools relatively unchallenged, or even unnoticed. You may think there is still time, that it has not fully hooked itself into education just yet, but Williamson argues that this would be to underestimate the impact it is having already.

“This will not entirely replace teachers’ or leaders’ intuitions,” says Williamson. “But new forms of predictive intelligence may profoundly reshape how educators engage with their students.”

Simon Creasey is a freelance journalist

This article originally appeared in the 1 November 2019 issue under the headline “Minority (school) report”

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared