Have you heard of the Momo challenge? This is a WhatsApp-based cyber-bullying “game” where players receive messages from an unknown number, including graphic images and instructions to harm themselves, leading to an incitement to take their own lives at the end. There’s debate about how widespread it actually is, but authorities across the world are nonetheless warning parents to be aware of it and ensure that their children are not sucked in.
Then there’s the Doki Doki Literature Club!; an anime visual novel that, on the surface, looks like a harmless role-playing game about poetry and romance. But it’s actually an incredibly dark, twisted story featuring shocking depictions of violence and suicide. Police and schools have released warnings about it after it was linked with several cases of people taking their lives.
However up-to-date you are on the current trends, the fact is that new ones will keep popping up and enticing impressionable young people to take part. We know that the adolescent brain lacks the forward-thinking skills to fully assess the consequences of actions. We also know that students are spending more and more time engrossed in screen-based activities, whether it’s watching videos, talking in chat apps or playing games, many of which involve live discussion with other players around the world. It is our responsibility as teachers and parents to keep them safe, so how can we ensure that they’re kept away from troubling content?
The challenge of policing students' online world
Schools usually take a two-pronged approach, using county-level blocking software (which prohibits users from visiting certain websites) and delegating a member of staff (often the network manager) to check over searches that are flagged as concerns. The trouble is that, in most cases, neither is up to the job of policing the ever-shifting online world our students are engaging with.
Detection software is only as good as the intelligence that goes into it, and county-level systems are rarely as effective as they need to be. They are often easily bypassed by students, and can generate vast amounts of false positives. Terry Laing, senior leader for ICT and data management at Alsop High School in Liverpool, highlights the problem of wasting time on “checking through flagged searches where it quite obviously was school work”, including core elements of the RE and English literature curricula.
And this is a huge amount of responsibility to put on a member of staff in addition to the other demands of their role, whether that’s as a teacher, a leader or a network manager. This may have been an effective model when schools first went online in the 1990s, but the landscape has changed beyond recognition, and our response to it needs to follow suit.
For example, how would your school’s network manager react to a search for the term “daijobu”? If they don’t happen to be fluent in Japanese, they would most likely Google it and find that it translates as “I’m OK”. But, in fact, it’s an expression that’s used extensively by the Doki Doki Literature Club! community, and it’s a cry for help. In this context, it means the exact opposite of its literal meaning: I am not OK. But that level of in-depth knowledge is – completely understandably – rare among the people responsible for checking flagged searches in schools.
So what can you do to stay on top these constantly shifting online safeguarding risks? Schools are increasingly turning to independent and qualified services to assist them in their safeguarding efforts in the knowledge that aside from the huge demands on their time that monitoring requires, checking search histories, monitoring emails and blocking specific websites isn’t enough. How effective is this time spent trying to see what our students are looking at, if we don’t know what we’re looking for?
From YouTube revision videos, to subject study guides, to exemplar essays, the internet is a treasure trove for students keen to engage in independent study or just to explore areas that interest them.
Identifying safeguarding issues online
Teachers or network managers tasked with wading through all of this data in order to check for safeguarding concerns will always have the students’ best interests at the front of their mind, but it’s crucial that they have the skills needed to identify the markers that indicate interventions are needed.
Through using eSafe, an outsourced monitoring service, Laing says his school now has the support of trained specialists who review and assess all incidents identified by the detection software to determine whether they are genuine or not, eliminating “false positives” and allowing him to focus resources on those pupils who are potentially in need of intervention.
“Using a team whose full-time job is to understand these issues and stay up-to-the-minute on the latest developments, I can be confident we are spotting the markers that previously might have been missed,” says Laing.
eSafe employs a large team of dedicated specialist behaviour analysts, who review safeguarding incidents all day, every day. They possess the language and the cultural expertise necessary to interpret behaviour markers generated by young people from culturally diverse backgrounds; they are educated to at least degree level in a relevant subject, such as psychology, and have previously worked supporting young people and adults.
Crucially, they have the experience and training to recognise content that could be harmful, allowing you to understand the severity of a threat and intervene early. They’re assisted by a state-of-the-art detection system that, unlike the standard-issue blockers, includes a dynamic library of markers of current and emerging behaviour trends, which is constantly updated to ensure its accuracy.
And the impact of this kind of monitoring can be huge, says Laing, whose school has outsourced the monitoring of its digital environment to eSafe for several years, “Not only has it helped with flagging students engaging with risky behaviour,” he says. “It’s also helped with identifying bullying problems, and students struggling with their mental health. Parents are relieved that we are monitoring their children in this way.”
Keeping young people safe in an ever-evolving online world can be daunting, but it doesn’t need to be. If you let the experts look after managing the monitoring, you can get on with helping the students who need it.