KCSIE 2023: 5 tips for online filtering and monitoring

With new guidance coming into force on how schools monitor pupils’ internet use, this safeguarding lead explains the key things that leaders should focus on
12th June 2023, 11:22am

Share

KCSIE 2023: 5 tips for online filtering and monitoring

https://www.tes.com/magazine/analysis/general/kcsie-2023-safeguarding-schools-online-filtering-and-monitoring
On screen red flag

The Keeping Children Safe in Education updated guidance for September 2023 does not have that many amendments for the new school year, but the Department for Education’s update does ask schools to be clearer about how they manage their online monitoring and filtering systems.

This is an area of renewed focus for the DfE due to, in part, campaigning by Judy and Andy Thomas to raise awareness of the importance of these technologies after the tragic death of their daughter, Frankie, who viewed content online at her school unmonitored because its systems were broken.

Most schools should already have the right systems in place and the new requirement will hopefully mirror existing best practices. 

However, the requirements of KCSIE (linked to the DfE document Meeting digital and technology standards in schools and colleges, updated in March 2023, which it would be also wise for designated safeguarding leads (DSLs) to read) mean school leaders and DSLs should look at this area soon. What does that mean in practice?

KCSIE guidance: online monitoring and filtering systems

1. Clear responsibilities for all staff

Paragraphs 14 and 124 of KCSIE 2023 talk about the need for appropriate safeguarding training for staff, including “applicable roles and responsibilities in relation to filtering and monitoring”. 

It is likely the day-to-day management of this online system will fall to your ICT team and pastoral leaders, who will receive any concerns raised by the filtering and monitoring system. 

It is important, though, that all teachers are fully aware of how these systems work, and schools should make sure that this is part of their start-of-year safeguarding training in September.

In part this is so they can ensure that they are periodically reiterating to their students the importance of bearing in mind the school’s systems.

In addition, they may need to consider these systems themselves. So, for instance, if a history class is doing research on Nazi Germany or a politics class is looking at drugs legislation then it would be wise for the teacher to flag this in advance before lots of seemingly worrying searches are spotted.

2. Updating DSLs’ job specifications

Paragraph 103 notes that the DSL should take lead responsibility for safeguarding, “including online safety and understanding the filtering and monitoring systems and processes in place”, and this should be included explicitly in their job description. 

This is a small but significant change, adding yet another responsibility to the role of the DSL. Quite possibly, if you are a school with a team of DSLs then this might be a responsibility that could be, in practice, deferred to a deputy DSL on a day-to-day level.

Quite clearly, though, it is important that schools do not just leave the filtering and monitoring systems to be overseen by the ICT team, and DSLs should both be able to understand how their monitoring system works (in non-technical terms) and have an input into them. 

This is especially the case around the requirement that the level of filtering and monitoring that happens in school should “block harmful and inappropriate content without unreasonably impacting teaching and learning” (paragraph 142 of KCSIE).

So, ideally there will be a different level of access available to teachers compared with students, so that students cannot access inappropriate material but a teacher could show a video for PSHE on extremism or sex education.

Good practice here is a regular review of the parameters to ensure that any new areas of content that may need to be taught or blocked are added or removed as required.

3. Links to other policies

Paragraph 138 adds that the school’s safeguarding policy should include “appropriate filtering and monitoring on school devices and school networks” and reflect this in other relevant policies, such as the ICT policy

It would be sensible for the head of IT to have an input into these changes, and this will have an impact on a number of policies, including safeguarding, behaviour and bullying. 

It might be a good idea for schools to have a broader meeting of their pastoral leads and the head of IT to ensure that everyone fully understands what is being put in place and that it fully supports the pastoral life of the school.

4. Conduct annual reviews

Paragraph 142 sets out two more key areas for schools to consider.

The first relates to the importance of reviewing filtering and monitoring provision at least annually. 

Companies that run these systems already seem to be responding to this by offering an annual review with the school themselves. Given the rapid development of new technology, particularly AI, it is particularly important school leaders remain well informed in this area.

5. Get governors involved

As with so many other areas of safeguarding now, it is specified that “governing bodies and proprietors should review the standards and discuss with IT staff and service providers what more needs to be done to support schools and colleges in meeting this standard”.

This follows the broader movement to ensure that governors are far more actively involved with the safeguarding supervision of the school

The AI future…

Finally, while artificial intelligence software is not mentioned in KCSIE, it is clear that the new focus on filtering and monitoring software comes at a key moment in that technology’s rapid evolution. 

Over the coming year this evolution will not only transform the academic life of schools but also bring into focus new concerns about the wellbeing of students, given the capacity of AI to generate any image or to create new content with only a brief prompt. 

A question immediately arises about who is at fault if AI generates an inappropriate image or extremist rhetoric even if this was not intended. These are all questions that will be being discussed with ever more urgency over the coming months - and hopefully future KCSIE updates will address.

Luke Ramsden is deputy head of an independent school and chair of trustees for the Schools Consent Project

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared