Heads welcome plans to ensure social media firms stop 'online harm'

School leaders union calls for companies to do more to protect children from damaging material

John Roberts

New powers are being created to protect children from online harm

Headteachers have welcomed government plans to ensure social media companies protect children from accessing harmful or offensive material online.

Proposed online safety laws would mean that social media companies and tech firms were legally required to protect their users and could face tough penalties through a new regulator if they did not comply.

They would include a mandatory “duty of care” that would require companies to take reasonable steps to keep their users safe and tackle illegal and harmful activity on their services.

Need to know: MPs quiz social media firms  

Quick read: Facebook backs online safety lessons

Background: The pressure caused by social media 

This aims to tackle material that incites violence or violent content, encourages suicide, disinformation, and cyberbullying.

The government says it also wants to ensure children are prevented from accessing inappropriate material.

There will be also be “stringent requirements for companies” to take tougher action to ensure they tackle terrorist and child sexual exploitation and abuse content.

Paul Whiteman, general secretary of the NAHT headteachers' union, said: “The NAHT is in favour of a statutory duty of care to guarantee that social media companies will prioritise the safety and wellbeing of children and young people, and remove content quickly that is inappropriate or harmful.

“Social media companies need to be more proactive. They need to be on the ball looking for material and have a clearer line on what is and isn’t acceptable, particularly where children and young people are concerned.

“Social media providers should take down not only illegal content but also legal material that could be harmful. These companies need to ask themselves: ‘Could this content cause harm to children or young people?’ If the answer is yes, then the content needs to come down.”

As part of the Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport and the Home Office, a new independent regulator will be introduced to ensure companies meet their responsibilities.

The government will consult on the regulators’ powers including issuing fines, blocking access to sites and potentially imposing liability on individual members of senior management at a company.

The Department for Education has raised concerns about the impact of social media on young people over the past year.

Earlier this year, education secretary Damian Hinds voiced concerns about the influence social media could have on his own children.

He has also said children should be taught about the dangers of social media from a young age.

'Time to do things differently'

Announcing the new plans to tackle online harm, prime minister Theresa May said: “The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children and young people, from harmful content.

“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.

“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”

The plans have been welcomed by the children’s commissioner Anne Longfield.

She said: “The social media companies have spent too long ducking responsibility for the content they host online and ensuring those using their apps are the appropriate age.

"The introduction of a statutory duty of care is very welcome and something I have long been calling for. It should now be implemented as quickly as possible so that all platforms, no matter their size, are held accountable.

“Any new regulator must have bite. Companies who fail in their responsibilities must face both significant financial penalties and a duty to publicly apologise for their actions and set out how they will prevent mistakes happening in the future.

“The internet wasn’t designed with children in mind, but they are among its biggest users. Social media platforms dominate aspects of their lives in a way that could never have been imagined 30 years ago. With this power must come responsibility – and it can’t come soon enough.”

Register to continue reading for free

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you

John Roberts

John Roberts

John Roberts is North of England reporter for Tes

Find me on Twitter @JohnGRoberts

Latest stories

Ministers seem to think schools are wasting money - in fact, schools are experts in cutting costs, says James Bowen

Why international teachers should receive financial CPD

There's a lot to learn working in another country - not least the financial situation and how to use your money wisely, which is why perhaps a CPD session or two would be a worthwhile investment
David Keating 30 Jul 2021