Get the best experience in our app
Enjoy offline reading, category favourites, and instant updates - right from your pocket.

Educators told to listen to primary pupils’ views on AI

Children are ‘excited’ about AI but want policymakers to acknowledge their concerns about issues such as biased representation, hears a major event on the ethics of the technology in education
3rd October 2025, 2:04pm

Share

Educators told to listen to primary pupils’ views on AI

https://www.tes.com/magazine/news/primary/educators-should-listen-to-primary-school-pupils-views-on-ai
Education need to listen to primary school pupils' views on AI, says academic

Primary school pupils hold “unique perspectives” about artificial intelligence that must help to shape approaches to the technology, a major conference on AI and ethics in education has heard.

Delegates were also told that many older primary pupils have no experience of AI and that, of those who do, some have discovered “distressing” bias and many still prefer using traditional materials in areas such as art.

This picture emerged in a presentation from Dr Mhairi Aitken - senior ethics fellow at The Alan Turing Institute for data science and AI, which is based in London - as she addressed this week’s Lovelace-Hodgkin Symposium on AI Ethics: Responsible AI and Education, at the University of Glasgow.

Child-centred AI

Dr Aitken discussed her institute’s work on “child-centred AI” - including collaborations with the Council of Europe, the LEGO Group, the Children’s Parliament in Scotland and the Scottish AI Alliance - and emphasised the need to take more seriously children’s views and ideas in shaping ethical AI policy.

She shared findings from a UK survey of 750 children aged 8-12 that also encompassed a survey of school teachers. Only 22 per cent of these children reported using generative AI, with girls and private school pupils more likely to use it.

Dr Aitken, speaking yesterday, also noted that children with additional learning needs report “significantly higher” rates of AI use “for communication and connection”. She described this as “an area that really needs a lot more attention” and said she sees “a lot of interest in the ways that generative AI could be used to support children”.

She also referenced workshops held with primary pupils in Scotland to explore their understanding and experiences of generative AI. Children showed a strong preference for traditional art materials over generative AI, for example, citing a lack of pride in AI-generated work.

“We found an overwhelmingly strong preference for hands-on art materials compared to generative AI,” she said.

Dr Aitken said that other worries children shared included potential racial bias in AI-generated images, the environmental impact of AI and the generation of inappropriate content.

She cited one example where a girl repeatedly asked for an image of a girl with brown skin to be generated, and found it “distressing” when an image of a girl with white skin was returned each time. One delegate, speaking in the question-and-answer session that followed Dr Aitken’s presentation, insisted that the technology had moved on, making this type of scenario less likely.

Dr Aitken also identified “a real risk that inequitable access to those technologies...might exacerbate existing inequities within education”.

In workshops with children in schools, pupils “were excited about the opportunity to learn about these technologies, and excited to have the opportunity to play and experiment with generative AI”.

However, children also felt that their interests and rights were not considered in the development of AI tools.

Pupils ‘excluded’ from AI policy

Dr Aitken said that children and young people would be “most impacted by advances in AI technology, but they’re simultaneously the group who are least represented in decision making about the ways that AI is designed, developed and deployed, and are almost entirely excluded from decision making around policymaking and regulation relating to AI”. “Now I think that’s wrong and I think that needs to change,” she added.

The academic said that children have “unique perspectives and insights” on how AI is affecting their lives.

She also noted that “teachers generally reported being very optimistic, very positive, about the ways that they themselves use generative AI in their work, but simultaneously reported significant concern about students’ potential use of those tools, including concerns around plagiarism [and] cheating, as well as impacts on critical thinking”.

Dr Aitken added that this “inconsistency between how they view their own usage generally and students’ use” needs to be explored further.

After an event in February, Dr Aitken said, a children’s manifesto had been produced, with the first big message being “listen to children”, which is “really the core message of all of this work”.

You can now get the UK’s most-trusted source of education news in a mobile app. Get Tes magazine on iOS and on Android

Want to keep reading for free?

Register with Tes and you can read five free articles every month, plus you'll have access to our range of award-winning newsletters.

Register with Tes and you can read five free articles every month, plus you'll have access to our range of award-winning newsletters.

Keep reading for just £4.90 per month

/per month for 12 months

You've reached your limit of free articles this month. Subscribe for £4.90 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

You've reached your limit of free articles this month. Subscribe for £4.90 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared