AI in schools: Experts warn concerns

Group of children of different nationalities sit in a row on the windowsill and with lack of emotion play online games or read social networks on mobile phones. Technologies that spoil childhood.

As reported by The Guardian, children in British schools are using artificial intelligence (AI) to make indecent images of other children, a group of experts on child abuse and technology has warned

They said that a number of schools were reporting for the first time that pupils were using AI-generating technology to create images of children that legally constituted child sexual abuse material.

Emma Hardy, UK Safer Internet Centre (UKSIC) director, said the pictures were “terrifyingly” realistic.

“The quality of the images that we’re seeing is comparable to professional photos taken annually of children in schools up and down the country,” said Hardy, who is also the Internet Watch Foundation communications director.

“The photo-realistic nature of AI-generated imagery of children means sometimes the children we see are recognisable as victims of previous sexual abuse.

“Children must be warned that it can spread across the internet and end up being seen by strangers and sexual predators. The potential for abuse of this technology is terrifying,” she said.

UKSIC, a child-protection organisation, says schools need to act urgently to put in place better blocking systems against child abuse material.

Experts in child abuse and technology issue a stark warning as British schools report instances of children using artificial intelligence to create disturbingly realistic indecent images of their peers, urging urgent action to implement better blocking systems against child abuse material in educational settings

“The reports we are seeing of children making these images should not come as a surprise. These types of harmful behaviours should be anticipated when new technologies, like AI generators, become more accessible to the public,” said UKSIC director, David Wright.

“Children may be exploring the potential of AI image-generators without fully appreciating the harm they may be causing. Although the case numbers are small, we are in the foothills and need to see steps being taken now – before schools become overwhelmed and the problem grows,” he said.

Imagery of child sexual abuse is illegal in the UK – whether AI-generated or photographic – with even cartoon or other less realistic depictions still being illegal to make, possess and distribute.

Last month, the Internet Watch Foundation warned that AI-generated images of child sexual abuse were “threatening to overwhelm the internet”, with many now so realistic they were indistinguishable from real imagery – even to trained analysts.

Don’t forget to follow us on Twitter like us on Facebook or connect with us on LinkedIn!

Be the first to comment

Leave a Reply