mtoto.news

AI Self-Generated Sexual Imagery of 7-10-Year-Olds Increased by 360% in 2 Years-Report

October 18, 2023

Although children may not be active consumers of Artificial intelligence, AI poses a great risk to children’s online safety.

A new report Global Threat Assessment 2023 indicates that, from the first half of 2020 to 2022, the Internet Watch Foundation (IWF) reported a 360 percent increase in the instances of ‘self-generated’ sexual imagery of 7-10-year-olds.

The report through the New insight from risk intelligence organisation Crisp, reveals that individuals seeking to abuse children in these environments can lock them into high-risk grooming conversations in as little as 19 seconds after the first message, with an average time of just 45 minutes.

A key conclusion to draw from emerging evidence is that boys and girls appear to be vulnerable in different ways. For example, boys are more likely to experience financial sexual coercion than girls.

The report also showed that reports of child sexual abuse material received from across the globe have increased by 87 percent from 2019 to 2022.

In 2022 the US National Center for Missing and Exploited Children (NCMEC) analysed just over 32 million reports of child sexual abuse material received from across the globe.

However, the number is estimated to be much higher than this as much harm is not reported.

Disrupting Harm 2022 report by UNICEF and other partners revealed that as many as 20 percent of children in some countries were subjected to sexual exploitation and abuse online in the past year.

The survey was conducted across 13 countries in Eastern and Southern Africa and Southeast Asia

Most children and young people perceive adults and peers they do not know as most likely to cause them harm or abuse them online.

However, 60 percent of cases of online abuse involved a perpetrator likely known to the child according to the Disrupting Harm report.

 

Hey, like this? Why not share it with a buddy?

Leave a Reply

Your email address will not be published. Required fields are marked *