South Korea imprisons a man for using artificial intelligence to create sexual images of children, in an unprecedented event in the country

Melissa Velasquez Loaiza

Seoul, South Korea (CNN) — A South Korean man was sentenced to prison for using artificial intelligence to generate images of child sexual exploitation, the first case of its kind in the country, as courts around the world confront to the use of new technologies to create abusive sexual content.

This unidentified man, in his 40s, was sentenced in September to two and a half years in prison, according to the Busan District Court and the district Prosecutor’s Office.

The man had created about 360 AI-generated images in April, prosecutors told CNN. The images were not distributed and were confiscated by the Police.

Prosecutors argued during the case that the definition of sexually exploitative material should include descriptions of sexual behavior by “virtual humans” and not just the appearance of real children.

Woman victim of “deepfake” pornography gives her testimony: “It dehumanizes us”

The ruling showed that sexually abusive content can include images made with “high-level” technology that are realistic enough to resemble real children and minors, according to the Prosecutor’s Office.

The case comes as governments around the world confront the explosion of the AI ​​industry, with far-reaching repercussions ranging from copyright and intellectual property to national security, privacy personal and explicit content.

Many are now rushing to regulate the technology, especially when cases like the South Korean ruling highlight how AI can be used to violate the bodily autonomy and safety of people, especially women and minors.

Earlier this month, police in Spain opened an investigation after images of underage girls were altered with AI to remove their clothes and sent around the city. In one of the cases, a boy had tried to extort one of the girls using a manipulated image of her naked, according to what the girl’s mother told the Canal Extremadura television channel.

A scary app to make any woman look naked was removed by its creator

For years, deepfakes – very convincing fake videos made with AI – have been used to put women’s faces in pornographic videos, often aggressive, without their consent. The videos seem so real that the victims find it difficult to deny that it is not them.

The problem came to light in February of this year, when it was discovered that a well-known video game streamer had accessed deepfake videos of some of his streaming colleagues.

“From the beginning, the person who created the deepfakes used them to make pornography of women without their consent,” Samantha Cole, a journalist for Vice’s Motherboard, who has followed the trail of deepfakes since their creation, told CNN.

Streaming platform Twitch responded to the controversy by toughening its policies, calling deepfake sex videos “personally violating and beyond annoying.” Other big platforms are updating their rules in a similar way, with TikTok adding more restrictions on sharing AI deepfakes in March.

In June, the European Union became one of the first countries in the world to set rules on the use of AI by companies, followed by China in July. And in early September, some of America’s biggest tech leaders—including Bill Gates, Elon Musk and Mark Zuckerberg—gathered in Washington as the Senate prepares to draft AI legislation.

™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Disclaimer: If you need to update/edit/remove this news or article then please contact our support team Learn more
Share This:

Peggy McColl

Mentor l NY Times Bestselling Author. Hi, I'm Peggy McColl, and I'm here to deliver a positive message to you!

Leave a Reply