Meta's Independent Board Investigates Social Media's Response To Deepfake Pornography

Meta’s autonomous supervisory board is presently examining the response of Facebook and Instagram to two instances where AI-generated pornographic images, including one of a well-known American figure, were circulated on its popular social media platforms.

Deepfake images, especially those involving famous personalities, have become alarmingly common. Recent victims of AI-generated deepfake pornographic include prominent figures like Taylor Swift and US Rep. Alexandria Ocasio-Cortez. The proliferation of such images has raised serious concerns about privacy and mental health implications for the victims.

The Oversight Board, funded by Meta but reportedly operating independently, is using these two instances of deepfakes as case studies to assess the overall effectiveness of its policies and enforcement practices. The board’s descriptions of the cases include an AI-generated image of a naked woman resembling an Indian public figure, and an AI-generated image of a naked woman resembling an American public figure, both of which were posted on Instagram and Facebook respectively.

One of the images under investigation was initially left up by Instagram, only to be removed after the board selected it for review. This image, according to the board, was posted by an Instagram account that exclusively shares AI-generated images of Indian women. The other image, which depicted an American woman, was removed by Facebook for violating its policy against “derogatory sexualized photoshops or drawings.”

Just last week Ocasio-Cortez, 34, opened up about her own horrifying experience of finding a fake image of herself performing a sex act — as she was scrolling through X while talking about legislation with her aides in a car in February.

“There’s a shock to seeing images of yourself that someone could think are real,” the Queens Democrat told told Rolling Stone. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma, while I’m trying to … in the middle of a f—king meeting.”

The board’s investigation has sparked a wave of reactions, with many calling for more stringent controls on the sharing of fake naked images. After Taylor Swift became a victim of AI pornographic, her loyal fans were left questioning how the images were not considered sexual assault.

The nonconsensual sharing of digitally altered pornographic images is already illegal in several states including Texas, Minnesota, New York, Hawaii, and Georgia. However, these laws have not been effective in completely stopping the circulation of such images. Even high schools in New Jersey and Florida have reported incidents of explicit deepfake images of female students being circulated by male classmates.

In response to this rising concern, Rep. Joseph Morelle (D-NY) and Tom Kean (R-NJ) reintroduced a bill that would make the nonconsensual sharing of digitally altered pornographic images a federal crime. The proposed legislation, known as the “Preventing Deepfakes of Intimate Images Act,” would allow for penalties including jail time and fines.

The bill has been referred to the House Committee on the Judiciary, which has yet to make a decision on whether to pass it. If passed, it would still need approval from the Senate and the President before becoming law.

Share This:

Then24

The News 24 is the place where you get news about the World. we cover almost every topic so that you don’t need to find other sites.

Leave a Reply