Meta’s Oversight Board says deepfake policies need update and response to explicit image fell short

Meta’s Oversight Board says the company failed to take down an AI-generated intimate image of an Indian female public figure that violated its policies until the board got involved Meta’s Oversight Board says deepfake policies need update and response to explicit image fell shortBy KELVIN CHANAP Business WriterThe Associated PressLONDON LONDON (AP) — Meta’s policies
Meta’s Oversight Board says deepfake policies need update and response to explicit image fell short

Meta’s Oversight Board says the company failed to take down an AI-generated intimate image of an Indian female public figure that violated its policies until the board got involved

Meta’s Oversight Board says deepfake policies need update and response to explicit image fell shortBy KELVIN CHANAP Business WriterThe Associated PressLONDON

LONDON (AP) — Meta’s policies on non-consensual deepfake images need updating, including wording that’s “not sufficiently clear,” the company’s oversight panel said Thursday in a decision on cases involving AI-generated explicit depictions of two famous women.

The quasi-independent Oversight Board said in one of the cases, the social media giant failed to take down the deepfake intimate image of a famous Indian woman, whom it didn’t identify, until the company’s review board got involved.

Deepake nude images of women and celebrities including Taylor Swift have proliferated on social media because the technology used to make them has become more accessible and easier to use. Online platforms have been facing pressure to do more to tackle the problem.

The board, which Meta set up in 2020 to serve as a referee for content on its platforms including Facebook and Instagram, has spent months reviewing the two cases involving AI-generated images depicting famous women, one Indian and one American. The board did not identify either woman, describing each only as a “female public figure.”

Meta said it welcomed the board’s recommendations and is reviewing them.

One case involved an “AI-manipulated image” posted on Instagram depicting a nude Indian woman shown from the back with her face visible, resembling a “female public figure.” The board said a user reported the image as pornography but the report wasn’t reviewed within a 48 hour deadline so it was automatically closed. The user filed an appeal to Meta, but that was also automatically closed.

It wasn’t until the user appealed to the Oversight Board that Meta decided that its original decision not to take the post down was made in error.

Meta also disabled the account that posted the images and added them to a database used to automatically detect and remove images that violate its rules.

In the second case, an AI-generated image depicting the American women nude and being groped were posted to a Facebook group. They were automatically removed because they were already in the database. A user appealed the takedown to the board, but it upheld Meta’s decision.

The board said both images violated Meta’s ban on “derogatory sexualized photoshop” under its bullying and harassment policy.

However it added that its policy wording wasn’t clear to users and recommended replacing the word “derogatory” with a different term like “non-consensual” and specifying that the rule covers a broad range of editing and media manipulation techniques that go beyond “photoshop.”

Deepfake nude images should also fall under community standards on “adult sexual exploitation” instead of “bullying and harassment,” it said.

When the board questioned Meta about why the Indian woman was not already in its image database, it was alarmed by the company’s response that it relied on media reports.

“This is worrying because many victims of deepfake intimate images are not in the public eye and are forced to either accept the spread of their non-consensual depictions or search for and report every instance,” the board said.

The board also said it was concerned about Meta’s “auto-closing” of appeals image-based sexual abuse after 48 hours, saying it “could have a significant human rights impact.”

Meta, then called Facebook, launched the Oversight Board in 2020 in response to criticism that it wasn’t moving fast enough to remove misinformation, hate speech and influence campaigns from its platforms. The board has 21 members, a multinational group that includes legal scholars, human rights experts and journalists.

Total
0
Shares
Leave a Reply
Related Posts

Honeymoon-bound newlyweds, bride’s mother killed when 500-pound tubes fall off semi-truck in Colorado

Honeymoon-bound newlyweds and the bride’s mother were killed Thursday when a semi-truck overturned in Colorado, sending its cargo of large plastic tubes across the highway. The tragedy unfolded shortly before 8:30 a.m. as Ruben Rodriguez and Juliana Rodriguez-Plata drove along Interstate 70 near Denver with their 1-year-old son and Juliana’s parents, according to Wheat Ridge

A Poltergeist TV Series Is In The Works

In 1996, Poltergeist received its first TV series, Poltergeist: The Legacy, which was only loosely connected to the films. That show ran for four seasons before coming to an end in 1999, but now, there's a fresh take on a Poltergeist TV series coming from Amazon MGM Studios. Via Variety, the Poltergeist series has landed
Met Police officer who worked in same unit as Wayne Couzens slept with suicidal woman he met during ‘welfare check’ then sent her a sexual image after she threatened to hang herself, tribunal hears
Read More

Met Police officer who worked in same unit as Wayne Couzens slept with suicidal woman he met during ‘welfare check’ then sent her a sexual image after she threatened to hang herself, tribunal hears

A former Met Police officer who worked in the same unit as Wayne Couzens and David Carrick exhibited 'predatory sexual behaviour' and slept with a suicidal woman after carrying out a 'welfare check' at her home in August 2017. Ex-soldier Phil Hunter, 60, also sent the woman, known only as X, a sexually explicit image
Florida’s official tourism site removes ‘LGBTQ Travel’ section
Read More

Florida’s official tourism site removes ‘LGBTQ Travel’ section

Florida’s official tourism website quietly removed a landing page for the state’s LGBTQ-friendly travel destinations sometime in the past four months. The website for the state’s tourism marketing corporation, VisitFlorida.com, had an “LGBTQ Travel” section that no longer exists, according to archived versions of the site viewable on the Internet Archive. The landing page previously