News in English

Meta oversight board: More needs to be done to fight deepfake pornography

Meta oversight board: More needs to be done to fight deepfake pornography

Meta’s Oversight Board said Thursday that the company needs to do more to combat deepfake pornography on its platforms after reviewing two posts of AI-generated nude images of public figures.

Meta’s Oversight Board announced in April that it would review two cases about how Facebook and Instagram handled content containing AI-generated nude images of two famous women. The board ruled that both images violated Meta’s rule that bars “derogatory sexualized photoshop” under its bullying and harassment policy.

One of the cases involved an AI-generated nude image of an American figure that was posted to a Facebook group for AI creations. Facebook automatically took down the photo after it was identified by a previous poster as a violation of Meta’s bullying and harassment policies. The board ruled Meta made the right decision in removing that post.

The other case concerned an AI-generated nude image made to resemble a public figure from India, which Instagram failed to review within 48 hours of the image being reported. The user appealed it to Meta, but the report was automatically closed.

The board said that the image was only removed after the board said it would take up the case, with Meta saying it was left up “in error.” The board criticized Meta for not adding the photo to a database that would allow for automatic removals of the photo.

When the board asked why the image was not added to the database, Meta said that it had relied on media reports to add the image that resembled the American public figure to the database, but that there were “no such media signals” for the case involving the Indian public figure.

"This is worrying because many victims of deepfake intimate images are not in the public eye and are forced to either accept the spread of their non-consensual depictions or search for and report every instance,” the report said.

The board overturned Meta’s initial decision to leave the image up in its report on Thursday.

The board, which is funded by the company and operates independently of it, said Thursday that Meta’s rules prohibiting “derogatory sexualized photoshop” are not "sufficiently clear” to its users.

The board suggested that Meta update its “derogatory sexualized photoshop” rule, including replacing the term “photoshop” with a general term for manipulated media. The board also suggested replacing the term “derogatory” with “nonconsensual.”

"The Board’s recommendations seek to make Meta’s rules on this type of content more intuitive and to make it easier for users to report non-consensual sexualized images," the board said.

Meta said it would review the board’s recommendations and provide a new update if it implements them. The board can issue binding decisions about content, but its policy recommendations to Meta are non-binding and the company has the final say.

Читайте на 123ru.net