Meta’s Oversight Board Advises It to Refine Rules on AI-Generated Adult Content

26.07.2024 12:01:19 Yorum Yok Görüntülenme
  • After a long investigation, Meta’s Oversight Board has made some suggestions regarding rules on AI-generated adult content.
  • For instance, the board wants Meta to replace the term “derogatory” with “non-consensual” when it comes to reviewing inappropriate images.
  • Meta has promised to take these suggestions into account and change the rules wherever needed.

Meta’s Oversight Board Advises It to Refine Rules on AI Content

Meta’s Oversight Board has advised it to revisit and refine its rules on how it handles AI-generated adult content.

Following an investigation where the board reviewed two pornographic fakes of famous women created using AI and posted on Instagram and Facebook (both owned by Meta), it had some important suggestions for the company:

  • To update its rule so that it can cover more types of cases. For example, saying only “Photoshop” narrows the range of prohibition. Instead, the rule should also talk about AI and editing techniques.
  • The board also wants Meta to change the term “derogatory” and use “non-consensual”. Derogatory can be very subjective but non-consensual would cover everything that happens without the user’s consent.
  • It should also move its policies on such images from the “Bullying & Harassment” category to the “Sexual Exploitation Community Standards” section.
  • Lastly, right now Meta prohibits non-consensual images only if it’s non-commercial or produced in private settings. However, the board feels that these criteria shouldn’t be mandatory when it comes to banning inappropriate content.

Meta responded by saying that it would review these suggestions and update them if any changes were made based on them.

For those who don’t know, the Oversight Board was established by Meta to keep an eye on controversial decisions. Although the board is funded by Meta, it runs independently. On the flip side, the board’s suggestions are not binding. Meta can choose to accept or reject them.

The Oversight Board had previously advised Meta to work on its policies after it didn’t remove Joe Biden’s manipulated video from Facebook in early February.

About the Investigation

The investigation surrounded two female public figures from India and the USA. Citing privacy concerns, their identity has been kept anonymous.

The edited images of these women that were circulated online were found to be in violation of Meta‘s rule barring “derogatory sexualized photoshop”. This falls under bullying and harassment, in which case, Meta is supposed to remove the posts immediately.

That’s not what happened here. In the case of the Indian woman, several users reported the post. However, Meta failed to review it within 48 hours, so the ticket was automatically closed.

The users appealed again but even the second time, the company refused to take any action. The company only decided to take the case seriously after it was picked up by the Oversight Board for review.

In the case of the US woman however, the post was immediately removed because it was already present in the Media Matching Service (MMS) repository – a database of images that have violated Meta’s terms of service, making it easier to detect similar images.

This is not a new issue with Meta. Devika Malik, a platform policy expert who previously worked in Meta’s South Asia policy team, had said earlier this year that Meta largely depends on user reports to identify and remove inappropriate content.

This is not an effective approach, especially when dealing with AI images because the responsibility falls on the user to prove their identity and that it was non-consensual.

The post Meta’s Oversight Board Advises It to Refine Rules on AI-Generated Adult Content appeared first on The Tech Report.

    Sakarya Haber

    Sakarya Son Dakika Haberleri sitemiz sizlere anlık son dakika haberleri sunmaktadır, hemen sakarya haber sitemizi ziyaret edin yeni haberleri kaçırmayın.

    © Copyright 2023 Sakarya Haber. All Rights Reserved.