• By Ashish Singh
  • Thu, 17 Oct 2024 04:42 PM (IST)
  • Source:Reuters

Meta's Oversight Board disclosed two instances that its moderators chose to leave on the network and solicited public feedback over the publishing of immigration-related content that could be damaging to immigrants.

The board intends to evaluate the suitability of Meta's decision, made in accordance with its hate speech policy, to shield refugees, migrants, immigrants, and asylum seekers from only the most serious attacks on its social media platforms. While it runs autonomously, the board is supported by the massive social media company.

It can provide Meta with non-binding policy recommendations after obtaining public feedback. The first instance the board discussed concerned a far-right coalition party on Poland's Facebook page, which published a meme in May that had a word for Black people that is commonly seen as insulting and disparaging in Poland.

READ: Meta May Face US Lawsuits Over Teen Social Media Addiction

The post was removed from Facebook after a human review by Meta, despite being viewed over 150,000 times, shared over 400 times, receiving more than 250 comments, and being reported 15 times by users for hate speech.

In the second instance, a photo of a blonde, blue-eyed woman raising her hand in a stop gesture was posted to a German Facebook group in June. The text said that people should leave Germany because there was no longer a need for "gang rape specialists."

After human assessment, Meta made the decision to keep the image up. A Following the Oversight Board's concern, Meta's policy subject matter experts re-examined both posts and determined that its first rulings were accurate.

Board co-chair and former Danish prime minister Helle Thorning-Schmidt stated, "These symbolic cases from Germany and Poland will help us determine whether Meta should be doing more and whether it is doing enough to prioritise this critical issue that matters to so many around the world."