• By Ashish Singh
  • Tue, 22 Oct 2024 12:54 PM (IST)
  • Source:Reuters

Facebook announced on Tuesday that it is testing its facial recognition program as part of a campaign against "celeb bait" scams, three years after Meta shut it down due to a wave of privacy and regulatory opposition.

About 50,000 well-known people will be enrolled in a trial by Meta, which will automatically compare their Facebook profile pictures with pictures from alleged scam ads. Meta will ban the advertisements if it thinks they are scams and the photos match.

According to the company, the celebrities will be informed of their enrolment and have the option to withdraw if they want not to take part. The business stated that it intends to launch the trial globally in December, with the exception of a few sizable nations for which it lacks regulatory approval, including South Korea, the United States, Britain, the European Union, and the states of Texas and Illinois.

READ: Meta’s Oversight Board Reviews Hate Speech Cases, Seeks Public Feedback On Content Decisions

During a press event, Meta's vice president of content policy, Monika Bickert, stated that the business was focussing on public people whose images it had discovered had been utilised in fraudulent advertisements.

"The goal is to provide them with as much safety as possible. Although customers have the option to opt-out, we want to be able to make this protection accessible and simple for them," Bickert stated.

The pilot demonstrates a company's attempt to balance reducing complaints about its treatment of user data, which have plagued social media businesses for years, with addressing regulator concerns about the growing number of frauds by utilising potentially intrusive technology.

READ: Meta May Face US Lawsuits Over Teen Social Media Addiction

One billion users' face scan data was erased when Meta shut down its facial recognition system in 2021, citing "growing societal concerns." The business was forced to pay Texas $1.4 billion in August of this year to resolve a state lawsuit alleging that it had unlawfully collected biometric data.

Simultaneously, Meta is being sued for allegedly not doing enough to stop celeb bait frauds, in which users are tricked into sending money to fictitious investment schemes by photos of celebrities that are frequently created by artificial intelligence.

Regardless of whether it found a scam or not, the company stated in the new trial that it would promptly remove any face data produced by comparisons with questionable ads.

Prior to testing, the tool was discussed with regulators, legislators, and privacy experts externally, and it was placed through Meta's "robust privacy and risk review process" internally, according to Bickert.

According to Meta, it also intends to test utilising facial recognition data to enable non-celebrity users of Facebook and Instagram, another of its platforms, to recover access to accounts that have been shut down because of a forgotten password or hacked by a hacker.