Les metaplates-formes utilisent la technologie de reconnaissance faciale pour lutter contre la fraude publicitaire.

Meta Platforms, the parent company of Facebook, has announced its plans to utilize facial recognition technology to combat fraud operations that exploit images of celebrities to make ads appear more credible, through a strategy known as celeb-bait ads.
Scammers rely on celebrity images to attract users’ attention, prompting them to click on ads that lead to suspicious sites aimed at stealing personal information or requesting money. Meta aims to identify these fake ads through facial recognition technology, comparing the images in the posts with photos on celebrities’ accounts on Facebook and Instagram.
Meta wrote in a blog post: « If we confirm a match and that the ad is a scam, we will ban it. » However, the company did not disclose the extent of this type of fraud across its services.
With nearly 3.3 billion daily active users across all its apps, Meta relies on artificial intelligence to apply many rules and guidelines related to content. This has allowed the tech giant to better deal with the huge volume of daily reports on unwanted content, although these automated measures have sometimes resulted in mistakenly suspending or banning legitimate accounts.
Meta is also looking to use facial recognition technology to assist users who have had their accounts closed. Through a new experiment, some users will be able to send a video of themselves when they lose access to their accounts, and Meta will compare the video with the photos on the account to verify identity.
Prior to this, Meta required banned users to submit other forms of verification such as ID cards or official certificates, but the company now indicates that the video option will be quicker, taking only one minute to complete. Meta confirmed that any facial recognition data created will be immediately deleted after the verification process, whether a match is confirmed or not.
Meta’s history with facial recognition technology is complex, having previously used this technology to identify users in uploaded photos, but facing multiple lawsuits for profiting from it without users’ consent. In 2024, orders were issued for the company to pay $1.4 billion to Texas as part of these lawsuits, and it had previously agreed to pay $650 million to settle a separate lawsuit in Illinois.
According to Monica Bickert, Vice President of Content Policy at Meta, the company will not conduct the self-captured video test in the states of Illinois or Texas.