Meta Platforms uses facial recognition technology to combat advertising fraud.

Meta Platforms, the parent company of Facebook, has announced plans to use facial recognition technology to combat fraud operations that exploit images of celebrities to make advertisements appear more credible, through a strategy known as ‘celebrity bait ads’.
Scammers rely on images of celebrities to attract the attention of users, prompting them to click on ads that lead to suspicious sites aiming to steal personal information or request money. Meta aims to identify these fake ads using facial recognition technology by comparing the images in the posts with the images on celebrity accounts on Facebook and Instagram.
Meta wrote in a blog post: “If we can confirm a match and that the ad is a scam, we will ban it.” However, the company did not reveal the extent of this type of fraud across its services.
With nearly 3.3 billion daily active users across all its apps, Meta relies on artificial intelligence to apply many rules and guidelines regarding content. This has enabled the tech giant to better deal with the massive amount of daily reports about unwanted content, although these automated measures have sometimes led to mistakenly suspending or banning legitimate accounts.
Meta is also looking to use facial recognition technology to help users who have had their accounts closed. In a new feature, some users will be able to send a video of themselves when they lose access to their accounts, where Meta will compare the video with the images on the account to verify identity.
Previously, Meta required banned users to submit other forms of verification such as ID cards or official certificates, but the company now indicates that the video option will be quicker, taking just one minute to complete. Meta has affirmed that any facial recognition data generated will be promptly deleted after the matching process, whether a match is confirmed or not.
Meta’s history with facial recognition technology is complex, as it previously used this technology to identify users in uploaded photos, but faced multiple lawsuits for profiting from it without users’ consent. In 2024, orders were issued requiring the company to pay $1.4 billion to Texas as part of these lawsuits, and it had previously agreed to pay $650 million to settle a separate lawsuit in Illinois.
According to Monica Bickert, Vice President of Content Policy at Meta, the company will not conduct the self-captured video test in Illinois or Texas.