Meta just announced plans to bring facial recognition technology back to Facebook and Instagram. This time it is a security measure to help combat “celebrity baiting” scams and restore access to compromised accounts.
“We know security is important, and that includes being able to monitor your social media accounts and protect yourself from scams,” the tech giant wrote in a blog post published on Monday, October 21.
Meta wants to use facial recognition technology to detect scammers who use images of public figures to carry out attacks. The company proposes to compare images from suspicious ads or accounts with legitimate photographs of celebrities. Facial recognition technology will also allow regular Facebook and Instagram users to regain access to their own accounts if they are blocked or hijacked. They will be able to verify your identity through video selfies that can then be compared to your profile images. Useful, sure, but can I trust Meta with my biometric data?
The Big Tech giant promises to take a “responsible approach” that includes encrypting selfie videos for secure storage, deleting facial data as soon as it is no longer needed, and not using these details for any other purpose. However, looking at Meta's track record regarding protecting and misusing its users' information, I am concerned.
Facebook's parent company has repeatedly violated the privacy and trust of its users in the past.
The 2018 Cambridge Analytica scandal was probably the turning point. It shed light on how the personal information of up to 87 million Facebook users was misused to target political advertising, predominantly during Donald Trump's 2016 presidential campaign.
After that, the company implemented significant changes around protecting user data, but Meta's privacy violations have continued.
Just this year Meta admitted to removing all Australian posts from Facebook since 2007 to train its AI model without giving the option to opt out. The company also received a significant fine (€91 million) in Europe for incorrectly storing social media account passwords in unencrypted databases. The previous year, in January 2023, Meta received an even larger fine (€390 million) for serving personalized ads without an opt-out option and illicit data handling practices.
It's certainly enough to make me skeptical of Meta's good intentions and big promises.
I'm curious what privacy advocates think about Meta's new plan to use facial recognition to help blocked users. Getting banned from Facebook or Instagram is a big deal, but this is also a reminder that we don't have federal laws protecting our faces. https://t.co/jvX9NIWYPuOctober 23, 2024
It's also worth noting that Meta itself decided to shut down its previous facial recognition system in 2021 over privacy concerns, promising to remove all collected “faceprints.” Now, three years later, it is back on the agenda.
“We want to help protect people and their accounts,” Meta wrote in their official announcement, “and while the contentious nature of this space means we won't always get it right, we believe facial recognition technology can help us be more faster, more accurate and more effective. We will continue to discuss our ongoing investments in this area with regulators, policymakers and other experts.”
We won't always get it right – that's not very reassuring. So is it certain that something bad will happen at some point? If that's the case, no thanks, Meta, I don't trust you with my biometric data. I'd rather lose my Facebook or Instagram account. What is the benefit of solving a problem to create an even bigger one?
What is certain is that Mark Zuckerberg doesn't need to lose sleep over EU fines just yet. Meta's facial recognition tests are not run globally. The company has excluded the UK and EU markets. GDPR provides strict privacy laws around personal information.
Elsewhere, Meta's tests will eventually show whether the new security feature is the right solution to the growing problem of social media scams, or if it becomes another privacy nightmare. Well, in the name of my privacy, I'm not sure it's worth finding out.