Meta Faces U.S. Lawsuit Over AI Smart Glasses Privacy Violations After Workers Reviewed Intimate Footage
Meta is confronting a new lawsuit in the United States over privacy violations related to its AI smart glasses, following revelations that overseas workers reviewed intimate customer footage including nudity and sexual activity.
Meta is confronting a new lawsuit in the United States over privacy violations related to its AI smart glasses, following revelations that overseas workers reviewed intimate customer footage including nudity and sexual activity.
The complaint, filed by plaintiffs Gina Bartone of New Jersey and Mateo Canu of California, alleges that Meta violated privacy laws and engaged in false advertising regarding its AI smart glasses. The public interest-focused Clarkson Law Firm represents the plaintiffs in the case.
The lawsuit stems from an investigation by Swedish newspapers that found workers at a Kenya-based subcontractor were reviewing footage from customers’ glasses, according to the complaint. The reviewed content included sensitive material such as people using the toilet, engaging in sexual activity, and other intimate moments.
According to the complaint, Meta advertised its AI smart glasses with promises like “designed for privacy, controlled by you” and “built for your privacy.” The plaintiffs believed these marketing claims and said they saw no disclaimer or information that contradicted the advertised privacy protections.
Meta claimed it was blurring faces in images, but sources disputed that this blurring consistently worked, according to reports. The revelations prompted the U.K. regulator, the Information Commissioner’s Office, to investigate the matter.
The suit charges both Meta and its glasses manufacturing partner Luxottica of America with conduct that violates consumer protection laws. Meta has not yet responded to requests for comment on the lawsuit.
Clarkson Law Firm, which has previously filed major lawsuits against tech giants including Apple, Google, and OpenAI, highlighted the scale of the privacy concerns. In 2025, over seven million people bought Meta’s smart glasses, according to the firm, meaning their footage enters a data pipeline for review with no option to opt out.
The complaint focuses heavily on Meta’s marketing approach, citing examples of advertisements that promoted the privacy benefits of the glasses and described their privacy settings as providing an “added layer of security.” One advertisement stated “You’re in control of your data and content,” explaining that smart glasses owners could choose what content to share with others.
Meta told the BBC that when people share content with Meta AI, the company uses contractors to review the information to improve users’ experience with the glasses. The company said this practice is explained in its privacy policy and pointed to Supplemental Meta Platforms Terms of Service, though it did not specify where this information was noted.
A version of Meta’s U.S. policy states “In some cases, Meta will review your interactions with AIs, including the content of your conversations with or messages to AIs, and this review may be automated or manual (human),” according to the complaint. The BBC found that a mention of human review could be located in Meta’s U.K. AI terms of service.
The case reflects broader concerns about emerging “luxury surveillance” technology, including smart glasses and always-listening AI pendants, which have prompted significant backlash from privacy advocates. The growing unease has led one developer to publish an app capable of detecting when smart glasses are nearby.
The lawsuit represents the latest challenge for Meta as the company expands its artificial intelligence and wearable technology offerings. With millions of users already adopting the smart glasses technology, the case could have significant implications for how tech companies handle user privacy in AI-powered devices.
The complaint seeks to hold Meta accountable for what the plaintiffs describe as misleading advertising about privacy protections while the company’s contractors were actively reviewing intimate customer footage without clear user consent.