Meta secretly watches the videos from your Meta Ray Ban glasses

Meta secretly watches the videos from your Meta Ray Ban glasses

3 minutes, 9 seconds Read

When Meta sold Ray Ban smart glasses with the promise that they were built with privacy in mind, the privacy design focused on a small LED light on the frame. Behind the scenes, the data pipeline routes images of homes and everyday environments to a contractor in Nairobi, Kenya, where workers review and label content to train Meta’s artificial intelligence systems.

That is the conclusion of a joint investigation by Swedish newspapers Svenska Dagbladet and Göteborgs Posten, based on testimonies from employees of Sama, a Kenyan data annotation company engaged by Meta to label images captured by the AI ​​glasses.

“We see everything from living rooms to naked bodies. Meta has that kind of content in its databases,” an employee told Swedish journalists. “In some videos you see someone going to the toilet or undressing. I don’t think they know because if they did they wouldn’t record,” said another contractor.

One employee described watching footage in which a wearer placed the glasses on a bedside table. The wearer’s wife later entered the room and undressed, apparently unaware that the device had captured the moment. Other recordings reportedly included accidentally filmed bank cards, people looking at explicit content and footage of sexual activity.

Employees are expected to review the material without questioning it.

“You understand that it is someone’s private life that you are looking at, but at the same time you are expected to carry out the work. If you start asking questions, you are gone,” said a contractor.

Ray Ban Meta smart glasses will sell more than seven million pairs by 2025. The devices capture first-person images when the AI ​​assistant is activated. Human annotators train Meta’s AI systems by labeling and categorizing objects, scenes, and interactions within images and videos. The material sent for review often includes everything the camera captured, whether the wearer intended it to be recorded or not.

“You think if they knew about the extent of the data collection, no one would dare use the glasses,” said one annotator.

Meta’s terms of service reserve the right to perform manual human review of AI interactions. This clause provides the legal basis for sending user recordings to contractors for training and quality control purposes. Privacy advocates say many users don’t realize the camera is recording when they activate the AI ​​assistant, meaning sensitive footage could be captured unintentionally.

Data protection lawyer Kleanthi Sardeli warned that once the footage ends up in training systems, user control becomes limited.

“Once the material is entered into the models, the user essentially loses control over how it is used,” she says.

Meta says automated face blurring helps protect identities within training data. Employees involved in the process say the system doesn’t always work as intended. According to them, faces and bodies sometimes remain visible, especially in poor lighting conditions.

This means that people recorded without their knowledge could be identifiable to people viewing the footage in other parts of the world.

After public criticism, the company ended some moderation work and switched to computer vision annotation. That kind of annotation now includes viewing images generated by Meta’s smart glasses. The facility’s employees work under strict confidentiality agreements. Offices use surveillance cameras and personal recording devices are prohibited. Employees say these restrictions leave them with few options if they want to report their concerns.

Internal meta-planning documents reportedly show interest in adding facial recognition capabilities to future versions of the glasses. Critics say such features could pose new privacy risks, especially if current safeguards fail to reliably obscure identities in training data.

For millions of users who bought the glasses believing the small recording indicator protected their privacy, the research raises deeper questions about how AI wearables collect and process data behind the scenes.

#Meta #secretly #watches #videos #Meta #Ray #Ban #glasses

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *