Workers reviewing Meta Ray-Ban footage encounter users’ intimate moments

Bank details and intimate moments captured without people realizing they are being recorded are the new privacy nightmare behind the latest tech fashion hit, Meta Ray-Ban smart glasses.

Meta Ray-Ban privacy risks

A joint investigation by Svenska Dagbladet and Göteborgs-Posten found that footage and audio recorded by Meta’s Ray-Ban smart glasses are reviewed by human contractors in Kenya, including recordings containing sensitive personal material.

A contractor workforce in Nairobi

A troubling reality for tech giants is that a large part of the AI revolution is built on the labor of workers in poorer countries.

The investigation focused on Sama, a Meta subcontractor in Nairobi, Kenya, employing manual laborers known as data annotators who train AI systems by labeling images, video, and speech.

Thousands of workers are involved in this type of AI training work. Tasks include drawing bounding boxes, assigning object labels, checking transcriptions, and performing quality assurance to help systems interpret visual scenes and user queries.

Journalists interviewed more than thirty Sama employees at different levels. All spoke on condition of anonymity for fear of reprisals.

Interviewees described repeated exposure to highly sensitive clips. Examples include bathroom visits, people undressing, sex, pornography viewed while wearing the glasses, and bank cards visible in recordings. Some workers described seeing material that could trigger enormous scandals if leaked.

According to workers, the facility operates under strict security controls, including office cameras and restrictions on bringing recording-capable devices into the building to prevent leaks.

“You understand that it is someone’s private life you are looking at, but at the same time you are just expected to carry out the work. You are not supposed to question it. If you start asking questions, you are gone,” one worker said.

Former Meta employees said faces appearing in annotation data are automatically blurred. Data annotators in Kenya, however, told that the anonymisation does not always work as intended, with faces sometimes remaining visible in the material they review.

Asked how this can happen, one former Meta employee said the algorithms sometimes miss, particularly in difficult lighting conditions when certain faces and bodies become visible.

Unanswered questions about where the footage goes

Journalists repeatedly asked Meta where the images reviewed by contractors originate and whether private recordings made in countries such as Sweden could end up being viewed by workers abroad.

They also asked how users are informed about the glasses, what safeguards exist to prevent sensitive material from reaching annotators, how subcontractors are audited, and how long voice and video recordings are stored.

After two months, Meta responded with a written statement from a spokesperson in London. The company did not directly address the questions, instead describing how data moves from the glasses to the user’s mobile app and referring to its AI terms of use and privacy policy.

Those policies note that content may be subject to human review. They do not specify where such reviews take place.

Lack of knowledge in stores

To assess how much eyewear store employees know about the data practices of Meta Ray-Ban smart glasses, journalists visited ten eyewear store in Sweden.

In several cases, employees did not know what data the glasses transmit, where the information is sent, whether anything is automatically shared with Meta, or how users’ voice and video recordings are processed.

Staff in different stores also provided contradictory answers, and many said they believed all data remains locally in the app. Tests carried out during the investigation found that this was not the case.

Sales staff’s lack of knowledge means customers may receive incomplete or incorrect information and remain unaware of the risks to their personal data.

Don't miss