Fraud Alert: Your Privacy Ends Where My Glasses Begin!
Picture this. You are sitting in a café, having a conversation with a friend or quietly checking your bank balance on your phone. Across the table or maybe just a few seats behind you, someone is wearing what look like perfectly ordinary sunglasses. You pay them no attention.
 
You probably should.
 
Those glasses might be recording everything in front of them.
 
That unsettling possibility moved from science fiction to documented reality after two respected Swedish newspapers — Svenska Dagbladet (SvD) and Göteborgs-Posten (GP) published detailed investigations into how footage captured by AI-powered smart glasses is being handled behind the scenes. What they found is difficult to dismiss.
 
According to the reports, workers employed by a contractor in Kenya have been reviewing video clips recorded by Meta's Ray-Ban AI smart glasses — not to watch them for any obvious purpose, but to help train the company's artificial intelligence systems. This process, known as data annotation, involves human reviewers tagging objects, actions, and scenes so that AI can better understand the real world. It sounds clinical enough until you consider what some of those clips actually contained.
 
The Swedish investigations describe workers encountering footage of people undressing, using bathrooms, and in some cases engaging in sexual activity — all apparently unaware they were being recorded. Bank cards, personal documents, and other sensitive material reportedly appeared in some clips too. 
 
The SvD report quotes workers directly about what they saw during these reviews. These were not isolated incidents flagged as anomalies. They were part of a routine workflow.
 
And the people in those videos? In all likelihood, they had no idea.
 
A Camera You Would Never Notice
The glasses at the heart of this story are the Ray-Ban Meta smart glasses, developed by Meta in partnership with eyewear giant EssilorLuxottica. They are designed to look indistinguishable from regular sunglasses, and they largely succeed. Inside that unremarkable frame, however, sit two small cameras, microphones, open-ear speakers, a touch control panel, and a built-in AI assistant.
 
Wearers can take photos, record video, livestream to social media, listen to music, and ask the AI to describe what they are looking at in real time. Say "Hey Meta, what am I looking at?" and the system attempts to identify and explain the scene. To start recording, you press a button on the frame, which triggers a small LED that indicates the camera is active.
 
Critics have pointed out that the indicator is tiny and easy to miss, particularly from a distance or if you are not looking for it.
 
7 Million Pairs and Counting
This would matter less if the devices were rare. They are not. According to media reports citing EssilorLuxottica, more than 7mn (million) Meta smart glasses were sold in 2025 alone, compared with around 2mn units sold across 2023 and 2024 combined. That is a sharp acceleration. 
 
As the installed base grows, so does the statistical likelihood that you—or someone you know—has already appeared, unknowingly, in footage captured by one of these devices.
 
When users activate the AI assistant with a phrase like "Hey Meta, look and…", images may be uploaded to Meta's cloud systems for analysis. Meta's own support documentation confirms that images processed this way may be stored and used to improve Meta's products, with assistance from trained human reviewers. 
 
So, the pipeline from a wearer's field of vision to a reviewer's screen is not hypothetical. It exists, it is documented, and it is already operating at a large scale.
 
The Bystander Problem
Privacy experts have long worried about what they call the bystander problem, and wearable AI cameras make it significantly worse.
 
When someone holds up a smartphone to take a video, people around them generally notice. They can object, move away, or ask the person to stop. Smart glasses offer no such warning. Because they look like normal eyewear, bystanders have no obvious signal that a camera is pointed at them. The recording happens invisibly, in the middle of everyday life — in restaurants, at home, in clinics, at ATMs, in offices or at school gates.
 
Once that footage enters a cloud system for processing, it may pass through multiple layers of storage, analysis, and human review before anyone considers whether the people in it ever consented to being filmed.
 
Meta has published guidelines asking users to respect others' privacy — to inform people before recording and to avoid filming in private environments such as bathrooms, hospitals, and changing rooms. These are sensible recommendations. But they depend entirely on the goodwill and discipline of millions of individual users, which is not a particularly reliable foundation for privacy protection at scale.
 
A Ready-made Tool for Fraud
Beyond the privacy implications, there is a more immediately menacing risk: these glasses are a gift to cybercriminals.
 
A device that captures high-resolution video of everything in the wearer's field of view — discreetly, continuously, and without drawing attention — is exactly the kind of tool a fraudster would design if given the chance. In a crowded café, airport lounge, or shared office, someone wearing these glasses could quietly record laptop screens showing emails, banking dashboards, or confidential documents. Brief glimpses of a payment card, a PIN being entered, or a mobile one-time passcode (OTP) notification would be more than enough to exploit later.
 
The information gathered need not even be used immediately. Criminals could use it to build a detailed picture of a target — their habits, their contacts, their finances — before launching a precisely crafted phishing attack or impersonation scam. They already know who you bank with, what your card looks like, and perhaps even your account number. The scam practically writes itself.
 
As wearable cameras become cheaper, more capable, and more socially invisible, expect them to increasingly appear in cybercriminal toolkits.
 
What You Can Do Right Now
None of this means you need to treat every person wearing sunglasses as a potential spy. But it does mean developing a few sensible habits.
 
Stay alert to wearable cameras. Devices that look like ordinary glasses may have cameras built into the frame. If someone nearby is tapping their frame or quietly issuing voice commands, it is worth being aware.
 
Keep sensitive information out of view in public. Bank cards, documents, passwords, and financial details should not be visible to anyone around you — whether or not they appear to be paying attention.
 
Shield your screen. In cafés, airports, trains, or any shared space, position your laptop or phone so that others cannot easily see what is on it. AI systems can read text from images with ease. You can also think about using a privacy screen guard or filter. The screen guard or filter limits the viewing angle, making it difficult for people sitting beside or behind you to see the display.
 
Cover your keypad. When entering a PIN or making a digital payment in public, shield the keypad and card from view as a matter of habit.
 
Ask for privacy when it matters. In your home, your clinic, or your gym, it is entirely reasonable to ask visitors to switch off recording devices. You are not being paranoid — you are being sensible.
 
Be honest if you use these devices yourself. If you own smart glasses, make it a practice to tell people before you start recording, and avoid filming anywhere that a reasonable person would consider private.
 
The Question That Will Not Go Away
Wearable cameras are not new. What is new is pairing them with artificial intelligence-AI, connecting them to cloud infrastructure, and putting them on the faces of millions of people in a form that is virtually undetectable.
 
The investigations by Svenska Dagbladet and Göteborgs-Posten have pulled back the curtain on just one part of that system — the human reviewers watching footage on the other end. There is almost certainly more we do not yet know.
 
The uncomfortable truth is this: in a world of AI-enabled wearables, your privacy no longer depends solely on what you do with your own devices. It depends on the choices made by every person around you — including the one sitting quietly across the café in a pair of perfectly ordinary-looking glasses.
 
Stay Alert, Stay Safe!
 
 
Comments
Fraud Alert: Beware the ‘Cyber Experts’ Selling Fear in the Name of Safety
Yogesh Sapkale, 27 February 2026
“Block your IP address on WhatsApp and you will get 100% protection from hackers.”   “Never help an unknown person fix their mobile phone because they will capture your face ID, fingerprint and voice sample and using your AI clone...
Fraud Alert: Abundant Intelligence, Scarce Trust
Yogesh Sapkale, 20 February 2026
The India AI Impact Summit 2026 brought together political leaders, technology chiefs and policy-makers at a moment many believe will shape the next chapter of human progress. The mood was forward-looking, even optimistic — but not...
Free Helpline
Legal Credit
Feedback