My review of the Meta Ray-Ban smart glasses

By Datawolf, 25 March, 2026

Forum
Assistive Technology

Good morning everyone, a while ago I published a review of said glasses, I thought, given that these are talked about on here, I will publish the review over here as well so people can possibly get a good opinion on the current state of the glasses.
I'm open for suggestions on how I can improve my experience.
Just a heads up before we start, the test was written by me, originally in German, I used an AI service to translate the text into English, I checked that everything that I said is still the same sentiment as the original. So, let's begin.
The test model here is the second-generation pair, with a better camera and improved battery life. Alright then, let’s get to it.
Unboxing the glasses is pretty unspectacular, so just a few words on that. In the package you’ll find the glasses themselves and a glasses case that also doubles as the charging cradle. Looks good, feels solid, and not like the junk Apple tried to sell us as the “Smart Case” for the AirPods Max. Whoever designed that thing really deserved a solid public shaming.
Other than that there’s nothing interesting in the box, just a pile of toilet paper no one ever reads and which you can find online anyway.
I think the design of the glasses is well done. Since Ray-Ban glasses already have rather chunky temples, the Meta version doesn’t stand out that much. The only giveaways are the LED indicator, the camera next to your eyes, and the small button on the temple. There’s another button for reset/power, cleverly hidden in the hinge — completely unobtrusive.
On the right side of the glasses, you will find a touchpad, which is used to start and stop music, skip forward and backwards, activate the AI, reduce or increase volume and so on.
To charge them, you just stick the glasses into the charging case. Through the Meta AI app you can check the battery status of both the glasses and the case.
Setup is quick and doable for anyone who can basically operate a phone and press buttons. Setup happens through the Meta AI app which, at this point, is fully usable with VoiceOver. During setup you’re also shown how the glasses work and what each feature does, and you have to perform these actions directly on the glasses. So yes, basically doable for anyone.
After setup I strongly recommend going through the settings. A key feature can be found under Accessibility: “Detailed responses” — this enables more precise answers for image-based queries to the glasses’ built-in AI. Another cool feature is the integration of Be My Eyes, which can be also activated in the accessibility settings. Be aware though, this is just the calling a sighted person feature, not the AI. Still though, it's good to have.
But otherwise you’ll find plenty of interesting stuff in the settings: camera resolution, wear detection, linking to other services, and whatever else. Just poke around a bit and configure everything the way you need it.
Now for the practical part:
I had the glasses in constant use over the past week — on the way to work, during walks, in the city, doing barn chores, and on horseback. It’s obvious that these glasses are aimed at creators. Photo and video functions are quick and easy to access. A big limitation, however, is that the camera can only record in portrait mode. I genuinely don’t understand why portrait is the only option. I assume they’re catering to the TikTok and Instagram crowd, but an optional landscape mode for proper videos would be nice — especially now that you can record 3K video.
Sighted friends describe the video quality as very good, though I strongly recommend adjusting the stabilization beforehand. You wouldn’t believe how awful a video looks that you filmed on a galloping horse without checking if stabilization is on. I recommend the automatic setting; it adjusts quickly and follows your movements well.
The glasses also function as a Bluetooth headset and connect to your phone the moment you put them on. The latency between phone and glasses is, in my experience, far too high for comfortable interaction. Maybe I’m just spoiled by AirPods, but using VoiceOver can be really tedious. For media playback, the temples contain built-in speakers. They do their job well enough — just don’t expect miracles. Speech and something soft like synthwave work fine, but heavier stuff like metal is pushing it. Fair enough for an open-ear system.
The cool thing is, that you also can accept calls with the glasses, the mic quality isn't really the best, but it's working fine if you are in an environment with not much background noise or wind.
Something else you can do is connect the glasses to a number of services you can then interact hands-free with. These are mostly Meta services like WhatsApp or the Facebook messenger, which then allow hands-free creation of messages, or calling people. If you are in a video call, you can double tap the button on the glasses to use the glasses camera and video calls, something really cool if you need help or want to show something.
Now to the AI — unfortunately the big weak point of these glasses. They use Meta’s AI model, which can answer simple questions and describe images.
Image recognition works fairly well, though even with the extended setting detail is still limited, lacking depth. The camera doesn’t seem to have flash either, which is a real problem now that it gets dark so early. The glasses can read signs and posters correctly, though you usually only get the info you actually need on the second question — unless you provide the context up front, like “What does the sign say?” Not a dealbreaker, but something to keep in mind.
I don’t consider Meta’s AI model sufficient. Yes, it can give simple answers, but as soon as you dig deeper or ask about current events, it falters. For example, I asked for information about a station I was traveling to. It answered well, even listed the correct lines serving it. But when I immediately asked where one specific line goes, it suddenly gave me a completely different line in another federal state — just with the same line number. Apparently Meta’s AI only keeps context up to a point.
Another situation: I asked last weekend whether any major motorsport events were happening. The AI said no, although the MotoGP race taking place that weekend apparently didn’t count. When asked again, it first claimed no race was happening, and only on the second follow-up did it give me the correct information. The whole thing feels like the dark ages of ChatGPT 3.5.
One total fail practically disqualified the AI for me: I was walking through the stable where my lesson horse lives. I asked the AI what it saw here and there, and took a photo of one of the horses to get a more detailed description.
The AI’s response — and I quote: “The horse appears to be a… oh, I can’t continue answering because my response might be inappropriate.” Excuse me? It’s a horse. A simple brown horse that needed describing. Maybe the AI doesn’t like Arabians — a sentiment I could absolutely agree with — but come on. That’s unacceptable, especially since other AI apps had zero issues describing the same image.
Battery life has improved compared to the first model. With moderate use I got about 6–7 hours. A few minutes in the case charge the glasses quickly enough to give you nearly two more hours, depending on what you’re doing.
So, what’s the verdict? I’m still not sure if I’ll keep the glasses. I definitely see the potential in this system, but right now it’s still too limited. Meta’s AI feels stuck in 2023. Web searches aren’t consistently reliable, image descriptions are often not accurate enough.
But I can see the advantages that could come with further development.
I hope my notes help someone out there.

Options

Comments

By Karok on Wednesday, March 25, 2026 - 04:32

this review has made me say no; i was on the fence about these, but if it can't even describe things that well how are people relying on them to read printed material and or read food items and cooking instructions is beyond me, but i know they are not designed for that, they are more designed for video creation.

When will the agiga glasses be available (i know the preorders stopped) as they will be better?