Good morning everyone, a while ago I published a review of said glasses, I thought, given that these are talked about on here, I will publish the review over here as well so people can possibly get a good opinion on the current state of the glasses.
I'm open for suggestions on how I can improve my experience.
Just a heads up before we start, the test was written by me, originally in German, I used an AI service to translate the text into English, I checked that everything that I said is still the same sentiment as the original. So, let's begin.
The test model here is the second-generation pair, with a better camera and improved battery life. Alright then, letās get to it.
Unboxing the glasses is pretty unspectacular, so just a few words on that. In the package youāll find the glasses themselves and a glasses case that also doubles as the charging cradle. Looks good, feels solid, and not like the junk Apple tried to sell us as the āSmart Caseā for the AirPods Max. Whoever designed that thing really deserved a solid public shaming.
Other than that thereās nothing interesting in the box, just a pile of toilet paper no one ever reads and which you can find online anyway.
I think the design of the glasses is well done. Since Ray-Ban glasses already have rather chunky temples, the Meta version doesnāt stand out that much. The only giveaways are the LED indicator, the camera next to your eyes, and the small button on the temple. Thereās another button for reset/power, cleverly hidden in the hinge ā completely unobtrusive.
On the right side of the glasses, you will find a touchpad, which is used to start and stop music, skip forward and backwards, activate the AI, reduce or increase volume and so on.
To charge them, you just stick the glasses into the charging case. Through the Meta AI app you can check the battery status of both the glasses and the case.
Setup is quick and doable for anyone who can basically operate a phone and press buttons. Setup happens through the Meta AI app which, at this point, is fully usable with VoiceOver. During setup youāre also shown how the glasses work and what each feature does, and you have to perform these actions directly on the glasses. So yes, basically doable for anyone.
After setup I strongly recommend going through the settings. A key feature can be found under Accessibility: āDetailed responsesā ā this enables more precise answers for image-based queries to the glassesā built-in AI. Another cool feature is the integration of Be My Eyes, which can be also activated in the accessibility settings. Be aware though, this is just the calling a sighted person feature, not the AI. Still though, it's good to have.
But otherwise youāll find plenty of interesting stuff in the settings: camera resolution, wear detection, linking to other services, and whatever else. Just poke around a bit and configure everything the way you need it.
Now for the practical part:
I had the glasses in constant use over the past week ā on the way to work, during walks, in the city, doing barn chores, and on horseback. Itās obvious that these glasses are aimed at creators. Photo and video functions are quick and easy to access. A big limitation, however, is that the camera can only record in portrait mode. I genuinely donāt understand why portrait is the only option. I assume theyāre catering to the TikTok and Instagram crowd, but an optional landscape mode for proper videos would be nice ā especially now that you can record 3K video.
Sighted friends describe the video quality as very good, though I strongly recommend adjusting the stabilization beforehand. You wouldnāt believe how awful a video looks that you filmed on a galloping horse without checking if stabilization is on. I recommend the automatic setting; it adjusts quickly and follows your movements well.
The glasses also function as a Bluetooth headset and connect to your phone the moment you put them on. The latency between phone and glasses is, in my experience, far too high for comfortable interaction. Maybe Iām just spoiled by AirPods, but using VoiceOver can be really tedious. For media playback, the temples contain built-in speakers. They do their job well enough ā just donāt expect miracles. Speech and something soft like synthwave work fine, but heavier stuff like metal is pushing it. Fair enough for an open-ear system.
The cool thing is, that you also can accept calls with the glasses, the mic quality isn't really the best, but it's working fine if you are in an environment with not much background noise or wind.
Something else you can do is connect the glasses to a number of services you can then interact hands-free with. These are mostly Meta services like WhatsApp or the Facebook messenger, which then allow hands-free creation of messages, or calling people. If you are in a video call, you can double tap the button on the glasses to use the glasses camera and video calls, something really cool if you need help or want to show something.
Now to the AI ā unfortunately the big weak point of these glasses. They use Metaās AI model, which can answer simple questions and describe images.
Image recognition works fairly well, though even with the extended setting detail is still limited, lacking depth. The camera doesnāt seem to have flash either, which is a real problem now that it gets dark so early. The glasses can read signs and posters correctly, though you usually only get the info you actually need on the second question ā unless you provide the context up front, like āWhat does the sign say?ā Not a dealbreaker, but something to keep in mind.
I donāt consider Metaās AI model sufficient. Yes, it can give simple answers, but as soon as you dig deeper or ask about current events, it falters. For example, I asked for information about a station I was traveling to. It answered well, even listed the correct lines serving it. But when I immediately asked where one specific line goes, it suddenly gave me a completely different line in another federal state ā just with the same line number. Apparently Metaās AI only keeps context up to a point.
Another situation: I asked last weekend whether any major motorsport events were happening. The AI said no, although the MotoGP race taking place that weekend apparently didnāt count. When asked again, it first claimed no race was happening, and only on the second follow-up did it give me the correct information. The whole thing feels like the dark ages of ChatGPT 3.5.
One total fail practically disqualified the AI for me: I was walking through the stable where my lesson horse lives. I asked the AI what it saw here and there, and took a photo of one of the horses to get a more detailed description.
The AIās response ā and I quote: āThe horse appears to be a⦠oh, I canāt continue answering because my response might be inappropriate.ā Excuse me? Itās a horse. A simple brown horse that needed describing. Maybe the AI doesnāt like Arabians ā a sentiment I could absolutely agree with ā but come on. Thatās unacceptable, especially since other AI apps had zero issues describing the same image.
Battery life has improved compared to the first model. With moderate use I got about 6ā7 hours. A few minutes in the case charge the glasses quickly enough to give you nearly two more hours, depending on what youāre doing.
So, whatās the verdict? Iām still not sure if Iāll keep the glasses. I definitely see the potential in this system, but right now itās still too limited. Metaās AI feels stuck in 2023. Web searches arenāt consistently reliable, image descriptions are often not accurate enough.
But I can see the advantages that could come with further development.
I hope my notes help someone out there.
By Datawolf, 25 March, 2026
Forum
Assistive Technology
Comments
thank you
this review has made me say no; i was on the fence about these, but if it can't even describe things that well how are people relying on them to read printed material and or read food items and cooking instructions is beyond me, but i know they are not designed for that, they are more designed for video creation.
When will the agiga glasses be available (i know the preorders stopped) as they will be better?
This is an excellent review
When I had gen 1 Meta glasses, I hated them; never had any luck with getting them to describe stuff, but I think user error was a factor. I've got gen2 metas now, and am having a much more positive experience. Battery life doesn't seem any better; listened to audible for about 3 hours and the glasses battery went from 98 percent to 15 percent. I guess the 8 our battery doesn't figure in continuous listening like that. I've had a couple of situations where the AI gave me attitude or wasn't particularly helpful, but again, I believe that user error could very easily be a contributing factor. Now, Envision's Ally is going to be available, at least for scanning documents, and it works very well; I certainly had better luck using Ally on my Metas than I ever did when trying to use the Ally Solos glasses. I can understand where someone with a lot of needs may be disappointed with the Meta glasses, but my needs are super basic, so I'm more or less satisfied. I think it took getting the Ally Solos glasses and finding out how tedius they were to use before I fully appreciated how much better I had it with the Metas.
i wouldn't
i wouldn't get meta raybands, instead opting for meta vangard okleys. better batery life, thicker framsk, teh rayband frames seem aimed for masion. they are two thin for my usecase. plus, the okleys can be costamized to my spesifoc tint of lenses, as i am photosensative do to two cossions firteen years ago. i would need to sign a contrack before getting them, as i live in a ficility with other people. excellent review, however. rayband frames are not werth it for me, as i tend to brzke glassrds at the video arkayed
Interacting with iPhone
Can one interact with Siri and iPhone functions with these glasses?
--Pete
Re: Interacting with iPhone
Not in the way you're thinking. You can still do, "Hey Siri", but that's dependent on your iPhone, otherwise, when it comes to direct iPhone and Siri interactions, just think of these like a typical Bluetooth headset.
For example, you can say something like, "Hey Siri, what is the weather?", While wearing your Meta glasses. And Siri will answer, with the audio coming through the metaclasses. But you can't say, "hey Meta, ask Siri what the weather is," or anything like that.
You can, however, set it up through the Meta AI app, so that you can ask Meta to call someone, or text someone.
Hope this makes sense.
Re: Interracting with iPhone
@Brian
Thanks for your response. so, just to confirm, it sounds like I can interact with my iPhone via "Hey Siri" to ask about the weather, call someone who is in my contacts, etc. Alternatly it sounds like I could say "Hey Meta" to get visual descriptions or take a photo with Meta.
that sound pretty flexible unless I'm missing something.
I should be getting a pair of the new Oakleys in a bit and will start experimenting.
Thanks.
--Pete
You can also...
Say, "hey Meta, call (insert name)".
Or
"Hey Meta, text (insert name)".
You just have to enable that feature from the Meta AI settings.
Yes, you can activate Siri with your voice, so long as your Meta glasses and iPhone are in range. Remember, the Meta glasses connect via Bluetooth. So you can't leave your iPhone at home, skip across town, and try to use your phone with the glasses. Otherwise though, You can still use Siri voice commands, by saying Siri or Hey Siri, through the microphone of your Meta glasses, all with just your voice.
I hope that makes sense.
I hav also use them for a week
Hi, I know the glasses need to be improved in a lot of ways, but I haven't have that latency you mentioned, as in my Huawei free buds which have a far mor latency than the Meta glasses.
Something that I have encountered is that my iPhone vattery has consumed quicker since I have been using the glasses than before, I don't know if it is because I find very helpful to read my whatsapp messages without touching the phone and responding to them or what happens, but my iPhone 17 pro is almost new and the use of the glasses are bringing this problem I think.
@Karina Velazquez
The battery drain could possibly be from the Meta AI application. Over the past year or so, Meta has added a lot of social media crap, called "Vibes". It's a whole hell of a lot of user submitted content, and as it is constantly refreshing on the app, I am fairly certain that it is causing the battery drain.
Interacting with Siri
@B Brian:
I finally got my Oakley glasses today and managed to set them up.
A few questions:
1. You indicated that I could say "Hey Siri" and get answers from my iPhone. I don't see how to set that up. Saying "hey Siri" does nothing.
2. I got a mesage saying that I had to have the Meta app running on my phone to use the glasses. Is that true? What functionality is lost if the app isn't actually running (but it is installed)?
3. I saw that I could connect my Meta glasses to use Amazon Music as well as Apple Music. Are there any other connections I should make? If I could say "hey Siri" I wouldn't need to make those connections.
Maybe I set up something wrong. Any tips on best configuration?
Thanks.
--Pete
Some answers
So the companion app is called "Meta AI". Trying to use the glasses without this running, pretty much means that Meta AI will not work and/or run very poorly. For best results/performance, always run the Meta AI app if you're going to use the Meta glasses.
As for what services to connect with Meta, totally up to you. Personally, I hate Spotify being connected, because there's a weird bug with Meta in Spotify, where Spotify will randomly play on you, when you're trying to do other things. Like answer a phone call for example. It is a pain in the backside when you answer a phone call, only to have your favorite Spotify station blasting in your ears. I personally enjoy, and often use, Audible and Shazam through my Meta Smart glasses.
As for Siri, there's nothing to really setup on the glasses. If you have your iPhone setup to answer with "Siri", or "Hey Siri", then you would just do that as you always do. If you're wearing your Meta glasses at the time, Siri sounds will just come through your Meta speakers. You don't have to do any gestures on the Meta glasses. You don't have to touch the Meta touchpad, or anything like that. For example, say you are out and about in the really real world, you have your iPhone in your pocket, and your Meta glasses on your face. You want to access Siri, maybe because you want to do something that you know the Meta glasses won't do, or won't do quite as well. Like texting, I prefer texting with Siri over texting with Meta AI. Just a personal preference. Anyways, without grabbing your phone, without pressing any buttons on anything, you say something like, "Hey Siri, send a text." Then, through the speakers on your glasses, you will hear Siri activate and prompt you with something like, "To who? "
You just give a name and follow the prompts to send your text message.
I hope this makes sense to you. š
Re: iPhone and glasses
@Brian
Thanks. This all seems to be working now. I found that I had to turn on the item in the accessibility / Siri settings to listen for "hey siri" even when the phone was face down. I did not have that turned on before.
One more question: In a noisy environment, when the glasses are responding, loud noises will interrupt the response as if meta is listening for another command. Is there an easy way of not having the glasses response be interrupted in such conditions, i.e., maybe somehow muting the microphone?
thanks for all of your help and suggestions.
--pete
Pete
Ah, apologies for not mentioning that iPhone setting. I have literally always had that enabled, but did not think to mention it to you. It's one of those things I'd tend to do on auto pilot these days.
As for your question in regards to disrupting the Meta AI while it's processing a prompt, I, believe, that you can go into the Meta AI settings for your smart glasses and disable the Respond without "Hey Meta". It basically listens for prompts after you give your first prompt. Disabling it, would maybe, just maybe, prevent it from being interrupted in a noisy environment. On the Meta AI application, go to your glasses settings, the navigate to the Experiences heading, then double tap on the Meta AI button. The setting will be in there under Meta AI preferences.
Also, if you are getting a lot of text messages or notifications on your iPhone, for example, you can go into the Meta AI settings and temporarily mute notifications. I have to do this when I'm using my glasses consistently, and people are text messaging me left and right.
HTH.
Don't want to create a separate thread if you don't mind. Potent
Subject.
I already know I'll buy it I just wanna go to store to choose between the styles and which will make me even more broke lol.
How are your tipical most useful or casual workflows? And I heard that aira integration is a bit of a mess but still kinda works afterward? I'd especially use it when looking at my laptop screen and just wanting to have a quick very rough idea ish of what's there hands free. Plus I know the micks are very good. I was told it has latency with voiceover and audio though? Would mainly use this home and hear around instead of airpods pro 2 with transparency mode. And etc.
And, man I know double tap can talk about this for hours but I blind guy wanna have actually useful / functional and cool! glasses so so bad :)
TheBlindGuy07
Obviously, I don't know what your daily life and daily activities are like. I do know that you are young, and I am fairly certain you are active in your daily life. With that respect, I think you would do best with the Oakley Vanguard. They are, "sport-centric", Have a waterproof rating, and the camera is placed in the center, I believe just above the bridge of your nose, rather than on the left edge of the frames, as the G1 Ray-Bans are. They also are wraparound glasses, which I take to mean that they are far more comfortable to wear. They also have a better battery life.
Just my two cents. š
Edit: I forgot to mention they start at $499 USD.
Just got a pair of Oakley glasses
Just picked up a pair of Oakley glasses last week, the non-sports version.
So far this seems like it will be fairly useful. One big problem I am having, however, is that the glasses are so tight on the sides of my head that I can't wear them for more than a day! I took them out of town on vacation the past week and wore them 3 days in a row thinking that the soreness would go away. No luck. I can barely put them on now! I can't believe the springs are so tight on the ear pieces.
I went back to the glasses store where I got them and asked if they could be adjusted. I was told that they couldn't because of the sensitive electronics in the ear pieces. so they couldn't do the usual trick of heating up the ear pieces and fgorming them or bending them slightly. Also, the plastic is very hard and can't be bent.
Is this a common issue? Any way around this?
--Pete
@Brian
Don't give me too much credit.
I am 23 and like all good gen z my dad would be happy if any of your statement was true, which is isn't.
I thought about Oakley but, first thank you @peter for your feedback, second I am Sikh and because of my turban I mostly plan on using these home anyway so I guess it's between wayfairer or skyliner?
I really don't feel like trying to find a store around to test both although I really should for the fit, so I think I'll just get the cheapest from amazon and see? Gen 2 of course.
As for the cameras on the classic gen 2, I thought there was one above each eye or something like that instead of 2 on the left?
Love my Wayfarers!
I've had quite the adventure with Meta glasses. I've tried both Skyler and Wayfarer, and I definitely prefer the Wayfarers. At first, I was nervous that they'd be too big, or look too masculine, but honestly, I can't tell enough difference between them and the Skylers for it to matter. A part of me is a bit curious what the Oakleys would be like, but since I have blue Wayfarers now with sapphire lenses, my search for smart glasses has come to an end. The centered camera would probably be nice, but honestly, for the most part, I'm getting fairly decent results with the glasses I have, so don't feel any desire to try anything else. I think I'm even going to wait for a while when Apple Glasses come out. I'm sure they'll be great, but like with the apple watch and some other apple goodies I have, waiting a bit until some of the bugs get ironed out seems smart, especially knowing all the things my Metas have become capable of.
Polar opposite
Granted I am still using the Gen 1 Meta smart glasses, but I found the Skylar just feel better on my face over the wayfarers. I don't know if the Gen 2 are any different in design, shape, or size, but they do seem to have better color options.
Having said that, I cannot wait to try out a pair of the Oakley vanguards. š
Pete
With my Gen 1 Meta's, I was able to slightly stretch out the arm pieces, by very, very, very carefully opening the arms, and then ever so slightly bending them outward. The arms are stiff, but there is a little bit of give to them. You have to be extremely careful though, because yes the electronics inside the arm pieces are extremely sensitive. I have a friend who destroyed her first pair of Meta glasses, by accidentally banging her head against a door frame (not intentionally) (, and the impact was so much that it disconnected the camera, making the glasses pretty much useless.
Regardless, I was able to get my Skylar to fit more comfortably on my face, as a result of my tampering with them.
Note that I don't actually recommend you do this, just telling you what I did with my glasses. If you do this, and you break yours, then, well, you're on your own.
Glasses comfort
I only have the wayfarers, and don't think I'll be purchasing any others any time soon. Since I am thoroughly on team Android, the only thing that might convince me would be a pair of android glasses, which Google and Samsung are said to be working on to compete with the metas. I know Samsung already came out with the more augmented reality focussed ones, but they're still supposed to be working on a pair of regular glasses similar to the metas. If that does happen, and we could have Gemini Live through the glasses, with camera access, that would be sweet. Anyway, I took my glasses to Walmart here in town, and they did ever so slightly manage to adjust them, but honestly, it didn't help a ton. So, I got an accessory kit on Amazon, with ear hooks, nose pads, stuff like that. The nose pads help, and while I had an ear hook on each side, it got to be too uncomfortable that way, as on the left side it was really digging in, so went with only one hook on the right side, and that's still enough to keep them from sliding off when I bow my head for prayer at church, or in similar situations.
As for use, I do find some use with Live AI, having it help me identify mail, read what's on a shirt, stuff like that. But honestly, when out and about, I wear them more for the look than the functionality. Sound is only meh compared to any pair of actual earbuds, and latency with just about any bluetooth headphones on talkback makes getting anything done rather difficult. So, honestly I keep them on my face, but turned off half the time. But when they come in handy, they really come in handy. I'd love to see improvements to live AI where it would have an actual video stream and be able to update you on status changes, say on a small screen, as opposed to having to tell it to look again, and again, and again, and... well, y'all get the picture.
Loving my Wayfarers
I always find it interesting reading other peoples experiences with these, as I guess with everything they can be totally different. Luckily for me, I absolutely love these since I bought them last year and take them out everywhere with me. Stuff like not having to rely on friends when Iām out to read a cocktail menu for me last weekend, or check what shop Iām outside when Iād been walking on autopilot through town the other day make a huge difference to me. I definitely wish there were things they did better, such as reading smaller texts or not trying to judge whether or not something is rude or not, especially when more often than not it isnāt.
@Dan Cook
So for me, it takes long enough to get them to read everything, that honestly it's just faster and a more pleasant experience to have someone read it to me. Especially because when I'm out and about, I usually have an idea of what I want from the places I go most. Plus in noisy environments, I either have a hard time hearing the glasses, or the glasses pick up on noise around me, think I've said something when I haven't, and stop what they were reading to start with. So, while I see the value, if you are willing to fight with them, but for me it's not worth the battle when my girlfriend is right there, and happy to read the menu.
The difference is, we now have the option
I totally understand your position, and to be honest I do a bit of both. I do wish it wouldnāt stop speaking when you are speaking while it is reading, because sometimes I want to check that what itās telling me is correct and I canāt do that this way. However for me, the fact that I can now choose to just grab a menu, ask it for the category or item I want and it will eight times out of 10 read it, is a huge Boone. Makes me really excited for the future of this technology seeing as we are currently in its infancy
Agreed
Yes, absolutely. And should I ever go somewhere and not have the help, it's nice knowing the glasses are there and can usually do a decent enough job. Also, at home, it's nice being able to figure out what is junk mail and what is something that actually needs a closer look. I've had it read letters to me, though usually it likes to summarize as opposed to reading word for word.