Ray-Ban Meta Wayfarer Smart Glasses Unreview

By Unregistered User (not verified), 27 February, 2024

Forum
Apple Hardware and Compatible Accessories

I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.

They cost £329 in the UK. They arrived two days early.

I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.

Options

Comments

By Brooke on Friday, May 3, 2024 - 16:18

With WhatsApp video calls... I'm guessing the call has to be started on the phone, then transferred to the glasses? I've done this successfully. Just wanted to make sure I can't actually initiate the video call from the glasses.

By Andy Lane on Friday, May 3, 2024 - 16:18

This can be done by starting a voice call on your glasses then double pressing the button on the top of the right arm to switch to a video call.

By Brooke on Friday, May 3, 2024 - 16:18

Will give that a try!

By MarkSarch on Friday, May 3, 2024 - 16:18

You don't actually need the phone to make a video call, you can do it directly from the glasses without touching the phone
First you have to know where the capture button is located on the right temple
You can try the following so you can see what will work for you.
you can keep your phone in your pocket
Start by asking Meta to make a WhatsApp voice call to any of your contacts
Once your call has been answered, press the capture button twice and you will listen the synthesizer mention something about the camera. Only the other person needs to accept the video call request and that's it.

By Andy Lane on Friday, May 3, 2024 - 16:18

Just start a call then double press that button and it will start the video on the glasses. I’ve used it to make a 15 minute call which used 24% of the battery which should hopefully mean around an hour of video calling from the glasses without recharging.

By Stephen on Friday, May 3, 2024 - 16:18

Honestly, I loved our back-and-forth banter! I just removed it because there was some ridiculous disrespectful people in the comment section which I don’t want to respond to. i’m really not interested in entertaining drama. It’s because of some people that we just can’t have nice things lol.

By Matt D on Friday, May 3, 2024 - 16:18

So, a weird thing happened for me this past Saturday We were out at a local pub and there was a paper copy of the specials. I said hey meta look and read these specials to me, and to my astonishment it read the first 3 paragraphs verbatim, and then stopped. It did this every time, this indicates to me a potential for real time OCR, and I did ask it as follow up to just tell me saturdays specials and of course it summarized them for me, but was most interesting was the fact that it read those first few paragraphs as they were. I wish I could figure out what difference happened there to make it decide to go in that direction. FYI, I did have my sighted wife confirm that the conversation was as the print was written.

By Ollie on Friday, May 3, 2024 - 16:18

I'm an engineer, I build things, I know how things are built. I'll always demand better. They are fine, they will get better but, for the moment, they just about work some of the time for the tasks that I give it, though I am likely a fringe case of a fringe case, asking for filament and 3d printing parts to be identified.

I think the issue is here, and why Seeing AI blows meta AI out of the water when it comes to OCR... Is that Seeing AI is built to do OCR whereas we're trying to get Meta to do something that isn't really it's central purpose. It's like trying to cut a sandwich with a chainsaw. I'm really glad that so many of you are happy with your purchase, and don't let my expectations sour that. I'm mildly impressed by them, but that, as I've said before, does not have to be your view, nor would I expect it to be. I'm writing this so others, considerig purchase, don't buy and are let down. Your phone will do a better job of reading items. Your phone will do a better job of describing scenes, what meta will do is describe something rapidly and briefly and, when it comes to written information, either in summery or inaccurately. I don't yet have a need for them. I've got them, and glad I have, for when I need a whatsapp help from friends and family, but the audio is better via Airpods.

They will improve, and I'm glad I've bought them, but they are not the killer device we need, not yet, and not for a long while. Meta's historically not been great about accessibility, just look at web compliance on facebook and instagram.

One last thing, and this is back to OCR. I often have to find my post in the post room downstairs as the delivery people helpfully dump it there... The joy! I can't use my Meta Ray-Bans for this for one simple reason, no wifi or mobile signal in the post room. It's something to bear in mind. Meta is useless without signal, Seeing AI, at least, as far as I understand it, does on phone processing. This won't ever change with meta, they want your data, that's what they sell. Now, if apple comes on the scene and has on device processing, that will be different.

I hope this clarifies my points. Maybe we'll get up to 200 comments soon. Funny it's not even about an apple product. surprised the usual suspects haven't popped up to bemoan the non-appleness of this thread!

By Stephen on Friday, May 3, 2024 - 16:18

Thanks. I thought it was pretty good myself… Not to toot my own horn or anything lol. Honestly, I don’t mind constructive criticism or constructive feedback. What I do mind is when people are bored and just start attacking people online for no apparent reason. You can still get your point across without being something that I probably shouldn’t be saying on this forum. I don’t tolerate it whether it’s directed at me, or other people who are commenting with replies to other comments. I don’t put that negative energy anywhere near me 😊.

By Ollie on Friday, May 3, 2024 - 16:18

I'm just going to rock out with a pair of Ray-bans, Seleste atop that, and a head mounted phone atop that. Covers all bases.

By Brooke on Friday, May 10, 2024 - 16:18

I'm curious, because I enjoy FB groups. Is there one relating to the glasses? I guess I could go and look it up instead of asking here, but... I'll post this anyway.

By mr grieves on Friday, May 24, 2024 - 16:18

So a few times I was out and about and opened Voice Vista whilst wearing the glasses. It did its thing as I would expect, but then every time I tried to say Hey Meta it would start playing my audio book (from Easy Reader).

When I got back to base I was playing with other apps instead of Voice Vista and they seemed fine. But then I tried Voice Vista again and it worked. So I think maybe this bug only materialises when I want to actually use Voice Vista.

I ended up just not bothering but it was a shame. There might be an obvious answer - not sure if Easy Reader was open or not. But I didn't want to spend ages fiddling with my phone when I was wanting to explore. Has anyone else come across this?

I will fiddle some more anyway,

By mr grieves on Monday, June 10, 2024 - 16:18

I know suddenly Gpt 4o has made these seem a bit old hat now, but I still love my glasses. I use them all the time and they are so useful. But it doesn't always know what the important information is as I've mentioned before.

One fantastically useless answer it gave me. I was holding a box of tea and asked it to look and tell me what this is. It said "this is a box with information on it". Yup, bingo. That's what I was after - thanks.

The other one that made me laugh was when I was on holiday walking through a field and my wife said "ooh look to the right and ask your glasses what that is!". So I asked and it said something like "there is a green field with flowers and some trees with branches sticking out." What it failed to mention was the 20 foot high naked man made out of bronze standing right in front of me.

In the latter one I did manage to eventually encourage it into a proper answer. "So, er, can you see a statue maybe?". Sometimes you do have to give it clues which would mean being able to see the thing in the first place.

But I had to give up with the tea after a number of tries.

I know I'm not mentioning all the times when it was genuinely useful but they weren't very funny.

By mr grieves on Monday, June 10, 2024 - 16:18

Whilst I agree with your statement I still think it is not necessarily answering the questions that a sighted person would ask. For example, my wife also didn't know about the big naked man in the field until she saw it. So "Hey Meta, look and tell me what that is" would be an obvious thing for her to ask. The fact that there was grass and trees around she could figure out for herself.

Similarly "this is a box with information on it" is not going to be useful to anyone. She may have been asking because she wanted to know a bit more about the product.

I think regardless of whether it is designed for us or not, the AI is just not as good as the others right now.

But given it is on my face and just ridiculously convenient, I can easily forgive it because when it does work it just blows my mind.

And the video calling thing is brilliant. The other day the post lady threw a parcel into a bush. I was able to video call for help. I would never have found it there in a million years. Sure I could have used my phone, but having two hands to prod around was very useful.

By Holger Fiallo on Monday, June 10, 2024 - 16:18

If she did asked it probably could not respond and make a statement about privacy. Nuts.

By Ollie on Monday, June 10, 2024 - 16:18

Yes, it probably saw nuts.

Yes, the video calling thing is brilliant and you are quite right, the AI is pretty lame. I was trying to use it today after I read your messages on here, you inspired me... I just don't really use the AI, but use it for tons of photographs, videos and video calling.

Audio, I'll put my airpods in, reading things I'll use seeing AI, but it's just nice that we have this range of tools for different things.

the real game changer is going to be when Open AI release their hardware and can provide real time visual feedback. That will be end game stuff. It'll be like having a guide with you at all times.

There are rumours about ear buds with cameras in though, as I've said elsewhere, this seems a stupid form factor, especially for those who have long hair, beards, fat faces etc... :)

Jonny Ive is involved so it'll probably be beautiful to the detriment of function.

By Orlando on Monday, June 10, 2024 - 16:18

Hello everyone,
I have been researching these smart glasses for the last few weeks and I just ordered myself a pair yesterday. It appears as though that there is room for growth with the product such as this. I found the article below from envision AI back into January of this year. They would like to be integrated with this Application and device! I feel that would be awesome! And I would wonder if other apps services such as be my AI, or seeing AI could be integrated into a product such as this?
When I get mine, I will be submitting some accessibility feedback on this very subject.
https://www.letsenvision.com/blog/ray-ban-meta-smart-glasses-accessibility-envision

By Gokul on Monday, June 10, 2024 - 16:18

Hay that could be brilliant actually! All the features that envision glasses offer with the meta hardware. Consider the cost factor! Plus it'd solve the OCR problem if it actually happens. Though, Like Ollie was saying, the real thing is going to be the open ai multimodel thing in a pare of glasses...

By mr grieves on Monday, June 10, 2024 - 16:18

You beat me to it - at least I don't need to apologise for my crap nuts joke now.

I do find the Meta AI a mixture of totally useless, quite fun and quite useful depending. If I have a number of things I want it to read the labels of so I can choose one and I roughly know what the options are, then it's quite good. It'll sometimes blurt out something random but I know if I've never heard of the answer then it's probably wrong. If I'm out with mrs grieves and I find myself standing around waiting for her, I will use it to try to get a feeling for where I am which stops me getting bored. She enjoys getting me to ask it when we see odd things.

I also use the AI to just ask random questions like I might an echo. I think I posted this elsewhere but one time we were driving home and I asked where we were. I then said tell me some facts about that place, and it told me that a dinosaur fossil or some bones or something had been found a few years ago. So I asked what kind of dinosaur and it told me. I then could say "oh, what does that name mean?" and it translated it. I then asked what kind of dinosaur it was, whether it had tiny little arms like a t-rex, who found it and when. And it was really good at giving me the details.

Or if we are visiting a heritage site or something I can ask questions and get random little details. It's really quite good at that.

And yesterday my wife saw lots of Croatia flags about and said "oh are they in the Euros?". So I was sure they were, but I asked Meta and it said yes, and then I could ask who they were playing first in the tournament and it was getting me all the live information out of it.

Not really accessibility related, but I just find it handy being able to ask thin air whatever random crap life throws up.

But you are right, I really, really want to be able to have it describe what the ducks are doing and help me hail a taxi. But that won't be the end game - it will just provoke the next "but what if it could do this too. The more powerful these things are the more it fires up our imagination..."

I also agree that air pods don't seem like the right place to have this. I personally don't like shoving things in my ears much anyway but I can't even imagine how the camera would work on them. I would never walk about with my Air Pods 3 in my ears as they could fall out at any time and that would be the end of that. Maybe the Air Pods Pros are a bit more secure. But I love the feel of the glasses.

By Ollie on Monday, June 10, 2024 - 16:18

Regarding 3rd party applications on meta ray-ban, as great as it might be, I don't see it coming. Meta wants content, they really don't care about shipping units when it comes to the ray-bans. Letting people add applications, even accessibility based ones, would not be in their best interest. It doesn't keep us on their platform and doesn't provide them with usable data. I think other companies will allow for this, open AI are a good example, as are apple, but meta has a long history of only doing what is right for meta.

I'd love to be proved wrong, of course, but I'd certainly not buy these glasses with the hope of a suite of accessibility features coming to it. Sorry, I don't like skeptics either.

By Holger Fiallo on Monday, June 10, 2024 - 16:18

Would the EU would not insist? They did with Apple regarding their iOS. Or that is only for phones?

By mr grieves on Monday, June 10, 2024 - 16:18

I think it is quite different here.

With iOS it was an anti-competition thing. Apple had an app store that only they control and therefore they have a monopoly and can do what they want, which is bad for consumers.

With the glasses, they aren't a platform they are an accessory. So telling them to open it up to other apps is almost like forcing them to make the glasses into something they are not which doesn't make much sense to me, much as I might want it.

I'm not sure technically how easy it would be to open up access to the glasses camera, microphone and voice assistant to other apps. I'd love it if they did but it also feels like there is no great reason for it unless they can somehow license it.

I think there are a number of things that we want that could maybe be framed in a way that isn't specific to accessibility that other people might find useful.

For example, being able to start turn by turn directions from your glasses or being able to find out what shops are nearby but out of sight.. Maybe "what places are there to eat nearby?" and then "ok, direct me to xxx". That's not specific to us but we would benefit. My wife does similar things with the google assistant in the car so a more portable version would be good for her too.

Similarly for OCR, there may be some benefits for sighted users to be able to grab text they see in the world then copy it as text into a message or note or something. And there are a lot of people with low vision who are nowhere near blind who might find it helpful if it could read small text to them.

By Holger Fiallo on Monday, June 10, 2024 - 16:18

Do meta allow third party apps? That was the issue with apple.

By Ollie on Monday, June 17, 2024 - 16:18

The apple EU thing is because apple's mode of charging developers on the apple market place, the app store, was deemed anti competition. there is nothing to make meta or any other device include apps they don't want it to include. It would be completely down to meta to choose to include such a thing. Maybe if they had an app store for it and the app store was big enough, that would result in some action, but it wouldn't make any difference in this particular case.

By Holger Fiallo on Monday, June 17, 2024 - 16:18

Thanks.

By mr grieves on Monday, June 24, 2024 - 16:18

I saw the other day that there has been an update for the glasses. You can now change the maximum length of a video clip from 1 minute to 3 minutes. I You have to do so from settings. There was also something about being able to use Amazon Music but only with car-play which seems a bit odd.

Other thing, I was listening to the Verge podcast yesterday and they were talking about these glasses. Apparently they have shipped a million of them which is far more than they were expecting. They were talking about the possibility of a new version maybe coming out next year with AR. Unlike Vision Pro which has everything on a screen in your face, this would be you looking into the world with things projected over the top of it. I guess like a HUD in a video game. I'm not sure. how much appetite there is from sighted people. Doesn't sound helpful to those of us who can't see, and I hope that they either keep the existing glasses as the entry level option or at least have a way to turn off the visuals to save battery. Again not sure if ading extra visuals is really that helpful for someone who can see. I'm guessing such a thing would cost more.

Still it is great that these are proving so successful.

By Gokul on Monday, June 24, 2024 - 16:18

Meta put out a couple of papers on multi-model (read as gpt4o-type) capabilities and suggested that these could be put out for testing sometime in the future. Maybe this kind of stuff might come to the meta glasses sometime down the line...?

By mr grieves on Monday, June 24, 2024 - 16:18

That does sound incredible. That chatgpt demo but on my face is the dream.

I'm still amazed by what these glasses can do already. However, I'd slightly prefer that they just improved what was there then get too ambitious too soon. For example, if I take a photo and ask about it I'm never quite sure if the glasses have any idea what I'm looking at or are just making assumptions. So maybe it knows I'm looking at a lake, so will just tell me something about some random lake it found online. Or maybe I'm looking at a building or monument and ask a specific question about a feature of it, but it answers based on something else in the vicinity but not what I'm actually lookking at now.

Hopefully one day AI will be good enough to rely on.

By Assistive Inte… on Monday, June 24, 2024 - 16:18

What your talking about is Augmented Reality and it is one of the two main uses for this technology - it is where they are going, we are just on step one.

You are right, it won't help us, but Assistive Reality will - which is what we are using the Ray-Ban glasses for as blind people.

By Gokul on Monday, June 24, 2024 - 16:18

Lama3 does appear quite deficient compared to, say, gpt 3.5 and even claud in it's first itteration, but hopefully Meta is investing on improving the smarts especially since the glasses have already been a better hit than even they expected. Also, I'd like full ocr capabilities to come to the glasses before anything else... The multimodal capabilities are for somewhere in the little more far-off future; far enough that it's far, but not so far as not to be imagined.

By Graham on Monday, June 24, 2024 - 16:18

Hi, Do all the what are you looking at features now work in the UK? also can you make unlimited time vid calls so a family member could guide you through a shopping centre using what's app or FB msgr?

Kind regards

Graham

By Assistive Inte… on Monday, June 24, 2024 - 16:18

The video calls are WhatsApp calls, so they are as long as you want, from what I can see.

By dan kysor on Monday, June 24, 2024 - 16:18

i would have been nice to have record more than one minute. how about the sunshine oho video glasses with access to siri, much better deal for $75

By Cliff on Monday, June 24, 2024 - 16:18

The new 6.0 update that came out a couple of days ago now adds the ability to extend the video recordings to 3 minutes. You'll have to change that setting yourself in the Meta View app, but it's rather nice to go from a limit of 1 minute videos previously to now being able to record 3 minutes. So far I'm loving the glasses, and again, VPN is the answer for getting all AI functions enabled outside of the US. I'm in Norway, and it works as a charm for me here :)

By Assistive Inte… on Monday, June 24, 2024 - 16:18

These are a thing, but I didn' see anyting about Siri on Amazon, just Mac/Windows. If they could work on the iPhone, they would bb the obvious companion to the Envision apps - for £2650 less! So probs something we are al missing.

By Assistive Inte… on Monday, June 24, 2024 - 16:18

I'm wearing these more and more. They are still a bit weird at times, there is a way they work you have to get tuned in to, but they are fabulous really. For the price, they are the best assistive technology I have ever bought - even if I am paying £10.99 a month for a VPN.

On the downside, I found myself looking on Amazon earlier, to see if I could buy Twizzlers and Snapple in the UK!

By mr grieves on Monday, June 24, 2024 - 16:18

Well I did not know these were a thing. I have the Oho Sunshine audio glasses. I thought they were extremely comfortable, the battery life was great but the sound was pretty bad. I ended up pairing them with my Apple Watch and they were just a tiny bit better than just holding my watch up to my ear.

With the video glasses, is the idea that you wear these when you are out, take a load of videos or photos or whatever, and then download them on your computer when you get home? So they are completely independent from the phone? Seems a bit weird to give them the same brand name if they work in an entirely different way if so.

Doesn't sound like they can really compare to the Meta Ray-bans but interesting to know about them.

By Assistive Inte… on Monday, June 24, 2024 - 16:18

In the latest update you can now connect Amazon Music and Calm to the Meta glasses. I wonder who/what is driving this? Could AIRA do an AIRA Anywhere integration? Could Envision also? That would be cool!

Or even, Seeing AI? I would love to be able to use that on these glasses.

By mr grieves on Monday, June 24, 2024 - 16:18

I think I totally misheard the release notes before. For some stupid reason I thought that it had Amazon Music support but only with Car Play. Which didn't make sense because, it seems, that's not what it was at all. You'd think I'd be better at this sort of thing by now...

So Amazon Music works like Spotify Tap I think where you can have it play what it thinks you will like but can't actually choose anything. Is that right? Whereas I think Apple music will allow you to play anything? I noticed Spotify is not showing up in settings with the others. Not sure if that's because it's the only one I've setup or if it's been given the boot? I don't really use it anyway.

And Calm is a meditation thing and nothing to do with the wheely thing you sit in.

God I'm such an idiot. I can only apologise. I appreciate anyone who politely ignored my stupid message from before.

By Gokul on Monday, June 24, 2024 - 16:18

I don't know if these would work with the envision app the way we imagine it to. I never thought about it; I guess I have a pair somewhere around here; let me see. But even if they do, it will never give us the seemlessness that the meta glasses do. And, I do feel that this is the best piece of assistive tech I've ever owned, maybe just below a computer with a screenreader (that could just be sentimental value); and yes, addition of something like envission app or even just ocr facility would make these incredible.
Speaking of which, why don't we think about pushing the envission devs to approach meta to have some kind of collab? But I guess that'd totally destroy the envission glasses so...

By mr grieves on Monday, June 24, 2024 - 16:18

I was listening to Double Tap last night and Steven was saying that the AI on his glasses had just booted him out saying not available in your region. So looks like it is not available in the UK yet unless you use a VPN.

I hope they get a move on!

Someone has said that there is a page that shows availability of it but I've not yet found it after my admittedly slightly half-arsed look.

By Assistive Inte… on Wednesday, June 26, 2024 - 16:18

Are introduced below the fold - a hint for those of you who avoid walking on the wild side!

By mr grieves on Wednesday, July 3, 2024 - 16:18

I was listening to the RNIB podcast and was surprised that they recommended using a VPN to get Meta AI. They suggested Windscribe as it is free.

I downloaded the app and registered. If you provide an email address you get 10gb of data per month and can unlock your account. But you can choose not to use an email address and get 2gb. (I initially didn't put an email in then went back and did after it warned me and it says I only have 2gb left so not sure if that's a bug)

Anyway, once in I was able to choose a US server easily and connect.

If you are just using the VPN for the Meta AI then this is plenty. You don't need to reboot or uninstall MetaView or anything like that (as was made out in the podcast). Instead what I did was have MetaView closed, then I opened it once VPN was on. Then quit and reopened again and I think that was it. It's obvious if it works because you are prompted when you open the app. (If that doesn't work try using Look and Tell, then quit and reopen the app again).

Once AI is enabled, you can disable the VPN and it should last a month or so. Then just repeat the process when it stops working.

I haven't tried Windscribe for this specific purpose but I reckon it will do a job better than Proton VPN for those not wanting to pay for a VPN. With Proton you can't really choose country on a free subscription although I have a feeling maybe it will work every month or so. (In my test it seemed to allow it a few days after my Meta AI stopped working)

Anyway I've not head of Windscribe before but it seemed accessible enough and it took me no time to get it connected to a US server.

By mr grieves on Wednesday, July 3, 2024 - 16:18

As usual, I wasn't paying enough attention. You have to confirm your email before you get the 10gb which I've just done. So that explains why I only had 2gb.

By Assistive Inte… on Wednesday, July 3, 2024 - 16:18

The AI is really good, but stupid. It took me three questions earlier to deal with an envelope I picked up off the mat:

1. Whats this? "an envelope"
2. Who is it addressed to? "my name"
3. Who is it fromm? "Humanware"

It was only four pages, I managed to get it to read most of each page with one or two questions. I think it did invent a new product - I can't find the Harc Reader with AI anywhere online!

By charles on Wednesday, July 3, 2024 - 16:18

I've been reading this thread, and have a thought. Even when using GPS apps on my phone, results are not immediate. Could there be a slight delay when using these glasses when asking which direction you are facing? I plan to get a pair next month. Having been blind for 70 years, this will be a thrill. Actually wearing glasses for a reason other then for looks.

By Assistive Inte… on Wednesday, July 3, 2024 - 16:18

There are lots, obviously, but more than you would think ask "What if they tell you someone is standing right next to you when you think you are alone?"

I was terrified just now, I thoght my cat IgglePiggle had become invisible! turns out I was just looking over her. Phew!

By mr grieves on Wednesday, July 3, 2024 - 16:18

I think it just makes the compass directions up. Every time you ask you get a different response. I think it is just doing the AI thing of plucking some random thing out of the air and assuming that is correct. I don't think it is tied into the compass at all.

I noticed I was using it yesterday with look and task and it seemed to have a lot of difficulty with follow-up questions and kept telling me to start my questions with "look and". Like I'd ask "look and tell me what this is", "it's a bottle", "a what kind of bottle?" "if you want to know what you are looking at, start the question with look and". Not had that before. I thought maybe the feature was a bit broken but on my last attempt it did work. QUite annoying when you usually have to do a couple of follow-ups to get the answer.

And I still get a lot of phantom magic taps. Like I'd ask a question, it would give me the response then Easy Reader would start playing. Has no one else had this sort of thing? I started closing all my apps when going out with this, but then Easy Reader forgets where in the book I am which is quite annoying.