Ray-Ban Meta Wayfarer Smart Glasses Unreview

By Unregistered User (not verified), 27 February, 2024

Forum
Apple Hardware and Compatible Accessories

I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.

They cost £329 in the UK. They arrived two days early.

I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.

Options

Comments

By Harryubu on Saturday, February 24, 2024 - 15:33

I am also in the UK and have ordered the glasses through the Argos store online and will pick them up supposedly tomorrow! I expect they will be fairly limited in what they can do for someone who has very little Sight but I think there is potential for expansion in future. Harry

By Andy Lane on Saturday, February 24, 2024 - 15:33

Which is great news if Siri is going to get all Chat GPT’d up this summer. The way it works right now is that the phone picks up the hey siri command then opens a Bluetooth channel to the glasses in super fast record breaking time. The input to Siri then comes from the glasses. The other option is just to hold the side button and the same thing happens. The only limitation seems to be your phone has to be able to hear the wake word. Oh and you can touch and hold on the right arm which also brings Siri up ready to listen through the glasses.

By Andy Lane on Saturday, February 24, 2024 - 15:33

The sound is basically the best I’ve heard from any device that isn’t a very expensive microphone. I think the reason is that the microphones that record video are on the top of each arm. They record in stereo and your head gets in the way of the sound just like it does for your ears. Listening back on headphones really is a treat. I think this has a name in recording circles. Something like HRTF, Head related transfer function I think. Anyway, it’s worth a go. Really captures being there in audio. The mics that pick up phone calls and audio messages are great too. I think theres one either side of the bottom of the nose bridge but not certain. They sound really close to your mouth. Much better than headphones without a mic boom.

By Andy Lane on Saturday, February 24, 2024 - 15:33

I believe the 5 mics are used as a stereo pair above the ears when recording and in a 3 mic array for noise reduction when on the phone etc. The right arm of the glasses have a button and a touchpad. The button is for recording functions and the touchpad changes volume, plays and pauses, answers and hangs up. Apologies, I just checked before telling you incorrectly. Long press is actually the meta assistant. The other ways I described of using Siri do work though. Long press side button of phone and hey siri/siri spoken loud enough that the phone picks it up then it opens that super speedy Bluetooth channel to the glasses so its already there when you naturally speak your command to Siri. Sorry about the mistake.

By Dave Nason on Saturday, February 24, 2024 - 15:33

Member of the AppleVis Editorial Team

They definitely sound interesting, even more so when the AI features come along.
What is the quality of the speakers like, and is there much sound leakage? i.e. Can everyone within five metres of you hear it?

By Andy Lane on Saturday, February 24, 2024 - 15:33

It’s pretty impressive and a lot smarter than Siri. Maybe not quite at open AI level but Still very impressive. It’s very quick to.

By Scott Davert on Sunday, February 25, 2024 - 15:33

Member of the AppleVis Editorial Team

Can anyone verify whether these glasses connect to Bluetooth audio? The Celest glasses are more along the line of what I was looking into, but I'm curious. Also, how's the battery life?

By Andy Lane on Sunday, February 25, 2024 - 15:33

Yes the Meta Ray Bans connect via Bluetooth to your phone and audio is routed to the speakers next to your ears. The sound is definitely reasonable but not as good as proper headphones. It’s good enough though. Latency is about average. Not as low as AirPods Pro 2 but better than some headphones I’ve used. I don’t like typing using a keybaord with them but everything else with voiceover is quick enough to be usable.

By Andy Lane on Sunday, February 25, 2024 - 15:33

Sorry I forgot that part of your question. If they are on your face, you get about 4 - 5 hours. If they are on the table folded up they seem to turn off and preserve power so last basically forever until you put them back on. They recharge when you put them back in the case and I think you get about 6 full recharges from one full case battery. The case plugs in with USB C to charge.

By Andy Lane on Sunday, February 25, 2024 - 15:33

hahahahah I genuinely think you’ve just named the product that apple launch when they see how successful this product catagory is going to be. Fingers crossed it is anyway. Once they’ve finished messing around with $3.5k goggles I think this is how wearables are going to develop.

By OldBear on Sunday, February 25, 2024 - 15:33

An article was posted by Ash Rein in the following thread that suggested/speculated that Apple is considering making smart glasses, along with Airpod cameras and a smart ring.
https://www.applevis.com/forum/hardware-accessories/airpods-built-cameras
I would feel slightly better about having an Apple product than a Meta product, though I am sure it would be priced for the high end of the market, and underwhelming in features at launch.

By mr grieves on Sunday, March 3, 2024 - 15:33

So you can tao the glasses to basically do a magic tap which will play/pause/answer call/disconnect. Which I forgot about and panicked when my audio book started playing as I was trying to change the volume. Anyway... with Spotify I think you can perform a gesture (can't remember if it's tap and hold or double tap) on the arm and I think it just starts trying to play something you might like. This didn't feel useful to me so I swapped it out to replace hey meta, but I've also not used it for that either. If it could conjure Siri that would be nice.

I'm going to repeat myself from the look and ask thread now, but this one seems more appropriate to post it in.

I bought these to replace my Bose Frames Alto. I actually still really like them, but the battery life isn't that great and they are a bit cheap. The battery life here isn't much better, but being able to recharge in the case is great and really helps alleviate the problem. (Sure I could get a usb battery thing for the bose I guess, but anyway). The other thing I wanted was the promise of the AI stuff which I am really impatient to try out.

However, what has surprised me is how much I am enjoying making videos. I've nevre really done that sort of thing before, certainly not since I've been blind. But it's just so damn easy to ask meta to start a video when I'm wearing them and it's also much more likely that I might be able to be looking in the right direction. I'm recording little videos every day now. I've been particularly keen to capture some of the funny noises my dog makes and it just sounds so good when you play it back. The stereo is amazing. My wife is also really enjoying it because the videos are actually watchable - they aren't just a minute of the sky or my chin. I managed to video myself getting soaked by a puddle courtesy of my other dog today and she found that extremely funny. Honestly, this is just so good and I had no idea I was going to use them for this.

The way it uploads the photos to the app is a bit strange - I think it should do it on my wifi network, but it often asks me to join the glasses wifi network. I think maybe it depends how quickly I go into the app. But I think if it is using the glasses wifi network and you try to share the video it won't work because I don't have internet access. Not a big deal just need to be a bit patient.

The sound quality is great - on par with the Bose Frames. Still not sure how good it would be in a very noisy environment. In a loud car on the motorway with music on the radio it was usable but a bit hard to hear. But otherwise it's worked really well.

Messaging with Hey M-guy works really well too.

It's been a while since a gadget really clicked with me. I'm such a cynical old curmudgeon but these do a small number of things very well and are just a joy to use.

If I can get some Be My AI type functionality, even if it's not as good, then I will be even happier.

By mr grieves on Sunday, March 10, 2024 - 15:33

Well, as well as getting a watch - that is still half the price of an Apple Watch Ultra - you are also getting open-ear headphones so I'm not sure I quite go along with that statement even if you aren't using the camera.

I had my first chance to go out with them and Voice Vista today. On the whole it was a great experience, but I do think they struggle a little bit in a noisy environment. We were walking through a busy cafe and it was quite hard to hear - not that I really needed Voice Vista in that moment. And we had a little ride on a miniature train and were sitting near the front and it was quite loud so was also hard to hear. But, again, if you cup a hand over them it does get much better.

But otherwise I was walking around a park and around town and had no problems there.

So I'm still really happy with them. I also love how they charge in the case which is really convenient. If I'm sitting down I can just pop them in for a bit and I'm back in business if the battery is getting a bit low.

I know they aren't cheap but I think you do get decent value for money. The top of the range Bose Frames were a little more expensive and have no voice assistant and no camera - they are just the headphones and nothing else.

And I am still really enjoying the videos. The other day y terrier was racing around a field with a ball in her mouth so I just told it to start and could tilt y head in the direction of the squeaks and pants, and apparently I managed to frame it really well which made me very pleased with myself. I think it is giving e a nice connection with my wife that I had lost when I stopped being able to see photos.

By Emre TEO on Sunday, March 10, 2024 - 15:33

So at what level is the meta aı for users living in the USA, can it perform operations such as Zuck's combination with shirts and pants in a video?

By Rusty Perez on Sunday, March 17, 2024 - 15:33

I have not attempted to dress myself with the aid of the Meta AI. My physical wife is in charge of that, but Meta does do a host of other things pretty well.
But I'll talk about clothing for a second.
Meta is really struggling to balance the abilities of the multimodal ai and usability for blindness applications on the razor edge of privacy. I don't know if they know this, but I know it.
I cannot look in to a mirror and ask meta to describe my clothing. She thinks I'm trying to invade someone's privacy and gives me a lecture.
I cannot ask her what a person looks like in a picture. It reminds me of when Be My Eyes would falsely claim that the face in a picture was blurred or that an image containing a person in it was blank.
Sometimes Meta gives false alarms on privacy.

I cannot successfully ask her to read my male yet.
But interestingly, just today she did give me a workaround.
I was testing her reading on a check. She refused to read it but she told me I could ask her questions about it. So that works. I think I may try a course of action where I ask her to summarize a piece of male, or just start asking questions about it instead of asking her to read it.

Previously I was concerned about her ability to read because I was getting some really bad results when I would ask her to do so. But they have definitely tuned the AI.
I described this in another thread, but I tried reading an informational sign in a park and she did a respectable job.

I had never touched a disc-golf hole before and Meta knew what it was.

I took a picture while looking down over the edge of a culvert where water was flowing under a road. She gave me a really interesting description which started out , "I am looking over the edge..."
She does hallucinate sometimes. Often when I know the truth, I challenge her on what she said, or I correct her. It's funny. Usually her response is to apologize and say I am right!

It can only get better from here.

By Rusty Perez on Sunday, March 17, 2024 - 15:33

Recently, in one of the permissions one has to agree to, they mention, among other things, that audio and images and text will be transmitted to the ai or something like that. The mention of "audio" has intrigued me. So I've tried asking her what she hears, what I'm listening to ETC. I've gotten varying responses. Sometimes her responses are convincing. I was walking in the park last Sunday and after taking a few pictures I asked her what she could hear. She talked about cherpping birds. There was also some water in some of the shots so she mentioned flowing water. But then, at one point I asked her and she said she heard nothing. Another time she responded that she could hear city sounds, honking horns ETC. There were no such sounds. So, ultimately I'm nearly convinced that the AI is not analyzing any actual audio.

But, I would love it if she could identify bird calls. :)

By Emre TEO on Sunday, March 24, 2024 - 15:33

At first I was surprised by this. But I know FaceID can be customized to recognize sunglasses or even masks. Likewise, this is the case even without introducing the glasses to the device. Still, for a more reliable subject, I recommend adjusting FaceID again when there are no glasses and on. Despite everything, he notices.

By Brad on Sunday, March 24, 2024 - 15:33

You could try proton vpn, it's quite usable on windows so should hopefully be ok on IOS, I have heard there's captcha stuff so that's something to look out for but it's worth a try.

By Ollie on Sunday, March 24, 2024 - 15:33

Heads up, the meta AI is on general release next month according to this article.

It doesn't say where this will be launched. The verge is a pretty USA centric site. I'll continue to dig and come back with any info.

By mr grieves on Sunday, March 24, 2024 - 15:33

Ooh now I'm getting excited. I couldn't find anything in the article that said if it was still going to be restricted to the US. I hope this means we might get it in the UK.

By Ollie on Sunday, March 24, 2024 - 15:33

Sadly, restricted to US according to this article:

"The smart glasses’s AI features will only be available in the U.S., and voice features are only available in English, Italian, and French, Meta said."

By Brad on Sunday, March 24, 2024 - 15:33

That's so odd.

They're doing the apple thing of walling in people but instead of just phones it's countries, that's not good at all.

By Ollie on Sunday, March 24, 2024 - 15:33

Not good for us but there will be reasoning, server load, data set training for the AI based on specific demographics. It's frustrating but does mean it will be a more fully formed product when it reaches these shores. It also means I won't be rushing out to buy a pair and hopefully something more suitable appears in the meantime.

Should they open up the API for the meta glasses, not sure if that is possible, that will be a game changer. How they can use that to data mine though, I don't know. Currently they seem to be a device to create content to add value to platforms like insta, facebook etc, and increase engagement. There will be greater value in them than just selling them unit by unit so likely they will keep it locked down for their own purposes.

By Andy Lane on Sunday, March 24, 2024 - 15:33

It’s just a staggered rollout of a new feature thats imperfect at present and requires enormous resources. It’s a standard thing people do. Be My AI expanded in a similar way. It started with a very small beta group then added more users over time. For a company like Meta, allowing the entire world to use an unfinished product is asking for problems especially considering the horsepower it takes to run this stuff. The UK will get it as soon as Meta think its stable, useful and supported enough by their infrastructure. They have to stagger rollout somehow and beta group to geography is a perfectly reasonable way of doing things.

By mr grieves on Wednesday, March 27, 2024 - 15:33

Damn, I knew I shouldn't have got so excited. It's not a massive surprise but I am really looking forward to this. I wish they could give us an idea if/when we might see it so I can curb my expectations.

Anyway, thanks @Ollie for the update even if it wasn't what I wanted to read.

Still, AI smarts or not I have no regrets about buying them. I won't deny it was a big reason why I was tempted to get them in the first place but they have more than justified their cost even with no more features.

Even so, Meta - what about the special relationship? I thought we were best friends here in the UK?

By Brad on Wednesday, March 27, 2024 - 15:33

The article says that it will only be availible in the US.

It didn't say, at this time, or anything like that.

Things could change but if they do, something better will probably come out for us.

By Ollie on Wednesday, March 27, 2024 - 15:33

I think one has to decide if the product as it is, without any updates, is worth £300. Buying based on future unconfirmed updates is always going to be frustrating, especially for us edge usage consumers. I'm flip flopping between them being a nice way of me documenting my life in picture, video and audio for others and myself along with open ear audio, and thinking they might just be another gadget that ends up in my brimming draws of other such tech that I'd hoped would make my life easier/better.

I think I'm going to hold off, keep the £300 in my bank for now and see what is coming out over the next few months. It feels like an open AI pair of glasses will be far more useful. From the youtube examples I've seen the meta AI seems okay, but I don't know if it will be quite the AI we want to read books etc. I don't think we can prompt it like chat GPT either which is a really useful tool in the chat GPT app in settings where you can specify how the AI responds, from tone to telling it we're blind and are looking for more detail in the description/facial expressions etc and less flowery talk, as it seems to default to. I want cold hard data, not a poem, chat GPT!

By MarkSarch on Wednesday, March 27, 2024 - 15:33

I live in United States and I have submitted a Podcast to applevis now is under review, not sure if it will be accepted.
Hopefully if its you will get some idea how Meta AI works in the demonstration.

By Ollie on Wednesday, March 27, 2024 - 15:33

Thanks. This will be very helpful.

If it is viable I think a VPN to the states could make it work though I am aware it also uses geo location on the app to assist with landmarks etc.

By Louise on Wednesday, April 17, 2024 - 15:33

Here goes another experiment. I decided to give the Meta glasses a go. First thing that is giving me anxiety is that I ordered about an hour ago, and I haven't received a confirmation email. Maybe the folks at Meta are off enjoying their Sunday, but This does seem strange.

I'll let people know how I think they compare with the Seleste glasses. I'm finding them a bit clunky, but am willing to hang in there a little longer.

By Holger Fiallo on Wednesday, April 17, 2024 - 15:33

I curious how they be in 2 years. Looks good. Not sure if I would get them. I am sure Apple would release their own and with their Smart Machine or AI it will be good. Will check the podcast if AppleVis let it in.

By mr grieves on Friday, April 26, 2024 - 15:33

So thanks to the power of a VPN I have managed to get the AI working too. I love it, of course, but am finding a few issues.

Firstly, I can't get it to read any text. My wife pointed it at a magazine article and asked it what it said, but it just described the images around it. If we asked it to read it would say it couldn't read messages. Similarly, we were out on a walk and I asked it to read sign posts, but it would not tell me what was written on them. I was somewhere where there were different routes - so the signs were something like Elephant Walk goes left, Church Walk goes right, but it couldn't tell me that. I asked it which direction should I go and it said something like "depends which walk you want".

Is there a particular way I need to phrase this?

It is a bit short on detail too, but I think it is in the phrasing. So I said "look and tell me what you see" and it said something like "you are on a winding path and there is lush greenery around". I was walking through the middle of a forest and it didn't mention there were any trees. If I asked it to give me a lot of detail it then did give me a better answer. Nothing like Be My AI level of detail, but given my existing position is nothing at all, then anything is appreciated.

I do like how you can do follow-ups with it. So when it did mention the trees finally, I could say what type are they and it told me it thought they were oak or ash. (Which was correct)

My wife was gobsmacked when it managed to correctly identify two of her plants which was impressive.

I also like how you can ask it general questions - my imagination only managed a couple of capital cities but I'm guessing you can just treat it like a portal Echo in that regard which would be helpful.

It doesn't quite cope with sounds. My wife held out her phone and played Dancing Queen by Abba and it thought it was Justin Timberlake. She then played Staying Alive by the Bee Gees and it thought it was Mark Ronson, and then later Happy by Pharell Williams. When it got to the chorus I tried again and it gave me a weird answer spelling out the date and time and telling me, I think, that I would have to check the local radio station listings to know what it was. I should say that my wife was trying to pick very well known songs with very obvious tunes that it should be able to recognise and not the obscure random crap I listen to. I also asked it to describe the noises around me in the forest and it said it couldn't.

I've also found a number of times that it has seemingly lost connection with my phone. Maybe it's because I'm still using the VPN. Often repeating the question works but not always. I did have a senior moment though - it wasn't working for ages then I realised I had put my hood on over my head because it was raining and the arms of the glasses were being covered. Taking the hood off did fix it. Duh.

I did want it to try to find our car in thre carp park but it had decided it didn't want to play any more at that point.

When we got back home I asked it what colour one of my wheelie bins was and it got it right.

One thing I was curious about is whether it saved all the photos you were asking questions about. I was pleased to find that it did not. I'm guessing you can maybe ask it questions about photos you have just taken but I have not tried that yet.

All in all, I am incredibly excited to be able to try this. I probably need to learn how to get the best out of it and no doubt it is going to improve anyway. I am a bit disappointed about the text, so I hope there is a way around that. But it has certainly made a product I absolutely love anyway even better, so I am not disappointed.

By Holger Fiallo on Friday, April 26, 2024 - 15:33

Maybe it does not like Disco.

By mr grieves on Friday, April 26, 2024 - 15:33

Yes, it didn't occur to me that the Meta Assistant might have music taste. Maybe it's as snobbish I am!

By Ollie on Friday, April 26, 2024 - 15:33

It's not great at reading text. It summerises. I think speed is rated more important than detail. If you're in the meta app and shake your phone you can send in a report to ask for a blind friendly mode for such things. Most people are using this to caption scenes so, unless we ask, it won't produce verbose description.

it'll get better too. I think short answers are good enough for when we're out and about, we want information quickly, not a long description of a bus thundering toward us.

By Holger Fiallo on Friday, April 26, 2024 - 15:33

I should have both. Suppose you have a ltter you want to check and you want to know ASAP.

By Louise on Friday, April 26, 2024 - 15:33

When I want all the text on a page read, I say "Look and read all the content on this page." or something like that, and it usually does. I really appreciate the ability to also summarize.

Getting it to read the instructions on a food package was a little tricky. I first had the package sideways, and it wouldn't read. When I got the text upright, and asked it how to cook this product, it told me to follow the directions. I had to be really specific and ask it to read me the directions on how to cook this product. When I did that, it worked fine.

It's going to be a matter of it learning what I like and me learning how to ask what I want to know in a way that works for the AI.

But the speed. The speed is amazing.

By Ollie on Friday, April 26, 2024 - 15:33

Nope, no full reading here. The visual description is usually not 100 % accurate either. It'll get better, it's just not the tool we need, yet.

There is no prompt, and the fact I have to do some linguistic gymnastics to pin down even a partial read, is a problem. Seeing AI gets it far quicker and more accurately but, to be fair, that is using simple OCR, which is all on device.

I'm just saying this, to temper people's expectations. They are a V2 product after all. The speakers get drowned out in noisy environments, meta AI, as it stands, is far behind chat GPT, the app has bugs, the main one being the ability to change the speed of the voice... But it's not bad. I can take videos when out and about and get good sound. I can whatsapp family and friends, as long as I have a good enough signal, and video call them, which is epic. It does a lot of things okay, the whatsapp video calling is the win for me though.

By Brooke on Friday, April 26, 2024 - 15:33

I'll be buying these soon. I noticed last night that even with using BeMyAI, I scanned a sheet of paper with text on it and was given a summary. I had to ask it to read the whole page and then got everything. So this seems common.

By CrazyEyez on Friday, April 26, 2024 - 15:33

I'm also going to try them.
I'm going to compare them side by side with the seleste glasses.

By Holger Fiallo on Friday, April 26, 2024 - 15:33

Maybe they will let third party apps so people can add seeing AI to get documents or other things read. How does it does with money and coins?

By mr grieves on Friday, April 26, 2024 - 15:33

I initially thought the AI was just about asking questions from the camera, but it does seem fairly general purpose, and up to date.

I asked it what the scores were in the English Premier League. It took me Arsenal's score and said they had won with a last minute own goal. I then asked what the other scores were and it reeled them all off. I asked what games in the Premier League tonight and it trotted off a load of teams I'd never heard of - presumably some other country's Premier League. So I asked for the English ones and it said there weren't any, which is probably correct.

I thijnk it doesn't always know what information is the most relevant but I suspect you can probably get there with a bit of proving in the right way. And, as has been said, it is pretty fast.

By Brooke on Friday, April 26, 2024 - 15:33

OMG, it says they won't be here until May 10th! It's going to be a long few weeks!!

By Holger Fiallo on Friday, April 26, 2024 - 15:33

How much are they?

By Brooke on Friday, April 26, 2024 - 15:33

$329. I got mine on Amazon so I could then get the rewards points.

By Holger Fiallo on Friday, April 26, 2024 - 15:33

Thanks but sadly I could not get it if I could afford it. My left ear is not well develop so would not be able to hold the left side of glasses. Would need those that are call wraparound glasses.

By Ollie on Friday, April 26, 2024 - 15:33

Okay, I'm glad to know, well, sad to know, that it's not just me getting the summery. I think there is a need to get familiar with how it works. It's almost like a decision tree, you get the brief summery of the image, you then have to drill down into what it's said. I'll keep playing as this might be what is needed to get full read, but by then, though meta ai is fast at responding, it will have taken a long time to get there.

What I'd really like is to be able to hold up my phone and flick through photographs like a normy and be told what is on the screen.

I think it's a really cool product and am excited to hear how people are hacking it for our specific use case.

And, Holger Fiallo, there is always a way. I'm wondering if a kind of bungy chord around the back of your head, and maybe over the top, very much like they have on the apple vision pro, might not provide enough stability to use them. Designing solutions is one of my hobbies, 3d printing etc. I'd not want you to pass on something because it's not quite right for you. We've all had enough of that over the years.

By Datawolf on Friday, April 26, 2024 - 15:33

The glasses sound pritty awesome and I could imagine getting them, the only problem I am having is that Meta is involved. And I trust that company as far as I can throw my riding lesson horse which is a 1900 LBS Clydesdale Mix.
Still curious though, are there demos on the AIcapabilitys in action?

By CrazyEyez on Friday, April 26, 2024 - 15:33

everything tracks you these days. No avoiding it if you want to own smart devices.