Meta's Smart Glasses Gain Live AI and Live Translation

By Oliver, 17 December, 2024

Forum
Assistive Technology

The Ray-Bans are getting live AI too according to the following mac rumours article:

https://www.macrumors.com/2024/12/16/meta-smart-glasses-live-ai/

Trouble is, only for those on the early access program and only in North America. Considering we're still waiting on look and tell here in the UK, I don't imagine we'll be getting this any time soon.

So close yet so far. I do hope rumours of apple glasses are true, though likely they'd be a long time coming and launch with only apple AI which, even by then, will be several generations behind.

If any are in North America and get this update, let us know how it works. Is it conversational like Chat GPT and Gemini? What are it's limitations? Can it, god forbid, read anything yet rather than frustrating summaries?

Options

Comments

By Gokul on Monday, December 23, 2024 - 23:12

@Ollie so true!
"Eventually, the AI will be able to give useful suggestions even without asking". That's a step beyond actual, full-scail realtime monitoring of the camera feed; not that I really like the idea of AI, for that matter other human beings, giving me suggestions without asking, but then it almost puts us in position to have live vision in the way open AI demonstrated back in May.
That May seems so long ago...

By 3AM on Monday, December 23, 2024 - 23:12

I've been attempting to play with the Live AI feature all day here in California.

I must admit, it's been a frustrating experience so far. I struggled whether to share my experience here on AppleVis. But then I got a calming impression that reassured me of an important fact--something I'm becoming more and more simpathetic thus far. "This is actually the worst it's gonna get!" And this is true!

So how has it been? Well, last night I saw Live AI and Translation available in the MetaView app's settings inside the "Early Access" menu option. Needless to say, I was extatic! But I had to wait till the morning to really take it for a spin.

Come this morning and I was pumped. I fired it up by saying "Hey Meta, start Live AI!" And within a few seconds I heard it; Live AI session with video starting now." I then spent about 2 minutes asking questions about my immediate surroundings, my office space.

My immediate impression was, WOW! It was actually pretty good. I asked it about what was on my iPhone screen. And it did so well. The speed was great, almost instant. It was accurate but not too wordy. It answered my questions quite specifically. For instance, I asked what it saw on the screen. It said, it looks like you are in the YouTube app; which I was. Then I asked, which video am I pointing to? And BAMB! It got that right away.

This continued for about another minute and a half. Then I got more excited. I thought, I gotta try this outside, on the streets of my neighborhood.

Filled with anticipatioin, I grabbed a jacket, a hat, and a few small items and headeed for the door. Now on the sidewalk, I asked meta to start Live AI.

I wouldn't connect. It refused to connect. It kept saying check your bluetooth connection or restart the MetaView app and try again. I did that about a thousand times. I rebooted the iPhone, the Meta glasses, the app, everything. And then, nothing! It would simply fail to make a solid connection.

Obviously I became more and more frustrated. After returning back to the office, I kept attempting to get it working, thinking it may be due to poor cellular signal. Perhaps my wifi would give far better results.

I can honestly report, that was not the case. It is just not reliable at all at the moment. It did present great potential when it did work. However, I think, with all the Live Video AI feed releases from Google Gemini and OpenAI, Meta felt compelled to launch it on the Metas--prematurely...

I am confident it will improve rapidly, in time. I saw the real benefit and potential, even if it was for a couple minutes. That was enough for me to experience it for what I'm sure Meta hopes it would be for everyone even now.

But unfortunately it will take time. It is an Early Access feature.

I started to get upset but quickly recognized it's not ready for primetime yet. And hey, we have got to be ok with that right now. This was unthinkable a year ago for me! And today, here we are!

Google's recent launch of Gemini 2 and live video is fantastic. OpenAI's debut of their own take on live video is also upon us. Meta is so appealing for us Meta owners and users that today things are really progressing far quicker than expected.

None of them are perfect. Far from it. But they've launched. The cat is out of the hat. And now we have front row seats to the neck breaking advancement in live video ai models.
I can safely say, this is the very worse, it's going to be. It's uphill from here. And that excites me like few things in recent years!

So if, and when we get frustrated with the inevitable flaws we undoubtedly will encounter while using Live AI, just remember; it's as bad as it's gonna get... Hold on. It will get better, and better, and better!

By Oliver on Monday, December 23, 2024 - 23:12

That's a really good way of looking at it and thank you for sharing your experiences.

I have a feeling that, when all of these are out of beta, we'll be using each in different ways. Meta, for example, does sound a very useful one for navigation, though with the huge warning, do not rely on the information, always confirm in a dangerous situation. It could help to, say, find a shop front, a turning on a straight shot path, etc whereas something like Open AI will be better for detailed tasks, following instructions in building something etc.

It's amazing how quickly we normalise this. It's incredible tech and to have this five years ago would have blown our tiny little minds. Now it's here, we're really looking at how it fits in to and complements our lives.

Yes, this is all in beta but, for obvious reasons, we're quick to jump on board. I do hope it makes it to the UK soon though. I feel we're, as with the rest of the world, being given a bit of a short straw. I know they have to stabilise the network, and so on, but you'd have thought that the cradle of the english language (my english pride showing a little there), it wouldn't be a big step going from US english to English. I understand with other languages, to a point, though, with that much compute power, I can't see it would be much of an issue.

Bring on the apple glasses where we can choose our flavour of AI for the task at hand.

By Gokul on Monday, December 23, 2024 - 23:12

Well, the connection drop issue is everywhere I guess then, with or without live AI or even AI at all. It kept dropping the connection and trying to reconnect throughout today for me.

By Oliver on Monday, December 23, 2024 - 23:12

It all feels a little precarious with all the AIs, but especially meta. I don't think rock solid reliability is their current goal, rather it's just a race to dazzle.

By 3AM on Monday, December 23, 2024 - 23:12

I should also point out one unforeseen bug.

I own and cycle through 2 pair Meta glasses. I switch when I hear the glasses prompt saying, "glasses are at 15%. Charge now!"

The bug is, the first pair I used to enable Live AI Early Access will connect to Live AI. No matter what I do, the second pair simply refuses to make the connection. In other words, for some odd reason, the feature won't activate on the second pair at all.

I've rebooted both the iPhone and glasses numerous times with no success.

I would hate to have to delete and reinstall the MetaView app. But I will, if that would fix it. I doubt it though.

I have submitted a support ticket with Meta.

I'm curious if others with dual pairs can confirm this bug as well.

By Oliver on Monday, December 23, 2024 - 23:12

Am I correct in thinking you have to submit the serial number of your glasses for the early access? I'm guessing you may only have one pair registered.

By Icosa on Monday, December 23, 2024 - 23:12

There's a lot more to it than just converting to UK English, the real problem is privacy laws and data protection regulation. They will have to have lawyers review the laws individually for every country they expand the service into. This can be a sensitive issue for sighted people because you can be certain some people will abuse the glasses.

By 3AM on Monday, December 23, 2024 - 23:12

Since my last update, I suddenly got a message saying "Your glasses have been updated."

It appears not both glasses had recieved the latest update. I'm not sure what happened. I did get an update a couple days ago on a pair. But when I attempted to update the second pair, it would say, "Your glasses are up to date." I found no way to trigger the update on that pair. I didn't know how to check if they were indeed updated to the latest version. Nonetheless it finally updated the Metas last evening, and here's now my experience.

Last night, my wife and I went Christmas shopping to a large discount store. Since this last update, I've noticed Live AI connected and functioned quite good. So I was anxious to try it in the wild, for real world tasks, store shopping.

Let me tell you; it worked fantastic! Here's my most recent experience.

Upon entering the store, I found myself in front of a women's clothing rack with sweaters on it. I am no longer able to see colors very well, nor read price tags or sizes--essential details for independent shopping. So I quickly fired Live AI. In about 10 seconds I was up and running.

I asked, "What is the price for this item?" Almost instantly the reply came. Then, "What size is it?" Again, immediately I heard it. There was no lag, no stuttering. It was as quick and natural as it could be. I found the responses quick and accurate; no more details than what I asked. It didn't color the replying with extra verbiage describing additional details I didn't ask. I found this very helpful.

It was just like having a second set of useful eyes right there beside me.

Needless to say, the whole experience shopping went excellent. It worked as intended and expected. I used it extensively during the night, for about 2 hours without any real serious bugs. The only mishap I can remember was, when asking for the size of a pair of pants, it spoke the actual numbers in what sounded like French! Yeah! French... They were size 32 by 32. My wife and I got a real laugh out of that.

I asked follow-up questions to try and get clarity as to the size. "Repeat the size, but in English." It kept saying it in French! This was quite odd. I couldn't seem to find how to guide it to make it aware of what it was doing. However, this was as bad as it got...

Thoughout my experience, I found myself activating Live AI constantly. Yes. I know I could leave it on. But I didn't want to deplete the battery super quickly since I only had one pair on hand.

I'd start Live AI, ask several questions about a product, get all the info I needed, then stop Live AI. It worked every time. I got the desired answers, again, flawlessly.

On one occasion, I left it active for about 3 minutes while exploring a section with a number of items in packagings. This is normally a useless moment for me since I can't tell what the items are. But last night was totally different--for the better!

I didn't need to say the trigger phrase, Hey Meta. Simply asking, "What is this? was enough to instantly trigger a short, precise response letting me know exactly what I was holding or even pointing to. I gotta tell you, this felt liberating!

Normally I'd need someone to shop alongside me the whole time. So you can imagine how utterly excited I was to experience this level of useful assistance from these glasses. For the first time in forever, my wife was able to hit the racks without me needing her every few seconds to stop her shopping to tell me what I was wanting to know.

Last night I experienced a level of independence I hadn't felt in many years.

Needless to say, I was pleasantly surprised at the rapid improvements from Live AI. It's literally been hours, not days, since we were given this ability. And already it's transforming shopping for me personally.

This blows me away...

Think of it! Today, from a discrete pair of glasses, we are now able to get realtime feedback about the world around us! Yes. It's true. I could easily invoke a Be My Eyes assistant or an Aira agent to meet many of these needs. As I see it, Aira is not free. Be My Eyes is also quite useful. But in these cases, it's a person on the other end, listening to everything, viewing everywhere you look, etc.

In my opinion, most times I'd prefer an AI assistant that hangs out with me, is there when I need something, but is not a person latched to my every move. These services and platform do have their unique and important place.

If I'm at an airport needing assistance to a gate, or finding a public restroom, for navigation and the like, Aira and Be My Eyes will always be essential tools.

However, for more trivial but needful guidance like product descriptions, reading signs, tags, labels, etc. Meta's new Live AI is proving to be a viable option for people with little to no vision, like me.

And to think, this is as bad as it should really get, blows me away!

By Justin Harris on Monday, December 23, 2024 - 23:12

That is incredibly awesome!!! I have been someone who has not been on the AI train. I find all the Apple Intelligence stuff to be, to put it nicely, not for me. But what you have described here sounds incredibly cool!
How did you go about choosing a style and color? I'm getting rather interested in these, but don't know what styles and colors are available, and wouldn't want to pick something on accident that would look girly. Or, is there pretty much just one model and color that is for both?

By Brian on Monday, December 23, 2024 - 23:12

If you want a pair of, universally designed, Meta Ray-Bans, then I would recommend you go with an all black pair of Wayfarer's or Skylar's.
@3AM,
Leave it to a pair of universally designed smart glasses, not intentionally intended for the visually impaired, to guide the visually impaired to the next Evolution of independence. 🤭

By 3AM on Monday, December 23, 2024 - 23:12

@Brian
Indeed! Ray-Ban Metas are perhaps one of the best accidental accessibility masterpieces so far!

They are proving to be more and more useful as AI continues to make huge advancements.

It's hard to estimate the value these glasses add to the blind & low vision community. And it just keeps getting better, quickly...

I will be going to lunch soon to a phenomenal Thai place nearby. I'm anxious to put Live AI through the meal decision process while evaluating its menu scanning capabilities.

Stay tuned for my next update later today; I hope.

By Orlando on Monday, December 23, 2024 - 23:12

I am anxiously awaiting this update…. While i am waiting i have a question for those who have received the update…. How is the battery life of the glasses? And is there a time limit to use the video function?

By Orlando on Monday, December 23, 2024 - 23:12

One more question… has the OCR functionality improved with the video?

By MR.TheBlind on Monday, December 23, 2024 - 23:12

Hello guys! So I am as of today a new user with the meta-raybans glasses. I am so excited I got them today from my birthday/early Christmas and I am really loving all the features that you’re able to do. My question is, how can I enable this new update to have the live video AI? I got them all set up and everything but I just wanna figure out how can I sign up for the early access program and how do I enable the video AI thank you so much. So excited to play with them.

By 3AM on Monday, December 23, 2024 - 23:12

Today I failed miserably. I did go to lunch. But I was in a group setting that didn't offer me an ideal moment to try Live AI, mostly due to intensely engaging conversations. I owe you guys my restaurant menu perusing experience. ;)

@Orlando
It's difficult for me to assess the battery life. In my testing so far, it appears to keep the connection about 2 to 3 minutes at best. Then I have to give It the "Start Live AI" command again. I can only deduce it will have a measurable impact on battery life; hence why, if at all possible, have a second pair at all times. I found my second pair on fb marketplace for $200 in May. Your mileage may vary.

On a sidenote, if anyone is needing prescription lenses for Metas, I highly recommend Lensology.co. They're in the UK but ship quick worldwide, have excellent quality and have unbeatable prices. No! I don't work with them in any way. I've just found them to be excellent for me. And they may just be for you too!

Regarding OCR, I find it can indeed detect and speak text. It's tricky to get it to read exactly what it sees. It always seems to try to give a summary. But with a little prompting, it does eventually read. I won't count on using it to read a book, at least yet. But I'm confident that's not too far off.

@Mr.TheBlind
Congratulations on your new tool! I'm sure you're finding it to be quite more useful than expected.
You first have to request "Early Access" in the MetaView settings tab. Once you enable it, you'll be able to turn on Live AI in the same Early Access menu on the app. It usually happens quickly. Be patient though.

There are excellent threads on the Meta glasses on AppleVis. I'm confident you'll soon find how invaluable this piece of tech will become for you.

Honestly, Live AI will prove to be one of those features that set the Metas apart from the competition.

Do share your findings and experiences.

By Gokul on Monday, December 23, 2024 - 23:12

@3AM does it do that? like, if you tell it something like "tell me when next a person comes into the field of view" or "tell me when the door opens"?

By Brian on Monday, December 23, 2024 - 23:12

While both would be awesome, I can see the people portion of that being limited, due to privacy laws. Still, would definitely be a nice feature to have. 🙂

By MR.TheBlind on Monday, December 23, 2024 - 23:12

hello, so thank you so much for replying to my previous comment about me being a new user with the Meta Ray-bans I am very excited indeed been playing with them since I got them yesterday. There’s so much to try. I am actually a blind content creator on social media, so this is huge for me to now own a pair and this is the first time I’m able to experience this smart glasses and I’m just blown away with all the possibilities that I will bring not only for my daily life, but also to my platform as well. So anyway, without getting too far off, I just wanted to ask more about the early access program because I was trying to search throughout the whole Meta View app to see if I can find the sign up button but I couldn’t find anything. However, I remember at the top of this article. There’s a link that someone provided. I went through that link and I read the article and it redirected me to another link to where I could sign for the early access program, it asked me for the glasses serial number which I was able to find, I putted it in and said I am on the waitlist. I don’t know if I did it right or wrong. That’s why I wanted to clarify? Because I couldn’t find any sign up on the app itself so I don’t know if I did it wrong? Thanks a lot.

By SSWFTW on Monday, December 23, 2024 - 23:12

Can't wait to try the live AI feature! I have been using Gemini 2.0 I think it is so cool. It would be even more amazing if I could have this on my face! Do everyone update With what you're trying out I love to hear more about what's possible

By MelodicFate on Monday, December 23, 2024 - 23:12

So far, from my limited testing(I just got this feature yesterday), I don't believe it's possible to say things like "tell me when x happens", and I know you can't ask it to keep describing what it sees.
You have to keep asking it things, which is fine. It's more conversational, and you don't have to keep saying "hey, Meta".
I just wanted to let you all know to not expect description simply by turning your head. Keep in mind that these are not made for us, and people with working eyes would very likely not want that particular behavior, so this only makes sense.

By Justin Harris on Monday, December 23, 2024 - 23:12

For continuous description, something like Be My Eyes on the glasses would probably be much better. But still, this Live AI sounds pretty cool.

By Martin on Monday, December 23, 2024 - 23:12

Hello all! Well, I signed up on the site about two days ago, I got no notifications that I was accepted to try this feature out however, I made a post on Facebook in a blind tech group about that and someone said I should go into the app and look under settings to see if I had an Early access button available so I tapped on that and I thought I was signing up again but the person told me I should close the app then open it back up and there should be access to the live AI feature and it was. It took about five minutes or more to go through the whole process of accepting everything and looking over everything with VoiceOver on cause there was a lot of text to go through. So far, it's cool to have a live AI video and ask whatever I need and get answers promptly. I asked what I was holding in my hand while I was looking for something in my kitchen and it told me exactly what it was, I asked what my significant other was wearing and it told me exactly what that was, I asked it about my Christmas tree and it described it in detail, I went to a local subways restaurant and asked about the type of signing and text was around and it read it, so it's very helpful and useful in daily life especially for a blind/visually impaired person. I like the independence of using it. Yes, sometimes it loses access and I have to start again but, that's OK. It's technology, it's not going to be perfect all the time and it will get better. I believe with the right prompts to give it, it will be very helpful to me and hopefully to anyone who gets access to this. That's the hardest part is trying to figure out which prompts to ask the AI to get the correct information. We need to have a handbook for this! 😎

By Gokul on Monday, December 23, 2024 - 23:12

@Martin Where in settings did you find the live ai button?

By mr grieves on Monday, December 23, 2024 - 23:12

The above is making me feel both very excited and also quite frustrated and impatient, being from the UK.

So is this actually using a video stream where you ask questions? or is it just like a look and ask session but without the need to keep prompting it, and really it's just taking photos when it needs to?

I also had a lot of connectivity issues with look and ask, although it's quite possible it is my mobile provider.

At the risk of going off topic, there was a demo on Double Tap the other day where someone went into Walmart in the US and had AIRA describe everything in real time using the glasses. This wasn't AI - this was a human. It seems you can use AIRA for free in Walmart stores. I just mention it as the above description about going shopping with the glasses reminded me of the video.

By Travis Roth on Monday, December 23, 2024 - 23:12

Despite registering on the Meta Early Access page early last week, I still don't have access. Maybe the beta program got full. I loved the shopping experience commentary above, that sounds like something I'd have thought was just science fiction only a year or two ago!

By Portia on Monday, December 23, 2024 - 23:12

@Travis,
I remember when I first signed up, it took me months before I saw the early access button in the settings.
So just give it some time, hope you'll see the button soon.