This morning I received an email from the Be My Eyes team, that Be My Eyes will start rolling out on meta-smart glasses. While perusing the email, I found a link, which is at the bottom of this post, explaining more about the service and how it will integrate with Meta. I, for One, am super stoked about this!
Read more about it from the link below. Enjoy! 😊
By Brian, 13 November, 2024
Forum
Assistive Technology
Comments
Me too
I got the email too. I did notice that it said it would be rolling out in my region which suggests it might not be everywhere yet. I am in the UK.
Also it might take a few days to appear.
I think this is absolutely incredible news.
Report
I will give a brief report on my experience once I have updated my Metas, and put the Be My Eyes service to use. I can already think of one good use for this, whenever going out and getting into an Uber/Lyft, because as some of you know I'm sure, sometimes drivers tend to not say anything when they arrive to pick you up. Even if you have informed them beforehand that you are blind or visually impaired, and will not be able to visually identify their vehicle. Even then, sometimes the drivers will just wave at you when they arrive.
Ah, rideshare. Never a dull moment. 😎
Speculations?
"This first release provides hands-free access to volunteer calls, but our respective engineers will continue to evaluate and develop new capabilities over the coming months."
Like the subject line says, any speculations?
Well...
Well I would love to see Be My AI added to the service on Meta, but I do not think that will ever happen, since Meta wants us all to use Meta AI.
Maybe
Maybe if BeMyAI was using the Llama model, but that takes away the whole usefulness point...
Meta AI
If they would update Meta AI to be more robust, that would be nice too. I mean, Meta AI is not terrible, but I do not know if I would call it Great necessarily, at this stage.
Meta AI
I actually think the AI has become pretty good recently. It's certainly a lot better than when it started. If it keeps going in this direction then I personally don't have a problem with it vs the other models.
I would also love some sort of Be My AI in there. But it might be that all we need is to go to the next version of llama and have it phrase the questions in a blind person friendly way.
I would be amazed if Meta were happy supporting ChatGPT but I'm not sure we necessarily need it.
Re: Meta AI
I will agree that Meta AI has definitely gotten better since it was first launched. Especially since now we do not have to say "look and" when giving it commands. Also, while this does not necessarily have to do with Meta AI, the Meta smart glasses overall are getting a lot more functionality now with the inclusion of the Be My Eyes Service added to it. 😆👍
Keep us posted Brian
I'm inclined to order a pair, but would love to hear some first-hand observations.
I love the smart glasses …
I love the smart glasses (Meta of course) for identifying clothing and objects. For example, we had a strange device in the office tucked in with some video magnifiers. The glasses immediately identified it as an UbiDuo, a device for deaf communication.
Yesterday I sorted through a pile of t-shirts and Meta AI easily described the pictures on each shirts and the writing. I also used it to identify food packaging and give me general scene descriptions. For example at our local library in the children's room where I was doing pet therapy, it told me the carpet had a colorful world map stitched in to it.
Where the glasses don't work as well as Be My AI is in reading signs. On the campus where I work we have many buildings, separated by wide asphalt plazzas. I walked around using Be My Ai, pointing at buildings and having it read me signs. So if I was lost, I would know where I was, just as sighted folks do by looking at signage. (All the buildings are gray and look similar so without signs sighted folks get lost too!)
Anyway, Be My Ai had excellent scene descriptions and automatically read signs. The Meta glasses could not do that. I'd have to query them repeatedly and only got lucky sometimes.
I also have trouble finding a bus stop in the middle of a parking lot; it's not on a sidewalk. Meta AI could tell me where it was, but could not give me enough detail to locate it. Be my AI could.
I hate bugging volunteers when Be My Ai does a good enough job, but I also hate trying to hold my phone, hit the Take picture button and ensure it is positioned properly to identify what I want. I too wish Be My Ai was incorporated in to the glasses.
One workaround the Be My Eyes folks could implement is have a hands-free way of using Be My Ai. So I could say "Hey Siri, use Be My Ai" and it would take a picture. At least then I'd have one hand available to hold an object while I held the phone with another hand, or had the phone on a lanyard.
Another tip I've used with both the glasses and Be my Ai is to have the object I want identified positioned on a piece of black construction paper or a black scarf on an empty surface. But it still involves holding my phone with one hand and tapping "take picture" with the other hand if I'm not using the glasses.
Thanks Deborah
The issue with not reading signs is as frustrating as it is baffling. We know AI can read signs. That this doesn't work flawlessly in the Meta Ray-Bans is a total head-scratcher.
Furthermore, connecting to a human BME volunteer can take some time. It's usually not an issue, but if you're waiting 15 seconds for a volunteer to read a sign for you, then walking 50 meters, then waiting for another volunteer to read the next sign,, it can be very tedious.
As you said, you don't need a human volunteer for this. Be My AI can handle it. But the requirement to point a camera, snap a photo, and wait for processing is a deal killer.
One use I have for the Meta Ray-Ban is to help me with indoor navigation at conferences, to help me find workshop and session rooms. For that, I need a decent sign reading solution. It sounds like I will still be waiting.
Other guesses
So if BME are listening, a couple of other things - proper PCR and integration with Voice Vista or similar please.
Bit disappointing if signs are still not working properly. First thing I did when I got the AI was to try it out on a walking route and it totally failed to read any of the signs at all. I was hoping it might have got better by now but I keep forgetting to try. Of course the other problem is knowing where the signs are.
@paul - maybe a long shot but have you looked at Good Maps? Shaun from Double Tap has been talking it up recently. I think it does require that a building be marked up, but some conferences possibly might be.
AI versus AI
It is strange how Be My AI can do signs so well, and Meta AI struggles with this. You can, sometimes, get it to work, but often times requires multiple prompts. Who knows, maybe in a future update Meta AI will catch up.
Indoor nav
Hello Mr. Grieves. I heard the Double Tap coverage of Good Maps. I'm glad people are working on indoor nav. But We've had indoor nav solutions for years that have required instrumenting the physical location, and these systems never attain widespread acceptance. And imagine if three or four do catch on, and we blindies need to familiarize with the idiosyncracies of three or four indoor nav UIs depending on which building we've enmtered. Not good enough, in my opinion.
If a human can navigate a complex environment by looking at overhead and/or wall-mounted signage, then certainly an AI should be able to do the same. If the AI has access to building floor plans, it should be able to do an even better job. I could even imagine someone walking through a facility, taking photos, and training an AI on locations simply based on the camera image, much like we train AIs what a cat looks like. or an elephant.
Sorry. Didn't mean to derail this conversation.
Re: indoor navigation
Whilst I agree that manually mapping out a building feels a bit backwards in this day and age, using signs does require that the place you are after is well sign posted. Whereas something like Good Maps presumably doesn't require you to find a sign, and will lead you right to the door rather than the general area. So in theory this sort of thing should be better than what a sighted person has. Just because you can see the signs doesn't mean you can find them and that they make sense as I've found out over the years.
What I don't want to do really is to have to hold my phone out as I'm walking about a busy place. I suppose there's always a lanyard but glasses like these feel like the perfect solution. (I believe Good Maps still requires some sort of camera use?) So, if Be My Eyes and Meta could somehow sort all this out that would be great, thanks! :)
I'm not sure how Glide handles indoor spaces as that would be interesting.
Sign posts
Agree with above post. Proper signage, and sign posting is a must, especially for us blind folks. I am not sure what it's like in the UK and surrounding areas, but here in the US, a telephone pole can be an actual bus stop. No joke, there can actually be a bus stop sign mounted to a telephone pole.
Imagine it. You were trying to find that bus stop, because you really need to get somewhere. Maybe you need to get home. Maybe you need to get to the pharmacy. And you ask somebody and they say something like, "sure, it's about 5 feet behind you, and will be on your left". And so you turn around, with your cane, or maybe your guide dog, and you go back-and-forth and back-and-forth, only to find out that that telephone Paul you've been walking past for the last 45 minutes, is, your bus stop.
Welcome to America.
Grrr.
Navigation integration
I just woke up and haven't finished my first coffee yet. So, while I'm in dreamland, I'll muse that what we really want is some kind of navigation solution that seamlessly integrates information from a variety of sources to determine our location. View from a head-mounted camera, audio cues, signals from GPS, RFID, or Bluetooth beacons, map data (not just roads but building interiors), text on any available signs, publically available geopositioned images, and even anonymized location data from the navigation systems of other nearby people, cars, or other entities. Such a system would track location as you walk from outdoors to indoors, step aboard an airplane, or take a boat across a lake.
And it's not that different from how humans track their location. They look around, yes, but they use lots of other cues besides eyesight. So give me a nav system that tracks location the same way people track location.
Okay time to wake up. The coffee smells good.
All of that…
And a fresh cup of coffee, too. All day, every day. ☕️
Maybe if BME did use LLAMA…
Maybe if BME did use LLAMA it could be better prompted to know a blind person is asking the question, an instruction to read all signage, documents etc. At the moment it is treating each request in the same way as it would for a sighted person. Maybe, at some stage, LLAMA will get memory so we have greater control over the format of responses.
On a side note, has anyone got Audible on their glasses yet? It was announced at the October event but not seeing it in my optional app linkage.
No Audible, but…
I have been listening to both Audible, and Kindle books, while using the Meta Smart glasses as a Bluetooth headset. Not what you're asking, I know, I know, but there it is. Also, I still do not have the Be My Eyes service on my Meta's yet.
The last update I received for Meta View, had quite a lot of new features. I can't remember all of them off of the top of my head, but some of the new stuff were things like sending video using your voice, sending meta-AI results to contacts, having Meta AI tell you whatever song you're listening to, and such. Oh, and I guess you can also take reminders with Meta AI and even search through Reminders using Meta AI, and reportedly you can also have Meta AI call a number that was read aloud to you off of signage.
Don't know if that is old news to the rest of you, but just trying to remember what the latest update was on my end. 🤷🏽♂️
Just got access
I just checked and now have access to BeMyEyes in the MetaView app. It wasn't there last night at like 11, and my app hasn't updated since maybe Wednesday. So it's clearly something they're rolling out under the hood. Going to give it a test call sometime later today.
I got it straight away but,…
I got it straight away but, as I was on the beta, I had to delete both be my eyes and meta view betas before I installed the public versions. I'm not certain this unlocked it, but it may have.
It's probably something I"m a little to shy to use myself, but it's great to know I have it as a back up should I need help in a pinch.
BMAI and LLAMA
If BME team reaches an agreement to integrate BMAI into meta glasses today, it will be 3 steps up from what we have with meta glasses. For one, for whatever reason, whatever prompt they're using with gpt is way better than any other similar app. And second, the way I understand it, BMAI does seem to have access to facilities in gpt which a regular image description with the gpt app doesn't give. For example, if gpt thinks an image uploaded through the regular app contains something explicit, it is likely going to refuse to describe it. But the same doesn't happen with BMAI, and that's how it should be.
Currently, meta ai refuses to describe people; which would and should not be the case with any image description service for the blind...
Using Meta Ray-Bans with Apple AirPods Pro 2
As of the past couple of weeks, I am a happy AirPods Pro 2 user. They act as hearing aids. If there is a compatibility issue between the AirPods and any other device, I'm sticking with the AirPods.
I know the Ray-Bans are bone-conducting, so I assume they are physically compatible. The AirPods go in my ears, the Ray-Ban temple pieces rest over my ears.
It's the Bluetooth connection that has me worried. I know for a fact that if my phone is using any other type of Bluetooth audio, that as soon as I put in my AirPods, audio switches to the AirPods.
Will I still be able to use the Meta Ray-Bans if they aren't connected to my phone by Bluetooth? Do I lose all functionality (it's one or the other, Ray-Bans or AirPods), or only lose some functionality? Or would the Ray-Bans work just fine in this case because they would simply route their audio through my AirPods?
Re: air pods pro vs meta
Standalone, you can use the Meta Ray-bans to take photos and videos I believe. But nothing else. All the AI goodness comes from the MetaView app on your phone. So I think without bluetooth the glasses aren't going to do much.
Also I don't think the Metas are bone conduction. I believe they just fire the audio at your ears from the arms of the glasses.
Someone more knowledgeable than me may know better of course.
New features
Just to continue brian's tangent of the new features... you can also scan a QR code with the glasses and it sends the link to the app and you can point at a phone number and ask the glasses to call it. Also it will announce the name of an incoming caller.
The reminders thing sounds awesome. E.g. take a photo of your hotel room door and ask meta to remember it for you.
Not all features in all regions.
I do wonder if being in the UK but on a US VPN means that the BME feature may not be unlocked in the same way. I don't think it's available for me yet unless I'm looking in the wrong place.
You can use both the meta…
You can use both the meta ray-bans and airpods pro 2 however, you will have to have the airpods pro in transparency mode as all the meta interaction remains on the glasses. It would be great if it could pipe directly into the airpods but it doesn't.
A quick tip, if you go into settings, bluetooth and long press your metas, you can change the type of output it is so, instead of audio from your phone coming through your ray-bans with the associated lag, you can switch the ray-bans to speaker, and voiceover will come from the phone instead and then, as above, you can use AirPods when you are out and about, if you like.
It's an option at least.
Just to clarify
No, the meta isn't exactly bone conduction. It just releases the audio close enough to your ear such that you can hear it without having to use an in-ear earpiece.
Ear compatibility
Okay, thanks for the clarification regarding bone conduction. I had that wrong. But it sounds like the take away is that the two devices are ear-compatible. Using AirPods doesn't physically prevent the use of the Ray-Bans.
The more important question is Bluetooth. How useful will the Ray-Ban's be if the iPhone only connects to one Bluetooth headset, and it's the AirPods? It sounds like the Ray-Bans might have some use, but not full functionality.
If I put the AirPods in, which breaks the Ray-Ban Bluetooth connection, then take the AirPods out and stow them, how easy is it to restore the Ray-Ban Bluetooth connection?
Connectivity
The iPhone does connect to multiple bluetooth devices at the same time, but I haven't personally tried connecting the meta glasses along with any other device so cannot say how much of the functionality will be available...
Has any Android user, who wasn't a beta tester, gotten access to
I know several IOS users who got access and are using it but none of the Android users I know have gotten access yet. FYI, they are all running the latest versions of BME and Meta View, according to their Play Store, as well as the latest firmware available for their glasses.
@paul
I just read this comment on Mastodon:
Post DAVID WOODBRIDGE, @Woody@dragonscave.space, “By accident found that I can still use my Airpods on my iphone for VoiceOver, Siri etc, plus keep using the Ray Ban Meta Glasses at the same time. Assuming MetaView is controlling audio to the glasses separate to OS Bt connection to the Airpods.”,
I had a try with my air pods 3 yesterday and thought I was experiencing this too but I got a little confused about where the sound was coming from. I did try a second time and once the glasses took over I couldn’t get it to revert back to the air pods.I’ll try to find some time to play about with it some more when my brain isn’t feeling quite so frazzled.
Does Meta Smart Glasses require a subscription?
Nice to have such AI features like Be My Eyes and others on the Meta Glasses.
Question: I know Be My Eyes is free to use, but if you are using other AI features of the Meta Glasses does that require a subscdription?
--Pete
Re: Subscriptions
Meta smart glasses do not require a subscription by themselves. You just pay for the glasses, and have access to the applications that are part of the Meta View software that you download to your smart device (Android or iPhone). Having said that, some of the third-party software that has been integrated into Meta View does require a subscription, for example, you will need a subscription to Apple Music if you want to have Apple Music playing through your meta-smart glasses.
HTH
Clarification
Just wanted to clarify that, while you can use the smart glasses as a Bluetooth headset, and listen to anything through them via your smart phone, apps that pair with the Meta View software, typically require a subscription to work properly.
Trying them
I'm planning to try these Thursday at Lenscrafters, a large retail chain that sells them here in the US. I will at least pick out style and size.
I'd love to give them a real trial run before I buy. Do you think they would let me setup their floor model with my phone? What kind of things do you wish you'd tried before you bought? Thanks.
Meta View software
...and what features are available using the Meta View Software? From this thread it sounds like you can query AI for descriptions of what is around you, have it OCR text, etc. Are those services free?
Also, it sounds like the device pairs with your phone if I udnerstand this right. Thus, I assume you use the glasses as the camera to your phone and get speech feedback from your phone sent to the glasses? Is that right?
--Pete
RE: Trying them
Hey Paul,
If you are going to LensCrafters, be sure to check out this thread.
RE: Meta View software
Edited for typos ... 😳
Hey Peter,
The Meta View application has a bunch of stuff built into it. Firstly, yes the glasses pair with your smart phone via Bluetooth. This is required if you want them to work at all. The camera that is built into the smart glasses you can use for taking pictures as well as small videos. There is storage built into the smart glasses, but you are able to transfer your pictures and photos to your Photos app on your smart phone. Also, if you are connected to either Facebook or Instagram, or both, you can upload pictures directly to those social media platforms. You can also live stream to your platforms as well with the smart glasses.
There are some, skills, for lack of a better descriptor, similar to the Alexa app on your smart phone, that allows you to pair services such as Amazon Music, Apple Music, Spotify Music, and Calm. This is a very brief description of Calm, "Listen to guided meditations, music, inspiring stories and more.".
I am unsure about Calm, but if you want Amazon, Apple, or Spotify music to be paired with your device so that you can use it with a simple tap on the touch sensor (located on the right arm, more on that later), it helps to have a subscription to one of those services. Of course, and as I mentioned above, you can play anything off of your smart phone, and just use your smart glasses as a glorified Bluetooth headset. In this way, you can listen to Netflix, YouTube, AppleVis podcasts, etc., etc.
Regarding Meta AI, this is included with your purchase, no subscription required at this time. You do have to enable it though through the Meta View application. In fact, there are a lot of things you have to, "turn on", before you can really start taking full advantage of the smart glasses.
Below is a list of some of the services and features you can do with Meta Smart glasses:
• Take pictures or record up to three minutes of video an optionally upload to Facebook or Instagram.
• Make phone calls through Meta AI, either via connectivity to your phone app, or using WhatsApp. Optionally with WhatsApp you can also do video calls, where people can see through your camera on your smart glasses.
• Text friends using Meta AI.
• Use meta AI to read signage, documents, etc. With one of the more recent updates, Meta AI can even call a phone number from any document. .
• Meta AI can tell you who is calling you, or texting you. It can also create, and search through, reminders.
• Meta AI can also tell you things like the weather, your location, random facts about things, like how tall is the Eiffel Tower, what is the largest tree in the world, etc., etc.
• You can also ask it to describe what you're looking at, meaning it will take a picture of wherever you happen to be facing at that moment, and describe what's in the photo. Personally, it's not as detailed as Be My Eyes, but it's not terrible either.
• Meta AI can now give you info about a particular song you are listening to at any given time. Kind of similar to Shazam.
• Use Meta AI to send pictures and videos to people with your voice.
• Meta AI can also be used to send its results to people with your voice. For example, if someone asked you, "Hey, how much does the Moon weigh"? You can send meta AIs results on that to said person, as a text, via your voice.
I mentioned above the touch sensor. On the right arm is a multi-touch sensor for doing things like adjusting volume, controlling playback of media, answering/rejecting calls, using Meta AI without saying, "Hey Meta ...", and as a quick play shortcut for services such as Apple Music.
I'm sure I have neglected to mention a few features, but this should at least give you an idea of the devices capabilities. Also, considering that it only requires Bluetooth for connectivity, and after the initial upfront cost, there is no other cost for use of the device or services, it is (in my opinion) The best smart Glas option we have at this point in time.
I hope you find the info useful. 🙂
Song info
I for my life cannot get this Shazam-kind of thing to work. any prompt for that?
Re: Song info
Hey Gokul,
Try something like, "hey Meta, what am I listening to"? I struggle with this also, but I have a friend who also has Metas, and she has no problem getting it to work.
Go figure ...
Re: Shazam
I thought the feature to tell you what was playing was limited to the music playing through the glasses, as opposed to being a general Shazam thing that can tell you about whatever music is playing in the background. And in that case it probably knows what is playing without having to figure it out by listening.
I might well be wrong, though - I think this is just what I presumed from reading the release notes for the last upgrade.
And, apologies for going on-topic for a moment :) but the accessibility option turned up for me yesterday. As mentioned before I don't think this coincides with an app update, but I did need to close and reopen the app for it to appear.
Music recognition and Accessibility
You may be right about the music recognition. Perhaps it needs to be playing directly from the smart glasses, in order to work properly. I am really not sure. My friend claims she can get it to work, and I can never get it to work. Go figure.
What accessibility thing are you referring to by the way? Are you referring to Be My Eyes?
Re: accessibility
When Be My Eyes is available, you configure it through the MetaView Settings. A new category "accessibility" appears alongside the others (under Communication I think)
I may be wrong about the music recognition thing. The release notes appear, I update then can't figure out how to get back to them to check. Also a lot of the new features were region locked if that makes a difference? Anyway it didn't work like Shazam when I was just listening to music in the houes the other day.
Be My Accessibility
Oh that is interesting about Be My Eyes, and the accessibility category. Maybe in time other apps or Services will populate that as well? That would be cool.
As for the music recognition, yeah I cannot get it to work either. I listen to Spotify in my house, all the time, and when I'm not listening to Spotify, I am listening to iHeartRadio. My Meta glasses always say something like, "sorry, I cannot recognize this ", or, "to recognize music, you can use Services like Shazam, or Google Assistant".
I have decided that my Meta glasses are just stubborn. Lol
Re: Meta vs Air Pods
OK I had another go with this and it does seem to work. I connected my Air Pods 3s to my phone and voiceover came out of them. I then put on my meta ray-bans and could speak to the AI there. At the same time, VoiceOver continued to come out of the Air Pods. I could also talk to Siri and she would come through the air pods too. It worked surprisingly well.
I can't test with things like transparency mode or using them as hearing aids but I'm more confident that they might work now than I was before. How well they work over time I also don't know.
I'm hoping to get some Air Pods Pro 2's at some point but won't be for a while. I'm sure someone else on here must have both that can be more helpful.
Thanks
"Meta versus AirPods"--I like it. Could be the name of the latest Kaiju movie. I'm looking forward to visiting lenscrafters and trying out the Meta glasses in a couple of days. I'm installing the Metaview app, hoping I'll be able to pair their demo model.
when will the be my eyes with gpt 4O released?
I watch a video on youtube, it is talking about we can do live conversation wiht gpt 4O using be my eyes...
anyone know that when will this feature released ?
https://www.youtube.com/watch?v=KwNUJ69RbwY
I question that video
This video was from the demonstration that Andy Lane did for Open AI to introduce features for ChatGPT 4o, that we never got ...