Today Be My Eyes launches on Meta smart glasses

By Brian, 13 November, 2024

Forum
Assistive Technology

This morning I received an email from the Be My Eyes team, that Be My Eyes will start rolling out on meta-smart glasses. While perusing the email, I found a link, which is at the bottom of this post, explaining more about the service and how it will integrate with Meta. I, for One, am super stoked about this!
Read more about it from the link below. Enjoy! 😊

https://www.bemyeyes.com/ray-ban-meta?ct=YTo1OntzOjY6InNvdXJjZSI7YToyOntpOjA7czoxNDoiY2FtcGFpZ24uZXZlbnQiO2k6MTtpOjEyO31zOjU6ImVtYWlsIjtpOjI3O3M6NDoic3RhdCI7czoyMjoiNjczNGM5M2M5OTZkODI5MTg0MzI2MyI7czo0OiJsZWFkIjtzOjc6Ijc1MjQ0MDYiO3M6NzoiY2hhbm5lbCI7YToxOntzOjU6ImVtYWlsIjtpOjI3O319

Options

Comments

By mr grieves on Monday, November 18, 2024 - 17:11

I got the email too. I did notice that it said it would be rolling out in my region which suggests it might not be everywhere yet. I am in the UK.

Also it might take a few days to appear.

I think this is absolutely incredible news.

By Brian on Monday, November 18, 2024 - 17:11

I will give a brief report on my experience once I have updated my Metas, and put the Be My Eyes service to use. I can already think of one good use for this, whenever going out and getting into an Uber/Lyft, because as some of you know I'm sure, sometimes drivers tend to not say anything when they arrive to pick you up. Even if you have informed them beforehand that you are blind or visually impaired, and will not be able to visually identify their vehicle. Even then, sometimes the drivers will just wave at you when they arrive.
Ah, rideshare. Never a dull moment. 😎

By Gokul on Monday, November 18, 2024 - 17:11

"This first release provides hands-free access to volunteer calls, but our respective engineers will continue to evaluate and develop new capabilities over the coming months."
Like the subject line says, any speculations?

By Brian on Monday, November 18, 2024 - 17:11

Well I would love to see Be My AI added to the service on Meta, but I do not think that will ever happen, since Meta wants us all to use Meta AI.

By Gokul on Monday, November 18, 2024 - 17:11

Maybe if BeMyAI was using the Llama model, but that takes away the whole usefulness point...

By Brian on Monday, November 18, 2024 - 17:11

If they would update Meta AI to be more robust, that would be nice too. I mean, Meta AI is not terrible, but I do not know if I would call it Great necessarily, at this stage.

By mr grieves on Monday, November 18, 2024 - 17:11

I actually think the AI has become pretty good recently. It's certainly a lot better than when it started. If it keeps going in this direction then I personally don't have a problem with it vs the other models.

I would also love some sort of Be My AI in there. But it might be that all we need is to go to the next version of llama and have it phrase the questions in a blind person friendly way.

I would be amazed if Meta were happy supporting ChatGPT but I'm not sure we necessarily need it.

By Brian on Monday, November 18, 2024 - 17:11

I will agree that Meta AI has definitely gotten better since it was first launched. Especially since now we do not have to say "look and" when giving it commands. Also, while this does not necessarily have to do with Meta AI, the Meta smart glasses overall are getting a lot more functionality now with the inclusion of the Be My Eyes Service added to it. 😆👍

By PaulMartz on Monday, November 18, 2024 - 17:11

I'm inclined to order a pair, but would love to hear some first-hand observations.

By Deborah Armstrong on Monday, November 18, 2024 - 17:11

I love the smart glasses (Meta of course) for identifying clothing and objects. For example, we had a strange device in the office tucked in with some video magnifiers. The glasses immediately identified it as an UbiDuo, a device for deaf communication.
Yesterday I sorted through a pile of t-shirts and Meta AI easily described the pictures on each shirts and the writing. I also used it to identify food packaging and give me general scene descriptions. For example at our local library in the children's room where I was doing pet therapy, it told me the carpet had a colorful world map stitched in to it.
Where the glasses don't work as well as Be My AI is in reading signs. On the campus where I work we have many buildings, separated by wide asphalt plazzas. I walked around using Be My Ai, pointing at buildings and having it read me signs. So if I was lost, I would know where I was, just as sighted folks do by looking at signage. (All the buildings are gray and look similar so without signs sighted folks get lost too!)
Anyway, Be My Ai had excellent scene descriptions and automatically read signs. The Meta glasses could not do that. I'd have to query them repeatedly and only got lucky sometimes.
I also have trouble finding a bus stop in the middle of a parking lot; it's not on a sidewalk. Meta AI could tell me where it was, but could not give me enough detail to locate it. Be my AI could.
I hate bugging volunteers when Be My Ai does a good enough job, but I also hate trying to hold my phone, hit the Take picture button and ensure it is positioned properly to identify what I want. I too wish Be My Ai was incorporated in to the glasses.
One workaround the Be My Eyes folks could implement is have a hands-free way of using Be My Ai. So I could say "Hey Siri, use Be My Ai" and it would take a picture. At least then I'd have one hand available to hold an object while I held the phone with another hand, or had the phone on a lanyard.
Another tip I've used with both the glasses and Be my Ai is to have the object I want identified positioned on a piece of black construction paper or a black scarf on an empty surface. But it still involves holding my phone with one hand and tapping "take picture" with the other hand if I'm not using the glasses.

By PaulMartz on Monday, November 18, 2024 - 17:11

The issue with not reading signs is as frustrating as it is baffling. We know AI can read signs. That this doesn't work flawlessly in the Meta Ray-Bans is a total head-scratcher.

Furthermore, connecting to a human BME volunteer can take some time. It's usually not an issue, but if you're waiting 15 seconds for a volunteer to read a sign for you, then walking 50 meters, then waiting for another volunteer to read the next sign,, it can be very tedious.

As you said, you don't need a human volunteer for this. Be My AI can handle it. But the requirement to point a camera, snap a photo, and wait for processing is a deal killer.

One use I have for the Meta Ray-Ban is to help me with indoor navigation at conferences, to help me find workshop and session rooms. For that, I need a decent sign reading solution. It sounds like I will still be waiting.

By mr grieves on Monday, November 18, 2024 - 17:11

So if BME are listening, a couple of other things - proper PCR and integration with Voice Vista or similar please.

Bit disappointing if signs are still not working properly. First thing I did when I got the AI was to try it out on a walking route and it totally failed to read any of the signs at all. I was hoping it might have got better by now but I keep forgetting to try. Of course the other problem is knowing where the signs are.

@paul - maybe a long shot but have you looked at Good Maps? Shaun from Double Tap has been talking it up recently. I think it does require that a building be marked up, but some conferences possibly might be.

By Brian on Monday, November 18, 2024 - 17:11

It is strange how Be My AI can do signs so well, and Meta AI struggles with this. You can, sometimes, get it to work, but often times requires multiple prompts. Who knows, maybe in a future update Meta AI will catch up.

By PaulMartz on Monday, November 18, 2024 - 17:11

Hello Mr. Grieves. I heard the Double Tap coverage of Good Maps. I'm glad people are working on indoor nav. But We've had indoor nav solutions for years that have required instrumenting the physical location, and these systems never attain widespread acceptance. And imagine if three or four do catch on, and we blindies need to familiarize with the idiosyncracies of three or four indoor nav UIs depending on which building we've enmtered. Not good enough, in my opinion.

If a human can navigate a complex environment by looking at overhead and/or wall-mounted signage, then certainly an AI should be able to do the same. If the AI has access to building floor plans, it should be able to do an even better job. I could even imagine someone walking through a facility, taking photos, and training an AI on locations simply based on the camera image, much like we train AIs what a cat looks like. or an elephant.

Sorry. Didn't mean to derail this conversation.

By mr grieves on Monday, November 18, 2024 - 17:11

Whilst I agree that manually mapping out a building feels a bit backwards in this day and age, using signs does require that the place you are after is well sign posted. Whereas something like Good Maps presumably doesn't require you to find a sign, and will lead you right to the door rather than the general area. So in theory this sort of thing should be better than what a sighted person has. Just because you can see the signs doesn't mean you can find them and that they make sense as I've found out over the years.

What I don't want to do really is to have to hold my phone out as I'm walking about a busy place. I suppose there's always a lanyard but glasses like these feel like the perfect solution. (I believe Good Maps still requires some sort of camera use?) So, if Be My Eyes and Meta could somehow sort all this out that would be great, thanks! :)

I'm not sure how Glide handles indoor spaces as that would be interesting.

By Brian on Monday, November 18, 2024 - 17:11

Agree with above post. Proper signage, and sign posting is a must, especially for us blind folks. I am not sure what it's like in the UK and surrounding areas, but here in the US, a telephone pole can be an actual bus stop. No joke, there can actually be a bus stop sign mounted to a telephone pole.
Imagine it. You were trying to find that bus stop, because you really need to get somewhere. Maybe you need to get home. Maybe you need to get to the pharmacy. And you ask somebody and they say something like, "sure, it's about 5 feet behind you, and will be on your left". And so you turn around, with your cane, or maybe your guide dog, and you go back-and-forth and back-and-forth, only to find out that that telephone Paul you've been walking past for the last 45 minutes, is, your bus stop.
Welcome to America.

Grrr.

By PaulMartz on Monday, November 18, 2024 - 17:11

I just woke up and haven't finished my first coffee yet. So, while I'm in dreamland, I'll muse that what we really want is some kind of navigation solution that seamlessly integrates information from a variety of sources to determine our location. View from a head-mounted camera, audio cues, signals from GPS, RFID, or Bluetooth beacons, map data (not just roads but building interiors), text on any available signs, publically available geopositioned images, and even anonymized location data from the navigation systems of other nearby people, cars, or other entities. Such a system would track location as you walk from outdoors to indoors, step aboard an airplane, or take a boat across a lake.

And it's not that different from how humans track their location. They look around, yes, but they use lots of other cues besides eyesight. So give me a nav system that tracks location the same way people track location.

Okay time to wake up. The coffee smells good.

By Brian on Monday, November 18, 2024 - 17:11

And a fresh cup of coffee, too. All day, every day. ☕️

By Ollie on Monday, November 18, 2024 - 17:11

Maybe if BME did use LLAMA it could be better prompted to know a blind person is asking the question, an instruction to read all signage, documents etc. At the moment it is treating each request in the same way as it would for a sighted person. Maybe, at some stage, LLAMA will get memory so we have greater control over the format of responses.

On a side note, has anyone got Audible on their glasses yet? It was announced at the October event but not seeing it in my optional app linkage.

By Brian on Monday, November 18, 2024 - 17:11

I have been listening to both Audible, and Kindle books, while using the Meta Smart glasses as a Bluetooth headset. Not what you're asking, I know, I know, but there it is. Also, I still do not have the Be My Eyes service on my Meta's yet.
The last update I received for Meta View, had quite a lot of new features. I can't remember all of them off of the top of my head, but some of the new stuff were things like sending video using your voice, sending meta-AI results to contacts, having Meta AI tell you whatever song you're listening to, and such. Oh, and I guess you can also take reminders with Meta AI and even search through Reminders using Meta AI, and reportedly you can also have Meta AI call a number that was read aloud to you off of signage.

Don't know if that is old news to the rest of you, but just trying to remember what the latest update was on my end. 🤷🏽‍♂️

By Brooke on Monday, November 18, 2024 - 17:11

I just checked and now have access to BeMyEyes in the MetaView app. It wasn't there last night at like 11, and my app hasn't updated since maybe Wednesday. So it's clearly something they're rolling out under the hood. Going to give it a test call sometime later today.

By Ollie on Monday, November 18, 2024 - 17:11

I got it straight away but, as I was on the beta, I had to delete both be my eyes and meta view betas before I installed the public versions. I'm not certain this unlocked it, but it may have.

It's probably something I"m a little to shy to use myself, but it's great to know I have it as a back up should I need help in a pinch.

By Gokul on Monday, November 18, 2024 - 17:11

If BME team reaches an agreement to integrate BMAI into meta glasses today, it will be 3 steps up from what we have with meta glasses. For one, for whatever reason, whatever prompt they're using with gpt is way better than any other similar app. And second, the way I understand it, BMAI does seem to have access to facilities in gpt which a regular image description with the gpt app doesn't give. For example, if gpt thinks an image uploaded through the regular app contains something explicit, it is likely going to refuse to describe it. But the same doesn't happen with BMAI, and that's how it should be.
Currently, meta ai refuses to describe people; which would and should not be the case with any image description service for the blind...

By PaulMartz on Monday, November 18, 2024 - 17:11

As of the past couple of weeks, I am a happy AirPods Pro 2 user. They act as hearing aids. If there is a compatibility issue between the AirPods and any other device, I'm sticking with the AirPods.

I know the Ray-Bans are bone-conducting, so I assume they are physically compatible. The AirPods go in my ears, the Ray-Ban temple pieces rest over my ears.

It's the Bluetooth connection that has me worried. I know for a fact that if my phone is using any other type of Bluetooth audio, that as soon as I put in my AirPods, audio switches to the AirPods.

Will I still be able to use the Meta Ray-Bans if they aren't connected to my phone by Bluetooth? Do I lose all functionality (it's one or the other, Ray-Bans or AirPods), or only lose some functionality? Or would the Ray-Bans work just fine in this case because they would simply route their audio through my AirPods?

By mr grieves on Monday, November 18, 2024 - 17:11

Standalone, you can use the Meta Ray-bans to take photos and videos I believe. But nothing else. All the AI goodness comes from the MetaView app on your phone. So I think without bluetooth the glasses aren't going to do much.

Also I don't think the Metas are bone conduction. I believe they just fire the audio at your ears from the arms of the glasses.

Someone more knowledgeable than me may know better of course.

By mr grieves on Monday, November 18, 2024 - 17:11

Just to continue brian's tangent of the new features... you can also scan a QR code with the glasses and it sends the link to the app and you can point at a phone number and ask the glasses to call it. Also it will announce the name of an incoming caller.

The reminders thing sounds awesome. E.g. take a photo of your hotel room door and ask meta to remember it for you.

Not all features in all regions.

I do wonder if being in the UK but on a US VPN means that the BME feature may not be unlocked in the same way. I don't think it's available for me yet unless I'm looking in the wrong place.

By Ollie on Monday, November 18, 2024 - 17:11

You can use both the meta ray-bans and airpods pro 2 however, you will have to have the airpods pro in transparency mode as all the meta interaction remains on the glasses. It would be great if it could pipe directly into the airpods but it doesn't.

A quick tip, if you go into settings, bluetooth and long press your metas, you can change the type of output it is so, instead of audio from your phone coming through your ray-bans with the associated lag, you can switch the ray-bans to speaker, and voiceover will come from the phone instead and then, as above, you can use AirPods when you are out and about, if you like.

It's an option at least.

By Gokul on Monday, November 18, 2024 - 17:11

No, the meta isn't exactly bone conduction. It just releases the audio close enough to your ear such that you can hear it without having to use an in-ear earpiece.

By PaulMartz on Monday, November 18, 2024 - 17:11

Okay, thanks for the clarification regarding bone conduction. I had that wrong. But it sounds like the take away is that the two devices are ear-compatible. Using AirPods doesn't physically prevent the use of the Ray-Bans.

The more important question is Bluetooth. How useful will the Ray-Ban's be if the iPhone only connects to one Bluetooth headset, and it's the AirPods? It sounds like the Ray-Bans might have some use, but not full functionality.

If I put the AirPods in, which breaks the Ray-Ban Bluetooth connection, then take the AirPods out and stow them, how easy is it to restore the Ray-Ban Bluetooth connection?

By Gokul on Monday, November 18, 2024 - 17:11

The iPhone does connect to multiple bluetooth devices at the same time, but I haven't personally tried connecting the meta glasses along with any other device so cannot say how much of the functionality will be available...

By Julian on Monday, November 18, 2024 - 17:11

I know several IOS users who got access and are using it but none of the Android users I know have gotten access yet. FYI, they are all running the latest versions of BME and Meta View, according to their Play Store, as well as the latest firmware available for their glasses.

By mr grieves on Monday, November 25, 2024 - 17:11

I just read this comment on Mastodon:

Post DAVID WOODBRIDGE, ‪@Woody@dragonscave.space‬, “By accident found that I can still use my Airpods on my iphone for VoiceOver, Siri etc, plus keep using the Ray Ban Meta Glasses at the same time. Assuming MetaView is controlling audio to the glasses separate to OS Bt connection to the Airpods.”,

I had a try with my air pods 3 yesterday and thought I was experiencing this too but I got a little confused about where the sound was coming from. I did try a second time and once the glasses took over I couldn’t get it to revert back to the air pods.I’ll try to find some time to play about with it some more when my brain isn’t feeling quite so frazzled.

By peter on Monday, November 25, 2024 - 17:11

Nice to have such AI features like Be My Eyes and others on the Meta Glasses.

Question: I know Be My Eyes is free to use, but if you are using other AI features of the Meta Glasses does that require a subscdription?

--Pete

By Brian on Monday, November 25, 2024 - 17:11

Meta smart glasses do not require a subscription by themselves. You just pay for the glasses, and have access to the applications that are part of the Meta View software that you download to your smart device (Android or iPhone). Having said that, some of the third-party software that has been integrated into Meta View does require a subscription, for example, you will need a subscription to Apple Music if you want to have Apple Music playing through your meta-smart glasses.

HTH

By Brian on Monday, November 25, 2024 - 17:11

Just wanted to clarify that, while you can use the smart glasses as a Bluetooth headset, and listen to anything through them via your smart phone, apps that pair with the Meta View software, typically require a subscription to work properly.

By PaulMartz on Monday, November 25, 2024 - 17:11

I'm planning to try these Thursday at Lenscrafters, a large retail chain that sells them here in the US. I will at least pick out style and size.

I'd love to give them a real trial run before I buy. Do you think they would let me setup their floor model with my phone? What kind of things do you wish you'd tried before you bought? Thanks.

By peter on Monday, November 25, 2024 - 17:11

...and what features are available using the Meta View Software? From this thread it sounds like you can query AI for descriptions of what is around you, have it OCR text, etc. Are those services free?

Also, it sounds like the device pairs with your phone if I udnerstand this right. Thus, I assume you use the glasses as the camera to your phone and get speech feedback from your phone sent to the glasses? Is that right?

--Pete

By Brian on Monday, November 25, 2024 - 17:11

Hey Paul,

If you are going to LensCrafters, be sure to check out this thread.

By Brian on Monday, November 25, 2024 - 17:11

Edited for typos ... 😳

Hey Peter,

The Meta View application has a bunch of stuff built into it. Firstly, yes the glasses pair with your smart phone via Bluetooth. This is required if you want them to work at all. The camera that is built into the smart glasses you can use for taking pictures as well as small videos. There is storage built into the smart glasses, but you are able to transfer your pictures and videos to your smartphone. Also, if you are connected to either Facebook or Instagram, or both, you can upload pictures directly to those social media platforms. You can also live stream to your platforms as well with the smart glasses.
There are some, skills, for lack of a better descriptor, similar to the Alexa app on your smart phone, that allows you to pair services such as Amazon Music, Apple Music, Spotify Music, and Calm. This is a very brief description of Calm, "Listen to guided meditations, music, inspiring stories and more.".

I am unsure about Calm, but if you want Amazon, Apple, or Spotify music to be paired with your device so that you can use it with a simple tap on the touch sensor (located on the right arm, more on that later), it helps to have a subscription to one of those services. Of course, and as I mentioned above, you can play anything off of your smart phone, and just use your smart glasses as a glorified Bluetooth headset. In this way, you can listen to Netflix, YouTube, AppleVis podcasts, etc., etc.

Regarding Meta AI, this is included with your purchase, no subscription required at this time. You do have to enable it though through the Meta View application. In fact, there are a lot of things you have to, "turn on", before you can really start taking full advantage of the smart glasses.

Below is a list of some of the services and features you can do with Meta Smart glasses:
• Take pictures or record up to three minutes of video an optionally upload to Facebook or Instagram.
• Make phone calls through Meta AI, either via connectivity to your phone app, or using WhatsApp. Optionally with WhatsApp you can also do video calls, where people can see through your camera on your smart glasses.
• Text friends using Meta AI.
• Use meta AI to read signage, documents, etc. With one of the more recent updates, Meta AI can even call a phone number from any document. .
• Meta AI can tell you who is calling you, or texting you. It can also create, and search through, reminders.
• Meta AI can also tell you things like the weather, your location, random facts about things, like how tall is the Eiffel Tower, what is the largest tree in the world, etc., etc.
• You can also ask it to describe what you're looking at, meaning it will take a picture of wherever you happen to be facing at that moment, and describe what's in the photo. Personally, it's not as detailed as Be My Eyes, but it's not terrible either.
• Meta AI can now give you info about a particular song you are listening to at any given time. Kind of similar to Shazam.
• Use Meta AI to send pictures and videos to people with your voice.
• Meta AI can also be used to send its results to people with your voice. For example, if someone asked you, "Hey, how much does the Moon weigh"? You can send meta AIs results on that to said person, as a text, via your voice.

I mentioned above the touch sensor. On the right arm is a multi-touch sensor for doing things like adjusting volume, controlling playback of media, answering/rejecting calls, using Meta AI without saying, "Hey Meta ...", and as a quick play shortcut for services such as Apple Music.
I'm sure I have neglected to mention a few features, but this should at least give you an idea of the devices capabilities. Also, considering that it only requires Bluetooth for connectivity, and after the initial upfront cost, there is no other cost for use of the device or services, it is (in my opinion) The best smart Glas option we have at this point in time.

I hope you find the info useful. 🙂

By Gokul on Monday, November 25, 2024 - 17:11

I for my life cannot get this Shazam-kind of thing to work. any prompt for that?

By Brian on Monday, November 25, 2024 - 17:11

Hey Gokul,

Try something like, "hey Meta, what am I listening to"? I struggle with this also, but I have a friend who also has Metas, and she has no problem getting it to work.
Go figure ...

By mr grieves on Monday, November 25, 2024 - 17:11

I thought the feature to tell you what was playing was limited to the music playing through the glasses, as opposed to being a general Shazam thing that can tell you about whatever music is playing in the background. And in that case it probably knows what is playing without having to figure it out by listening.

I might well be wrong, though - I think this is just what I presumed from reading the release notes for the last upgrade.

And, apologies for going on-topic for a moment :) but the accessibility option turned up for me yesterday. As mentioned before I don't think this coincides with an app update, but I did need to close and reopen the app for it to appear.

By Brian on Monday, November 25, 2024 - 17:11

You may be right about the music recognition. Perhaps it needs to be playing directly from the smart glasses, in order to work properly. I am really not sure. My friend claims she can get it to work, and I can never get it to work. Go figure.
What accessibility thing are you referring to by the way? Are you referring to Be My Eyes?

By mr grieves on Monday, November 25, 2024 - 17:11

When Be My Eyes is available, you configure it through the MetaView Settings. A new category "accessibility" appears alongside the others (under Communication I think)

I may be wrong about the music recognition thing. The release notes appear, I update then can't figure out how to get back to them to check. Also a lot of the new features were region locked if that makes a difference? Anyway it didn't work like Shazam when I was just listening to music in the houes the other day.

By Brian on Monday, November 25, 2024 - 17:11

Oh that is interesting about Be My Eyes, and the accessibility category. Maybe in time other apps or Services will populate that as well? That would be cool.
As for the music recognition, yeah I cannot get it to work either. I listen to Spotify in my house, all the time, and when I'm not listening to Spotify, I am listening to iHeartRadio. My Meta glasses always say something like, "sorry, I cannot recognize this ", or, "to recognize music, you can use Services like Shazam, or Google Assistant".
I have decided that my Meta glasses are just stubborn. Lol

By mr grieves on Monday, November 25, 2024 - 17:11

OK I had another go with this and it does seem to work. I connected my Air Pods 3s to my phone and voiceover came out of them. I then put on my meta ray-bans and could speak to the AI there. At the same time, VoiceOver continued to come out of the Air Pods. I could also talk to Siri and she would come through the air pods too. It worked surprisingly well.

I can't test with things like transparency mode or using them as hearing aids but I'm more confident that they might work now than I was before. How well they work over time I also don't know.

I'm hoping to get some Air Pods Pro 2's at some point but won't be for a while. I'm sure someone else on here must have both that can be more helpful.

By PaulMartz on Monday, November 25, 2024 - 17:11

"Meta versus AirPods"--I like it. Could be the name of the latest Kaiju movie. I'm looking forward to visiting lenscrafters and trying out the Meta glasses in a couple of days. I'm installing the Metaview app, hoping I'll be able to pair their demo model.

By Brian on Monday, November 25, 2024 - 17:11

This video was from the demonstration that Andy Lane did for Open AI to introduce features for ChatGPT 4o, that we never got ...

By Tara on Monday, November 25, 2024 - 17:11

Hi Brian,
Jonathan Mosen did an interview on his Access on podcast with Mike Buckly, and he asked about the video feature and Be My Eyes. Mike Buckley said that if he had known Open AI would have taken this long to release the video feature, he wouldn't have released that demo. But I actually thought Open AI released it, which means Be My Eyes wouldn't have had any control over it surely?