Just watch this Youtube shorts: https://www.youtube.com/shorts/lNinxT_iotA
Seeing this, I opened Gemini live on my Moto Edge 50 Neo, and found the video stream button. It opened up a camera, and I was able to quiz it with my voice about the things It was able to see through camera.
Bunch of points:
A. If you just pan through your room quickly, it probably won't pick up day to day objects. For example, I swiftly went through the room, just sitting on my chair, and asked if it sees any earbuds case. At first, it just asked me to show it the areas of the room. After multiple back and forth, it picked up the case literally when I was directly pointing the phone at the case up close. In another example, without moving from my chair, I asked it to find a light switch, which was a bit far and not directly facing me, it was not able to locate it even when directly pointing at it. For finding a tablet, it was able to find only when I was directly pointing at it. Not that useful for finding objects unless you are willing to go inch by inch over the room.
B. I showed it a photo, up close and probably holding steady for a second or 2, and, it remembered the details. I removed the photo and was able to continue quizzing about it to the gemini afterward.
C. It does describe people's outfits, their expression, smiling in my case.
D. It is not able to monitor something, for example, in the example of finding objects, I had to repetedly prompt it to tell me if it sees so and so object. It will not initiate speech once object comes in to the view. So, not exactly like the Be My AI demo from a year ago.
I've heard that this is also coming to Gemini app on iPhone. Yet to download Gemini on my iPhone, but this gives me a solid reason to do so, and see if object finding improves. My Moto Edge 50 is admittedly a mid-range device, costing 1/7th , possiblly 1/5th given the exchange rates and price reductions, compaired to my 15 Pro Max.
By SeasonKing, 22 May, 2025
Forum
Android
Comments
Bunch of secondary thoughts
This is already an improvement over Be My AI, as I am able to go back and forth very quickly, without that processing music.
I can adjust framing quickly, and also interrupt Gemini's ongoing speech to provide me new description.
I haven't found any way to quickly launch this live video streaming to Gemini, so if anyone finds one, please let me know.
@Forum Mods
I see that someone else has already started some conversation in another forum category. Apologies if this feels like duplication. I didn't check before posting.
I am okay if this get's deleted, the other guy probably posted before me.
No Reason to Remove It :)
No reason at all to remove this, this post is right on topic. :)
Just checked mine
I just checked mine and i have the option to share my screen or video streaming, this is cool, i just had it describe my room and it did ok, it didn’t give me as much info right away as be my eyes did but i kept asking questions and was able to get what i wanted
Sounds a lot like Google AI studio
So, what are the differences, if any, between this, and google AI Studio? 😎
No differences as such.
They were testing (sort of a beta) gemini live in the google AI studio and are now rolling it out full-scail through the app. That's all. The only difference, as far as I can tell is that you can access it through the app rather than going to a website. And that it will likely not stop after a few minutes unlike in the studio. Also, they say now it's got memory; have not tested personally so not sure..
Slightly off-topic, but do any Chat gpt subscribers here have screen shairing available in their IOS app?
I just tried the IOS App
Gemini IOS app doesn't have the video stream feature in live mode yet. Given superior iPhone camera, I have high hopes from it when it arrives.
Fair enough
Thanks for the info, Gokul.
I have the Gemini app on my iPhone, find myself using ChatGPT, Claude, and tinkering around with Grok more than anything else these days.
Still not seeing it on my end
I’m still not seeing it on my side. Maybe it takes a while to roll out.
A Pixel
Now I believe my next phone wil be a Pixel.
Pixel
I've been playing around with this for the last 2 months and it is quite responsive on my Pixel. I can get it to update me without prompting it when i'm out on a walk. With Be My Eyess, I was always having to stop as I needed to tap the shutter button but withh Gemini, I can just talk to it as I'm walking and it just feels natural. No one is going to know tht I'm talking to a bot unless they are within a couple of ft of me. I ditched my 15 Prro Max for this and the new TalkBack feature which allows you to get a full AI description of whatever you are focused onn and you can ask follow up questions and you can also have your whole screen described. Apple is falling way behind and there is no mention of these ffeatures in the up and coming u accessibility updates.
Re. Saqib
Do you mean what I think you mean when you say "I can get it to update me without prompting it when i'm out on a walk.?" Can it seriously do it on a pixel device? like the stuff that Open AI demoed last year? Seriously?
@Saqib, you opened a can of worms
Same thing What Gokul above asked.
Please clarify, when you say that you can get it to update without prompting, does it speak up on it's own when object comes in to view?
If you tell it to monitor for any obstacles, puddles, potholes in the path, and then walk without any further instructions, does it continuously monitor and provide feedback?
In that case, I would kindly request you for a demo video if possible.
Also calling all fellow Pixel owners to verify if this functionality is widely available.
Proud of Google Pixel 9 Pro owner
Here some video that google released days ago
Project Astra
How Visual Interpreter Helps People who are Blind and Low-Vision
https://youtu.be/PibfzdEaw_c?si=XthydGrIm80UOyfl
Project Astra
It seems like every day I read about something new that Google is doing for the visually impaired people of the world.
Gemini Alerts
Hi, Gemini will tell you what's is infront of you without you having to promt it if you ask it to keep informing you what's infront of you. I can't make a demo video on my pixel as I can't get it to record the audio part for some reason. I get the visuals but not the audio from the video. I don't think it can scan certain obsticals while you are walking but it can just tell you what the camera sees overall.
App
Are we talking about the google AI studio webpage here? or, maybe the Gemini app itself? Have to say the Google AI studio webpage has possibly the fastest response time I've seen from any AI app. It is instantaneous. The Gemini app which I've not subscribed to has the usual delays before answering. My only complaint regarding the webpage is that everytime you start a new chat you have to enable the camera beforehand even if the settings have been set to always allow. Don't get why that would be.
re. Gemini Alerts
If that is indeed the case, I think we have breeched a major milestone. eagerly waiting for the update to role out.
At Lee
You can do from the own Gemini app.
Gemini Alerts clarification
I own a Pixel 9 Pro
and Gemini Live streaming feature.
Works only prompting about the scene that is on camera for now.
I could be completely wrong, but according to the comment that Saqib mention above.
Hi, Gemini will tell you what's is in front of you without you having to prompt it if you ask it to keep informing you.
on my device it doesn't work in this way. to get a answer or description of the scene, I always have to use a prompt.
The only time you don't need to mention any words for it to describe, is at the beginning when you open Gemini app, and select turn on live streaming option at the point it automatically describes the scene. After that you have to keep prompting.
Re: Gemini Alerts Clarification
Although I don't think the Live feature with the ability to see through the camera has been rolled out to iOS devices yet, the behavior you describe is how Google's AIStudio web site works. You can ask questions about what Gemini sees, but it doesn't look continuously through the camera. So, for example, you can't ask it to tell you when something passes into or out of view. You have to keep asking.
Maybe some day they will include a mode that will automatically look through the camera every second or so to catch things that come in and out of view, but that could take a lot of processing power. Meanwhile, the ability to ask about what the AI sees currently and get an immediate response is pretty darn good and useful. Although I can use Google AIStudio on my iOS device (which runs in a browser), I am looking forward for the new Live features to come to the standard Gemini app for iOS devices. Maybe after Google is done rolling it out to Android devices.
--Pete
@Saqib
Are you subscribed to any of the Google's premium AI plans?
It seems The rest of us don't have the feature you described. And I am on the free version.
Regarding Google AI Studio, I am aware it is available via a webpage and have tried it. It's just that before getting the help I need, I don't want to jump through thousand hoops to get it working.
As much as possible, I would love to have assistance just 1 click/long-press away. Be My AI does that currently, but has long processing times. Also the interaction is very much back and forth, requiring taking of entirely new photo if the current one didn't captured what you were looking for. Whereas with something like Gemini live, I could simply move the phone, even when the voice is telling me what's it's seeing, interrupt and simply ask it to check if it's seeing what I need now with the new angle.
Google AI Pro subscriber
I am Google AI Pro subscriber and I do not have the features that Saqib mention above.
My understanding is all those features are coming in the laters android 16 OS, together with Gemini and project Astra. Where Gemini will be able to map the spaces.
Ah, the updates
Good thing that my Motorola is slated to get 5 years of software updates, at least till Android 19. Hopefully they stick to their promice.
it's rolling out
I am based out of India and in spite of that, I have got both live video and screen sharing in my IOS app as of today.