hi guys, I truelly didn't know where to post this topic, but..
Well the question is, have anyone of you tried the wonders of Chat GPT 4o in your day by day?
I want to know if this AI has made a difference to you now that it has video capabilities, which could be of so much help for us these days.
I would like to know this in order to decide if it is worth to pay $400 mexican pesos each month to have this enhancements.
Thanks in advance.
I
Comments
It's not out yet, no one has…
It's not out yet, no one has access and current reports suggest it has been delayed further.
If you want to keep up to date with what's going on with it, check out the chat gpt reddit.
Answering your question regarding paying for chat gpt, it doesn't give you anything more asside from more requests per hour and the ability to create and publish your own GPTs.
I'm sure someone will cover it on here as soon as it makes it way to either Chat GPT or Be My Eyes.
Which one was Andy Lane's ducks?
So much LLM/AI has hit the fan that I can't remember what he was using in that video, though I'm pretty sure it was some kind of beta test that the rest of us can't have. The ducks were the only one that made me feel I truly wanted that, in the sense of pealing off wads of money each month. Otherwise, I'm OK with using that chat bot comparison site for my extremely limited interactions.
That was GPT-4o
As has already been said, it isn't out yet. There is some sort of delay - maybe they are still getting there ducks in a row?
If it ever happens...
That would be worth the pesos. At least for watching animals. LOL
Ducks in a row, need a pato…
Ducks in a row, need a pato on the back for that one...
clarifying
GPT-4O is 100% out and available. What's not out are some of the multimodal capabilities, specifically, the voice chatting / video call functionality which was demoed last month. it also can't yet generate images.
You can experience the model for free, with heavy usage caps without paying anything. I think they allow you to send like 10-15 messages every three hours, or something. You just have to create a free account to use it.
Multilingual punishment aside
I'm not sure how many requests per time unit you will get when you can get it. That's the one thing that makes me hesitate in longing for this type of thing. You might end up using all your AI time up trying to get it to give you the descriptions you want. In the back of my mind, I wonder about the internet traffic and providers, but I'm not even sure what to ask about that.
I believe there is going to…
I believe there is going to be a deal with be my eyes. As a specific access use case I believe chat GPT will be giving be my eyes a lot of tokens for free which, I'm hoping, will avoid limitations, especially in cases where we're mid way across a new town with it looking out for our drug dealer... I mean, bus stop.
Reading the news on this, I don't think it will be out for quite a while though. It was just a publicity stunt to trump google, or so people are saying. They said out in the coming weeks and, as others have said, that's intentionally misleading. Andy's demo was tantalising indeed, but I wonder if scaling for such intensive processing is going to be the bottleneck.
I doubt it will be out for a couple months.
All these companies wanting to one up another, it would be amusing if it wasn't so sad.