I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.
They cost £329 in the UK. They arrived two days early.
I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.
Comments
Meta
That's what kept me from ordering these literally the first day I read about them. Anything connected to Meta makes me uncomfortable. But my curiosity won out in the end.
At CrazyEyez
Sorry but that doesn't fly with me.
There is a difference between you getting tracked by cookies, Trackers, iFrames and so on or directly purchasing a product from a company which is known for selling your data off to data brokers and creating add profiles on you. One thing is you encounter and have to battle against to get ridd of that, the other one is a contious decision. I rather wait till apple comes around with their iteration of a smart glass setup.
I don't know about the Celeste Glasses, I have seen to many startups with great ideas fail over the last cupple of years, what tells us that the company behind Celest won't just collapse in the next cupple of weeks and these glasses turn into electronic paperwaights.
Datawolf
Apple tends to release tech late than all but when they do so, they are much better. They do take their time to get it right most of the time. Same here. I am sure that their glasses will be much better, probably the info will be process within the phone regarding AI since they got several companies that process info within the phone and avoiding sending it out to the cloud to get data review. Picture it as Sofia tended to said, You get a nice picture of someone and your phone will give a good description and also use and external AI for others things.
Apple variation...
Would also probably be way over-priced. Like the Envision glasses. Cool concept, but they will never be in my price range.
Coming sooon!!
So excited to finally try these out! I get them tomorrow!
Demo of AI features
The Double Tap podcast yesterday had a really nice demo of the AI features plus the video calling. It's in the second half of the show if you are interested. https://doubletaponair.com/apple-id-issues-ipads-for-blind-people-a-meta-ray-ban-review/
@Stephen
Enjoy them!! Mine will be here sometime between May 6th and 8th. It was originally 10th through 12th, and I'm hoping it moves up even more. I don't wait well for tech, Lol!
AI is now in the UK!
Apparently, the AI feature is now rolling out to the UK. It might be a slow roll-out so not sure if it is available to anyone. I cheated with a VPN so can't check myself but this is such great news.
AI
I had a bit more of a play with this and firstly, I agree with everyone being so excited about it.
It was able to describe my wife's clothes, including the colours and pattern of her dress. It was able to read the ingredients of a recipe and then also the directions to follow. It helped me identify a bottle and what was in it. It told me the best before date on the milk carton.
I think it doesn't like to say too much. So I asked it about the shelves of condiments which has herbs and spices and it only wanted to tell me about some of them.
I asked it to tell me where the Thai green curry powder was (my wife ensured its label was visible so a bit of a cheat). It told me top shelf, second on the left. But it was actually the shelf down and in the middle. I asked it for the ingredients and it told me the kind of thing that tends to go in Thai green curry powder, not what was in this one.
It managed to find my wife's secret stash of cookies that I had no idea existed.
I think for text if you have something fairly simple it should be fine. Maybe the magazine had too many distracting visual elements on it. I'll have another try at sign posts when I get a chance.
But the fact I have this sort of thing sat on my face giving me all this information is incredible. If you had told me I would have this a couple of years ago I would have thought you had got carried away reading science fiction.
But as always, just be really careful with the information it gives you.
mr grieves
Cookies? Now that you know do not eat it. If you do and tell her the glasses told you, she might hide them.
Mind blown.
OK, so as some of you folks may know, I have a couple pairs of smart glasses. I had the Celeste, I also now have the Ray-Ban metas. The metas are everything i’ve ever been looking for in a pair of smart glasses. Response time is superfast, it does give you short descriptions, but if you want more detail, just ask it. I prefer short descriptions because sometimes I don’t need every single detail. Text, Reeds great, I even tried holding my finger over a button on a microwave and asked it to tell me what button my finger was on. Worked phenomenal. End up buying a second pair as a spare. Also works fine when it comes to reading my thermostat from across the room. I also love that you can do multiple things with it, not just descriptions. Making and receiving phone calls, listening to music, taking pictures and live streaming all of which are phenomenal. Not gonna lie though, it makes me worried for envision and seleste. Meta and Rayban are behemoths in their perspective fields.
Re: Cookies
Cookies? What cookies? I don't even know how those crumbs got there.
Second pair?
Why are you getting another pair already. I’m over the moon excited about where these are going but 1 pair will do for me. Are you going to try the Skyla frames? I’d be really interested to know what they’re like.
Second pair
The reason why I’m getting a second Paris so that I can swap them out when the glasses battery gets low
Hey Stephen. Would you mind…
Hey Stephen.
Would you mind explaining how you got yours to read text?
Mine will not go past the summary.
The product recognition isn't the most accurate either.
I had some deodorant, and it got the brand wrong twice. Both times it gave 2 different responses.
The colour recognition seems okay.
The speed is great.
I'm still excited to try the Seleste glasses to compare the two devices.
Good times are ahead.
Rayban Meta pro tip.
I have been using these glasses for about 5 weeks now. All I can say is, awesome! No. They're not perfect. But which one is? It's been said before, and I would agree. This is the worse they will be.
I took a trip to Eastern Europe a couple weeks ago. I was hoping for easy sign translation, etc. Nope! And the Look and tell features were not available. Well actually, two days before returning to the States, they appeared to get an update that enabled Look and tell along with other AI functions. NICE!
But ok. in case you may not know, there are a couple very useful apps for the phone that really take AI interaction to the next level. And in tandem with the Meta glasses, they shine!
I primarily use PI: Personal Assistant which is a completely free and very capable AI platform that allows for two way communication. And in general, it is excellent. The other is the well known ChatGPT app. It too has a very useful free mode that also provides two way voice conversation.
So the pro tip is, in the Shortcuts app, create shortcuts to either or both apps conversation mode. Then, simply invoke the respective AI chatbot with a hey Siri command.
You may need to unlock your device or use Face ID. But from there on you can have a full conversation and get the information you may be looking for. And all this from the convenience of your glasses. How cool is that?
PI works well at finding real-time location information.
Let me know what you think? I'm thinking of doing a YouTube tutorial. Give it a try and report back!
Do those apps use the…
Do those apps use the cameras in the glasses, or are these apps just something to talk to and ask questions?
Reading text
I think this needs clearing up as there has been some slight deviation from the truth here.
Meta Glasses won't read out documents or labels verbatim, it will always summerise no matter how you phrase it. What it will do is allow you to interrogate the image, eg, what is this brand, what are the instructions etc.
I say this as I don't want others, like me, going a bit nuts trying to work out the magic words, a spell if you will, to make them read a page of text. They won't. They're not designed to do so.
As others have pointed out, these are not designed for blind users. If you are in a wood and ask what meta sees, it won't tell you you're in a wood as the designers are assuming that we already know where we are as we can see it.
Meta AI through Ray-Ban is good on specifics, small nuggets of information, but, as yet, do not read out documents in the way we need.
I also encountered another issue, trying to read my iphone screen it said there were notifications, but would not read them out due to privacy concerns, which makes sense on a wider scale, but makes them pretty frustrating for our usage.
I can only hope that meta, in time, allow for a 'blind mode' which allows us to have a slightly different system prompt to those wearing them doing extreme sports or who simply want to caption photographs or identify a plant.
HTH
other languages?
Hello,
Does Meta AI also speaks other languages? If you take a picture of a text in another language, does it reads it correct?
Regards
@Ollie
Well, yes I sort of agree. But what is the use case for a sighted person to ask the glasses to describe what's in front of them? I get this feature being useful for "what kind of plant is this?" or other very specific questions, but a general "where the hell am I?" is probably not a question a sighted person is even going to ask.
I have noticed that if I say "look and tell me what you see in a lot of detail" that you get a bit more information. Maybe we still disagree on the definition of "a lot of" but when I tried that it did describe the trees and give a bit of a feeling of the ambience.
You might be right that getting it to read an entire document isn't going to happen right now, and we aren't going to be able to sit with a book and have it read it all to us. But for bits and pieces I can see them being really useful.
With any AI gadget there are always big caveats - there is quite a large BS factor to consider for starters. I think right now both developers and us are just testing the waters to see what can be achieved. I hope Meta picks up on how much we all love these glasses and what potential they have in our world.
Getting mine on Tuesday.
Hey all,
I did end up getting the shipping email from Amazon US, saying that my Meta Wayfairer smart glasses will arrive Tuesday.
I'm excited to put them to the test around my room, and maybe around my house as well.
I figured it would still be worth getting them?
A few thoughts
Hey everyone. You probably don’t know me, but I am a totally blind journalist and accessibility advocate. I am using these glasses since January, I had access to the preview program for AI and have been testing features within early access and now with the public Version.
I have to say that I am impressed with what updates I got during this months. The fact that I can activate AI functionality in Norway using DNS when setting up the glasses, the fact that I can stream calls directly through WhatsApp messenger and Facebook messenger complete with video And the fact that they included Apple Music functionality, not only Spotify, makes me really happy for now.
Of course, I’m hoping that the rumored glasses from Apple are in development and we will get something from them soon, but until then, these should do the trick for me when it comes to Identifying objects, describing areas and reading text.
Always here if you have any kind of questions.
Chatgpt on meta glasses via whatsapp
I found this on the web. No idea if it works:
https://jovanovski.medium.com/part-2-getting-chatgpt-working-on-meta-smart-glasses-82e74c9a6e1e
Re ChatGPT
Oh wow, I had no idea this sort of thing was even possible. I'm guessing this could open up the glasses to do all sorts of things if you had the time and inclination. I've not gone into detail but I presume the web hook needs to be over the internet not wifi.
I suspect I am too lazy to go all the way through with this but I am very tempted to have a go. Probably with a Raspberry Pi on the internet you cold unlock your whole smart home if you so desired. (Not looked into the security of this yet though). I do have a Pi but have never been bothered to try to get it available on the internet.
Might have a go this weekend…
Might have a go this weekend. It's something I speculated about but didn't have the direct knowhow, so thanks for the link.
The biggest downside to this, I think, will be speed. it's awkward.
re: Might have a go this weekend…
Hello,
Let me know if it works
Location Services and other observations
I had a good chance to put the AI through its paces today.
At one point I was sitting outside a cafe. I asked meta to tell me the menu of the cafe I was at. It told me that I could enable location services in the meta view app settings to get information like this. So I asked where I was and it said that according to location services on the Meta View app, I was at Towcester. Now I wasn't really in the town, but it was the nearest one. I tried this again later and again it just told me the nearest town.
I then asked what direction I was facing and it said North. So I kept my body still and moved my head to the left and asked again and it said West. So it looks like the glasses have head tracking in them which I wasn't aware of before. The second attempt took two goes before it worked - in the middle it said I had to enable location services in the app even though it understood my question exactly. I asked the same thing again and it told me.
I can't find any specific location options in the app but obviously it knows where I am. A bit later my wife disappeared into an antiques shop and left me outside with the dog. I asked what shop I was outside and it said Tesco Express. As far as I am aware there wasn't one of those for miles.
The where am I/what direction questions seem new. I tried them last weekend and they didn't work, although I was on VPN and in the middle of a forest. But it said I couldn't do that on these glasses. If they can tweak it to give a bit more accuracy that could be useful.
When my wife brought over the printed menu I asked the glasses to read it. It said that it was a menu and had breakfast, lunch and dinner options. I asked it to tell me what was on the lunch menu. It gave me some funny answer about"the best lunch options in Towcester include the Ship Inn etc etc" "
A few other things I tried - there was a bench and it told me it was ornate and had some text on it. I asked it to read the text. It said it was Latin then told me it roughly translated to something or other. I had heard it could translate some things, but was surprised it managed Latin.
I also tried asking about a few buildings I was looking at. It told me things like "this is a gothic temple from the 18th century" and I could then find out what year it was built and by whom. (Assuming it was telling the truth.) Or it said "this is stowe house" and could then go on and describe how many stories it was, the Corinthian columns outside and so on.
So I maybe take back my assertion that a sighted person would be asking what they are looking at, because I guess this sort of makes sense.
I think the detail it gives in general is a bit substandard compared to Be My AI, or even sighted person. But it is quite smart when it gives me some extra information that you wouldn't know just from sight. And as always with AI, follow-up questions are amazing.
Another thing I tried was asking what the score was in the Arsenal v Bournmouth match. It told me 1-0 to Arsenal and who had scored from the penalty spot. I asked how long had the game been going on and it told me "it hasn't started - it kicks off at 3pm". About an hour later I asked for the score and it told me it was 0-0.
The other thing I hadn't realised was that it keeps track of the conversations in the AI tab, so I can refer back with VoiceOver. And in here are the images I was asking about too. I presume these are stored in the cloud and not on the glasses as they are not in gallery.
Anyway, AI definitely enhanced the day for me even if it is still a bit hit and miss at times.
The other thing that was quite good was that I could take the occasional photo to keep the wife happy. But a few times it told me that my hand was over the camera. I'm pretty sure this wasn't the case. One time I asked the M-guy to take the photo and he told me that but my arms were by my side. Unless it was some shadow from the hat I was wearing that confused it.
Re: Location Services and other observations
Wow!
I cannot wait to get mine now!
OMG, this is going to be so exciting to test around my house, etc.
According to Amazon US, mine left Georgia on Friday, so we will see when they get here, Amazon says Tuesday.
In uk and still not got Meta AI to show
I’ve got my glasses in the UK 24 hours ago but unfortunately still no sign of meta AI.
I’ve gone through and pretty much allowed everything that it wants access to. As the matter AI functionality is used in the marketing material, I would have hoped that New users would get it from the start so as to avoid any confusion. Of course if anybody has any ideas if I might be doing something wrong, happy to hear them.
Re: UK
I think if it’s not working for you, I would make sure you are on the latest firmware. Then quit and reopen the app. Then ask the glasses something like “look and tell me what you see”. If it says something like you nee to enable AI, quit the app again and reopen. Hopefully then it should prompt you to enable AI.
This is what happened to me after I used proton VPN to pretend I was in the US last weekend. Once enabled, the app will have a Meta AI tab and something similar appears in Settings.
If it’s not available still, then it’s possible that it is only rolling out slowly. It only came on in the UK mid last week, possibly Wednesday. At the time I heard it might be rolling out slowly as some of these things tend to do.
If you can’t wait, then you can get a free account with Proton VPN, pretend you are in the US and once you are setup, just turn off the VPN and you should be good. But I think you wouldn’t need to do that for much longer.
VPN got me Ai less than a minute
as mentioned, I’m in the UK and meta AI wasn’t showing up. As recommended, I connected to a VPN to show I was in the United States and as soon as I went back into the Meta View app, met AI appeared.
I then disconnected straight from the VPN and it is still working. it’s a shame the information given by Meta AI isn’t up-to-date but the image descriptions are superfast. Initially very short but it’s so quick to answer all questions that it doesn’t really matter. In fact, you probably get the exact information you want much more quickly than if it described everything
Products and previous images
I was at a market today and asked what I was looking at. I was told a dog treats stall. I tried asking what products it was selling. I was told something like "I can't help you with product availability, but I will be able to soon". This wasn't recorded in my AI log and I can't seem to get it to reproduce it again. So not sure what that is exactly.
One other thing I didn't realise it could do. After this I asked it about another market stall. A little later I said "tell me more about the dog treats stall" and it told me about a couple of products it was selling (which my wife verified). I had a walk and asked a few more things. This was a few hours ago. And I asked it just now to tell me more about the dog treats stall and it still can.
So I thought I would ask it again about the bench I saw yesterday. It told me it cannot access previous conversations. I tried again just now to get it to describe the bench to me and I think it basically just described any old bench. It had a different inscription and I asked where it was and it said it was in the middle of a dog park. I asked where in the world it was. Apparently San Francisco. I don't have the VPN turned on any more I should say, and the man in the glasses seems to know where in the world I really am.
I think you could quite easily lose your grasp on reality with this sort of thing if you aren't careful! The fact it was sunny did make me think I couldn't possibly still be in the UK.
Some random thoughts: I…
Some random thoughts:
I tried the "what direction am I facing" question someone else tried and got various answers, so no head tracking on mine. If there is such a thing, its unreliable.
It is very hit and miss with this AI.
I'm getting slightly annoyed with all the summarization.
Yesterday I was in a coffee shop. I had to ask it 5 questions just so I could find 1 menu item and its price.
It got the sausage egg and cheese breakfast sandwich right but got the price of a cup of coffee wrong.
Product recognition is 60-40 at best.
It can't recognize any of my cologne bottles, it took 5 tries to guess the brand of deodorant I prefer.
It recognized I was holding a bottle of water but not the brand.
Finding directions on packaging is frustrating.
So far colour recognition has been spot on for me.
Its fun to play with in the store.
I wore the glasses to Costco and had a ball.
It correctly identified 3 cars that belong to family members and their colours.
The best features are the video calling and the ability to snap a quick picture or shoot a video with the press of a button.
The speakers are quite good for such a tiny product.
Its not all bad but it definitely needs work.
If anyone has any tips and tricks to aid in proper product recognition, i'd love to hear them.
I'm glad I have a 45 day return policy so I can play with them a little more before I decide if i'm keeping them or not.
Still waiting for my seleste glasses. Hopefully they come soon so I can do a proper comparison.
Re: unreliability
I may have got lucky with the directions. It's hard to know. I was assuming head tracking because it seemed to know my head had turned, but who knows really.
I think there is a debate to be had about the importance of being able to trust tech. Right now I think we are all giving AI a free pass because of the potential it has and how mind-blowing it is when it gets something right. But I think once the novelty wears off we will stop accepting all the false positives and start having the same expectations we would have for any other kind of tech.
I think right now it is too early to buy a product only based on the AI in it. My belief is that these glasses are well worth the money without it, and I just see the AI as a fun thing to play about with. I'm lucky that my wife can see so I have an easy way to check the results, and we can have a laugh if it is wrong. But no way would I depend on it.
As has been said elsewhere, this is the worst AI will ever be. It is going to improve at a rapid rate, I'm sure. Whether they can get past the hallucinations I don't know.
I definitely wouldn't use the AI in these glasses in an app on the phone - the other options seem better to me. But you can't deny that the form factor is so spot on and the convenience is incredible.
its incredible to be able to…
its incredible to be able to hold something and ask what it is.
Its just somewhat frustrating when it is giving you wrong answers.
It frustrates me further because it recognized the breeds of my little dogs, the fact that the dog was drinking water or sitting on the floor and their colour.
It recognized cabbage leaves and green onion.
I love the convenience of having the glasses.
I'm excited to see what the future holds.
more amazement lol. It…
more amazement lol.
It correctly answered cabbage rolls and mashed potatoes when I asked it what was in front of me.
We don't need it since it is brailled, but it successfully recognized Canadian currency ranging from $10 to 100 dollar bills.
Glasses
I think we all need to remember that these aren't designed specifically for blind people. They aren't meant to read menus to us, as it's unlikely sighted people would be requesting this. The AI is cool for what it is. But if I need something specific read word for word, I'm going to try one of the apps on my phone or the Celeste glasses. That's more of what they're meant for. My Meta glasses did correctly identify every seasoning, jar, and can I held in front of them. Lol it even used a Mexican accent when identifying my Midz Alfredo sauce. It's also been completely accurate with colors and scene descriptions.
@ Brooke
Ah, the quest for the ultimate tech gadget—it’s like expecting a Michelin-star meal but ending up with fast food! Your Meta glasses seem to have quite the flair, especially with that Mexican accent when identifying your Alfredo sauce. It’s like they’re ready for their own segment on a food network!
But let’s talk turkey about something crucial: inclusivity.
It’s wonderful that your glasses can identify every spice in your cabinet and narrate your surroundings like they’re in an Oscar-nominated film. However, the real magic happens when technology serves everyone at the table. Right now, it feels like tech companies aim for the moon but only clear the tree line. They dazzle with features for some, but what about making sure everyone can benefit?
We need devices that go beyond just impressing at tech expos; they need to be as versatile and accessible as a Swiss Army knife—useful for everyone, whether they have perfect vision or need a little extra help seeing the menu.
Imagine a world where every gadget is like the ultimate potluck: dishes for every taste, preference, and need. Everyone, from those who can’t read tiny text to those who just want to spice their chicken correctly, finds something useful.
Let’s champion technology that doesn’t just boast smart features but also embraces a warm, inclusive spirit. These gadgets should be as commonplace and essential as salt on a dinner table, ensuring everyone can enjoy their meal.
Here’s to a future where technology is as inclusive as a family reunion buffet—everyone finds something they love. Let’s make it the standard, not the exception!
Brooke
Meta would do better to allow third party apps that can take advantage of the glasses. Until them, it will be just a nice toy. If they let third party apps than only than, the blind will be able to fully take advantage of the possibilities of the glasses. Google map, Be my eyes and apps that are setup for the blind would be able to make the glasses be worth the price. Until them, is a nice toy for those who have the money to get them unlike us who do not have the money for toys. Hope Apple will release glasses similar to it and would be worth the price.
Inclusivity
@Stephen you make an important point there. Its sad that noone bothered to think about inclusivity or the potential of such glasses for the disabled community when they were conceived; things would have been so holistic if the glasses were built with accessibility and inclusivity in mind from the scratch. Having said that, the platform becoming inclusive is still a very real possibility so long as we as a community can make our voices be heard. I still remember the state of android accessibility as recently as android 8/9, and how far it has evolved since then. So it's all about letting our voices and concerns reach places which could help make a difference.
Also, why is the comparison always between Meta and Seleste glasses? Why is Envision Glasses never in the picture? Is it the cost? Or is it something else? I mean, the range of services they offer is almost the same. I'm genuinely curious.
@ Gokul
I think the reason for me anyways is because when it comes to the unflattering appearance of the device, that alone is not worth the time or effort and yes, The almost $4000 cad price tag is another factor for basically the same features as the seleste and meta glasses. In all reality, the envision AI is about the same as the other 2. Keep in mind the seleste glasses and envision glasses use the exact same engine under the hood. Meta however is building there own which is why their AI model needs so much training before it can compare with open AI’s chat gpt and others like it.
I was using the Ray-Bans…
I was using the Ray-Bans last night to try and identify a blister pack of tablets, looking for some sleeping tablets, and, boy, did it get it wrong. I had the right pills, but it first said they were antibiotics then anti-depressants... Finally, I took my phone out and used seeing AI to actually read the label.
I think this is the difference, AI isn't great for everything. OCR apps that run locally, for now, give far better and more reliable results. For me, meta AI is still a toy. It's fun but pretty useless. I want to read stuff, not have scenes described to me, but that is my use case. I don't know if I'll ever be out and about and want to admire a view described by a synthetic voice when I can experience the environment on my own terms.
Meta AI, for us, needs more functionality and less fluff.
Ollie
I would not trust any AI regarding medications. That would be like playing Russian roulette. Also any scannerapp.
I don't know...
I have been using OCR apps to identify medication for some time now; have also used BeMy AI ( having said that, I don't have to do it constantly; it's once-in-a-while stuff). I guess it's about trusting one's own knowledge of where certain medicins are, the shape/sice of the tablets etc and then confirming that with one/multiple apps. And yes, it's a very subjective thing; it's also about how much one usually rely on their own awareness of the things around them.
As for the toy argument, as I have already mentioned elsewhere, I work on a job where the awareness of visual info is a huge plus if not an imperative, and I find that the use of AI-based apps had already upped my game and the use of wearable tech is taking it to the next level.
Lottie
You just play one? take care.
Let's just keep things balanced
So I think maybe I was wrong about the head tracking and I've just been a little gullible.
So today I tried again. I stood up and look forward. What direction am I facing? North. OK, no idea if that's true. Moved head to the left. What direction am I facing? Firstly, I need to enable location service apparently (No I don't). So repeated. Oh I'm looking East it seems. Now I'm not the outdoorsy sort and I'm not going to go out hiking in the mountains using the moon as a guide. But I'm reasonably sure that east is not to the left of north. So I turn my body so it is facing to the right of north. Now I am looking south. Again I'm pretty sure that's not where south should be in relation to north.
So I positioned myself back where it told me that north was and asked what direction. Now it's west. I ask again without moving. Now it's back to north.
So I think the other day it just guessed and happened to get it right. And now I just feel stupid for having taken its word for it.
Again so much of this is smoke and mirrors, but it's so convincing that it is easy to forget.
@Lottie - I agree with you to some extend. We should absolutely be hyped up about this sort of thing. But I also don't think we should just ignore its limitations. Because with tech like this is is absolutely essential that you understand what you are getting into.
It would be a very bad idea to trust everything the glasses have to tell you.
And this goes back to my previous comment - how important is it that we can trust it?
I think for something that some of us might be wanting to depend on then it is essential. For something that is just giving us a little extra flavour to our lives, then not so much.
But it does beg the question - what am I going to do with all this information?
Am I going to swallow these pills it has confidently told me are the right medication? Am I going to open this can and hope that there really are tomatoes inside? Am I going to try to look clever and tell everyone that I can translate the Latin on this bench?
I think it is much more important for us to get reliable information than for us to get information tailored to being blind.
I think the other side to this is how much confidence AI always has in itself. So, like an idiot, I totally fell for the idea that it had a compass or head tracking built in. Like, why would it lie to me about something like that? In this case it wasn't important. But what if I had been out hiking and had been under the assumption that it could do this.
Contrast with OCR - it's usually fairly obvious if it can't cope, but you know it is at least trying to answer the specific question you are giving it, and not potentially inventing a whole new question you weren't aware of.
So I don't think we should just be saying how wonderful this all is without acknowledging the other side. I feel right now we are excited by the potential not its current state.
But we should also not be so hung up on the problems that we are unable to reap the benefits and look forward to what comes next.
When I had a good try with these glasses before I took them away and was going off to try to find something that normally I would ask the wife for. So I was looking through all these things getting the glasses to read the labels and for a while I thought this is absolutely incredible - it is giving me back something I have lost. I wouldn't go as far as saying suddenly I feel partially sighted instead of blind as that's a bit of a leap but it was something of a rush to be able to do it.
So I then made my choice and took it over to the wife... and it wasn't at all what the glasses had told me it was.
And that's the two sides to it. We shouldn't discuss one without the other. But you should also not feel that just because some of us post negative comments that it in anyway invalidates the successes we do have.
I would also agree that £300 is actually not much considering what you get. That doesn't mean it is affordable to everyone and I am lucky I could afford to get them. Considering the top end Bose frames were a little more expensive and didn't have the nice charging case, or the camera or the ability to share videos or make video WhatsApp calls. Let alone open up this other world to us.
Anyway, Lottie, please keep doing what you do, which is sharing your enthusiasm and making at least one old curmudgeon a little bit more excited about the future.
mr grieves
If the AI from the device tells you that it has a bridge that you can get for $4 that is in SF, cal do not believe it.
@Lottie
Your comment... “I think we should all stop being negative and should all start being grateful for what we have ben given. £300 to turn a blind person into a partially sighted person is pretty good value isn't it?” ... is one of the best things I've read in a long time. It's exactly how I feel. I love these glasses, flaws and all. Do I hope they improve? Absolutely! But I appreciate what they do right now.
Envision glasses
For me, the Envision glasses have never been an option because of the price.
Re: wasn't there a Seleste V Meta thread?
If you are referring to the post titled “Ray ban meta Glasses VS Seleste,” It was removed by the poster.