Stepping into the future: Seleste smart glassess Unreview

By Unregistered User (not verified), 15 March, 2024

Forum
Assistive Technology

Last month I took a leap of faith and preordered a pair of Seleste smart glasses. For $99, plus $27 shipping, I was on the list for the next batch of 2000 pairs, due in mid-March.

Today I got the email I had been waiting for. My glasses have been shipped and should be with me on the 28th of March. Seleste is in Canada and I am in the United Kingdom, so that sounds reasonable. I also got a link to download the Seleste app (TestFlight required) and another to update my payment details for the subscription.
The subscription is ÂŁ65 Canadian dollars, which is automatically converted to $50 U.S. dollars. The first payment is due one month after the glasses arrive.

The 29th of March is a holiday here, Good Friday, the start of the Easter weekend. Ig will also be the start of my journey as I take my first step into the future wearing my smart glasses. As they say, my future is about to get a whole lot brighter!

And what will be my answer, when they ask me “what can you see through those glasses?” I will reply “the future!”

The real answers will be chronicled in the comments, questions welcome.

Options

Comments

By Brad on Friday, March 22, 2024 - 02:05

His plan is that everyone, ok a lot of people, use the seleste glasses.

I don't know how well that will work out for him but we'll see.

The delay is going to be worked on as the product gets updated.

Did you try the Ela thing I mentioned in the email?

By Portia on Friday, March 22, 2024 - 02:05

@Lottie,

I know what you mean about the delay.
I think I read/heard somewhere that he does want to make Ela work faster, so it is definitely a work in progress.
I'm definitely excited to see where this goes with the technology, as even for me, in my opinion, it was worth getting these.
I'm not one who is able to plunk down the high expenses for the Envision glasses or similar, since I don't have that kind of money, nor do I have organizations that would get them for me anyway.
Even thoguh he meant these to be for everyone, I am still happy and thankful that he made them... even told him so via email.
So, here's to hoping for a bright future for Seleste.
I had thought of getting the Meta glasses to compare to, but since I'd read that they need sighted help to set up, I knew that wasn't possible LOL.
Anyway, have a great, blessed day/evening all!
Warmest regards to all,

Portia.

By mr grieves on Friday, March 22, 2024 - 02:05

The Meta Ray-bans do not need sighted assistance to setup. I managed to do it on my own and I can't see anything - and I usually bail at the slightest difficulty and ask my wife to help.

The only problem I had was on the very first screen where it asked me to select the glasses I had and the options were just read out as image or button or something. I swiped up on each and it told me what they were. It's possible that it was something in my VO settings allowing me to do that but I'm not aware of setting anything like that up.

It did take me a little while to locate the camera button which it asks you to press during setup, presumably to confirm you've found it. It's on the far right, just on the arm. It didn't take long to find and I was probably just being an idiot like usual.

But otherwise the rest of the process was very simple and the app seems very accessible. I've not had to ask for help with anything. And as I say I almost always have to ask for assistance when setting things up - this includes my Mac and Apple Watch, so I'm hardly one of the blind elite.

By OldBear on Friday, March 22, 2024 - 02:05

I'm having a little difficulty getting my mind around a pair of regular-sized glasses being independently connected to Wi-Fi and having any sort of battery life. I had assumed the Seleste glasses were connected to the phone by bluetooth, and the phone was doing all the heavy lifting, even if partly over the network.

By OldBear on Friday, March 22, 2024 - 02:05

Oh, so you do have to have the phone. This reminds me of trying to use a portable printer with AirPrint on an iPhone.

By Stephen on Friday, March 22, 2024 - 02:05

I’m glad to hear some of y’all got the glasses.
1, yes everything is powered by the phone.
2, no there is no mic on the glasses witch is pretty clear on the documentation. It is best to use AirPods or your preferred headset with a mic to have the best experience.
If your routing the audio from the glasses to your phone, you’re going to get that high-quality voice through your headset and ela will be highly responsive.
3. Love how in the email template posted above just happens to mention something along the lines of how your business will go under. Is that really the proper way to give constructive feedback? Do you really think startups are going to consider our community if we bully when things aren’t perfect? It has been said time and time again that this is a startup and The developer has been very responsive regarding taking constructive feedback.
Is it perfect?
No.
Could it be better?
Absolutely.
Also keep in mind that it is still in beta testing. Instead of complaining, let’s instead provide constructive feedback because the only way that this product is going to get better, is by providing feedback. Complaining and utilizing language like “ your company will go under” is not the adult way of providing a developer feedback on a product.
I’ve said it before and I’ll say it again. Right now, this is the worst it’s ever going to be. There are instructions on how to set things up and if you are still having trouble, just do something as simple as reaching out and asking for assistance. They are always willing to help and make improvements to make the experience better. I would encourage you all to refocus the conversation into a more positive direction even if there is something you don’t like, there’s always a way to say it where it doesn’t come off so abrasive.
But what do I know. I’m just me.

By Louise on Friday, March 22, 2024 - 02:05

Stephen, I like how you phrased it, which is that the technology is the worst it ever will be. I've had these glasses for almost 2 days now, and have figured out mostly how to use them.

I use my bluetooth earphone with them, and so don't need to have the phone in my pocket when moving around the house.
As for why the glasses need a wi-fi connection, I imagine that more data can be transmitted through wi-fi than bluetooth, but that's just me speculating. The smart bulbs in my house that connect to my phone and Alexa also need wi-fi, and I have no idea why, so I'm not going to try and understand it for this either. I'm just connecting to hotspot and going out.

I've had issues with Ella disconnecting. I emailed Shubh, and he answered quite quickly that it's something they're working on, but that turning voice recognition off and on again usually fixes it.

I'm probably going to have a call with him tomorrow, and I do have a couple suggestions on how to improve the experience, but in general, I'm impressed with what this little starting up business is taking on.

Am I tempted by the Meta glasses? Yes, a bit, but there may be room for both in my life.

One thing I really like with the Seleste glasses is the ability to capture text without talking. I'm anxious to try this in a meeting where powerpoint slides are being used.

One thing I'll suggest is that when text is recognized, it could be saved on the phone, so I can refer to part or all of it with VO.

Anyway, I do agree that constructive feedback is the way to go. Shubh is well aware that the success of his company rests on the success of his product, so doesn't need to be told that. In brad's defense though, he did decide to not send the email.

Here's hoping that we all look back in a few years and remember how we were in on the beginning of something remarkable.

By Shubh on Friday, March 22, 2024 - 02:05

Hey All

Sorry for not messaging here as much. Thanks, Brad for reminding me to check in on the apple vis community. You guys are so passionate about this technology it's great to see I've never seen anything like it!

Also really cool seeing some of you in-person at CSUN last week!

Some of my thoughts
Yes we use Wifi to send photos to the phone and then analyze that on the cloud. We're currently looking into how we can move this over to Bluetooth to send photos instead but that might be something we can only do with the new hardware. This is a really high priority for us.

The audio on the glasses is pretty bad, you can understand the words that it's saying but just doesn't sound nice. Many of our users route the audio through the phone and then Bluetooth headphones for a better audio experience.

Also I don't mind the good, bad and the ugly when it comes to feedback on what we're doing. People are paying for a product and trusting us to deliver and so it's on us to fix the problems people are facing and provide value.

We're working on the issues people are facing right now but I'm excited about the new things we have coming up like Ela being able to help with multi-step tasks, better leveraging memory and more.

By the way that's a funny story about your experience with the Dolphin Computer Access Lottie, hopefully, with everyone's help we're able to create a great product and then we can all look back at the early days lol

By Jahmal on Friday, March 22, 2024 - 02:05

Hey all,
I received the glasses yesterday, so I've had a full day with them.
I did experience a few connection issues when trying to connect to my hotspot.
However, I managed to resolve them without contacting support, though it did require a moment or two of tinkering.
I've encountered some issues with Ela, the AI assistant, but I went into this knowing it's technically a beta and still very new. The problems I've faced so far aren't deal-breakers at this stage.
Why did I get the glasses at this early phase?
I love cutting-edge technology.
I enjoy being involved in projects from the ground up.
When purchasing these glasses, we need to accept and approach it with the mindset that we're essentially getting in on the ground floor.
Instead of just complaining about our issues, we should discuss them, yes, but also share some positive experiences we've had.
We shouldn't speculate on whether they'll succeed or not, express that the company will fail if these issues aren't fixed, or expect production-quality software.
It's been made clear that this is still a very early version.
We need to support the developers.
Think of it as an investment in the future.
These glasses have tremendous potential.
Today, I read a restaurant menu using the scan text feature. That was such an awesome experience!
I also had the chance to look out of the window in my Uber as we were driving down the road. That too was an awesome experience.
I believe that over time, these glasses will advance and improve, and we'll look back on this and be amazed at the progress.
So, let's hang in there, be supportive, and back the glasses.
Let's recognize where they currently stand, meet them there, and grow with them.
Let's help make these glasses the best they can be, without becoming a source of stress for the developers.
I know we can do that.
Even if they do fail, that doesn't mean nothing came out of it. Who knows? Maybe the next project will succeed, and the developer will take what they've learned to wow us all.
I prefer to focus on the bigger picture of these glasses and the company succeeding, where we had a significant impact in making that happen!
They may not be for everyone, but that's okay.

By Stephen on Friday, March 22, 2024 - 02:05

So just so y’all know, you can continuously ask the assistant to do things like:
Continuously describe my environment,
Continuously describe the people around me,
Continuously describe what’s at the car window,
But my fave one:
Continuously describe what’s going on on the television screen.

Some tips:
If you make a phone call, use dictation, use Siri, the glasses assistant will glitch out.
Some times I have to just go into the app, sometimes I have to turn the smart assistant off then back on and sometimes I have to restart the app. With the original set up yes you do have to tinker with it just a bit.
All of those issues though the devs are working on so have faith.
Yes there is some latency, but it’s actually better than what it was before. I’m sure it will get even better with time.
I haven’t used them in a few days because I’m waiting for the new Shokz OPENFIT to be delivered as I hate wearing the AirPods for long periods of time.

By Jahmal on Friday, March 22, 2024 - 02:05

I have a set of SHOKZ open fit, and unfortunately, them and the glasses do not fit well together. The glasses don’t really fit over the loops of the open fit that wrap around the ear. At least, that has been my experience.

By Stephen on Friday, March 22, 2024 - 02:05

Well they better fit on mine! I didn’t spend $250 for nothing lol.

By Harryubu on Friday, March 22, 2024 - 02:05

I just went out for an hours walk with my Guide Dog and throughout that time I used continuous scene description. The battery on the glasses dropped by 20% and the battery on the phone dropped by about 10%. That was a pleasant surprise!
I was using my AirPods Pro as well as the glasses and had audio in the Seleste app routed to the iPhone. This made a huge difference to my experience! The voice you get describing things is much more human-like.
As I walked along the Voice did describe some features in front of me but it was very sparse: I would get very bored of hearing that there was a sidewalk going straight ahead in front of me with a hedge on my right and the grassy area on the left with some parked cars! Occasionally it did make more interesting observations about items. What I did notice was that it only recognise a pelican crossing when I was about 2 yards from the traffic lights, whereas it spotted features like roundabout much further away. It also uses American terminology so for example it kept announcing that there was a pedestrian crossing ahead or on the right whereas there was simply some tactile paving. In the UK most people would understand by pedestrian crossing that there are white stripes on the road and beacons or traffic lights with a red or green figure illuminated - so this can be a bit confusing.
In order to get the glasses to report the name on a street sign, I had to stand and wait for nearly 30 seconds. I think it would be useful if there were some audible beep to indicate that the glasses had just taken a photo which it was preparing to describe to you because otherwise I was constantly left guessing at what point the photo had been taken and therefore how far back the description corresponded to.
It would also be useful if the assistant could indicate that the pavement or path in front is narrowing because it did not pick up such features nor that the pavement narrowed because there was a car parking space there.
Great potential!
Harry

By Brad on Friday, March 22, 2024 - 02:05

You are beta testers because you're going to be testing new features as soon as they come out, well,, if you decide to keep the glasses.

What other word would you prefer they use? You're testing features as they develop, isn't that what a beta tester does? The only diffirence is you're paying to support the company and it's not free.
The app isn't on the app store yet as far as I know either so that's still being ironed out too.

By Brad on Friday, March 22, 2024 - 02:05

well then we're going to have to agree to disagree on this one.

By Lee on Friday, March 22, 2024 - 02:05

When is a beta tester not a beta tester? Say in a few weeks Seleste introduce a new feature and want feedback then at that point you are by definition a beta tester. Most people would update to that new feature and try it out. However, I do agree with Lottie that if a product comes with a set of features that have already been tested and bugs removed then that should be the minimum the person purchasing the product should expect to work. So in this instance, Lottie should receive what it said on the tin but going forward she may become a beta tester for future releases. Envision I believe introduced a feature last year called Describe scene and at the time they actually put in their updates it was a beta version. A couple of months later they released the full version. So I don't see any issues with that process. Afterall, say 2 people testing a feature can't be certain they have found all bugs etc whereas 100 people may well do.

By Karok on Friday, March 22, 2024 - 02:05

Hi all, I agree with Lottie in some respects but i think what Brad and others are stating is that okay, you are buying the product with a down-payment. then, as you pay your 50 canadian dollars per month you are able to test i believe weekly features as they emerge and i understand, the new hardware comes out and you get it as you are a paying customer, yes beta testing the product but you will be helping from what i understand to shape its future. what i would like to know is on a differing note, when say i want to read a book will it describe any pictures in the book as well as reading the text and allow me to batch scan? i find with Be My Eyes i can't say batch scan a few pages and recognise in one go.

By Ollie on Friday, March 22, 2024 - 02:05

Full release, to me, means a feature set that works 100 % of the time. Beta is as yet unreleased public features that still may have issues.

This sounds like a well intentioned product, but I'm not going to bother. There are going to be far better and more stylish products coming out with a wider user base in the near future. I don't want to be testing a product I've paid for, I want to use it.

By Harryubu on Friday, March 22, 2024 - 02:05

Just been out to a cafe and ELA read the menu out to me beautifully! The only snag was that the other 3 sighted people I was with kept interrupting my listening by talking to me! I will have to go into town alone to a cafe to practise asking ELA questions about their menu. Good excuse? Harry

By Brad on Friday, March 22, 2024 - 02:05

Exactly.

It's going to be interesting to see what comes out in a couple years.

By Ollie on Friday, March 22, 2024 - 02:05

I think it will be sooner than that. I've read several articles naming 2024 as the year of the AI glasses. Some have suggested that Open AIs first device will be glasses.. Though I'm not sure. Meta have certainly opened the door though, you're right, in a couple of years we'll be looking back on these devices and sniggering. Fools we are, fools we were.

By Louise on Friday, March 22, 2024 - 02:05

We shouldn't be all that surprised that the Seleste glasses aren't as smoothe as they're going to be with more work and development. From the begining on this Site, Brad let us know that this is a startup, and Shubh has been clear that this is early days, and the product will significantly improve. In addition, the app is clearly a beta app.

That being said, although connecttion hasn't been as smooth and easy as I think it will be in future, I'm hard pressed to know what the glasses fail to do the that was promised?
Recognize text within about 5 seconds, check.
Describe a sceen at the push of a button, check.
I haven't tried the walking around features, but read here that they work.

I have ideas on how they can be improved, but they do live up to their promise IMO.

By mr grieves on Friday, March 22, 2024 - 02:05

They sound less like a beta than most major OS releases from Apple or any other bit of mainstream hardware or software I try to use. I know maybe there are different expectations with something actually built for us, but since I've been blind I feel like everything is beta.

I think being an early adopter of something like this, I would probably expect a few wrinkles. And I don't think we can consider anything that mentions AI as anything other than a beta. (That's not AI-bashing, I love it, but also we have to accept its imperfections) I think I would have been pretty annoyed if I had been unable to set it up when the meter is effectively running at that point, but it sounds like the company is pretty responsive.

The question is more around whether in its current form it does enough for you to warrant the price which is something only you can answer.

I don't think it does for me personally just yet but I'm continuing to read these posts with great interest and a tiny bit of jealousy. And I can't wait for someone to do a demo of the continuous mode! (OK I may have mentioned that already.)

I think really for anything that's a bit pricey I'm going to wait a generation or two and see how it settles down. The sound quality is going to have to improve as I don't want to have to buy another device just to use this one. Maybe I can somehow balance them on top of my Meta Ray-bans for that true 4-eyes look.

We are so close to the perfect blind person accessory - whether it ends up being the Celeste or Meta Ray-bans or something else, it's a really exciting time and it's hard not to get a bit impatient.

By Shubh on Friday, March 22, 2024 - 02:05

Lol this beta-testing label discussion is interesting, honestly, it's something that I struggle with as well. I think the best definition is what Lee said. We're constantly adding new features and so those newer features like using ELA to remember things are "in beta" since their less reliable whereas things like describing the scene are more reliable.
At the end of the day you are paying money for a product and we need to provide enough value to people such that it's worth that $50 a month. At our stage I guess part of the value also comes from being first to get access to new features. But what people find valuable varys a lot. That's why I like getting on calls with people to understand what they want from the glasses and I'm upfront about if we'll be able to do it and our timeline to do it.

Also, Harry you can reread what was previously said on the app under previous AI result, in case you get interrupted by someone talking or just want to read it again. You can also copy the text from there too if you want to save it somewhere else

By OldBear on Friday, March 22, 2024 - 02:05

I read "The Murderbot Diaries," by Martha Wells, and the humans have some sort of internet-like connection in their brains. When they're looking at something in their brain connection, they get a blank stair in their eyes.
So is it going to be noticeable to the sighted people when, if wearing these AI-type glasses, we are receiving audio descriptions because we just kind of get this... look?
Patricia Cornwell also has a couple of characters in her "Kay Scarpetta" novels that use AI fed sun glasses in their Secret Service jobs. They're always getting information in the glasses that they don't talk about. So are sighted people going to be asking us how our glasses are describing them or other sighted-people-type questions?

By Harryubu on Friday, March 22, 2024 - 02:05

Compare Seleste (ÂŁ600 per year). With Envision glasses (ÂŁ3500 plus ÂŁ200 per year). How many years of paying Seleste sub before they equalize? I reckon gettimg on for 9 years! Envision is quite old hardware. Do your own maths. Seleste seems pretty good to me

By OldBear on Friday, March 22, 2024 - 02:05

Sometimes I scan the junk mail of certain sized cards because neighborhood alerts are interesting. The rest gets used as scrap paper
I don't have a choice, my mind gives everything a color or appearance, sometimes just from it's name. Your air fryer is a kind of steel blue/gray, and I will have to fight with my mind if it turns out to be wrong. I did use the Clothes Color app to confirm that my pizza cooker is red, which I knew before I bought it, but is white in my mind part of the time.
I guess it might be nice to have instant color information about something before it is set in my mind. It also comes in very handy to know the actual color of things for when you have to talk about something with a sighted person.

By Bruce Harrell on Friday, March 22, 2024 - 02:05

Maybe I missed it. Has anyone with these glasses shared how they describe people around them? Appearance? Facial expressions? Body language? Clothing? Age? Attractive orunattractive? Make up? Neat or dirty? hair style? Complexion and eye color? Race or ethnicity? Jewelry etc. being worn? Mood? Abnormal features, such as big mouth or close set eyes? Height? Physical endowments, such as bust size, muscle mass, or fat? Sitting, standing, running, walking, crawling about on the floor (as if there was anywhere else to crawl)?

By OldBear on Friday, March 22, 2024 - 02:05

I guess we both know now. Oddly, the air fryer is black in my mind now, so maybe these AI descriptions could help.

By CrazyEyez on Friday, March 22, 2024 - 02:05

I ordered a pair.
Figure I have nothing to lose with the 30 day money back guarantee.

I'm excited to try out all its features.
I will make a recording as soon as I get them if nobody has done one by then.
Thanks to all of you who have shared your experiences.

By Stephen on Friday, March 22, 2024 - 02:05

It can describe peoples features, hand gestures, what type of clothing they’re wearing etc. etc. but as for attractiveness, attractiveness is up to interpretation. What one may find attractive, another will not. It can describe certain things to you, it’s up to you as to whether or not that’s attractive to you.

By Bruce Harrell on Friday, March 22, 2024 - 02:05

Thanks Stephen, but how does it do with facial expressions? Body language? In other words, what information do the glesses offer about another person's mood or emotions?

By Gokul on Friday, March 22, 2024 - 02:05

As much as much of the visual information out there might have become not really inportant in terms of living the daily life of a blind person, it is both interesting and useful to have this info especially if you deal with a lot of sighted people on a regular basis. And, there're details we miss, say, when we go to a new place such as a new room, which is obvious to everyone else. This was the greatest thing about BeMyEyes for me. And on a side-note, does anyone know if Seleste plans to offer a feature wherein you can save the picture of a person with a name, and whenever that person is detected again, it can tell you their name? Envission has that thing I guess.

By Harryubu on Friday, March 29, 2024 - 02:05

So, after 8 years, Envision glasses costs ÂŁ3500 plus ÂŁ1600 for updates (@ ÂŁ200 per year) = ÂŁÂŁ5,100
Seleste costs ÂŁÂŁ4,800 (@ ÂŁ50 per month)
So, yes, you do have to pay nearly 9 years to equalize!
The subtle killer is the Envision annual ÂŁ200 for software updates!

By Harryubu on Friday, March 29, 2024 - 02:05

This version of the Seleste glasses is not it - and Shubh quite openly says so. I just reckon they are keen, dedicated and very open to learn from us. As I have posted before, ÂŁ50 a month works out SO much cheaper than Envision glasses. And if someone else comes out with a better option, we can all cancel our Seleste subscriptions and continue the ride elsewhere

By Emre TEO on Friday, March 29, 2024 - 02:05

Rayban meta and a few different glasses I tried worked seamlessly with FaceID. What is the difference between Seleste?

By Harryubu on Friday, March 29, 2024 - 02:05

Yes, I just unlocked while wearing Seleste. You could redo your face scan and include a version wearing Seleste?

By mr grieves on Friday, March 29, 2024 - 02:05

I think I know what you mean. This form factor is incredibly convenient.

It's maybe similar to how I am with the Meta Ray-bans. Could I have used my iPhone for all the little videos I've been taking? Sure? Would I have enjoyed it as much? No way. Would I have bothered? Well, possibly not. But it does sort-of defy logic.

I think some tech just clicks and feels nice to use. Sometimes it's obvious why but sometimes it's just a feeling. I had a similar thing with the Hable One. Did I need it? Certainly not? Could I bring myself to send it back at the end of the trial? For some reason, no.

I think a lot of it is how much effort it is to use. It's not a massive amount to open my iPhone, find the right app, go to the right setting, take a photo, wait a bit etc. But it's still enough that I might not bother unless I really need to. Maybe we are just spoilt with all the tech that is out there now.

Anyway please keep all the updates coming - I'm enjoying finding out all these little details. I'll keep holding off for now but maybe one day.

By SSWFTW on Friday, March 29, 2024 - 02:05

Is there anyone here willing to record a few demonstrations? not a walk-through but something that would give us a real worldview of how fast/slow they are and the quality of description you get.

By mr grieves on Friday, March 29, 2024 - 02:05

I like to think that we are moving slowly away from being able to do things we need to do, to being able to do things just for the hell of it.

I know you can with the iPhone, but wha I mean is if you have something that's simple and enjoyable to use, you might use it just because you can as opposed to solving a specific problem with it.

Going back to the Metas - in the past my wife would sometimes ask me to take a photo and I'd groan and grumble and fumble around with my phone and take it, and it's good it helps you get things into the centre, but I will inevitably forget which of the physical buttons is a shortcut for taking the photo and I'll be swiping around all over the screen trying to find the take photo button. And it's fine, and it's amazing I have this tech that lets me do this so well without being able to see what I am doing.

But on the other hand, I can just ask the M guy to send the photo directly to my wife and it's done in a second. And now I'm more efficient at taking the photo than she is. OK it might not be perfect but it's usually good enough and I think I am finding that quite empowering. Whereas on the iPhone, yes I can do it but it usually takes me a bit longer than I would if I could see.

Anyway no doubt I'm going off on a tangent and this is nothing like your experience, but I'm really glad you are enjoying the Celestes so much. All power to us in the long-term.

By Tom on Friday, March 29, 2024 - 02:05

I've been reading the discussion, together with similar topics, and it prompted me to order the glasses, unfortunately I decided on March 1, just a bit late for the first round.
But a few thoughts about the recent discussions, and what made me sign up.
I read recently that it is not as detailed as Seeing AI or BeMYAI. I honestly wonder if it is actually not a good thing, when sitting at home going through old pictures, I want as much detail as possible. For that matter, I theoretically zoom on different parts of the picture and have it described in small parts, thus in great detail.
But walking on the street, I need quick and useful info.

Regarding the fact that you can use the phone though it may take a bit longer to get the info, it is true, and I was trying to assess why I am not doing it then. And ultimately I feel that sometimes it is a deal breaker when I need two hands. When I am out and about, I have one, the other holds the dog or the cane. And it makes it so slow or complicated to use the phone that I would rather plan ahead and eliminate it as much as possible. This is why I decided on the glasses, if it could give me an audio or one-handed solution it would make a huge difference. Yes, there is Siri, but currently it don't work with the workflow I would need.

But I will leave it there, the rest would be speculation which wouldn't make sense without holding the glasses, if it won't be too late, I will definitely create a review.

By MissThea on Friday, March 29, 2024 - 02:05

I am a girl with an eye for neat gadgets. Always have been. So, are these virtual reality, glasses or what? Is this I guess a partially sided kind of app or glasses that you can use. I’m total, so I guess I won’t be able to use them, but I’ve always heard stories about how they could make a pair of glasses so that even a totally blind person can take in visual information. And I watch too much science fiction. Thing is, I’m very interested in these glasses. Keep us posted.

By Brad on Friday, March 29, 2024 - 02:05

they're glasses that allow you to read text, describe what's around you and stuff like that.

Honestly the website is quite bare-bones at the moment, in other words; it doesn't have much on it when it comes to what these glasses can do.
There's a youtube channel but it could be a lot more professional, the guy, Shub,, shows off the glasses a little bit but no where near my standards of what i'd call a professional level.

By Brad on Friday, March 29, 2024 - 02:05

Shub has told me that that stuff will come in the next hardware update.

I'm more interested in the glide as I've said so should really leave this topic alone but you know me,, I can't resist answering questions.

By OldBear on Friday, March 29, 2024 - 02:05

Just remember, slogans serve the purpose of replacing thinking with chanting, even if it's mental chanting.

By Brad on Friday, March 29, 2024 - 02:05

That you guys who can must make changes when you can, who knows, you could ask the person, if they're worth asking that is.

By Dave Nason on Friday, March 29, 2024 - 02:05

Member of the AppleVis Editorial Team

Enjoying the thread, but as someone who has ordered but is yet to receive the glasses, I must admit I’m getting cold feet.

By SSWFTW on Friday, March 29, 2024 - 02:05

Hearing that they are slow and the quality of the descriptions, are not as good as be my AI makes me much less excited.

By Brad on Friday, March 29, 2024 - 02:05

He'll give you a refund, he did for me.