Meta Connect - new glasses, a screen reader and an open platform - Seeing AI anyone??

By mr grieves, 28 September, 2025

Forum
Assistive Technology

I know this happened a week and a half ago, but I am amazed no one on here is talking about the Meta Connect event which feels a lot more exciting than the recent Apple events to me. I’m sure most on here have been following it anyway but I’ll give a very quick rundown.

There are two parts to this. Firstly, the new hardware.

There’s a 2nd generation of the Meta Ray-bans. Now it has 8 hours battery and a 3k camera.

Then there are the Oakley Meta Vanguard which do the same sort of thing, but are wrap-around with 9 hours of battery plus the camera is in the middle not on the left.like Garmin

And finally the Meta Ray-bans Display. These are similar to the normal glasses but have a tiny screen in the bottom right lens. I don’t know how useful this will be to the folks on here. If anything the more interesting part of this announcement is the separate controller called the Neural Band. I’m not sure I fully understand this but it is something that fits on the wrist and can be controlled with subtle gestures. Apparently Meta are considering allowing this to be used as a standalone controller for other things in the future.

The other intriguing thing about the Meta Display glasses is that they will have a screen reader from day one. This is pretty surprising and great. Whether it makes the Meta Display worth having for us over the other glasses I don’t know.

But without a doubt the most exciting thing is the Meta Wearables Device Toolkit. We’ve been talking on here for a long time about the restrictions of the Meta glasses - you can only use the functionality that Meta provides. Well, it seems, no longer.

And amazingly this is having a big focus on accessibility. Already it’s been announced that Seeing AI is going to make use of this, and companies like Be My Eyes (of course) and surprisingly, Humanware are coming on board.

According to Double Tap we probably won’t see the fruits of this until sometime next year.

But have a look at this: https://developers.meta.com/blog/introducing-meta-wearables-device-access-toolkit/

I think initially they are going to be providing other apps access to the camera, microphone and touch pad microphone and touch pad but they have also said that they are looking into integrations with Hey Meta at some point.

So hopefully this means the end of WhatsApp hacks for things like Aira and PiccyBot, and maybe we can start having the full hands-free integration we’ve all been dreaming of.

It also seems that they are open to use of other AI models so I don’t believe everyone will be forced to use Llama.

So my main question is - why on earth is no one talking about this on here?

Options

Comments

By mr grieves on Sunday, September 28, 2025 - 13:55

Can't find an edit option on my main post, but I almost forgot. The new toolkit is supposedly going to be available for all the Meta glasses, which I believe will include the gen 1 Meta Ray-bans that many of us have already.

By Brian on Sunday, September 28, 2025 - 14:06

I have been behind the times when it comes to news like this. I was aware of the Meta Oakley's, did not know about the other technology though. I'm thinking my next pair of Metas will be Oakley, as I would love to have Meta Smart glasses in a wraparound form factor.

By Gar on Sunday, September 28, 2025 - 14:11

While I'd heard about the new glasses, I hadn't heard about the screen reader or any of the other stuff. But honestly, it's a Meta product, and I really don't have any trust in them or desire to use anything they put out. But I recognize that's just my opinion, and others may vary.
With that said, I think the main reason it's not being discussed here is that it's not an Apple product, and I think that part of the website, the core Apple categories, are visited more frequently than this forum category. So, it very well could be popping up here or there, but not in plain sight.
I'm not throwing any shade here, I don't think the AV Mastodon account posts topics from some categories at all, either. It's highly possible it might be getting some traction on Mastodon or in other places though.

By mr grieves on Sunday, September 28, 2025 - 14:46

They have been talking about this on Double Tap a bit, and they have also been unable to quite understand why this isn't bigger news all round.

I can't quite see the point in the Meta Display glasses for us, but it really took me back when I heard it was going to have a screen reader up front. It was almost in the small print. Quite why they weren't making a big deal about this I don't know.

I've been dreaming of Apple glasses that would integrate all my favourite apps into the iPhone. Now it looks like I already have all I need and just need to wait until next year. What we need is something like Voice Vista that can then use the camera to provide door detection and things like that.

I was almost excited by the Envision Ally glasses, but the Ally app felt pretty hopeless to me. Whereas I really can't wait for this.

I hope the Toolkit thing won't be restricted to the US only to begin with.

It does feel like all the pieces of the puzzle are fitting into place for us.

I understand why people don't like Meta. They aren't a company I have been particularly fond of. But I really can't wait to see where this all goes, and it is pretty thrilling that we are right at the front of the queue for once.

By Oliver on Sunday, September 28, 2025 - 15:22

I nearly wrote about this but, as it's so far off, I didn't really think there is much to say yet. Yes, it's all coming, but we've heard that before.

I've actually applied for the dev kit, so we'll see where that goes.

Call me jaded, but a lot has been promised. I'm still cautiously optimistic that this will be the turning point for us where Meta Glasses go from something that's kinda cool, but not quite filling our need, to something that is really powerful. My dream, as I've mentioned elsewhere, is to be able to kick back and read a book, any book. This would be the app I'd look to develop, purely book reading. I know Seeing AI will have document reading, but book reading, at least, many pages, might be a little beyond its remit.

I'd go for the Vanguards. I find the Meta Ray-Bans uncomfortable. I've got a low nose bridge so as soon as I get a bit sweaty (gross), they slip down my snozz. I know the second gen have better fit and should stay put, but I'd rather go a little more expensive and get wrap arounds. Not sure about others, but if I've got a bit of a flare up, or a stye, I'd like to have that full hidy coverage. Centre camera is also cool. Plus, I can pretend to be like totally sporty and stuff... WEEEE

By jim pickens on Sunday, September 28, 2025 - 15:56

I have no idea how I never knew about this, the screen reader and neural band I knew about, but the integrations I decidedly did not. My only excuse is… I don’t watch events... or something?

By Tara on Sunday, September 28, 2025 - 16:20

oh dear, well it's a bit of a sore point I'm afraid. I've gone and ordered the Ally Solos glasses now, but if I'd known Seeing AI was going to be on a pair of glasses coming soon, I probably wouldn't have ordered the Ally Solos. Particularly as Seeing AI is better at identifying products than the Ally app is. I'm just hoping that when I wear the Ally glasses, they'll be better at identifying stuff, since they'll be sat on my face so to speak, and I won't have to position the camera. But the Ally app really did halucinate when I asked it what I had. It got totally the wrong type of beef burgers. It said they contained jalapeno, and they definitely didn't! It just made the whole thing up. They were only Birds Eye afterall! I could always send them back if I really don't like them, the glasses not the burgers, they've already been eaten and I liked those no question. I always wanted Seeing AI on a pair of glasses though. Maybe I could have both? The Meta glasses and the Ally Solos? We'll see.

By SeasonKing on Sunday, September 28, 2025 - 18:48

I know Meta has delivered some amazing things in recent years, and, my finger is itching to hit that buy now button, but, I am just holding for Google's version to launch, so can make an informed decision.
Also, I need to know if Seeing AI and other assistive apps make good use of the Meta's SDK. Because the ones from Google are certainly bound to have better integration with all of it's existing Android apps. Yeah, Google Talkback combined with Google's TTS is kind of frustrating, but, if they tackle that in glasses version somehow, and, if we get to see some nice capabilities in terms of accessible navigation and recognition, it might be much better deal.

By Missy Hoppe on Sunday, September 28, 2025 - 19:12

I'm just speaking for myself personally, but I honestly don't want anything to do with Meta as a company. Yes, I have Facebook, but I consider it more of a necessary evil than something I genuinely want. For everyone who is excited about all of these new announcements, that's great, and I'm sure I'll read as much as I can just for my own knowledge, but I just have a lot more confidence in Envision. Can't wait to get my Ally Solos glasses; they'll probably meet my needs better than the Metas did and will almost certainly have fewer privacy concerns.

By Michael Hansen on Sunday, September 28, 2025 - 23:43

Member of the AppleVis Editorial Team

Perception is everything. I don't trust Meta because of Facebook's track record with user privacy. If I'm going to be having glasses with a camera on them, I want that data in the hands of a company with a strong and proven track record for putting user privacy first. That said, though Meta glasses are not for me, I am very glad that third-party apps like Seeing AI are going to be able to come to the glasses; and Meta's inclusion of a screen reader on the display model is a welcome development.

By Gokul on Monday, September 29, 2025 - 02:28

Post it here, and was really surprised noone did till now. I knew all the stuff including the humanwear and seeing AI integrations, and am happy I waited till the meta connect event before deciding as to whether or not to order Ally Solos. I honestly believe the opening up of the developer toolkit is revolutionary. Also, I'm not sure what promis that meta gave is that they've failed to fullfill? I mean, I can understand if someone did say meta never gave us any promisses, but don't know what they promissed and didn't deliver. If you ask me, that's Apple not meta.
Edit: I guess most of you've heard of it, but in case you haven't, the humanwear integration will also include indoor navigation facilitation; I'm excited for it; in as much as I have my doubts as to how practically it'll work.

By Brian on Monday, September 29, 2025 - 03:18

I think a lot of people in regions such as the UK were unhappy with the delay on getting the actual AI portion of Meta AI. That and of course being able to describe your scene/photo/whatever with said AI.

By mr grieves on Monday, September 29, 2025 - 16:57

Whilst, as I said earlier, I don't really trust Meta, I'm not sure I particularly trust any of the big companies offering Large Language Models. I definitely don't trust Google any more than Meta. I'm not sure I particularly trust OpenAI either.

And whilst I don't think Envision as a company seem to have any hidden agenda, are they not just making use of LLMs created by the big companies? I'm not exactly sure who they use but suspect it's probably ChatGPT. So I'm not convinced that it makes the Ally glasses a more trustworthy product than the Meta glasses.

Regarding buying things based on potential and promises, I definitely think that is not a good idea. You should judge a product on what it delivers now not on what may or may not come later. In my case, I had a pair of Bose Frames and although I liked them, the battery wasn't very good and they were hardly that sturdy. The Meta glasses had a little better battery, a more convenient way of charging and I think if that's all it was then I would still have got a lot of good use out of them. But naturally I was also persuaded by the promise of what was to come.

One trend I don't like is that we are given deals that are almost too good to pass up, but made at a time when you can't possibly make a good judgment call. All the accessibility companies are doing it - whether it's Envision, Glidance, WeWalk or the company behind Echo Vision. The ones I like least are the ones that offer lifetime discounts on subscriptions. It's all a bit too close to gambling for my liking. I guess they need the capital up front but I don't like it.

When I tried the Ally app, I really did not enjoy talking to the voices, but more importantly the information I was getting was very bad. I think for glasses to really achieve their potential for us, we need more reliability. It's all very well for low stakes stuff, but then so are my Meta glasses, but anything remotely important - like checking medication, navigation and the like, then none of the LLMs really deliver. We let them off because they are so convincing and the potential is so huge, but really if it's feeding you information that is likely to be wrong, then how are you going to know when it is right?

Now Envision has promised some local functionality and I suspect that will mean Seeing AI like functions. But, again, that's just the promise, which is the same for the Metas.

I doubt on day one of Seeing AI appearing on the Meta glasses that it will be exactly what I want. I suspect I will need to open up the app outside the glasses, but it is still moving in the right direction.

None of the options that are available today or even this year are likely to be exactly what we want, but they all do offer a glimpse of where this can go. It will be fascinating to find out where we are this time next year. Hopefully this will have appeared, and maybe Glide too. Let's hope these things can deliver.

It's an exciting time for sure.

By Ash Rein on Tuesday, September 30, 2025 - 00:58

The AI is pretty unreliable. It gets a lot wrong. And it’s a lot to invest in with so much that it can’t/won’t do. And the meta presentation with Zuckerberg failed pretty miserably. And between this being unreliable, Glyde being delayed, echo vision being delayed, envision being unreliable, people have gotten weary. Be my eyes and be my AI work ok on the phone. And Google XR are on the horizon. Apple glasses are coming. And they are a year or three away. A lot of excitement earlier in the year. Just to be repeatedly let down. People are burnt out.

By Gokul on Tuesday, September 30, 2025 - 06:10

I don't know what there is to be frustrated about when I think of the fact that some 15 months ago, getting a picture described the way it is today by an LLM was not even in the frame of possibilities. These are all early itterations and there are bound to be glitches, failures, and even regression. I mean, IOS got its screenreader first only in version 3 if I'm not wrong. Just because there have been some setbacks doesn't mean either that the potential cannot be realised or that one's got to mistrust everything. in as much as it is not ethical on the part of any company to make unfullfillable promisses, it's unreasonable on our part, as consumers, to set unrealistic expectations.

By Ash Rein on Tuesday, September 30, 2025 - 11:03

It’s just that there’s too many promises that have been made. People got their hopes up. They thought that by this point, they would be using glyde and they would be running around doing all the things they’ve been dreaming about. They thought that they would be using smart glasses, too read everything around them, have everything described without fail, meet the partner of their dreams, etc. And it came to fast and too hard. And then when it started looking like all those things were going to take a little while longer, they pulled back. So now they’re hyper cautious.

By João Santos on Tuesday, September 30, 2025 - 20:06

I wasn't aware that these new products would be open for development instead of totally tied to Meta's ecosystem, and to me that changes absolutely everything. Whereas I couldn't care less about the Meta Raybans as they were, a wearable like that to which I can develop my own software for is a dream for me, so short of any major showstopper details I'll be all over these. The only Meta service that I use is WhatsApp because I'm kinda forced to use it, but if these platforms end up being open enough to allow my creativity to flow, then Meta might just have won me as a customer. Privacy is not an issue here since to my knowledge none of these devices has its own Internet connection, and I can just run large machine learning models on my Mac Studio back home so none of what I intend to be using these devices for will be sent to untrusted destinations.

By feofil on Wednesday, October 1, 2025 - 11:32

Anyone remember the Osbourne? It is a fossilized dinosaur today in the rapid growth of technology today, but at the time it was a massive forerunner of the laptop computer. It was roughly suitcase sized, weighed 23 pounds, and the keyboard folded over to cover the five inch screen. I believe it came out in 1975. so it is exactly fifty years old.

In late 1976, Osbourne announced the coming of the Osbourne II to hit the market sometime in 1977. So what happened?

Suddenly, people stopped buying the original Osbourne , preferring to wait for the Osbourne II. Revenue fell off so sharply that Osbourne eventually folded its keyboard one last time and filed for Chapter 11. It never emerged from bankruptcy and disappeared from the market. It made promises so extravagant that it encouraged people to wait rather than purchase the original unit. Had it kept its promises to itself, it might still be around today. Apparently, Dufferberg never heard of the Osbourne and its tragic marketing fiasco.

Feofil

By TheBlindGuy07 on Wednesday, October 1, 2025 - 14:30

The day you code something for this platform I purchase one, even if I don't need it or your software at all :) just sayin...

By Brian on Wednesday, October 1, 2025 - 19:33

VOSH on Meta?
🎵 Dun—dunn—dunnn! 🎶

By João Santos on Wednesday, October 1, 2025 - 22:49

I have other plans for wearables, which I was already going to implement using Apple products and may require an iPhone to run muscle code on its GPU, but until I actually find out what's available I can't really tell exactly how I'm going to use Meta's hardware.

By João Santos on Thursday, October 2, 2025 - 00:53

Just took the time to read about the Wearables Device Toolkit, and what I gather is that it's not really going to run anything on the wearables themselves, which to me is really good news because that likely means direct access to the audio, video, and motion sensor data streams from a companion device like an iPhone. Therefore unless Apple pulls yet another anti-competitive stunt to limit what Meta can actually deliver, I think it will be good, and am not subscribing as a developer right now because I still have work to finish.

By Oliver on Thursday, October 2, 2025 - 04:35

Very much looking forward to seeing what you dream up. It is all very exciting.

There is an app, or a concept, called The VOICE. It scans an image and presents an audio version using volume and pitch to denote colour and height whilst presenting it in a stereo sweep. It's a really interesting idea; the developer has just never had the mainstream platform for it, instead having to suggest clunky old Android glasses. With this, and with very little set up for the user, I imagine it will find far more traction. It's an interesting solution; with enough training, it activates the visual cortex. It basically rewires the brain through exposure.

More info here if anyone is interested.

Exciting times.

What do people think of the Vanguards by Oakley? I'm thinking they will be my device of choice, but realise they are a bit more expensive, and I do worry I will look like a twerp in them... More of a twerp.

By Brian on Thursday, October 2, 2025 - 05:25

I am quite invested in my Meta smart glasses. Use them every day, for all sorts of tasks, which intern give me greater independence. Having said that, I really hope Apple does not put any silly limitations on the development toolkit for Meta. I'm really looking forward to seeing what developers come up with.

By Ash Rein on Thursday, October 2, 2025 - 08:59

I can see that at some point this will have its own cellular connection. Maybe not this iteration. But eventually. Apple is also fast tracking their smart glasses to directly compete with them. Apparently they’re making one that will be dependent on your mobile phone. The other one will be its own device. I think that smart glasses will eventually be our cell phones and our main device really.

By Brian on Thursday, October 2, 2025 - 09:32

I can see that happening somewhere down the line. Smart glasses replacing smart phones as a daily driver, with their own cellular connection, Bluetooth chips, etc., etc. Meta will absolutely need to update their touch panel on the right arm though. Currently, it is extremely limited. Other smart glass designers will likely have to incorporate something similar. I mean, voice is nice, and introducing gestures we could do with our hands, in front of the camera while we wear the device to get things to happen, could work, but probably not very efficiently with today's technology.
Still, the potential is definitely there...

By mr grieves on Thursday, October 2, 2025 - 09:37

I wonder if they see the neural band as potentially a replacement for a touch screen. I am pretty sure Meta are at least plotting how they might escape Apple and Google and go it alone. Definitely a stand-alone device is in the future somewhere.

By Brian on Thursday, October 2, 2025 - 09:45

Good morning Mr. Grieves!
You know, I forgot all about the Neural Band. I suppose that, paired with the smart glasses, could definitely be a game changer for touch interactivity. 🙂

By SeasonKing on Friday, October 3, 2025 - 05:45

Having a wrist band as an input device for my iPhone could be game changing. Imagine operating your iPhone while it's still resting in your pocket, doing gestures using wrist band while VO speaks in your ears using airpods. It's probably never going to happen from Meta's side, but, I am sure other brands or perhaps even Apple might realize the potential of such an innovative input method. Perhaps, they might even integrate the tech in Apple Watches.

By mr grieves on Friday, October 3, 2025 - 07:44

They do seem to want to open the neural band up to a lot of other devices. I was listening to one interview with someone from Meta who was saying that it seemed silly if you are watching TV and reach for the remote control when you have a full controller already on your wrist. It sounded like they were thinking about it as possibly a universal controller for different appliances. It But controlling the phone feels like a natural extension to their existing product line - ie if you have non-visual Meta glasses on then it might allow you to control more than just take a photo, play and pause music etc. particularly with the new toolkit. I don't really want to have to reach into my pocket to open up Seeing AI so I can use it on the glasses - really I want to be able to do that hands free.