What enhancements or features would you like to see integrated in iOS 19, MacOS 19, iPad, and iOS 19?
By Dennis Long, 1 February, 2025
Forum
Other Apple Chat
I figured we would do this to have a little fun. so What enhancements or features would you like to see integrated in iOS 19, MacOS 19, iPad, and iOS 19?
This already includes bug fixes and go
Would like VO to be easy to change voices. Address bugs that continue to happen. Fix Siri and make the next iOS smooth and fast. Do not need new features just clean the bugs that most people have. For me I want VO no longer mention the time whenever I get a notification. Is that to much to ask? Some do not have this bug just me.
I would love for there to be more customization brought to iOS 19. If this thread is meant to be a fun "what if" conversation, then consider the following:
1. A simple way of importing ringtones to iOS/iPad OS.
2. More custom gestures. There are sooo many more gestures that can be implemented into the OS, and thus made customizable. We have a 2 finger double-tap & hold, and a single finger double-tap and hold. Why not a 3 finger version? 4 finger version? How about a:
2A. A 3 finger rotor-like gesture?
2B. How about Dock gestures? The dock is seriously under utilized here.
2C. More StatusBar gestures***.
3. The ability to dictate a text or email while in the middle of a phone call. Android can do this, I want my iPhone to do this too. Grrr.
4. A smart Voice Command rotor option. Borrowing from Android again, sorry not sorry, but the ability to activate Voice Command for a single action, muting VoiceOver temporarily just long enough to complete the Voice Command function. I would kill for this ...
5. Mail folder empty functionality. This actually used to be in iOS way back when, but at some point was removed. Personally I use Outlook for iOS these days for this, but those of you sticking with iOS Mail; I would love to see you all get the ability to simply swipe up/down on a Mail folder and then be able to simply tap on, "Empty". :)
6. Automagic capitalization of the App Library. 'Nuff said ...
That is all for now. Feel free to shoot down my ideas. :D
*** A note on the status bar. There are already 2 gestures i am aware of with VO; a single finger double-tap and a single finger double-tap and hold. The former will automagically scroll VO focus to the top of any page you are on, and the latter will open Notification Center upon release. :)
Bring VoiceOver into parity with the same experience with VO on iOS. SImplify some features, overly complex keyboard shortcuts, and get rid of the innumerable bugs. RIght now VO on iOS gets all the love, or as much love as Apple gives VoiceOver nowadays. VoiceOver on Mac is the abandoned child.
You can do this on the iPhone, but I would like to split the audio on macOS so that voiceover comes through one AirPod while the computer sounds, a phone call, a meeting, etc. comes through the other side.
I'd like us to have more gestures to play around with voiceover. It would make it more customizable and also could make it so apps can be better controlled with voiceover.
This is just the start.
1 ability using a Braille display to double tap and hold on buttons. Where is this useful? If you are in WhatsApp, Tellegram and it says double tap and hold to record a voice message.
If you use a bluetooth keyboard you have this ability. If you use the touch screen you can lock the button down.
Note this should also apply to BSI.
2 have an option to speak position in list.
3 have a option to have a repeating caller id.
4 have something similar to the guided frame that is found on Pixel devices.
I have other ideas but don't just want to put mine up hear.
I know this conversation got a bit out of hand last time, but I've ben thinking about the suggestion from, I can't remember who it was, sorry, about an ability to set voiceover to activate on finger lift like in touch typing but on the home page etc. I know, and it was well voiced that this would have to be carefully done, but as an option to interact with ones phone, certainly not making it the default action, might be interesting. It'd almost be like direct touch functionality in the way you can set it for certain apps, but using the touch typing release gesture.
Mac: Regarding Mac, it would be very nice if voiceover didn't choke on novel length manuscripts in text editors. I can't imagine this will ever be fixed though. Being blind, being a novelist/someone dealing with such expansive texts/using a mac, seems a very niche user base.
iPad: Finally, on iPad OS, I'd really like to see some interesting uses of the apple pencil and apple pencil pro, tying in gestures to voiceover, using it as a magic wand to control our devices, read, stop reading, stop and start music, invoke siri, etc. I'd also love to be able to use it to write but, maybe I'm a bit optimistic in the quality of my handwriting and the power of apple silicon to make it out. I'm a fast typer and could do with a way of forcing my mind to slow down during composition, where words are carefully chosen instead of rushing out as if from a firehose.
Bonus: The ability to spell check in safari in these posts which is still broken. This is really the only place I do post so I'm not sure if the lack of spelling correction is specific to apple vis's edit boxes or is true on other edit boxes across the web.
Agreed about ringtones. The current way of going through ITunes (especially on Windows since ITunes is only half-usable there) is annoying.
WhatI would also like to see is a way of backing up an IPhone without having to use ITunes that doesn't force you to buy ICloud storage. I know that'll never happen for monetary reasons though.
And what do you mean about the dock? There's not much you can do with that beyond what is possible now.
jWhat I would also like to see is an easier way to organize your home screens. I would love to create folders and stuff, but that's just annoying to do with VO.
It would also be great if audio-ducking worked again (at least it doesn't for me).
Well, if we’re just talking new features we would like to see…
I want Apple intelligence to be used for integrated image and maybe video description. Maybe somehow put that in with the explore image item on the rotor, so you could get a good, detailed image description without having to open another app like BMAI.
A native way to scan documents with VO so you don’t need something like Seeing AI. I know you can scan documents into the notes app, but I’ve never really got that to work well. Maybe just have better VO guidance for that.
Being able to use your phone’s camera to identify trafic lights and the lines in crosswalks. Yeah, I know, wishful thinking lol. I never used the Oko app, but it sounded cool. Hmmm… maybe Apple should just buy them.
@Dennis Long I think your first one already works.
With a Braille display you can do a 3-5-6 chord or a 3-6-7-8 chord, both of whitch do a long press, or the same thing as a double tap and hold or a one finger triple tap. If you’re using BSI, you can switch to BSC mode and do dots 3-5-6.
I haven’t used Whatsapp in ages, but I tried all of these Braille commands in iMessage, and they let me hold down the record button for audio messages.
Since I am a long time user of iPhone I am majorly waiting for the feature of auto language detection mainly to South Indian languages is missing me excited about this bug fix
Would like a built-in clipboard manager. Something that allows user to copy at least 10 to 20 things and reference to as needed. The copy speech to clipboard voiceover Command isvery useful. I wouldn’t mind an added rotor feature that allows us to go pback and paste previously copied things as needed.
I know this is the boring answer and not entirely in keeping with the intention behind this post, but the best thing Apple could possibly do with the Mac is to put all new features on hold and just fix the bugs.
For example, I'd like to be able to navigate by heading and actually hear the text of the heading rather than a count of the children. (As per Ventura)
I'd like to be able to press up and down cursor keys and be confident that they are never going to suddenly start working as left and right keys.
I'd like the focus in Safari to be rock solid so that I don't have to go round and round a page to get to what I want.
I'd like Apple to take backwards compatibility a little more seriously so that when they release a new MacOs, then it doesn't suddenly unleash a torrent of bugs in third party apps like Chrome and PyCharm.
I'd like terminal to be able to speak the lines of output from a command predictably. I'd like it to be able to work properly when I press up and down arrows. And it would be great if it could speak the command I am currently inputting in a reliable way.
It would be nice to see some improvements to activities. For example, manually switching an activity should work as per Ventura. Custom punctuation sets should work when I switch activity. Apple should actually tell us what the weird new context switching thing is actually for and how devs can make use of it.
I guess I just want a consistent and predictable experience. This also comes down to the seemingly random way we must interact with things. Sometimes it's arrows and tabs, sometimes it's vo left+right, sometimes it's vo+up/down, sometimes we need vo+space, sometimes that weird action menu thing, sometimes context menus. It's not so much difficult as it just takes more brain power than is needed, and usually if you use the wrong key combination the focus is going to shoot off somewhere crazy and you need to start again. Maybe it's just me.
This also extends to stupid little things like just giving the web page focus. Normally jump to heading gets me in there, but sometimes VO just can't find the heading and I need to faff about trying to get it to recognise the web page. Or it gets stuck in the page of a tab that doesn't even have focus.
I'd like to be able to dedicate as much of my attention to what I am trying to achieve and much less to how I need to achieve it.
I'm a little unconvinced that the iPhone is so much better than the Mac. I think this perception exists for a few reasons.
Firstly, at least in my case, the iPhone is my leisure device. So I'm not doing anything too serious with it. I'm not coding, I'm not using terminal, I'm not using many complex web apps.
Then I think navigate by touch provides a good fallback way to getting to something that might otherwise not be accessible.
But I also think that the form factor of the phone means that apps need to be simpler. Heading or tab navigation is much more straightforward than the Mac's horrible hierarchies of collections and lists. I hate using apps on the Mac like the app store, although to be fair that's not great on the phone either.
But things like text editing on the phone is also a pretty bad experience. Probably because messages on the Mac are shorter it's less of a problem. But it hallucinates text just as badly, if not worse, than the Mac.
Don't get me wrong, I'm mostly happy with the phone but that's more the nature of what I do with it rather than how great VO is.
Having said all that, there probably are a few features that would be good.
Something like VOCR built-in, for example. The only danger of this is that it might give devs an "out" to creating accessible software. But there are apps when I am just shut out entirely and it would be great if the Mac had some way to help me here.
I'd also like a better way to get image descriptions. I can press VO+Shift+L and this is sometimes really helpful at giving me the OCR of an image, but pretty useless for anything else. And it only works in some apps. I'd like to be able to feed the images to AI more easily and ask questions, ideally with the AI running locally. I also wish that I could be able to rely on the answers that AI gives. For example, if I feed it a graph and I have maybe 70% confidence that it is right, then I'm not going to use it, I'm going to ask another human. Maybe that's asking a bit much right now. Maybe a way to automatically produce audio graphs would be good, but I've never quite got my head around them.
This is a small thing and maybe I can do this already, but I'd like the option when I choose which output VO should come from if I could say "headphones, or Mac speakers, but never air play speakers". I definitely don't want VoiceOver coming out of my Sonos speakers, ever.
I'm also a little jealous that the US Siri voices have some kind of speed reading option available in VO but the UK voices don't. I would like to be able to use the Siri voices but I can't bear the way they speak single characters, so maybe that would fix it. I'm still waiting for the perfect VoiceOver voice as they all have their quirks.
As for iOS, I think better image descriptions would be good so that I didn't always have to pass them onto a third party service every time. What's there is better than nothing but not always that helpful.
That's about it. As I've said on another thread, I think Apple needs to be careful about adding too much more customisation unless they can figure out a way to properly test VoiceOver.
I would also like the ability to optionally turn off confirmation prompts. For example, if I’m deleting a text thread, I don’t wanna have to confirm it every single time. This option already exist in Mail settings. It would be great if the option could be added to message settings. I also would like the ability to turn off the prompts for allowing notifications or location services. For me, the answer is always going to be allow. I would like some kind of toggle to always allow so that I don’t get hit with these prompts every single time.
I agree it would have to be carefully implemented. Why do I say don't just make it the default? How many blogs, articles, ETC. have others done on how to do things? if this new way were the default and they came across articles with the current way of doing things it would confuse the crap out of people :). That being said so long as it is an option that the user can change to that method should they choose there is nothing wrong with it. Now on to your other comment no safari is not the only place spell check is broken. If using a Braille display and you turn off predictive and auto correct under keyboard. Then go to mail. type using your Braille display. press space. Voice should say Applevis misspelled it won't do that in mail.
The only feature I personally need at the moment would be a plenty of bug fixes on all platforms. macOS and iPadOS are extremely buggy with the iPadOS 18 / macOS 15 release.
Once the bugs that are long-standing and heavily impact user experience are fixed, we can discuss about new features in my opinion.
But this will be a dream forever, I think.
You're right, Denis. Mail has been so long a messy interaction that I completely forgot it's issues. It's weird how we get used to these things. It's like a car, for those that drive of course, that requires a little coaxing to get going. I think there are probably lots of these examples for us long time mac users where we know how to finess it to get it working that it has become second nature.
I would really like to see the gujarati Indian language added to voiceover.
I am receive a lot of things in gujarati at the moment.
You can read with the espeak TTS but it would be good if it can be added as a voiceover language.
With all the awesome features that Apple products already have, it'd be cool indeed for Mac OS to get something like VoiceOver Recognition, with all its features. For instance, there is a website which I'm currently fighting with lol. Well actually, that's only partially true. It is a poetry-related website and although I've been able to read some poems and other things on there, there appear to be several poorly-labeled/unlabeled elements. I can certainly try out this site on my phone, but I'm honestly faster on an actual tactile keyboard and my eReader is not yet paired with my current iPhone. Just for the record, the website to which I'm referring is http://www.allpoetry.com not only for my neighbor's poetry but there are others on the site with whom I'd like to interact. I actually published a poem on there several years ago, and I think the site must have been overhauled because it was better back then. But that's probably not the only place where inaccessibility factors in.
I tried that site on my phone. I clicked on a poem. Even when turning on screen recognition it appears to have made no difference. There appear to be three links at the end of the Poem if they would be labeled correctly this would not be an issue. if you would like I can contact them and ask them to fix any issues you find.
I would like to see public betas or releases that are focused on accessibility and Voiceover issues.
Also, I would like to see more frequent mini updates that address accessibility and Voiceover issues.
Currently, releases have new features and/or fix problems that are not specifically related to accessibility and Voiceover.
The problem with addressing accessibility issues as part of new version releases is that there isn't enough focus on the accessibility issues. I think it would be a good idea for Apple to separate the cycle for general releases and fixes for accessibility so that the accessibility issues get more attention. Sometimes accessibility issues seem to get lost in the shuffle. We shouldn't have to wait for these accessibility issues to get addressed and fixed.
I think that is one of the strengths of Windows. The screen readers are separate from the rest of the OS and the developers are selling a separate product. So the developers of screen readers have a real incentive to get it right or they won't sell anything! If Apple doesn't address accesibility issues they will still be selling lots of devices!
Second thought with accessibility. It is an after thought. "OK all the next iOS is ready for release what next? Did we get everything, Oh we forgot bugs for VO. We deal next time with it".
Some nice ideas posted already. What I'd like is:
1. Better native instant image descriptions.
2. An easier and quicker way to get AI image descriptions, just like Google have done with TalkBack on Android, and FS have done with Picture Smart on Jaws. I don't mind if these are Apple Intelligence, or utilise Be My AI or what, but please make it less of a rigmarole.
3. More one finger gestures on iOS, such as the L shape and back / forth gestures in TalkBack on Android. I know I've been banging this drum for a long time, but I really want it to be easier to use my phone one handed, and to have more gestures available to assign actions to. They don't have to be compulsory if some people don't like them, just be available to those who do.,
4. Improve keyboard access and text editing on iOS, and text editing on Mac.
5. When I press Play on audio or a video, pause VoiceOver speech automatically, so I don't have it speaking over the start of the audio / video.
6. I like the clipboard manager idea above.
7. On Mac, literally scrap VoiceOver and start again from scratch, i.e. build a brand new screen reader for Mac that is fit for purpose. Or really and truly fix the bugs, inconsistencies and usability issues, whatever approach will give best results.
8. Provide proper APIs allowing third parties to build screen readers, particularly for the Mac.
I am not defending Apple. I am not a fan of Capitalism.
However, I will say that Accessibility is not an after thought. It is a major component of all devices. We saw this in Vision Pro. This is a device that we wouldn't have even cared if Voiceover wasn't implemented. How many blind people really bought that thing? And yet, it is a major part of Vision OS.
There are issues. Yes. And those issues are pretty frustrating. And there needs to be a willingness to recognize that fixing bugs isn't as easy as finding them and fixing them. These platforms have literally hundreds of thousands of lines of code. Not only is it difficult to find the problem, fixing one thing can often break 6 other things.
Sometimes these bugs stay put because its the lesser of a problem. For example, fixing the double spacing no period announcement bug (13 years old) can cause a focus issue, the status bar to be inaccessible, the mail app to crash, and voiceover to start speaking letters as numbers. And this is not an exageration.
These things take time, they take patience, and they take persistence. I would say be persistent. Keep pushing for improvements.
Please don't just come on this forum and leave post after post about how accessibility stinks, Accessibility is an after thought, Blah Blah.
At this point, I am honestly very curious about Harmony OS and its accessibility. Chinese phones and devices seem pretty interesting. Near link for example is amazing. However, the Apple engineers work hard and they are definitely trying. They are making improvements, changes, and keeping up with evolving technology. They deserve some consideration.
I will agree with you that accessibility is not an after thought at Apple. It is part of the product design of all of their products. Want proof of that look at the Vision Pro. It was in there from day one.
It Perhaps when they release is added but when concerns bugs related to VO it is. How long it took to address bugs regarding Braille? There are bugs that continue from what I keep reading here that last more than what a bug that affected sighted people would be tolarated. That is an after thought. Recall there was a bug that affected VO and calls that took a long time to be address. Even J. Mosen talked about that if it was a bug affecting sighted people it would had address. I believe it had to do with VO stopping after ending a call or starting one.
With regards to my last post, I would also love to see an update to the handwriting feature for VO. Instead of having a handwriting rotor option, I think it would be better if Apple would make it an additional keyboard, like how we have our regular keyboard plus our Emoji keyboard. Wouldn't it be great if there was a third option; a handwriting keyboard. Would be really useful for Those times when you need to type in sensitive information, considering the amount of privacy and security one would have drawing out their sensitive information while enabling screen curtain. I realize you can already do this with the handwriting rotor, but that particular rotor is all kinds of broken.
I also agree with both Dave and Ash with regards to additional one-handed gestures, and the persistence of development on accessibility, especially where VoiceOver is concerned.
I would like the process for opening unsigned apps on the mac to be a lot easier than it currently is. When "allow apps to be downloaded from anywhere" is selected under Privacy and security, there is no "Open Anyway" button when you scroll to the bottom of the screen, as was the case with previous versions of Mac os, and I, for one, find this frustrating.
What would you like to see Apple add in watchOS 12 betas this year? I would like to see the ability to costumize your VoiceOver rotor in watchOS 12. I would like to also see the ability to copy and paste text similar to other technologies.
Accessibility features may not be an afterthought, maintenance does seem to be though.
this isn't restricted to accessibility. 'New' in the tech world seems to be king. Being able to say a device has accessibility from the word go is very good for marketing though, in the case of Vision Pro, I question the point and whether resources could have been better used for fixing issues which have been listed here. Of course, if we hadn't had accessibility on a product which is made by apple, irrespective of the price and that it has the word 'vision' in the title, we'd be disappointed… Myself included. It just seems a lot of work for a tiny number of blind and partially sighted users when far more users could benefit from VO 2.0.
the issue we face is apple continuing to create solutions we don't want. Magnifier with door detection is nice, the OCR is okay… But it seemed to have more value for promotional materials than end users. We have Seeing AI which does a far better job of OCR.
But, again, this isn't accessibility specific. Apple makes all sorts of nonsense, remember the clips app? Certain tools or apps seem to be made because they can be made, not because they should be.
Accessibility isn't an afterthought, it's a marketing strategy. Once it has served it's purpose, to be 'new', it drops down the priority list, more so, I think, than features the sighted use, which makes sense, there's more of them.
Apple need to steel from JAWS their new idea of getting blind people suggest any new feature and add it to VO. I believe picture smart came up from that. Instead of getting what apple thinks we need they should ask and have a suggestion with prices. The best idea can be use.
I think many of us, myself included, have developed a level of trust that if Apple release a new product, then we will be able to use it without sight. Although the Vision Pro seems not really to be useful to anyone, I think it would set a very awkward precedent if they had decided on our behalf that this wasn't something for us.
I do agree that otherwise they seem more interested in headline new features that show how much they care about accessibility than making the existing ones work well. I think it is easier to build a new feature than it is to try to figure out some of these problems that have been around for a while and are likely both very complicated and often very specific.
What Apple need to do is invest in a really comprehensive and well thought out automated testing solution so that would enable them to make changes to VoiceOver's existing behaviour without breaking things. And if VoiceOver in its current state cannot be made testable, then it should be consigned to history and something new should be built with that in mind.
The problem with Apple is that they are a very arrogant company. Because of their market share they can just do whatever they want and people will just fall in line behind them. This has some benefits, I guess, because it is more likely that they would come up with something out of left-field that no one had ever considered before. But it does mean they are more interested in self promotion than fulfilling the needs of their actual customers.
I can imagine that VoiceOver on the Mac is an incredibly intimidating and complicated beast. It seems that whenever they change anything they break something else because there is no control over it. And from Apple's point of view, it's up to us to adapt to the new unwanted behaviour because they know best. But it's probably a lot simpler for them to just mess with the UI and introduce a new keyboard customisation thing that no one asked for rather than fixing the things that get sent to their inbox on a daily basis. But these are the things that actually matter. Everything else is just sugar coating.
Adding new features and customisation options will not make the overall experience better. It will make it more complicated and harder to test, as well as taking resources away from the things that matter.
I agree with vision os. Nobody asked them to do it but they still did. This is something we have to recognize. The fact that any blind dude can go to an apple store, buy whatever device and be independent / close to independent for first setup and general usage is something incredibly valuable. If you guys think that accessibility is an after thought on macos and god can we agree that VO definitely has some long standing problems there, you haven't tried a real thing where accessibility is an after thought. Spend five minutes with wayland and gnome/kde on any linux distro with gui and you will see when accessibility is a real after thought (the easy installers of asahi linux for example, I mean you can technically create your own image with some other distro but it's not in my skillset currently). When in some tty installs of a linux distro you don't even have speech and the only way is to force a screen reader into the image, or something like adding speech to the bootloader, this is indeed an after thought as you have to do things yourself. Not that I am attacking those software, it's just that accessibility is way easier when you have actually money to put in, which open source community doesn't have unless the dev is aware or have personal interest in. For the memo, windows bootloader or whatever the first thing that's load is still not accessible because accessibility is just not there. There have been some request to add to nvda the same trackpad ish thing that voiceover has on macos but it's not possible because there are too many different touchpad drivers and etc. So no, accessibility is not an after thought.
For Apple, even sighted non apple fanboys are starting to rant on reddit and other forums about how bugs are everywhere. I don't mean that it minimizes our situation, not at all, but... The tech industry is changing and capitalism has its flaws.
Sorry to be off topic. Guys, as a rule of thumb, before posting anything here in terms of bug report or suggestions, please take the habit to send it in a cleaner way to Apple themselves first. I am not saying that bugs in macos 15.3 should be normalized, but a lots of things have still been corrected behind the scenes since the first developer beta of sequoia to the point that I kept my mac. Generally speaking, for a windows user the iwork suite has become so much more usable now. Okay terminal sucks, but if I have to use gui text editors and can still use vim on my archlinux stormux raspberry pi 400 install, then... I will, and am doing it! :) I can still code, most of the things in terminal are still perfectly usable, so as long as I am still productive...
My request is that make iphone mirroring just airplay voiceover to mac with the tts and other sounds and send keyboard events back to the phone, that's so much easier to implement rather than this incredibly overengineered approach they have now. Nobody want to report a bug specifically because on sevral screens of instagram version x on ios version x and macos version x x thing happen with VO unable to read anything on the screen while with vo shift l we clearly see that it's perfectly showing. There are just too much room for things to go wrong, and things are indeed going very wrong as this feature is still practically useless for us as we have a bad mac VO interacting with so many layers of ios that... And we know that we would just want to have iphone voiceover to be controlled with mac hardware and that's it.
Please, add native screen recognition to voiceover on mac.
Make text selectable and easy to navigate in preview for pdf.
Correct nested list problem in pdf.
Make selection actually work on the web when macos standard commands don't work with VO ones (completely broken for me on all browsers with every VO setting combination I've tried according to that selection page in mac voiceover user guide).
Make it easy to actually skim through text by proper scrolling mechanism like sighted people do.
Create a garage band voiceover user guide on mac, if you already have one for ios.
Keep it up with freeform accessibility, some bugs but overall impressive as the app is 101% visual.
We really want to use the trackpad properly like sighted users would. Create a way so that apps in the background are not in the foreground when the pointer is on those app in non-fullscreen. Create a possible third trackpad mode where we can have greater control and use the perfect suite spot of voiceover and sighted way to interact with the trackpad, as macos is so point and click this will be a game changer.
Create a real scripting language for VO so 3rd party app dev can implement per app accessibility, this is seriously the worst side of mac accessibility. Like there is so much you can do with applescript which is not even updated anymore and catalist will be the death of it anyway.
Keep it up with the latest pwa way of doing things and web standards, add html tab navigation in the rotor across the ecosystem.
Create proper channel for accessibility bug reporting and tracking as so much is lost in the croud. I reported about 30-40 bugs roughly, 10 corrected, 2-3 officially acknowledged. This is wrong.
Release accessibility specific updates so we don't have to wait months or years for a critical accessibility bug to eventually be solved.
To the users. I feel that applevis is a great community but the community aspect has taken over the bug report aspect. Just an opinion. Maybe like for guides, the editorial team can come up with bug reporters role where the team validate each post before posting so it's a bit more official than forum posts? And create a video / audio section in podcast for bug demos... Food for your thoughts.
Apple did not create VO from their goodness of their heart. Apple PC were part of school and government and part of law 501 that require accessibility. I believe it was 501, someone will correct me please. Same from microsoft.Before someone jump on me I had an iPhone since the 4. Now have iPad 9 and watch 9.
I agree with TheBllindGuy07 we get any Apple product and know it is accessible. Does Apple products every platform has bugs no software is bug free. As for building a new screen reader. That is easier said then done.
I would love to see better integration with AI in iOS 19. Having the ability to rewrite an email is great, but what about the ability to tell me about my emails through a voice conversation. Have you ever watched the movie Her? Now, parts of that movie were just weird and kind of creapy, but the way the AI handled his email was way cool. He could just respond to emails by voice when he was walking down the sidewalk and teh AI even let him know when an important email (in this case, from his devorce lawyer) came in and needed his response. I so want an IOS AI to work like that. Its great to have conversations with Chat GPT, but that AI can't do anything on my phone like send an email or summarize a document unless I upload the document to the AI. I want everything to just be integrated.
Just some thoughts on a Friday afternoon.
Jim
I don't believe that accessibility is an afterthought at Apple. I think they are truly commited to making their products as useable as possible by as many people as possible.
What I do think is missing is a more direct interaction with the people who do rely on accessibility. As someone mentioned, developers such as those who made JAWS, NVDA, Zoom Text, etc. are often interacting in more direct ways with their users such as webinars, both private and public beta testing, and soliciting feedback and suggestions that get directly fed back to the team that does the development for these specialty products. Thus the interactions, feedback, and fixes don't get swamped by being part of a much larger organization whose focus is on serving the wider community of users.
For example, if the accessibility team develops a fix for some accessibility issue, the users have to wait until the organization as a whole wants to push out and update to the wider community of users. Thus the accessibility fixes can lag in time.
Note how many smaller developers you see interacting directly with users on AppleVis to solicit feedback, explain problems, etc. and how often this feedback and interaction leads to those products fixing accessibility issues and implementing new capabilities suggested by users in a timely fashion. That just doesn't seem to happen as readily with big companies.
Of course I don't envy Apple's job since visually impaired users consist of only one of the few groups using Apple products that have accessibility concerns. Perhaps interacting more directly with each specific group would require too much resources that could be used to actually develop solutions and fix problems. I guess it is always a trade off. There are always pulls and pushes in large organizations and it is sometimes hard for the tail to wag the dog.
Perfectly unrelated but you guys just showed me that my username was misspelled. Late is better than never!
A very important one across the ecosystem, don't create VO fragmentation. Like there are literally more features on the ios side. That script to reconnect braille display on ios 18, we got nemeth braille code for equation like ages after on macos compared to ios, ios has more sound options than macos VO, (some) siri voices have extra settings on ios that are not replicated on macos, etc, etc.
Comments
VO
Would like VO to be easy to change voices. Address bugs that continue to happen. Fix Siri and make the next iOS smooth and fast. Do not need new features just clean the bugs that most people have. For me I want VO no longer mention the time whenever I get a notification. Is that to much to ask? Some do not have this bug just me.
More customization
I would love for there to be more customization brought to iOS 19. If this thread is meant to be a fun "what if" conversation, then consider the following:
1. A simple way of importing ringtones to iOS/iPad OS.
2. More custom gestures. There are sooo many more gestures that can be implemented into the OS, and thus made customizable. We have a 2 finger double-tap & hold, and a single finger double-tap and hold. Why not a 3 finger version? 4 finger version? How about a:
2A. A 3 finger rotor-like gesture?
2B. How about Dock gestures? The dock is seriously under utilized here.
2C. More StatusBar gestures***.
3. The ability to dictate a text or email while in the middle of a phone call. Android can do this, I want my iPhone to do this too. Grrr.
4. A smart Voice Command rotor option. Borrowing from Android again, sorry not sorry, but the ability to activate Voice Command for a single action, muting VoiceOver temporarily just long enough to complete the Voice Command function. I would kill for this ...
5. Mail folder empty functionality. This actually used to be in iOS way back when, but at some point was removed. Personally I use Outlook for iOS these days for this, but those of you sticking with iOS Mail; I would love to see you all get the ability to simply swipe up/down on a Mail folder and then be able to simply tap on, "Empty". :)
6. Automagic capitalization of the App Library. 'Nuff said ...
That is all for now. Feel free to shoot down my ideas. :D
*** A note on the status bar. There are already 2 gestures i am aware of with VO; a single finger double-tap and a single finger double-tap and hold. The former will automagically scroll VO focus to the top of any page you are on, and the latter will open Notification Center upon release. :)
Nothing
I want no more bugs, I do not need features, I want bugs squashed.
VO overhaul on MacOs
Bring VoiceOver into parity with the same experience with VO on iOS. SImplify some features, overly complex keyboard shortcuts, and get rid of the innumerable bugs. RIght now VO on iOS gets all the love, or as much love as Apple gives VoiceOver nowadays. VoiceOver on Mac is the abandoned child.
Split audio on macOS
You can do this on the iPhone, but I would like to split the audio on macOS so that voiceover comes through one AirPod while the computer sounds, a phone call, a meeting, etc. comes through the other side.
More gestures
I'd like us to have more gestures to play around with voiceover. It would make it more customizable and also could make it so apps can be better controlled with voiceover.
Improve the terminal…
Improve the terminal experience on VO on mac so it works as good as on lower end hardware such as pros and air series.
Things I would like to see.
This is just the start.
1 ability using a Braille display to double tap and hold on buttons. Where is this useful? If you are in WhatsApp, Tellegram and it says double tap and hold to record a voice message.
If you use a bluetooth keyboard you have this ability. If you use the touch screen you can lock the button down.
Note this should also apply to BSI.
2 have an option to speak position in list.
3 have a option to have a repeating caller id.
4 have something similar to the guided frame that is found on Pixel devices.
I have other ideas but don't just want to put mine up hear.
IOS: Lift to tap... I know…
IOS: Lift to tap...
I know this conversation got a bit out of hand last time, but I've ben thinking about the suggestion from, I can't remember who it was, sorry, about an ability to set voiceover to activate on finger lift like in touch typing but on the home page etc. I know, and it was well voiced that this would have to be carefully done, but as an option to interact with ones phone, certainly not making it the default action, might be interesting. It'd almost be like direct touch functionality in the way you can set it for certain apps, but using the touch typing release gesture.
Mac: Regarding Mac, it would be very nice if voiceover didn't choke on novel length manuscripts in text editors. I can't imagine this will ever be fixed though. Being blind, being a novelist/someone dealing with such expansive texts/using a mac, seems a very niche user base.
iPad: Finally, on iPad OS, I'd really like to see some interesting uses of the apple pencil and apple pencil pro, tying in gestures to voiceover, using it as a magic wand to control our devices, read, stop reading, stop and start music, invoke siri, etc. I'd also love to be able to use it to write but, maybe I'm a bit optimistic in the quality of my handwriting and the power of apple silicon to make it out. I'm a fast typer and could do with a way of forcing my mind to slow down during composition, where words are carefully chosen instead of rushing out as if from a firehose.
Bonus: The ability to spell check in safari in these posts which is still broken. This is really the only place I do post so I'm not sure if the lack of spelling correction is specific to apple vis's edit boxes or is true on other edit boxes across the web.
@Brian
Agreed about ringtones. The current way of going through ITunes (especially on Windows since ITunes is only half-usable there) is annoying.
WhatI would also like to see is a way of backing up an IPhone without having to use ITunes that doesn't force you to buy ICloud storage. I know that'll never happen for monetary reasons though.
And what do you mean about the dock? There's not much you can do with that beyond what is possible now.
jWhat I would also like to see is an easier way to organize your home screens. I would love to create folders and stuff, but that's just annoying to do with VO.
It would also be great if audio-ducking worked again (at least it doesn't for me).
Well, if we’re just talking…
Well, if we’re just talking new features we would like to see…
I want Apple intelligence to be used for integrated image and maybe video description. Maybe somehow put that in with the explore image item on the rotor, so you could get a good, detailed image description without having to open another app like BMAI.
A native way to scan documents with VO so you don’t need something like Seeing AI. I know you can scan documents into the notes app, but I’ve never really got that to work well. Maybe just have better VO guidance for that.
Being able to use your phone’s camera to identify trafic lights and the lines in crosswalks. Yeah, I know, wishful thinking lol. I never used the Oko app, but it sounded cool. Hmmm… maybe Apple should just buy them.
@Dennis Long I think your…
@Dennis Long I think your first one already works.
With a Braille display you can do a 3-5-6 chord or a 3-6-7-8 chord, both of whitch do a long press, or the same thing as a double tap and hold or a one finger triple tap. If you’re using BSI, you can switch to BSC mode and do dots 3-5-6.
I haven’t used Whatsapp in ages, but I tried all of these Braille commands in iMessage, and they let me hold down the record button for audio messages.
South Indian languages auto detection
Since I am a long time user of iPhone I am majorly waiting for the feature of auto language detection mainly to South Indian languages is missing me excited about this bug fix
make audio destination work again.
Maybe if it worked you wouldn't get the problem of very loud voiceover on the phone when connected to a bluetooth speaker.
Built In Clipboard manager
Would like a built-in clipboard manager. Something that allows user to copy at least 10 to 20 things and reference to as needed. The copy speech to clipboard voiceover Command isvery useful. I wouldn’t mind an added rotor feature that allows us to go pback and paste previously copied things as needed.
Just fix it, please
I know this is the boring answer and not entirely in keeping with the intention behind this post, but the best thing Apple could possibly do with the Mac is to put all new features on hold and just fix the bugs.
For example, I'd like to be able to navigate by heading and actually hear the text of the heading rather than a count of the children. (As per Ventura)
I'd like to be able to press up and down cursor keys and be confident that they are never going to suddenly start working as left and right keys.
I'd like the focus in Safari to be rock solid so that I don't have to go round and round a page to get to what I want.
I'd like Apple to take backwards compatibility a little more seriously so that when they release a new MacOs, then it doesn't suddenly unleash a torrent of bugs in third party apps like Chrome and PyCharm.
I'd like terminal to be able to speak the lines of output from a command predictably. I'd like it to be able to work properly when I press up and down arrows. And it would be great if it could speak the command I am currently inputting in a reliable way.
It would be nice to see some improvements to activities. For example, manually switching an activity should work as per Ventura. Custom punctuation sets should work when I switch activity. Apple should actually tell us what the weird new context switching thing is actually for and how devs can make use of it.
I guess I just want a consistent and predictable experience. This also comes down to the seemingly random way we must interact with things. Sometimes it's arrows and tabs, sometimes it's vo left+right, sometimes it's vo+up/down, sometimes we need vo+space, sometimes that weird action menu thing, sometimes context menus. It's not so much difficult as it just takes more brain power than is needed, and usually if you use the wrong key combination the focus is going to shoot off somewhere crazy and you need to start again. Maybe it's just me.
This also extends to stupid little things like just giving the web page focus. Normally jump to heading gets me in there, but sometimes VO just can't find the heading and I need to faff about trying to get it to recognise the web page. Or it gets stuck in the page of a tab that doesn't even have focus.
I'd like to be able to dedicate as much of my attention to what I am trying to achieve and much less to how I need to achieve it.
I'm a little unconvinced that the iPhone is so much better than the Mac. I think this perception exists for a few reasons.
Firstly, at least in my case, the iPhone is my leisure device. So I'm not doing anything too serious with it. I'm not coding, I'm not using terminal, I'm not using many complex web apps.
Then I think navigate by touch provides a good fallback way to getting to something that might otherwise not be accessible.
But I also think that the form factor of the phone means that apps need to be simpler. Heading or tab navigation is much more straightforward than the Mac's horrible hierarchies of collections and lists. I hate using apps on the Mac like the app store, although to be fair that's not great on the phone either.
But things like text editing on the phone is also a pretty bad experience. Probably because messages on the Mac are shorter it's less of a problem. But it hallucinates text just as badly, if not worse, than the Mac.
Don't get me wrong, I'm mostly happy with the phone but that's more the nature of what I do with it rather than how great VO is.
Apologies for the rant.
The less serious answer
Having said all that, there probably are a few features that would be good.
Something like VOCR built-in, for example. The only danger of this is that it might give devs an "out" to creating accessible software. But there are apps when I am just shut out entirely and it would be great if the Mac had some way to help me here.
I'd also like a better way to get image descriptions. I can press VO+Shift+L and this is sometimes really helpful at giving me the OCR of an image, but pretty useless for anything else. And it only works in some apps. I'd like to be able to feed the images to AI more easily and ask questions, ideally with the AI running locally. I also wish that I could be able to rely on the answers that AI gives. For example, if I feed it a graph and I have maybe 70% confidence that it is right, then I'm not going to use it, I'm going to ask another human. Maybe that's asking a bit much right now. Maybe a way to automatically produce audio graphs would be good, but I've never quite got my head around them.
This is a small thing and maybe I can do this already, but I'd like the option when I choose which output VO should come from if I could say "headphones, or Mac speakers, but never air play speakers". I definitely don't want VoiceOver coming out of my Sonos speakers, ever.
I'm also a little jealous that the US Siri voices have some kind of speed reading option available in VO but the UK voices don't. I would like to be able to use the Siri voices but I can't bear the way they speak single characters, so maybe that would fix it. I'm still waiting for the perfect VoiceOver voice as they all have their quirks.
As for iOS, I think better image descriptions would be good so that I didn't always have to pass them onto a third party service every time. What's there is better than nothing but not always that helpful.
That's about it. As I've said on another thread, I think Apple needs to be careful about adding too much more customisation unless they can figure out a way to properly test VoiceOver.
Would also like option to turn off confirmation prompts
I would also like the ability to optionally turn off confirmation prompts. For example, if I’m deleting a text thread, I don’t wanna have to confirm it every single time. This option already exist in Mail settings. It would be great if the option could be added to message settings. I also would like the ability to turn off the prompts for allowing notifications or location services. For me, the answer is always going to be allow. I would like some kind of toggle to always allow so that I don’t get hit with these prompts every single time.
@Brian Giles
Try it in apps such as whats app and other apps you will see it doesn't work sadly.
@ Ash Rein
Excellent ideas I hope everyone fills out the applevis report card every year. I've seen a lot of good ideas
@Oliver lift to tap and other suggestions
I agree it would have to be carefully implemented. Why do I say don't just make it the default? How many blogs, articles, ETC. have others done on how to do things? if this new way were the default and they came across articles with the current way of doing things it would confuse the crap out of people :). That being said so long as it is an option that the user can change to that method should they choose there is nothing wrong with it. Now on to your other comment no safari is not the only place spell check is broken. If using a Braille display and you turn off predictive and auto correct under keyboard. Then go to mail. type using your Braille display. press space. Voice should say Applevis misspelled it won't do that in mail.
Truly multilingual Siri
A truly multilingual Siri, not only for reading messages, but for everyday use.
Plenty of bug fixes
The only feature I personally need at the moment would be a plenty of bug fixes on all platforms. macOS and iPadOS are extremely buggy with the iPadOS 18 / macOS 15 release.
Once the bugs that are long-standing and heavily impact user experience are fixed, we can discuss about new features in my opinion.
But this will be a dream forever, I think.
You're right, Denis. Mail…
You're right, Denis. Mail has been so long a messy interaction that I completely forgot it's issues. It's weird how we get used to these things. It's like a car, for those that drive of course, that requires a little coaxing to get going. I think there are probably lots of these examples for us long time mac users where we know how to finess it to get it working that it has become second nature.
@Cornettoking
Regarding the dock, I would just love it if Apple recoded it to have more gesture options added to it. Like what they did for the status bar. :)
Gujarati language added.
I would really like to see the gujarati Indian language added to voiceover.
I am receive a lot of things in gujarati at the moment.
You can read with the espeak TTS but it would be good if it can be added as a voiceover language.
Punjabi too then!
Punjabi too then!
Agreed on More Image-Description Features
With all the awesome features that Apple products already have, it'd be cool indeed for Mac OS to get something like VoiceOver Recognition, with all its features. For instance, there is a website which I'm currently fighting with lol. Well actually, that's only partially true. It is a poetry-related website and although I've been able to read some poems and other things on there, there appear to be several poorly-labeled/unlabeled elements. I can certainly try out this site on my phone, but I'm honestly faster on an actual tactile keyboard and my eReader is not yet paired with my current iPhone. Just for the record, the website to which I'm referring is http://www.allpoetry.com not only for my neighbor's poetry but there are others on the site with whom I'd like to interact. I actually published a poem on there several years ago, and I think the site must have been overhauled because it was better back then. But that's probably not the only place where inaccessibility factors in.
Picture smart
Be nice if Apple copy what FS did with JAWS. Have that nice feature there would not be need for be my eyes or others.
@Ekaj
I tried that site on my phone. I clicked on a poem. Even when turning on screen recognition it appears to have made no difference. There appear to be three links at the end of the Poem if they would be labeled correctly this would not be an issue. if you would like I can contact them and ask them to fix any issues you find.
More frequent releases geared to accessibility and Voiceover
I would like to see public betas or releases that are focused on accessibility and Voiceover issues.
Also, I would like to see more frequent mini updates that address accessibility and Voiceover issues.
Currently, releases have new features and/or fix problems that are not specifically related to accessibility and Voiceover.
The problem with addressing accessibility issues as part of new version releases is that there isn't enough focus on the accessibility issues. I think it would be a good idea for Apple to separate the cycle for general releases and fixes for accessibility so that the accessibility issues get more attention. Sometimes accessibility issues seem to get lost in the shuffle. We shouldn't have to wait for these accessibility issues to get addressed and fixed.
I think that is one of the strengths of Windows. The screen readers are separate from the rest of the OS and the developers are selling a separate product. So the developers of screen readers have a real incentive to get it right or they won't sell anything! If Apple doesn't address accesibility issues they will still be selling lots of devices!
--Pete
peter
Second thought with accessibility. It is an after thought. "OK all the next iOS is ready for release what next? Did we get everything, Oh we forgot bugs for VO. We deal next time with it".
Thoughts
Some nice ideas posted already. What I'd like is:
1. Better native instant image descriptions.
2. An easier and quicker way to get AI image descriptions, just like Google have done with TalkBack on Android, and FS have done with Picture Smart on Jaws. I don't mind if these are Apple Intelligence, or utilise Be My AI or what, but please make it less of a rigmarole.
3. More one finger gestures on iOS, such as the L shape and back / forth gestures in TalkBack on Android. I know I've been banging this drum for a long time, but I really want it to be easier to use my phone one handed, and to have more gestures available to assign actions to. They don't have to be compulsory if some people don't like them, just be available to those who do.,
4. Improve keyboard access and text editing on iOS, and text editing on Mac.
5. When I press Play on audio or a video, pause VoiceOver speech automatically, so I don't have it speaking over the start of the audio / video.
6. I like the clipboard manager idea above.
7. On Mac, literally scrap VoiceOver and start again from scratch, i.e. build a brand new screen reader for Mac that is fit for purpose. Or really and truly fix the bugs, inconsistencies and usability issues, whatever approach will give best results.
8. Provide proper APIs allowing third parties to build screen readers, particularly for the Mac.
Accessibility is not an after thought
I am not defending Apple. I am not a fan of Capitalism.
However, I will say that Accessibility is not an after thought. It is a major component of all devices. We saw this in Vision Pro. This is a device that we wouldn't have even cared if Voiceover wasn't implemented. How many blind people really bought that thing? And yet, it is a major part of Vision OS.
There are issues. Yes. And those issues are pretty frustrating. And there needs to be a willingness to recognize that fixing bugs isn't as easy as finding them and fixing them. These platforms have literally hundreds of thousands of lines of code. Not only is it difficult to find the problem, fixing one thing can often break 6 other things.
Sometimes these bugs stay put because its the lesser of a problem. For example, fixing the double spacing no period announcement bug (13 years old) can cause a focus issue, the status bar to be inaccessible, the mail app to crash, and voiceover to start speaking letters as numbers. And this is not an exageration.
These things take time, they take patience, and they take persistence. I would say be persistent. Keep pushing for improvements.
Please don't just come on this forum and leave post after post about how accessibility stinks, Accessibility is an after thought, Blah Blah.
At this point, I am honestly very curious about Harmony OS and its accessibility. Chinese phones and devices seem pretty interesting. Near link for example is amazing. However, the Apple engineers work hard and they are definitely trying. They are making improvements, changes, and keeping up with evolving technology. They deserve some consideration.
@Ash Rein
I will agree with you that accessibility is not an after thought at Apple. It is part of the product design of all of their products. Want proof of that look at the Vision Pro. It was in there from day one.
Ash Rein
It Perhaps when they release is added but when concerns bugs related to VO it is. How long it took to address bugs regarding Braille? There are bugs that continue from what I keep reading here that last more than what a bug that affected sighted people would be tolarated. That is an after thought. Recall there was a bug that affected VO and calls that took a long time to be address. Even J. Mosen talked about that if it was a bug affecting sighted people it would had address. I believe it had to do with VO stopping after ending a call or starting one.
With regards to my last post…
With regards to my last post, I would also love to see an update to the handwriting feature for VO. Instead of having a handwriting rotor option, I think it would be better if Apple would make it an additional keyboard, like how we have our regular keyboard plus our Emoji keyboard. Wouldn't it be great if there was a third option; a handwriting keyboard. Would be really useful for Those times when you need to type in sensitive information, considering the amount of privacy and security one would have drawing out their sensitive information while enabling screen curtain. I realize you can already do this with the handwriting rotor, but that particular rotor is all kinds of broken.
I also agree with both Dave and Ash with regards to additional one-handed gestures, and the persistence of development on accessibility, especially where VoiceOver is concerned.
PS Ash Rein For President! 🇺🇸😃✌️
What I would like to see
I would like the process for opening unsigned apps on the mac to be a lot easier than it currently is. When "allow apps to be downloaded from anywhere" is selected under Privacy and security, there is no "Open Anyway" button when you scroll to the bottom of the screen, as was the case with previous versions of Mac os, and I, for one, find this frustrating.
watchOS 12
Hi Guys,
What would you like to see Apple add in watchOS 12 betas this year? I would like to see the ability to costumize your VoiceOver rotor in watchOS 12. I would like to also see the ability to copy and paste text similar to other technologies.
Accessibility features may…
Accessibility features may not be an afterthought, maintenance does seem to be though.
this isn't restricted to accessibility. 'New' in the tech world seems to be king. Being able to say a device has accessibility from the word go is very good for marketing though, in the case of Vision Pro, I question the point and whether resources could have been better used for fixing issues which have been listed here. Of course, if we hadn't had accessibility on a product which is made by apple, irrespective of the price and that it has the word 'vision' in the title, we'd be disappointed… Myself included. It just seems a lot of work for a tiny number of blind and partially sighted users when far more users could benefit from VO 2.0.
the issue we face is apple continuing to create solutions we don't want. Magnifier with door detection is nice, the OCR is okay… But it seemed to have more value for promotional materials than end users. We have Seeing AI which does a far better job of OCR.
But, again, this isn't accessibility specific. Apple makes all sorts of nonsense, remember the clips app? Certain tools or apps seem to be made because they can be made, not because they should be.
Accessibility isn't an afterthought, it's a marketing strategy. Once it has served it's purpose, to be 'new', it drops down the priority list, more so, I think, than features the sighted use, which makes sense, there's more of them.
Oliver
Apple need to steel from JAWS their new idea of getting blind people suggest any new feature and add it to VO. I believe picture smart came up from that. Instead of getting what apple thinks we need they should ask and have a suggestion with prices. The best idea can be use.
Re: Vision Pro
I think many of us, myself included, have developed a level of trust that if Apple release a new product, then we will be able to use it without sight. Although the Vision Pro seems not really to be useful to anyone, I think it would set a very awkward precedent if they had decided on our behalf that this wasn't something for us.
I do agree that otherwise they seem more interested in headline new features that show how much they care about accessibility than making the existing ones work well. I think it is easier to build a new feature than it is to try to figure out some of these problems that have been around for a while and are likely both very complicated and often very specific.
What Apple need to do is invest in a really comprehensive and well thought out automated testing solution so that would enable them to make changes to VoiceOver's existing behaviour without breaking things. And if VoiceOver in its current state cannot be made testable, then it should be consigned to history and something new should be built with that in mind.
The problem with Apple is that they are a very arrogant company. Because of their market share they can just do whatever they want and people will just fall in line behind them. This has some benefits, I guess, because it is more likely that they would come up with something out of left-field that no one had ever considered before. But it does mean they are more interested in self promotion than fulfilling the needs of their actual customers.
I can imagine that VoiceOver on the Mac is an incredibly intimidating and complicated beast. It seems that whenever they change anything they break something else because there is no control over it. And from Apple's point of view, it's up to us to adapt to the new unwanted behaviour because they know best. But it's probably a lot simpler for them to just mess with the UI and introduce a new keyboard customisation thing that no one asked for rather than fixing the things that get sent to their inbox on a daily basis. But these are the things that actually matter. Everything else is just sugar coating.
Adding new features and customisation options will not make the overall experience better. It will make it more complicated and harder to test, as well as taking resources away from the things that matter.
Sorry I'm ranting again.
I agree with vision os…
I agree with vision os. Nobody asked them to do it but they still did. This is something we have to recognize. The fact that any blind dude can go to an apple store, buy whatever device and be independent / close to independent for first setup and general usage is something incredibly valuable. If you guys think that accessibility is an after thought on macos and god can we agree that VO definitely has some long standing problems there, you haven't tried a real thing where accessibility is an after thought. Spend five minutes with wayland and gnome/kde on any linux distro with gui and you will see when accessibility is a real after thought (the easy installers of asahi linux for example, I mean you can technically create your own image with some other distro but it's not in my skillset currently). When in some tty installs of a linux distro you don't even have speech and the only way is to force a screen reader into the image, or something like adding speech to the bootloader, this is indeed an after thought as you have to do things yourself. Not that I am attacking those software, it's just that accessibility is way easier when you have actually money to put in, which open source community doesn't have unless the dev is aware or have personal interest in. For the memo, windows bootloader or whatever the first thing that's load is still not accessible because accessibility is just not there. There have been some request to add to nvda the same trackpad ish thing that voiceover has on macos but it's not possible because there are too many different touchpad drivers and etc. So no, accessibility is not an after thought.
For Apple, even sighted non apple fanboys are starting to rant on reddit and other forums about how bugs are everywhere. I don't mean that it minimizes our situation, not at all, but... The tech industry is changing and capitalism has its flaws.
Sorry to be off topic. Guys, as a rule of thumb, before posting anything here in terms of bug report or suggestions, please take the habit to send it in a cleaner way to Apple themselves first. I am not saying that bugs in macos 15.3 should be normalized, but a lots of things have still been corrected behind the scenes since the first developer beta of sequoia to the point that I kept my mac. Generally speaking, for a windows user the iwork suite has become so much more usable now. Okay terminal sucks, but if I have to use gui text editors and can still use vim on my archlinux stormux raspberry pi 400 install, then... I will, and am doing it! :) I can still code, most of the things in terminal are still perfectly usable, so as long as I am still productive...
My request is that make iphone mirroring just airplay voiceover to mac with the tts and other sounds and send keyboard events back to the phone, that's so much easier to implement rather than this incredibly overengineered approach they have now. Nobody want to report a bug specifically because on sevral screens of instagram version x on ios version x and macos version x x thing happen with VO unable to read anything on the screen while with vo shift l we clearly see that it's perfectly showing. There are just too much room for things to go wrong, and things are indeed going very wrong as this feature is still practically useless for us as we have a bad mac VO interacting with so many layers of ios that... And we know that we would just want to have iphone voiceover to be controlled with mac hardware and that's it.
Please, add native screen recognition to voiceover on mac.
Make text selectable and easy to navigate in preview for pdf.
Correct nested list problem in pdf.
Make selection actually work on the web when macos standard commands don't work with VO ones (completely broken for me on all browsers with every VO setting combination I've tried according to that selection page in mac voiceover user guide).
Make it easy to actually skim through text by proper scrolling mechanism like sighted people do.
Create a garage band voiceover user guide on mac, if you already have one for ios.
Keep it up with freeform accessibility, some bugs but overall impressive as the app is 101% visual.
We really want to use the trackpad properly like sighted users would. Create a way so that apps in the background are not in the foreground when the pointer is on those app in non-fullscreen. Create a possible third trackpad mode where we can have greater control and use the perfect suite spot of voiceover and sighted way to interact with the trackpad, as macos is so point and click this will be a game changer.
Create a real scripting language for VO so 3rd party app dev can implement per app accessibility, this is seriously the worst side of mac accessibility. Like there is so much you can do with applescript which is not even updated anymore and catalist will be the death of it anyway.
Keep it up with the latest pwa way of doing things and web standards, add html tab navigation in the rotor across the ecosystem.
Create proper channel for accessibility bug reporting and tracking as so much is lost in the croud. I reported about 30-40 bugs roughly, 10 corrected, 2-3 officially acknowledged. This is wrong.
Release accessibility specific updates so we don't have to wait months or years for a critical accessibility bug to eventually be solved.
To the users. I feel that applevis is a great community but the community aspect has taken over the bug report aspect. Just an opinion. Maybe like for guides, the editorial team can come up with bug reporters role where the team validate each post before posting so it's a bit more official than forum posts? And create a video / audio section in podcast for bug demos... Food for your thoughts.
TheBllindGuy07
Apple did not create VO from their goodness of their heart. Apple PC were part of school and government and part of law 501 that require accessibility. I believe it was 501, someone will correct me please. Same from microsoft.Before someone jump on me I had an iPhone since the 4. Now have iPad 9 and watch 9.
I agree with TheBllindGuy07
I agree with TheBllindGuy07 we get any Apple product and know it is accessible. Does Apple products every platform has bugs no software is bug free. As for building a new screen reader. That is easier said then done.
Better integration with AI
I would love to see better integration with AI in iOS 19. Having the ability to rewrite an email is great, but what about the ability to tell me about my emails through a voice conversation. Have you ever watched the movie Her? Now, parts of that movie were just weird and kind of creapy, but the way the AI handled his email was way cool. He could just respond to emails by voice when he was walking down the sidewalk and teh AI even let him know when an important email (in this case, from his devorce lawyer) came in and needed his response. I so want an IOS AI to work like that. Its great to have conversations with Chat GPT, but that AI can't do anything on my phone like send an email or summarize a document unless I upload the document to the AI. I want everything to just be integrated.
Just some thoughts on a Friday afternoon.
Jim
Jim D
Wait 20 year from now. Assuming human still here and they did not blow themselves to h. AI could be a game changer.
Accessibility and Apple
I don't believe that accessibility is an afterthought at Apple. I think they are truly commited to making their products as useable as possible by as many people as possible.
What I do think is missing is a more direct interaction with the people who do rely on accessibility. As someone mentioned, developers such as those who made JAWS, NVDA, Zoom Text, etc. are often interacting in more direct ways with their users such as webinars, both private and public beta testing, and soliciting feedback and suggestions that get directly fed back to the team that does the development for these specialty products. Thus the interactions, feedback, and fixes don't get swamped by being part of a much larger organization whose focus is on serving the wider community of users.
For example, if the accessibility team develops a fix for some accessibility issue, the users have to wait until the organization as a whole wants to push out and update to the wider community of users. Thus the accessibility fixes can lag in time.
Note how many smaller developers you see interacting directly with users on AppleVis to solicit feedback, explain problems, etc. and how often this feedback and interaction leads to those products fixing accessibility issues and implementing new capabilities suggested by users in a timely fashion. That just doesn't seem to happen as readily with big companies.
Of course I don't envy Apple's job since visually impaired users consist of only one of the few groups using Apple products that have accessibility concerns. Perhaps interacting more directly with each specific group would require too much resources that could be used to actually develop solutions and fix problems. I guess it is always a trade off. There are always pulls and pushes in large organizations and it is sometimes hard for the tail to wag the dog.
--Pete
Fix the bugs
Fix the bugs, I think apple should do what google does with talkback and make voiceover be updated separately from the OS
Perfectly unrelated but you…
Perfectly unrelated but you guys just showed me that my username was misspelled. Late is better than never!
A very important one across the ecosystem, don't create VO fragmentation. Like there are literally more features on the ios side. That script to reconnect braille display on ios 18, we got nemeth braille code for equation like ages after on macos compared to ios, ios has more sound options than macos VO, (some) siri voices have extra settings on ios that are not replicated on macos, etc, etc.