After getting fed up with the general neglect of MacOS accessibility from Apple, and having wanted to work on something meaningful for quite some time, I decided to attempt something that for some reason nobody seems to have tried to do before: write a completely new screen-reader for that platform. This isn't an easy task, not only due to the amount of work required to even get close to matching a mature screen-reader in terms of functionality, but also because Apple's documentation for more obscure system services is nigh on non-existent. Despite that, and since I've already overcome a lot of hurdles that I thought to be show stoppers, after a single week of work I already have something to show in a very embryonic stage of development. The idea is to gage the interest of the community in a project similar to NVDA for the Mac to be worked on in the coming years and to which other people can contribute.
The project is called Vosh, which is a contraction between Vision and Macintosh, though the name isn't set in stone yet so if you wish to suggest something different feel free to do so. The code for the project isn't available yet since I haven't even done my first local commit as I'm still learning the ins and outs of Apple's consumer side of the accessibility framework and thus will likely end up refactoring the whole thing, but once I feel comfortable with the structure of the code I will post it to my personal GitHub profile for everyone to see and modify as they please, and will also start accepting code contributions. At the moment the only thing this project does is to allow navigating the accessibility of every app, reading everything and moving the keyboard focus as it attempts to read the contents of every accessibility element, though there's a lot to do to even get close to matching the level of sophistication of VoiceOver, which employs a number of hacks to make things feel as smooth as possible to the end-user. One such hack is the flattening of the accessibility tree which tends to be very shallow particularly on Safari where the accessibility tree can actually be quite deep. It is my intention to make navigating with Vosh as close as possible to navigating with NVDA. For that reason I will try my best to copy its behavior and laptop keyboard commands as much as possible so that Windows NVDA users can feel more at home while using a Mac.
Before posting this I also posted a demo video to YouTube where I show that it is already possible to do some very basic navigation of websites like reddit using Vosh, though very inconveniently both due to the fact that I haven't implemented semantic navigation or the aforementioned accessibility tree flattening hack employed by VoiceOver. I apologize in advance for my diction since, while I write English every day, I rarely speak the language, and it is also not my native language, so if you don't understand something I say feel free to ask here. I also apologize about the quality of the video, as the original file was almost a gigabyte in size so I kind of over-reduced the resolution resulting in a huge degrade to image quality that I wasn't quite aware of until someone sighted actually watched it.
Comments
Re: Highlighting
If you mean when you expand a selection using the text caret, yes, it can be read.
If you mean when you move the screen-reader cursor to the element, there are several pieces of optional information that can be present and can be read too.
If you mean reading the element under the mouse cursor, yes, there's a way to do that too.
The project is currently stalled though, because I've been trying to reverse engineer VoiceOver to extract undocumented and undeclared notification identifiers and figure out how to make Vosh notice changes to the frontmost window without polling the system-wide element. Also the video linked to on the original post no longer reflects the state of the project.
Use by person with only one hand
My friend who can only use his right hand, recently purchased a Mac, but was unaware that sticky keys on the Mac will not work with voiceover because it only allows one key to be held down and of course, voiceover requires several keys most of the time. Will your screen reader be easier to use for my friend? Will it work with sticky keys? Will it make use of any shortcuts for voice input? Will it also embellish or encourage you to use trackpad gestures? Just wondering if this is a viable alternative. Any notions about timing? Thank you much and very glad this work is happening.
Abandoned
I've abandoned this project.
Originally I wanted to build something that could showcase my ability to develop for Apple technologies, with the intent of landing a job as a native iOS developer. However since nobody replies to my applications, I have switched to a much more interesting yet niche area and language: embedded / bare metal development in Rust. Unfortunately while I have a lot of free time on my hands, I really do need a job, so currently that's the highest priority in my life.
The project's repository will remain public just in case someone wishes to fork or learn something from it, but I'm unlikely to pick it up again in the near future.
I believe it's possible to use VoiceOver with just one hand, thanks to QuickNav, Single Key QuickNav, and VO locking. To enable QuickNav, press the Left and Right Arrows simultaneously by default; to enable Single Key QuickNav, press VO+Q by default; to enable VO locking, press VO+Semicolon. Unfortunately if VO locking is designed for single handed users, the position of the keys at least on US laptop keyboards makes it very inaccessible since there's no Control key anywhere near the Semicolon. It can also be activated from the VoiceOver Help Menu (VO+H) -> General -> Toggle the VO Modifier Lock On or Off, and might be callable from Keyboard Commander possibly using AppleScript so you might be able to map it to a more accessible key combination (I didn't test this).
I wish you
best of luck in your future dev projects. This was quite an ambitious thing and I even happier that you had the humility at the right time to abandon it in a smoother way for everyone.
Work in Progress: Vosh - A New Screen-Reader for The Macintosh
Hi I am Ayub and I am Visually Impaired or legally blind. I would like to participate to make this project better. Are there any conferences that I can attend so that I can advocate for what needs to be done in this project? Thank you and keep up the hard work!. I know this is going to be a long project but when the first beta version comes in can you leave a comment on this post?
That's fine. Everyone's next…
That's fine. Everyone's next computer should be an arm based windows machine anyway.
Concur
Preach it, Brother Ollie! 😇
Re: Use by person with only one hand
There are a couple of things your friend could do to make VoiceOver easier with one hand. Firstly, the NumPad commander is great - I rarely have to use VO modifiers as most things can be done there with one hand. You can use the numpad 0 as a modifier which means you get a lot of keys you can use.
Obviously it requires a numpad to use. You can buy bluetooth numpads I believe. I would be doing that if I was using the laptop keyboard.
The other thing which I don't use is single key navigation. (Pre-Sonoma this was part of quick nav but I believe it isn't now).
Interesting
Hopefully someone will pick up the production of this screen reader, regardless of the status of it. I like the concept, if people were motivated enough to Finesh it, it could be comperable to VoiceOver.
It will be a shame if it was…
It will be a shame if it was comparable to voiceover.