Ahead of Global Accessibility Awareness Day (GAAD) Apple is offering a preview of accessibility features coming to its platforms later this year; including Accessibility Nutrition labels for App Store apps, Magnifier for Mac, Braille Access, and more.
Accessibility Nutrition Labels
- Similar to the Privacy Nutrition Labels for App Store apps that Apple introduced in 2020, developers will soon be able to submit Accessibility Nutrition Labels to let users know what accessibility features are supported by, or have been tested in, their apps. According to Apple, Accessibility Nutrition Labels will be voluntary at the outset, in order to give developers ample opportunity to prepare and evaluate their apps; with a requirement for developers to share this information coming in the future.
Vision accessibility features
- Magnifier for Mac will allow you to use your iPhone's camera, or another camera attached to your Mac, to zoom in on your surroundings, like a screen or whiteboard. Multiple simultaneous sessions will allow you to, for example, participate in a presentation using your Mac's built-in camera, while at the same time read a document using Desk View with the camera on your iPhone.
- Braille Access seeks to emulate the experience of using a Braille note taker on iOS, iPadOS, macOS, and visionOS by providing a custom interface for common functions of dedicated braille note takers. With Braille Access, you will be able to take notes, perform calculations using Nemeth code, and open Braille Ready Format (BRF) files directly. Additionally, an integrated form of Live Captions will allow you to transcribe conversations in real time on braille displays.
- visionOS is getting several enhancements that could improve its accessibility potential, including Live Recognition, Zoom, and a new "Trusted apps" API that will allow apps like Be My Eyes to use the Apple Vision Pro's main camera to facilitate live person-to-person visual assistance and interpretation.
- Accessibility Reader is a new reading mode designed to improve the legibility of text, with the ability to customize fonts, colors, and spacing, useful for those who have low vision, in addition to those with reading disabilities like dyslexia. This can be used system-wide on iOS, iPadOS, macOS, and visionOS, and integrates with the Magnifier app, allowing you to take advantage of its benefits to read text in your physical environment.
- CarPlay will support Large Text.
Hearing accessibility features
- Live Captions is coming to watchOS, allowing you to view captions on your Apple Watch when Live Listen is on. In addition, you will be able to turn Live Listen on and off from your Apple Watch, eliminating the need to interact with your iPhone to perform this task.
- Live Captions will work with Live Listen, allowing you to read speech your iPhone's microphone picked up when Live Listen is on.
- The EQ of Background Sounds will be able to be customized, which may be helpful in improving focus or relaxation, particularly for those with tinnitus. In addition, Background sounds are gaining new Shortcuts actions, as well as the ability to specify how long they should play.
- Music Haptics on iPhone is gaining new customization options for conveying different parts of songs, as well as the intensity of taps, textures, and vibrations.
- Sound Recognition is gaining the ability to recognize when your name is being called.
- With CarPlay, those who are deaf or hard of hearing will be able to use Sound Recognition to detect the sound of a crying baby, in addition to other sounds outside of the car.
- Live Captions adds support for English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean.
Other updates
- Personal Voice will be faster and more natural sounding, with voice creation taking less than one minute, and only requiring 10 recorded phrases, as opposed to the 150 that are currently required. In addition, voices will be able to be created in Spanish (Mexico).
- macOS will support Vehicle Motion Cues, which can help reduce motion sickness when using your Mac in a moving vehicle. In addition, the onscreen dots that appear when using your iPhone, iPad, or Mac with this feature turned on will be able to be customized.
- Eye Tracking on iOS and iPadOS will integrate with Switch and Dwell Control for faster selections and typing, in addition to improved keyboard support on visionOS.
- Head Tracking will allow you to control your iPhone or iPad with head movements.
- iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for brain computer interfaces (BCIs).
- The TV app will support Assistive Access with a simplified media player. In addition, developers will be able to use a new API to create custom app interfaces for Assistive Access tailored for those with intellectual or developmental disabilities.
- Voice Control will have a new programming mode in Xcode for those with limited mobility.
- Similar to Portable Preferences on macOS, you will be able to temporarily share your iPhone or iPad's accessibility settings to another device, useful if you, for example, are borrowing the device or using a device in a kiosk.
Feature Availability
According to Apple, these new accessibility features will be available later this year. At the time of publication, we do not have any further information about how these features will be implemented or any other possible upcoming changes for blind, DeafBlind, or low vision users.
What do you think of the accessibility features Apple announced for Global Accessibility Awareness Day 2025? Let us know in the comments!
Comments
Regarding Braille Access Mode
Braille Access Mode requires a Braille Display.
It was mentioned on the DoubleTap Podcast.
So no Braille Display, you're out of luck.
It's on the episode calledInside Global Accessibility Awareness Day 2025: What Tech Companies Are Doing.
You can find it around the 42 minute mark.
But what I don't get is…
But what I don't get is according from the interview of Steven from Double Tap with Sarah it will only work if you connect a physical braille display to the apple device... Not sure about this querty braille keyboard everyone (including me) is talking about / want. That would be nice, yet another commander on the mac :) .
@
Yep didn't read your message before mine.
Again, I am excited for the magnifier though even for blind if it can have some scanning capabilities like detect text... And it seems like it will have.
now I'm confused
Spoiler alert.
I listened to that double tap episode and Sarah does say that the Braille Access does require a connected Braille display. In a way that does make sense if the feature is ecosystem-wide.
But upon reading the press release again, it says you can open apps, take notes and perform calculations by typing with Braille screen input or with a connected Braille display. This makes me wonder: Will you be able to read back brf files and the notes you take using BSI or a display, with speech?
I don't see this totally replacing dedicated note taker hardware though. Case in point, the magnifier app that's been in iOS since 2016, and that they're bringing to the Mac, hasn't replaced handheld magnifiers. There are a zillion of them and new ones are always coming out. It would be interesting to hear from a low vision user how the iOS apps compare to the dedicated ones, like the Magna ones from Orbit or the Explore ones from Humanware.
Spoiler alert, wwdc is like…
Spoiler alert, wwdc is like in 3-4 weeks! We'll see then.
Braille note taker
OK I warned a dumb question was forthcoming. As someone who has only dipped their toes into braille, I am struggling to understand exactly what this is all about.
I have an Orbit Reader 20 and can obviously use it to control my Mac or iPhone. So I can take notes and read things in braille. I can type in braille if I want too. Honestly my braille isn't good enough yet to brave any of the system menus so I've never tried using it standalone.
So - is this just a simplified thing to make it easier to access these functions? In which case, what has this got to do with braille as such? Or is it just useful because braille is possibly a slower way of accessing the normal screen?
Or is it because it is providing an easier way to input braille using different systems like for maths etc? So it's an input interface?
I guess we don't know the details, but I'm And given that possibly all braille displays all have a built-in note taker, why would we want to plug one into an Apple thing and use it instead? Given now I would need 2 things and not just the one?
Maybe someone who uses a braille note taker could shed some light on what kind of thing would be useful? Other than those who prefer to type in braille over a qwerty.
Feels to me like it might be better to make the overall experience better and quicker rather than whatever this is supposed to be. Is this just a way for Apple to avoid having to fix some more substantial bugs?
@mr grieves I think this is…
@mr grieves I think this is primarily for people who prefer Braille input and output. I’m thinking the note taking in this mode will be Braille formatted (brf files), which are made to be read on a Braille display or embossed into hard copy.
Braille Access is also supposed to have some kind of math mode, where you can input Braille math content. It’s hard to know from the screen shot if this is a full on calculator. Math is not my thing though. Lol
Not all Braille displays have stand alone note taker features (mine, 4th gen Focus 40, for instance). So having these features integrated into iOS could be good for people who have an older display like that, or for people in the US who have an NLS Ereader which doesn’t include a built-in editor. Some people might also like these kind of features integrated into Apple’s platforms.
Focus blue 40 4th gen as…
Focus blue 40 4th gen as well.
Still a little confused
OK, so VoiceOver currently outputs braille and I can input braille with my Orbit. Is the difference that this is natively braille and isn't being translated? Because obviously I can use my Mac entirely with my Orbit Reader 20. Well, I can't, but someone competent could!
Is there an advantage if you are working purely electronically? Is it more that if you want to print it, or something, that it is braille formatted properly?
Interesting to know about the focus - I just thought they all had this built in. But I would imagine Apple building this isn't just going to be to support old tech.
Public Apple engagement
I kinda heard somewhere that Apple employees aren't even allowed to browse public forums or social networks during work hours, so interacting with the public on company-related subjects even outside the company is likely strictly forbidden as well. If this is the case it's actually a policy that I can understand, because there's a chance that, in the heat of a debate, one might accidentally spill out business secrets by incorrectly judging what is and is not publicly available information. Honestly I can't even think how much NDA-breaching public pressure Apple employees face on a daily basis, which is a huge responsibility with lots of reputational and career-damaging potential.
Braille access
The way I understand that braille access works is that it allows you to edit and create BRF files as well as Nemeth code. It’s going to be interesting to try, especially that Nemeth code can be used for calculations.
For anyone curious, I previously posted a comment where I had ChatGPT describe the images in the press release.
these features sound…
these features sound interesting!
though i hoped apple would work on fixing issues in its OS too. esp give a little attention to mac
i can see some of them being helpful to my family members who're disabled but not too disabled hehe
and braille access sounds exciting.. am curious what they do with it without the hardware.. maybe transfirring files to and from displays or something or just we'll be able to read BRF files? dunnow.. would be curious if it'd be useful because braille devices are hella costly here
but hope they remember they can't never replace the devices built for them. so if they'd work in conjunction that'd be good.
USB Braille Support
If this new Braille Access feature requires a physical display, I hope they've finally realized they need to include the same universal USB support in iOS and iPadOS that's in macOS. HID support over USB is a half-baked solution. Not all displays support HID, and not everyone wants and/or can afford a new display, including me. I have a perfectly functional Focus 40 Blue I prefer to use over USB, because I'm uncertain of the status of the battery. If Android finally did it, Apple can as well.
Magnifier for Mac + This Accessibility Reader Big Win for Me
Like I said previously in this thread, all these upcoming accessibility enhancements/features sound very promising. If I'm reading things correctly, the new accessibility reader in combination with the magnifier app will hopefully be a big win at least for me. As stated previously, I've had mixed results scanning groceries and other stuff. Not to stray too much off topic, but health and nutrition has been a big thing for me these days.
Feeling mostly indifferent about new features
The nutrition labels have potential, but I think giving the feature that name might end up being a bit too random and confusing for a lot of users. Instead of adding more new features, I wish Apple would take a year or two to fix all the long-standing accessibility bugs. For the most part, things are pretty good on my iPhone 16, but TV OS 18.5 is a complete train wreck; I still have a wish list in the movies app, and it's all but impossible to navigate it now. Also, VO just keeps repeating messages non-stop sometimes, and it never used to do that before on my Apple TV, or at least it wasn't to the extent that it is now.