What's New in iOS 15 Accessibility for Blind and DeafBlind Users

By Scott Davert, 19 September, 2021

Member of the AppleVis Editorial Team

Introduction

Another Autumn means another release of iOS. Just like releases of iOS dating back to 2009, this release brings a lot of new features and functions for blind and DeafBlind users. Major changes in iOS 15 include a new Focus Mode, several enhancements with FaceTime, the ability to use Siri for certain tasks while offline, and much more. Many blogs will be highlighting these enhancements to iOS, so I will not discuss them in great detail here. This article covers accessibility features impacting individuals who are blind or DeafBlind.

One of the joys and curses of getting a new release from Apple is that they do not actively document the changes in accessibility with their products. This is good for me because it gives me the chance to share new features with my readers, but it is also a challenge. While I have worked extensively with iOS 15 since the first beta release in June, there will inevitably be things that I have missed. This is also part of the fun; whenever I discover a new feature, it's almost like solving a puzzle or getting an early birthday present. Prior to upgrading, I strongly encourage people to check out AppleVis' list of bugs present and resolved in iOS 15. If you are reading this article after subsequent versions of iOS 15 have been released, it is possible that some of the bugs present in the initial release of iOS 15 will have been resolved; you can always check The AppleVis Bug Tracker for the most up-to-date list of active iOS and iPadOS bugs. With that said, here are the changes Detective Davert found while investigating accessibility with iOS 15:

An Important Note About Older iPhone Models

There is good and bad news concerning older iPhone models. The good news is that if you were able to install iOS 14 on your device, iOS 15 will also work on your device. I installed iOS 15 on an iPod touch 7, and found that it worked quite smoothly. There was no noticeable difference in the amount of time apps took to load, no degradation in the speed at which I could type, and there were indeed some new functions. However, there are certain features which will not be supported on devices that are older than the iPhone XS; this is due to the fact that the older devices run slower processors which cannot use some of the new enhancements. For a list of features announced at WWDC which will not be available on older devices, check out this article from iPhone Hacks. Note that the iPhone SE 2020 and iPhone XR do both support all of the upgraded functionality in iOS 15.

Some General Changes in iOS 15

This section is intended to give the reader a few mainstream enhancements which I felt were the most significant in terms of accessibility itself. It is not a comprehensive list; there are already many such lists available online.

Siri

As briefly noted above, it is now possible to have Siri perform certain tasks without utilizing the internet. This allows the user to, for example, have Siri enable accessibility features without having to be on a connection to the internet. It also means that certain tasks can be completed much more quickly. Opening apps, setting timers, toggling features, and several other options are available. Using Siri while offline does not require you to change any setting; it is part of the upgrade.

Follow-up is now more useable with Siri. For example, if you are browsing a contact and say "message them," it will open a text that you can dictate to the user onscreen. Siri has also gained new supported languages, including Danish, Finish, Swedish, and Norwegian.

Picture This Text

One of the new functions in iOS 15 is called Live Text. Available on the iPhone XS and later, Live Text allows you to read text from photos and with the Camera app. Similar to the Short Text option in Seeing AI, Live Text uses AI to identify text. Unlike Seeing AI, though, Live Text in iOS 15 does not require an internet connection to work. However, Seeing AI will run on devices older than the XS and of course has other capabilities besides text recognition. Both services are free.

After launching the Camera app, if text is detected in the view finder, VoiceOver will announce "text detected." If you find the "Detect Text" button and double-tap it, you will be presented with the text that was found. Like taking any sort of photo with a portable camera, it is important to do this with a steady hand. Like Seeing AI and any other app which performs OCR, it can be quite challenging for a braille-only user who needs to have both a steady hand and the ability to read braille one-handed. The information that text has been detected is conveyed with the alert messages function, so this could also present challenges for slower braille readers. For example, if the text goes out of focus before the "Detect Text" button is activated, the button will disappear and you will need to focus on the text and try again.

If text has been detected, activating the option to read the text will present it in a way which is easily readable by VoiceOver and braille. Selecting the "Detect Text" button also puts a stop to the camera attempting to recognize other things in your environment.

next to the detected text, there will be a series of buttons which enable the user to take action on the text. Select All, Copy, and Look Up are all options on basic text. If the text detected includes a phone number or email address, this text becomes a link, and the user can activate it to carry out the default iOS option for that type of content. I recently scanned a few business cards from a convention I went to in 2019, and was able to get the printed contact information on each card. The same set of options are available in the Photos app, so if you have a photo with text, you can also take advantage of Live Text.

Don't Leave Meeeeeee!!!

Within the Find My app, there are now options to notify you when you have been separated from your AirTag. If, for example, I had attached an AirTag to my wallet, and my wallet falls out of my pocket, I would be notified that I had left it behind. Once I am notified, I then have all of the options associated with Find My, which will allow me to hopefully get the lost item back.

VoiceOver Changes in iOS 15

Quickly Accessing Even More Settings

Over the years, Apple has come up with new ways of providing quicker access to VoiceOver's functions. With iOS 15, there is a new way to access even more settings. Whether you are suffering from a cluttered VoiceOver Rotor or running out of custom gestures, you now have another option with VoiceOver Quick Settings. You can access the Quick Settings menu by performing a 2-finger quadruple-tap, or pressing VO+v on a hardware keyboard. Note that as of iOS 15.0, there is no braille display equivalent for the VoiceOver Quick Settings command, and no option to add this to a keyboard assignment.

It is not only possible to utilize the settings in this menu, but one can also add, remove, and reorder the items. To do this, head over to Settings > Accessibility > VoiceOver > Quick Settings and view all of the possibilities. You will notice that these items are adjustable like the Rotor and that you can double-tap to go into a submenu to change the setting that way. Many of the options in Quick Settings mirror those available to access from the Rotor, so one can save the rotor for things like navigation if you wish, and use Quick Settings for other needs. Reordering the items in this menu is also easy to use and works the same way you would reorder items in the Rotor.

Grouped Navigation

Grouped Navigation is another new VoiceOver feature in iOS 15; the purpose of this new navigation style is to group certain types of similar screen items together so that your interaction can be more manageable with that group. For example, in the Mail app, the messages list when in an email account is its own group. However, it's important to note that Grouped Navigation is a work-in-progress, as indicated by struggles I have had with this feature. With Mail, for example, having Grouped Navigation enabled allows you to swipe past the "Dictate" button, but will not continue onto the group of messages. Instead, you must hit the touchscreen, and when you do, it will automatically start interacting. You can also move by container to get into the group, but that seems like more trouble than it's worth. The intent is to make this feature work like it does on the Mac, as evidenced by the 2-finger swipe right to begin interacting, and 2-finger swipe left to stop interacting with a group. On a keyboard, VO modifier with Shift Down Arrow will enter a grouping, while VO modifier with Shift Up Arrow will stop interaction. There appears to not be a braille display keyboard equivalent for these actions. At the moment, I am not finding Grouped Navigation to be a feature I will be using. If you find it of use, you can add it to the VoiceOver rotor to toggle off and on, as well as toggle it in Quick Settings or by going to Settings > Accessibility > VoiceOver > Navigation Style.

More Verbosity

Verbosity has again been expanded to include new capabilities. To check out what's new, you can read on, and/or follow along by navigating your way to Settings > Accessibility > VoiceOver > Verbosity.

One of the new items you can adjust is the way in which you are informed of the status of QuickNav. When it is changed from one setting to another, you now have the options to have this info conveyed with speech, a sound, changing the pitch of speech, through braille, or to have it do nothing at all.

From time to time, my flashlight can be turned on by accident. This is frustrating; unless someone is around to let me know that it has been activated, the only ways I typically can find out is if my phone is really warm or my battery takes a quick nose dive in the amount of juice it has left. With iOS 15, under the Verbosity menu, there is now an option to be notified when the flashlight is enabled. When one locks the screen, there will be a speech announcement that says, "Flashlight on." If there was a braille flash message that comes along with the "Screen Locked" message, I did not notice it when testing this functionality. If the flashlight is left on for more than a minute after the screen is locked, a notification will also appear on the Lock Screen from VoiceOver.

In iOS 14, Verbosity allowed users to control whether they are informed about when the Actions rotor is available. A nice addition to this menu is the option to have it sent to the braille display. While it can be a hindrance in the Mail app, since you would need to read "actions available" with each new line of text, it is something which can be of value in different circumstances. This is especially true for apps which take advantage of this rotor option where it may not be obvious until the user checks for it. One other added control is the ability to specify whether numbers are spoken as words or as digits.

Explore the Expanded Descriptions

One of the things sometimes missing from images in iOS 14 and before with text recognition was the ability to explore what is positioned where within an image. This option is now available on any photo. Not only can one expand the description by choosing the rotor option that is now included, but the user can then flick left and right through the detected content, which can give a good overview of what is in the photo. If you encounter an image in another app, this feature is also available. I am finding image descriptions more useful now that the technology has matured. The amount of details available within these images is amazing, and I also like the new ability to explore by swiping or touching the screen. This puts it into a format which does not rely on the VoiceOver alert messages feature, which is more difficult to access, especially for a braille user who may be a slower reader.

Another enhancement that is somewhat related, is the ability for content creators to add image descriptions that can be read by VoiceOver in Markup. This allows creators to generate their own descriptions which can be more accurate than what you may find using the AI-driven options.

Rotor Settings Become Less Mysterious

Previous to iOS 15, a VoiceOver user trying to adjust certain settings would have to change them to figure out the status of that rotor option—for example, whether a certain feature is on or off. In iOS 15, VoiceOver gives you the information each time you move to specific rotor options such as the speech rate.

Right-to-Left Support

For those who read in languages where one reads from right to left, VoiceOver can now be set to reverse the left-to-right navigation. This setting can be controlled under Quick Settings if the app developer supports this. By default it is set to "Automatic."

Visual Accommodations

No One Size Fits All

In iOS 13, VoiceOver users were introduced to a feature called Activities, which allowed the user to control how specific apps behaved based on an Activity. Think of an Activity as a group of custom settings; for example, if you preferred that you had one voice reading your email messages and another that read your books for whatever reason, you could set these options accordingly. However, this was a VoiceOver function only. iOS 15 changes that in many ways.

If you head over to Settings > Accessibility > Per-App Settings, you will have the ability to customize many of the visual settings for each app on your device. If you want Invert Colors on for one app, for example, and not for another, this menu will allow you to specify your preference. It is also possible to apply something to all apps, or on the Home screen. Some of the settings you can change include: contrast settings, Smart Invert, Cross-Fade Transitions, and many more.

The other thing the Per-App functionality brings, is the ability to set the text size on a per-app basis via the Control Center. To do this, one must first put the option in the Control Center, which can be found under Settings > Control Center. Then, while in any app, open Control Center and select the Text Size control. You will then be presented with a screen that has a slider to change text size and a toggle that lets you choose between All Apps or only the app you're currently using.

An Independent Magnifier

The Magnifier has been located under the Accessibility settings menu for many releases of iOS, and it has expanded dramatically over the past couple of years while still being able to do things like spot read. in iOS 15, Magnifier has become its own standalone app.

Hearing

Accommodate These Headphones

Headphone Accommodations for supported Beatz and AirPods were introduced in iOS 14. When turned on, if you have a set of supported headphones, you are walked through a form of hearing test which will ask you questions about what you hear. Based on that information, iOS will optimize the sound for your hearing loss. iOS 15 makes this easier by allowing for the importing of a paper or PDF audiogram. You can still use the test discussed above, but this seems much faster and the resulting sound has been reported to be nearly as effective as using the manual method. To access these new Accommodation options, which are in a different menu from iOS 14, head to Settings > Accessibility > Audio/Visual > Headphone Accommodations.

Hanging out in the Background

Also under Settings > Accessibility > Audio/Visual, you will find an option called Background Sounds. These sounds can come in handy to help someone focus, but can also help potentially support people with tinnitus. Not only can you play a background sound, but you can customize how it behaves. You can independently control the volume of the sound, whether the sound continues when media is playing, the volume for media itself, and have the noise stop automatically when your screen is locked. Background Sounds is a nice function, though it would be even nicer if there was a way to quickly start and stop the playback through a control in the Control Center.

The sounds you can choose from include: Balanced Noise, Bright Noise, Dark Noise, Ocean, Rain, and Stream. The recordings are in stereo and are of decent quality.

Other Changes

I Wasn't Done Reading that Braille Flash Message, Thanks for Holding!

As noted above, I appreciate some of the elimination of the use of what iOS refers to as VoiceOver Announcements. Not that the information previously provided through them wasn't useful, because it was, but that particular format caused issues for some braille users who were trying to read a VoiceOver announcement that would disappear too quickly. This was a particular challenge when accessing Rotor options. iOS 15 has done away with delivering Rotor and some other functions this way, which makes them more accessible to slower braille readers.

Switch Control Becomes Auditory

Sound Actions for Switch Control let you control iPhone with simple mouth sounds. This feature allows for much easier access for some users who can't or don't wish to utilize the other options like Switch Controls, a mouse, or other gadgets. However, you can combine this feature with other forms of Switch Control, so I would imagine this will be helpful for many.

New Sounds to Recognize

iOS 15 also adds two new sounds to its Sound Recognition options under the household category. Breaking glass can certainly be handy, not only to help alert you to something being dropped, but can also help you determine whether someone is breaking in. I found that playing sound clips of breaking glass through Youtube was mostly reliable, but just like with the other sounds, it must be far above the threshold of sounds around you. I did not have any false positives with this sound.

The other new sound is kettle. I couldn't get this one to work, but it would be handy to have since this sound is outside my hearing range.

Conclusion

Though the changes in iOS 15 are not quite the amount of enhancements we have seen in past releases, Apple continues to expand and enhance its accessibility features with each release. Overall, I have noticed some general stability improvements in this release. Whether you should upgrade to iOS 15 is an individual decision no one should make on your behalf. Remember, the ability to revert to iOS 14 is available for a short time after release, but this process is not for the faint of heart. Once Apple quits signing iOS 14, though, there will be no going back. I would again encourage users to take a look at the bugs listed on the AppleVis website and to check out iOS 15 on another device if possible prior to installation on your primary device. iOS 15 is a free upgrade for all iOS devices which run iOS 14, and can be downloaded and installed by visiting Settings> General> Software Update.

Options

Comments

By Ginsenshi on Tuesday, September 21, 2021 - 18:01

I like everything under More Verbosity, especially quicknav since the audio notifications are hard to hear or tell the difference
even with my hearing aids in.

By Unregistered User (not verified) on Tuesday, September 21, 2021 - 18:01

I frequently get lost in the rotor, and certain items insist upon rearranging themselves despite me explicitly setting an order, so the new menu sounds like a great addition.

The improvements seem to reflect how iOS 15 is, overall, a fairly conservative update compared to other releases. If it means more bug fixes, that's perfectly fine by me. I'll be waiting for the .1 or .2 for some of the more egregious bugs that'll inevitably rear their ugly heads so they can be ironed out.

By Missy Hoppe on Tuesday, September 21, 2021 - 18:01

I enjoyed reading this. There's plenty to get excited about, but my phone is too old to support most of it. I just don't know what new phone to get. Kind-a wanted to try to hold off for one more year, but some of these features are way too tempting.

By Jo Billard on Tuesday, September 21, 2021 - 18:01

This is exciting! I have an older model, and although I'm disappointed I won't be able to make use of everything here, I am excited to learn what new features I will have. If I'm missing out on too much, I guess I'll be upgrading.

By Pildain on Tuesday, September 21, 2021 - 18:01

Now text replacement works in braille screen input. I was waiting this feature, I use it a lot on my Mac and having my replacements in the braille keyboard boosted my typing speed on the iPhone.

Created an abbreviation to insert my work zoom link and nothing happens with BSI, but it works with the standard on-screen keyboard. No idea what the problem could be.

By Rafal on Tuesday, September 21, 2021 - 18:01

Hi How to use text replacement in BSI? I find new iOS nice, but: I cannot adjust screen time limitations and when I place camera over printed text nothing happens. Maybe I should turn an option to recognise text in VO settings? Cheers!

By Holger Fiallo on Tuesday, September 21, 2021 - 18:01

I do not know what I did but now I can not get VO to read my messages. I know I have one due to my haptic in my watch. When I open the phone, VO reads it. Help. Using 12 pro, Series 5. I think it was focus that did it but do not know. all setting are on messages.

By Peter Holdstock on Tuesday, September 21, 2021 - 18:01

In reply to by Holger Fiallo

Hopefully I’ve understood your problem correctly. Do you mean that when you receive a notification on your watch about a message, it’s not being read? I’m having a problem where any notification which pop-up on my watch, aren’t red’s read by VoiceOver. However, if I go into notification centre on the watch and then able to read The notifications. To do this, while on the watch face, flick down until you get to notifications then double tap.