It's difficult to believe that iOS will be turning into a legal adult next fall, though the iOS 17 upgrade is far from being minor. Many new features and functions will be available for everyone. To check out some of the mainstream changes, Apple's official iOS 17 preview page may help. Alternatively, our main post announcing iOS 17 offers a large list of mainstream enhancements as well as a list of changes in accessibility. Not all devices which supported iOS 16 will support iOS 17; for a list of devices which will, check out this CNET article The goal of this article is not to repeat the information widely available from other sources, but to look at it in a bit more detail where new accessibility features are concerned. I would like to thank my colleague Juan Ramos for his contributions to the article from a visual perspective.
Mainstream Changes
Siri Again
The VoiceOver iterations found in iOS 17 now have the ability to be sped up or to be slowed down. Also, the word "hey" is no longer necessary when summoning Apple's virtual assistant. Finally, it is possible to speed up and slow down the speech rate of Siri when it is not being utilized with VoiceOver. This setting is found under Settings > Accessibility > Siri, and is in the form of an adjustable slider.
Transcription of Audio Messages
iOS 17 also brings the ability to have your audio messages transcribed as well as any type of voicemail. While the transcripts are useable with braille and VoiceOver for voicemail, transcripts of audio messages are currently not useable with VoiceOver. Once you receive and audio message, you can select it and then find the "more" button. This will bring up a transcript of the audio message on screen. However, VoiceOver still reports "Transcript not available".
VoiceOver
Get Notified Only When You Want
For many years, VoiceOver users have had the option of whether they prefer to have VoiceOver always speak notifications or not. Even though the setting was there, if I had speech on when this was set to off, notifications were still being read out loud. In iOS 17, this has been improved and expanded to include several options. To check them out, head to Settings > Accessibility > VoiceOver > Verbosity > System Notifications. Here you will find options to set how you would like VoiceOver to handle notifications in different situations. When the screen is locked and a notification arrives, VoiceOver is now able to speak the notifications as they arrive or speak the amount of notifications since the device was last unlocked. It is also possible to show these notifications in braille only, or to have the device do nothing. One can also configure what happens with banner notifications in this submenu. The options are to speak, play a haptic, braille, and do nothing. Finally, it is possible to silence the output of VoiceOver when notifications come in altogether by turning the Ring Switch on or off.
More Speech Customization
iOS 17 also brings new speech options to all supported VoiceOver voices. These new settings can be found by navigating to Settings > Accessibility > VoiceOver > Speech and then selecting the language and desired TTS voice. After selecting the voice, flick up or down with one finger, or press space with dot 3 or dot 6 on a braille display, to find a new feature called Per-Voice Settings. The available options in Per-Voice Settings vary based on the synthesizer. For example, Vocalizer voices offer options for customizing sentence pause and timbre; while the Alex voice offers customizations for pitch range and WPM minimum and maximum. In all instances, Thankfully, included in Per-Voice Settings is a "Reset to voice defaults" button in case the changes you have made are not what you prefer. All screens of the per-voice settings also have a "preview" button, so it is possible to listen to what you have created to determine whether you've created a monster, a perfect sounding voice for your needs, or a monstrously good sounding voice to your ears.
In iOS 16, Apple added the Eloquence text-to-speech engine to its list of available speech options. With the 17th iteration of iOS, many options have been added to the Eloquence voices. After navigating to Settings > Accessibility > VoiceOver > Speech > Voice > Eloquence > the preferred voice and then flicking up twice, the user will find an option called "Open per-voice settings. Among the options, the user can now configure: Rate Multiplier, head size, pitch, pitch range, breathiness and roughness. It is quite difficult to explain each of these parameters in writing, so I would advise those with interest to check out the AppleVis podcast done by Thomas Domville which demonstrates these features. It should be available shortly after the release of this article. Also on this screen is a toggle for a higher sampling rate. Other new options for Eloquence include phrase prediction, whether you would like the abbreviation dictionary to be in use, and the community-driven pronunciation dictionary.
Predictably Better!
This is true, at least where it concerns how VoiceOver will communicate the presence of predictive text. One can locate these settings by going to Settings > Accessibility > VoiceOver > Verbosity > Predictive Text Feedback. There are 2 new series of options: one is for communicating when predictive text appears, and the other will communicate when predictive text has been successfully entered. With both adjustable options, you have the ability to have the information communicated by playing a sound, speaking, through braille, by changing the pitch of the speech, any combination of those options, or to do nothing. It may be worth noting that some VoiceOver users report that recommendations are silent in certain apps, even though they have selected speak. Others have found that with the "predictive text" setting configured to not speak recommendations, VoiceOver will still read out the recommendations in some apps like Mail. So these options are a great idea, albeit a work in-progress.
New Haptic Feedback
In iOS 17, haptic feedback is used by VoiceOver to communicate more things. For example, when the Lock Screen is open and the display goes to sleep, the user will feel a gentle Haptic as well as hearing the high-pitched sound to indicate the display has dimmed. Haptics are also now played when an element is activated, similar to the familiar sound that has accompanied element activation since the very early days of iOS. There is now also an option to customize haptic intensity. To customize all things related to VoiceOver haptic feedback, go to Settings > Accessibility > VoiceOver > Audio > VoiceOver Sounds & Haptics.
Braille
Connection Concerns
It has been reported by several users that the connection process in the public release of iOS 17 has become a bit unpredictable. In this author's experience, there are significant connection issues with the Focus displays as well as the Braille Sense 6, though both the Brailliant BI 20X and Mantis Q 40 connect very reliably. Other users have reported no issues with the Focus, so your experience may vary.
Sound Off With Sound Curtain!
There are quite a few new options for braille users in iOS 17, some of which are only applicable to the braille experience. For example, one new feature is called Sound Curtain. Much like Screen Curtain darkening the screen, Sound Curtain mutes all sounds other than emergency alerts. To enable Sound Curtain, move to Settings > Accessibility > VoiceOver > braille and turn the switch. If you are sure that's what you want to do, you will be asked to confirm. Music, VoiceOver speech and other sounds from your iPhone will be silent. This can be helpful to many, but especially for those braille users who do not have any hearing to tell when their speech is on. Sound Curtain brings more assurance that the user's speech and sounds are all turned off. Like the Screen Curtain, Sound Curtain doesn't turn the sounds off, it covers them up. For the moment at least, turning speech on will still cut into hearing aids that have VoiceOver set to go to them, but it will be silent. If you have hearing aids connected and do not wish to have your hearing interrupted if you accidentally turn on speech, it has been my experience that routing audio to the internal speaker of the iPhone resolves this issue.
Synchronized Startup
Continuing in the braille submenu, there is also a new feature which will allow the user to turn Bluetooth on automatically whenever voiceOver starts. This means that if a user turns Bluetooth off by accident, for example, restarting VoiceOver will turn it back on. The good news is that this is a toggle, so the user can turn it on or off as needed.
No Naps in iOS 17
Traveling to Settings > Accessibility > VoiceOver > Braille > [your braille display model] > more info, you will encounter a new setting which allows you to control whether a braille display stays connected when your phone is locked or not. It is on by default, which is the behavior we have sene in the past, but always keeping the braille device in range and connected insures there are less Bluetooth connection issues. The draw-back, of course, is that having a braille display always connected can drain the battery quite quickly.
Faster Launching for Braille
iOS 17 also brings a new way to launch applications from the Home Screen in braille. To use this, from the Home Screen, press dot 8, or space with dot 8 if in 8 dot braille. Then type the app you are looking for; Then, press dot 8, or space with dot 8 when in 8 dot mode. For those using either contracted or uncontracted braille, a list of matching apps will appear with a full cell at each end of the display. Move through the results of matching items by pressing space with dot 1 or space with dot 4. When you find the app you want, press dot 8 again or a cursor routing button if you prefer--and the app will open. If you wish to cancel your search without launching an app, you can press space with b to return to your Home Screen.
New Commands!
There are a couple of new commands available for braille display users as well. Navigate to Settings > Accessibility > VoiceOver > Braille > [your model of braille display] > more info > Commands and under the "keyboard" category, one of the new options is to toggle text selection. Though it appears this function mostly works as it should, it is a bit challenging to use, since VoiceOver does not often announce in any way when Text Selection is turned on or off.
There have always been commands to go to the first item, last item and so many more choices. iOS 17 adds a command to move to the center of the screen. This seems to work as advertised, and if you know the layout of an app, this tool can also be a productivity gainer.
Low Vision
Freeze Frame!
For those who are sensitive to rapid animations, you can now control the frame rate minimum and maximum of content. Check it out under Settings > Accessibility > Motion > frame Rate. it's also possible to turn these images off in both Safari and Messages. To do so, head over to Settings > Accessibility > Motion > Auto-Play Animated Images and turn them off altogether if that is preferred.
Grayscale Gets Mor Variable
If you are a user that has some type of color filtering enabled, there is a new setting which allows the user to control the intensity of the Greyscale on any of the color filters which already exist. Previously when Grayscale was applied it was intensified to a maximum level. Now you are able to control the amount of Grayscale applied to your screen. Head on over to Color Filters under Display & Text Size in Accessibility. Enable Color Filters, tap on Grayscale to apply and now scroll towards the end of all the filters. You will notice a heading labeled, “Intensity.” Use this to apply less or more intensity. By default, it is set to 100%.
Point and Speak
Available on the Pro and Pro Max models of iPhone 12 and later, Point And Speak allows the user to identify text by physically pointing at it with their finger. It can be found in the Apple Magnifier app as part of Detection. Point And Speak allows you to move your finger around the view finder and if lined up properly, it will identify the text closest to your finger. It is useful for when working with things such as keypads with touchscreens or appliances. You can also point to a specific section of text and have that spoken as well. Like the other function of Detection Mode Discussed in the iOS 16 article, this feature requires more hands than a deaf-blind person typically has. As a speech user, I find that Point And Speak is useable, but I had trouble, for example, figuring out how to use it with a microwave or air frier. While one hand holds the phone about a foot from the back facing camera, the other hand must point at the text. This takes some coordination and guesswork without any visual reference point, but can be done. For braille users, even if you have a 3rd hand, my testing found that the output sent by VoiceOver is not helpful. Instead of getting a flash message on the display of what is verbalized, the braille user reads "[speak]{prosody}." Though this technology on a phone may be extremely challenging for braille users, it is highly likely that it would be of much higher value in a head-worn device such as the Apple Vision Pro as one could point with a hand and use the other to read their braille display.
All of the Text Detected
The other new functionality the Magnifier gains in iOS 17 is Text Detection Mode. Unlike Point and Speak, this will work similarly to the live text option in the Photos app. It will read any text that is detected within the viewfinder. This can help in certain circumstances, where Point and Speak may not be helpful such as when trying to read a document or sign.
Hearing
A New Form of Control
For those cochlear implant or hearing aid wearers that use an mFI compatible device, there is a new way to more quickly take control of what you will find in the Hearing Control Center. To check out the new options for yourself and set it up, move to Settings > Accessibility > Hearing Control Center. The options themselves aren't exactly new, but what is new is the ability to control which of the available options you prefer to be present when selecting the "hearing devices" option in your Control Center. This can speed up productivity, since you can remove the options you don't want. For example, I would prefer not to have the Background Sounds in my Control Center options since I do not often utilize that feature. Now, those who want it can have it, while I can remove it.
Speech Features
Speech has now earned its own category which has 2 new features. Live Speech, available on all devices supporting iOS 17, and Personal Voice, which is only supported on the iPhone SE 3 as well as the iPhone 13 and later models. Each will be discussed in turn.
Live Speech
Live Speech allows the user to utilize a chosen voice which may support them in communicating over video calls or in-person. After enabling Live Speech under Settings > Accessibility > Live Speech, there will be options to set up favorite phrases and to also choose the voice you would prefer to use. All of the voices available in VoiceOver are options. After setting it up, you can then use the Accessibility shortcut to launch Live Speech. The icon will be located below the status bar. For braille users, you will need to find it by pressing space with dots 4-5-6 twice. After activating Live Speech, there is a part of the screen which allows you to toggle between your favorite phrases and a keyboard. Moreover, if you use multiple keyboard languages, you can change the voice output to another language by toggling between the keyboards you have enabled. For example, if English US is your default and you added Spanish US as a secondary keyboard, switching to the Spanish keyboard will make the output speech switch to the voice you have set for that language. The voices do not seem to have the same inflection that is found when using VoiceOver or the voices created for Siri, but it's a potential tool that could come in handy.
Personal Voice
iOS 17 also brings a new feature for users of the iPhone SE 3 and later called Personal Voice. This feature can be used in conjunction with Live Speech to allow an individual who has preserved their voice to continue using a cloned version of it in phone calls. However, for conversational purposes, the output of the speech is a bit flat. To Set up a Personal voice, with one of the phones specified above, go to Settings > Accessibility > Personal Voice, and follow the prompts. You will be given several hundred phrases to speak into your iPhones microphone. If iOS detects that your audio level is too low or too high, it will also inform you of this. Though it has been written that this process takes roughly 15 minutes to complete, it took me around 30 minutes. After the phrases are recorded, you will then need to leave your screen locked and preferably, connected to electricity. One can still use their phone while the voice is created, but it does take several hours to complete. After setting your newly created Personal Voice as the output under Live Speech, it is then possible to begin typing and pressing return or enter to have the text spoken. Though the clone of my voice does seem to sound like me, it is my hope that it can become more animated with future updates. I have typed the following into the text field exactly as follows. "Hello, this is Scott Davert. Or is it really a fake version of me? I HAVE NO IDEA!!! Do you???" Though it is not entirely flat, I would have expected a bit more inflection from a sentence written in all caps and with 3 exclamation points. That said, this version of voice cloning is free, while the more advanced models of cloned voices require a monthly subscription and limit you to the number of characters in text each month. Personal Voice is free, and there doesn't appear to be a limit to the amount of usage.
Other New features
Assistive Access
Assistive Access allows for the customization of certain apps. What can be configured will depend largely on the app itself. This function allows the user to configure which apps will be included with Assistive Touch and also provides the ability to add only certain apps to a Home Screen. Assistive Access has been set up with a form of a start up wizard which will guide you through the process. After setting up how icons should appear, the user is presented with a list of apps that they can choose to include on the Home Screen. The configurability of each app is highly contextual. For example, with Messages, one can limit the user of the iPhone it is being set up on to only contacting and showing messages from certain contacts. There are also options for hearing messages you tap being spoken, whether the user guess details like the status of a message and when it was sent, and the ability to limit input to the keyboard, video selfie or to use emojis. The options for calls are somewhat similar to the Messages app, such as the ability to control both received and incoming calls from specific contacts, though it also offers the ability to hide the keypad. In calls, you can also show or hide the keypad, and control whether the speaker will take the call or not. It is also possible to set up a passcode so that one can be assured they will not exit the Assistive Access mode unintentionally.
Once the user has configured Assistive Access to their preference, they can save the changes and then use the Accessibility Shortcut to enable and disable it. It is my hope that this type of customization could be continued with other types of features considered. For example, I work with many individuals who are deaf-blind who may only wish to learn how to text and use Mail. It would be helpful in some situations to be able to set up the Messages app to only show the Back Button, keyboard, history and send as an option. This would cut down on the unnecessary clutter a slow braille reader must navigate through to get to what they wish to do. It's also worth noting that some testers are reporting that Assistive Access makes the device run more slowly. Swiping or tapping takes a few seconds to respond, Even more so when VoiceOver is enabled.
Voice Control Gets Guides!
For those who use Voice Control, there have been some frustrations since some users weren't aware of all of the things this feature can do. With iOS 17, there are now all kinds of guides which can help the user learn how to most effectively make use of Voice Control. These are done from the perspective of an individual who does not use VoiceOver, so if you are a VoiceOver user, you will need to take this into account.
There has also been a new feature added which allows the user to differentiate between words which sound the same; for example, there, they're and their.
Conclusion
Like always, Apple continues innovating for all users to promote a more inclusive society. For inclusion to happen, communication needs to be accessible for all; and Apple has once again taken steps toward that goal through the new features and enhancements in iOS 17. Whether you should upgrade or not depends on your specific use case. I would recommend checking out AppleVis' list of new and fixed bugs prior to doing so.
iOS 17 is a free download available for all supported devices. if needed, More information on how to update the software on your device is available on this Apple Support page.
Comments
Well explained!
Thanks for the updates
Excellent!
Scott,
I always appreciate your work and I am really happy with this excellent summation of accessibility notes regarding iOS 17. It was a very pleasant and very informative read.
Best wishes, sir,
Morgan
Community-driven Pronunciation Dictionary
As always, Scott, you do an exemplary job of describing what's new for each version of iOS from a VoiceOver and Braille perspective. A huge thanks to you for this thorough compilation!
I have enabled the community-driven pronunciation dictionary in the VoiceOver Eloquence settings, and it makes a vast difference! The listening experience is much smoother, and a lot more enjoyable. How does one go about adding to this dictionary if desired, or, where/how does one learn more about this resource?
Thanks,
Rachel.
So appreciated!
I so appreciate you taking the time to explain these features too!
Very Well Done
Thanks Scott for a job well done on this, as always. I am of course still with my trusty and awesome iPhone 7, but am on a family plan so will definitely talk to my parents about an upgrade. But what I can do is install Mac OS Sonoma on here when it comes out next week. Or at least I'm pretty sure I can, will have to check.
Regarding Community-driven Pronunciation Dictionary
Learn more here:
https://github.com/thunderdrop/IBMTTSDictionaries
Thanks to @Amir for all his great works there.
Community Dictionary
We have no idea what version Apple is using, nor if and/or when it will be updated. I assume updates would be delivered using system updates, just like anything else to do with VoiceOver.
Re: Community Dictionary
Chris, the Eloquence Community Dictionary that iOS 17 uses is at least 6 months behind the project on Github, and was never updated during the iOS 17 beta cycle. The dictionary can be updated via any iOS release - 17.1, 17.2, etc., if Apple decides to do so, the way some VoiceOver features are introduced, VoiceOver bugs are fixed or, rather, introduced sometimes.