Apple is once again celebrating Global Accessibility Awareness Day (GAAD) by offering a preview of upcoming software features designed to enhance cognitive, vision, hearing, and mobility accessibility. These features, scheduled for release later this year, aim to empower individuals with disabilities and make it easier for them to interact with technology and the physical world.
Among the upcoming features, the Point and Speak feature will be a welcome addition for individuals with vision disabilities. This feature, joining People Detection, Door Detection, and Image Description in the Magnifier app, utilizes the LiDAR technology available on selected iPhone and iPad models. By leveraging input from the Camera app, LiDAR Scanner, and machine learning, Point and Speak announces the text on various buttons as users move their finger towards them. For example, while using a household appliance—such as a microwave—Point and Speak will announce the text on each button as users move their finger across the keypad.
Point and Speak may sound familiar to those who have used or heard of the recently released VizLens app, which also offers a similar functionality called 'Live Camera Interaction.' However, Point and Speak distinguishes itself by utilizing the LiDAR technology to more accurately determine the position of the user's finger relative to the object being pointed at. This improved accuracy could potentially provide a more precise and reliable experience for users, but we shall have to wait and see whether this translates into any meaningful difference in performance between Point and Speak and VizLens.
It is worth noting that Point and Speak currently focuses solely on text recognition and does not include the identification of graphical symbols typically found on appliance controls. Furthermore, it does not provide object identification by directly pointing at the objects themselves. It is also likely that Point and Speak may not be the optimal choice for use cases involving real-time reading of text on items like food packaging or for handling longer text passages.
Point and Speak will be available on iPhone and iPad devices with the LiDAR Scanner in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian.
As Apple continues to expand the capabilities of its Magnifier app, it raises the question of whether these advancements hint at potential inclusion in the highly anticipated mixed-reality headset that is expected to be announced next month at Apple's Worldwide Developer Conference (WWDC).
In addition to Point and Speak, Apple has revealed other improvements for individuals with low vision and VoiceOver users. Users with low vision will benefit from enhanced Text Size adjustment across various Mac apps, including Finder, Messages, Mail, Calendar, and Notes, allowing them to personalize their visual experience according to their preferences and requirements.
VoiceOver users will enjoy more natural and expressive Siri voices, even at higher rates of speech feedback. Furthermore, users will have the ability to customize the rate at which Siri communicates, with options ranging from 0.8x to 2x. This level of customization enables VoiceOver users to tailor the auditory experience to suit their individual needs and preferences.
Notably, there is no mention of any new features or enhancements specifically tailored for Braille users. Furthermore, the number of new features and enhancements for blind and low vision users appears to be fewer compared to previous previews. This raises the question of whether Apple has prioritized addressing longstanding issues rather than introducing new capabilities.
Other features previewed by Apple include Assistive Access, which provides a customized experience for various apps, including Phone, FaceTime, Messages, Camera, Photos, and Music, in order to lighten cognitive load. These apps feature high contrast buttons and large text labels for improved accessibility. The feature also offers tools for trusted supporters to personalize the experience according to the individual's needs. For instance, Messages includes an emoji-only keyboard and the ability to record video messages for users who prefer visual communication. Users and their supporters can opt for a visually-oriented, grid-based Home Screen layout or a text-focused, row-based layout depending on their preferences.
The new features also include innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. For example, Live Speech on iPhone, iPad, and Mac allows users to type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.
Another speech accessibility feature is Personal Voice, a simple and secure way to create a voice that sounds like the user. Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
Apple says that "for users at risk of losing their ability to speak—such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability—Personal Voice is a simple and secure way to create a voice that sounds like them."
"At the end of the day, the most important thing is being able to communicate with friends and family," said Philip Green, board member and ALS advocate at the Team Gleason nonprofit who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018. "If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world—and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary."
Personal Voice can be created using iPhone, iPad, and Mac with Apple silicon and will be available in English.
Currently, it is not anticipated that VoiceOver users will be able to utilize the Personal Voice feature to create a custom text-to-speech (TTS) voice that can be used by VoiceOver. This presents an intriguing area for potential development, and we encourage Apple to explore the possibilities in this regard.
Additional Features:
- Deaf or hard of hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.
- Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like do, due, and dew. Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
- Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
- Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
More Apple Celebrations of Global Accessibility Awareness Day:
- SignTime will launch in Germany, Italy, Spain, and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. The service is already available for customers in the U.S., Canada, U.K., France, Australia, and Japan.
- Select Apple Store locations around the world are offering informative sessions throughout the week to help customers discover accessibility features, and Apple Carnegie Library will feature a Today at Apple session with sign language performer and interpreter Justina Miles. And with group reservations — available year-round — Apple Store locations are a place where community groups can learn about accessibility features together.
- Shortcuts adds Remember This, which helps users with cognitive disabilities create a visual diary in Notes for easy reference and reflection.
- This week, Apple Podcasts will offer a collection of shows about the impact of accessible technology; the Apple TV app is featuring movies and series curated by notable storytellers from the disability community; Apple Books will spotlight Being Heumann: An Unrepentant Memoir of a Disability Rights Activist, the memoir by disability rights pioneer Judith Heumann; and Apple Music will feature cross-genre American Sign Language (ASL) music videos.
- This week in Apple Fitness+, trainer Jamie-Ray Hartshorne incorporates ASL while highlighting features available to users that are part of an ongoing effort to make fitness more accessible to all. Features include Audio Hints, which provide additional short descriptive verbal cues to support users who are blind or low vision, and Time to Walk and Time to Run episodes become “Time to Walk or Push” and “Time to Run or Push” for wheelchair users. Additionally, Fitness+ trainers incorporate ASL into every workout and meditation, all videos include closed captioning in six languages, and trainers demonstrate modifications in workouts so users at different levels can join in.
- The App Store will spotlight three disability community leaders — Aloysius Gan, Jordyn Zimmerman, and Bradley Heaven — each of whom will share their experiences as nonspeaking individuals and the transformative effects of augmentative and alternative communication (AAC) apps in their lives.
"At Apple, we've always believed that the best technology is technology built for everyone," said Tim Cook, Apple's CEO. "Today, we're excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate, and do what they love."
Apple has not provided any specific release dates for the upcoming features; however, they are anticipated to be included in the next versions of their operating systems, such as iOS 17, iPadOS 17, macOS 14, watchOS 10, and tvOS 17, which are expected to be launched this fall.
We'd love to hear your thoughts on the upcoming accessibility features from Apple! Feel free to share your opinions with us by leaving a comment below.
Comments
Point & speak and Live Speech.
These features both sound incredibly interesting to me, Point and speak for obvious reasons but Live Speech suggests they are moving toward some kind of on device neural engine speech production. This could hopefully spill over into new voices for Voiceover and Siri. I’m less interested in creating a TTS of my voice but very excited Apple are back to innovating in this area. TBH they didn’t announce the single feature that would have got me most excited which would be fixing the bugs. Please Apple if anyone from the accessibility team is reading these comments, I’m literally begging. Please can you put significant resource into fixing the bugs. So many of us are so absolutely depressingly beaten by them. They are ruining our devices and therefore our ability to do things.
Personal voice, live speech, and point speak all are interesting
I can’t use points of speak, until I get an Io device with lighter scanner.
Live speech would be interesting to play around with, and even though I don’t have a hearing disability, it might be good in the future. My hearing doctor say that because of my Eye condition, Norris disease, there is an extreme possible chance I could become Jeff in the Matthew chat
Ask for a personal voice, that sounds amazing. Maybe I could snap, AI, clips of Trump, Obama, Biden, and other AI voices, and put them into VoiceOver.
as for the new Siri improvements, that is excellent.
No, I can make Siri faster.
Next feature, I hope, is the option to use other VoiceOver voices like Alek or eloquence, Alex, or even the singing voices, LOL. Imagine those singing voices reading out the time for Siri and reading out my messages.
That would really make people laugh.
Very Exciting!
These upcoming features sound very cool, pun literally intended. Regarding Braille on Apple products, I still haven't managed to get everything up and running on my eReader but that is on my to-do list. The feature that will let people record their own voice for loved ones to hear is going to be another game-changer inandof itself. Things like this are what make me proud to be an Apple customer!
Another bone
Once again Apple throw people a nice and juicy bone and people waggle their tail, they did it to them with eloquence and people were so happy. I want for the money I pay for my 13 pro that voiceover bugs are address, fix issues with pronunciation with Alex, address issues with Braille that those who use it have. There are more to list but I am sure Applevis will get Sarah to come and talk about how wonderful apple is and those in the apple will prey to the sky with glory, glory to apple. Before you jump on me, I had used iPhone since iPhone 4, 4s, 5, 7, xs, 11 pro max, 12 pro, and 13 pro. I also using iPad 9 and my series 7. I purchased my devices and no one gave them to me. I spend my money on it. At this time android is not that accessible compared to iPhones. I only ask that apple focus their time on bugs and making voiceover more responsive to us instead of adding new features. I asked Bella the cat her thoughts, she responded "meow, meow and eow. I am not sure I want to know what she said.
Kill the bugs! Please!
Dear Apple,
I really do love your new features. Big happy smile. Your innovations are amazing! And they are really helping people who really need help. There only one thing.
For the love of God, would you please kill the bugs! I totally agree with Andy Lane. The bugs present in IOS 16.4.1 (and earlier versions that are still present) are destroying the usefulness of my iPhone, and believe me, I use my iPhone all the time, now not because it's so useful but because it takes all the time to get the stupid thing to be of use!
Thank you, Apple. Big happy smile. Your new stuff is great. Scowl. The bugs suck.
Bruce
Accessibility Features
The live speech and new Siri voices sound interesting. I wouldn't enjoy using my own voice for voiceover. Point and speak is also interesting but looks like I will need to do an upgrade from the 12.
As others have said, fix the bugs.
I know there's nothing we can do, apart from emailing apple and hoping for the best, but as I said in another comment, I'm not impressed.
I'm sure some of these features will help people with disabilities but they really need to start fixing bugs before adding new stuff that will probably break.
bugs on IOS
I have been using my IOS device for a long time now. I personally have not experienced many bugs. What bugs are you experiencing?
NY city
Apple is on a race to see who has more bugs NY or Apple. NY have bed bugs and apple have VO bugs. Both are bad.
For Oliver
You make an interesting point about the personal voice feature and possibilities. As a medical lawyer I am concerned about the impact this might have on our ideas relating to personhood. Essentially, personhood is not really the same as being alive. Your body can be biologically alive without your having the degree of cognitive and psychological continuity that you once had. The extent of continuity required for personhood is contentious but most theories agree that there needs to be some degree of this. The expression that X or Y has 'become a different person' since A or B happened might well be a colloquialism, but actually it captures the essence of the problem quite well. Suppose that X has dimentia or some such degenerative illness, and suppose he preserves his voice using Apple's personal voice feature, coupled with the AI capabilities you talk about to analyse his conversations and in effect, generate an AI-generated X that has learnt how to be the biological X. now suppose the biological X is so ill that he loses what psychological continuity he formerly had - or, to put it another way, he loses his sense of self. now, suppose we get to the point where the biological X is still alive but the AI-generated X thinks, talks, reacts like X to a significantly greater extent than the biological X does. Towards which X do we feel greater affinity? Which X do we like better? Can the AI-generated X make decisions about what the biological X would have wanted, supposedly on the biological X's behalf? Does the biological X become a diferent person or, even worse, not a person at all? This is one that, seriously, I believe the judges of the Court of Protection in the not too distant future might have to grapple with.
in the meantime, it's good to see the changes for those with speech difficulties. Granted I'm not one of them, but let another disability have a keynote moment in the sun. I hope it works for the people who depend on it. Nothing that exciting for blindies but over on another thread that's pretty much what we're asking for. Fix the bugs and don't worry about aught else. Well, looks like we have half of that. Bingo looks forward to the other half.
PS: Yeah yeah - I know about Siri voices. They'll probably be all american anyway: they usualy are.
to Holger Fiallo and all the negative people
First Apple does more than any other company for accessibility!!!! Second YOU haven't seen iOS 17 so how do you know that the bugs aren't killed?
@paras shah and the bugs
This is the actual issue I have with Apple. If there is a bug, it will only be on some iPhones, and often the bugs will be randomly occurring and impossible to catch with the screen recording the Apple accessibility team seems to want as proof of an issue lately.
Meanwhile, there are other iPhones that are apparently bug free, therefore no bugs exist. Or that is how it comes across from the outside.
I'm not a computer programmer, nor a hardware engineer. I'm probably not even calling them the correct titles. But I know when my phone is doing weird things, like losing focus over, and over, after I've updated it. I also know when I'm getting frustrated at being pushed and told I should update to a new version that lots of people are reporting has a horrible focus bug because of security issues. It leads me to use my iPhone just about only to check the time on the lock screen and answer phone calls.
Dennis Long
Bella the cat is proud for you. It does not change that bugs and bugs are with VO. Now you are going to say, all OS have bugs!!! That was not an answer by the way. Microsoft does the sane with support and address issues but they are not perfect. Bugs I do not want them and Bella the cat does not like them either.
who cares what your cats likes
Who cares what your cat wants or likes. All operating systems have bugs.
Bella
Sad picking on a cat. How low. Well check below for 9to5 talk about this. Not bella but the new features.
New Live Speech, Personal Voice and Point & Speak Features for iOS 17 | What You Need To Know!
no link
There is no link.
Paras
You ask what bugs I've been experiencing. The worst is VoiceOver seemingly resetting itself every few seconds or minutes, resulting in focus going to the top of the screen. Imagine what it's like to try to read a news article, for example, but every few seconds focus goes to the top? Then you have to hunt for where you left off? And back to the top you go, before you find it? Or after a few seconds more reading once you did find it?
Now, imagine you're in Safari, going through a google search page, hunting. Bang! VoiceOVER RESETS. Back to the top you go! Where did you get interrupted? Well, you don't know. You have to hunt for it, and maybe while you're hunting or maybe once you've found it and you continue hunting, bang! Back to the top you go!
Paras, you have every reason to count yourself fortunate if you do not have this bug.
I agree with Dennis
You always seem to include your cat and everything. If your cat likes you so much, create an account on here for it. You will need it!
thoughts
I didn't expect to see this until later in the week. The new announcements sound kind of interesting, but not enough to make me run out and upgrade to a 14 pro. The demo use case they showed for the new point and speak thing, sounds like it will be a lot more complicated to make work reliably. It sounds like you would have to hold your phone in one hand and try to focus it, while touching the buttons on the microwave with the other, which does not sound easy given that the pro iPhones are heavier. My initial enthusiasm dropped after an admittedly brief play with door detection last year.
I hope these new features don't end up sherlocking a bunch of other apps, although I don't think that will happen. I usually like using Apple apps, but I find myself going to something like Seeing AI rather than trying to use text detection in the camera app. Options are a good thing!
This forum is just like any other Apple enthusiast forum; half the people want Apple to do nothing but fix bugs and call it iOS 17, but then the other half will complain that iOS has become boring and that they "can't inovate anymore" Phil Shiller's ***. I don't know which side of the fence I fall on, but I do agree that a lot of clean up needs to happen.
I also agree with an above post; not every new accessibility feature has to be about us. Perhaps that personalized voice thing is that community's VO activities or Braille screen input or wider choice of TTS voices.
For Paris & Dennis.
If you are happy with the current state of VO then thats great for you. Maybe you don’t have the bugs, maybe you have enough vision that they aren’t an inconvenience and maybe you are more tolerant and patient and some of us. I find the bugs highly annoying which becomes highly depressing over time and as I’ve taken the time to understand what could be going on for you that they don’t bother you, perhaps you’d want to try and understand why they might for some of us. An example beautifully illustrated above is the focus jumping. I have no vision which means every minute or less sometimes I have to find where I was. Sometimes I don’t even get back to where I was meant to be before boom it happens again. While typing this message the focus has gone to the preview button at least 8 times requiring me to find this box again. When I read anything with social media api’s I can’t get past them, embedded video. Nope can’t get past it and often can’t play it. If the box for GDPR pops up or any other box that I can’t find or didn’t hear pop up then the entire site behaves badly until I realise or stumble across the button to accept cookies. If I try to reply on YouTube and theres no send reply button after I’ve typed out a well thought through reply. If I want to write or edit a text document but can’t because the cursor is all over the place and doesn’t speak random parts of the text. If I try to buy anything online and some of the controls can’t be interacted with. If I answer the phone and VO won’t talk until I reset it. How about connecting a Bluetooth speaker and having my ear blown off with VO volume slamming itself to full while I have it quietly next to my ear. What about all the other bugs that I can’t even remember or make everyone suffer through in a over long comment that just make me want to not bother using my device that I’ve paid over £1000 for in an echo system I have spent almost £50k on in the last 15 years. Yeh there are problems, If you aren’t having them then honestly thats great but some are and they are affecting our ability to use our devices and subsequently our lives. Wow that ended up being mostly for Dennis but Paris, these are some of the bugs we are having to deal with.
Getting around the looping thing with media players on the web.
I find that if you tap a bit lower on the screen it sometimes gets you past those bits and the same thing works for adds.
I'm not experiencing that many bugs iether, just the stopping speech thing really so I'll see what IOS 17 brings to the table.
if my phone doesn't support it then I won't care about it until I'm forced to upgrade.
Thanks Brad.
Cheers for the advice. I usually do try something like that but the problem is at the very least, I’ve lost the thread of what I was listening to and concentrating on and at worse I just give up because I often can’t make the end match the beginning and don’t know how much I’ve missed. I think if this had a happened once or 10 times I’d think very differently to where I am with it now.
Dennis Long
Sometimes when I paste a link does not show up. Copy it and open YouTube and paste and search. Is by 9to5. Also for Dominic regarding Bella the cat, you can skipt it and not read it. I am sure she would not care. I just want apple to focus on bugs and all those who are having bad experiences want the sane. Sane with Braille. I do not use it but many do. Long live the Apple.
Reading and Writing with VO
The one basic thing we expect a screen reader to do is to read what is on the screen.
Having VO start reading down an article in Apple's own News app and then suddenly randomly stop at links or other breaks in the text is not only irritating, but should be unacceptable for any screen reader.
But that isn't even as annoying as the bug that occurs when VO just goes off into the weeds for a long time or until restarted and there is no feedback at all for the blind user. Is that a screen reder?
Also, note the various threads in AppleVis that talk about problems with simply reading a text document with VO by arrowing up and down lines, selecting/highlighting text, etc. That has never been reliable for me using a bluetooth keyboard. How can a blind person get a feel for exactly what is on the screen or how it is formatted when VO feedback is irratic at best, and sometimes just wrong?
Yes, it is nice that Apple keeps introducing new accessibility features, but if the bugs in the basic features remain, it makes it not only less enjoyable to use our devices, but less reliable.
Hope some of these bugs can be squashed in iOS 17, but if the past is any indicator, we will continue to have to live with these issues.
I do understand that all modern programs and operating systems have bugs, but at least make the experience for blind users as good and reliable as it is for sighted users.
--Pete
bugs
In my experience, newer devices have less bugs. I have not experienced any focus issues in Safari. On an older Mac, sometimes, voiceover would say bizzy, but not major issues on IOS.
To Paris.
Wow, you are very lucky indeed. I am using a 14 pro max and safari crashes almost constantly. Every minute is good going and each time it does, I lose focus and start back at the top or even another random place. The preview button on this page for example. Perfectly timed, it just happened then and another twice to get to here. Lol. It’s not fun at all.
iOS and Mac bugs.
Can you all give some examples where you are seeing issues? The name of a webpage and link would be good. No beta software. Thanks for me, the experience has been fairly good.
did those having issues contact apple accessibility?
Did you contact apple accessibility and do a screen share?
did I get missunderstand
I think the personal speech is the features let us record our voice for the voice over voices right?
My current understanding of…
My current understanding of personal voice is that the voice you create won't be supported as a VoiceOver voice.
Interesting new features
These are interesting features. A lot of the new features aren't specifically made for the blind, but I'm really happy. When other peole with disabilities can move mor independently through their world, we all are elevated. Also, some of these bugs people are complaining about aren't bugs, but quirks in the voice, or synthesis engine apple is using. Also, sometimes the reason VoiceOver sometimes jumps focus in certain situations is because the screen refreshed or something was loading.
Just one very decent AI voice will make me happy
I don't know if I'm alone here, all I want from iOS 18 is a really decent AI voice so I can read Kindle books with greater enjoyment. I don't care about wallpapers, I don't care about sticky notes or spinning UI elements or yet another way of achieving something you already do by a current satisfactory means, I just want to be able to read a Kindle book with an AI voice which is hard to distinguish from a human reader. Oh yes, it would be nice if Apple Fixed so many of those voiceover bugs that we've had to put up with in iOS 17.
Personal voice
Fun stuff. With each such announcement, I get more and more excited about WWDC.
To create a personal voice, it sounds like the user must read text displayed on the screen. I wonder how someone totally blind might accomplish this. As a writer who occasionally does public readings of my own work, I find that I must rehearse multiple times, using VoiceOver as a prompter, and still I'm dissatisfied with the results.
Someone above mentioned that Personal Voice would not be usable as a VoiceOver voice. From an engineering perspective, this seems innefficient. Apple doesn't have a special purpose machine learning API for every AI application, or a special purpose rendering library for every rendering application. But they seem to have special purpose text-to-speech solutions for every application that needs to talk. The result is more code, greater memory usage, and inconsistency in features, capabilities, and issues.
@grumpy
Unless Apple ads more Siri voices, do ya see Apple partnering up with Eleven Labs in some way?
An example on the Android side of things, can be found here:
https://accessibleandroid.com/elevenlabs-api-support-in-reader-apps-the-ultimate-reading-experience-can-now-be-possible/
On that note, even Siri voices could use better inflection.
If Tru Voice on Windows and DecTalk can do it, why can't Apples on voices?
Grumpy old techie
What I just want that apple on accessibility day focus on fixing bugs. Just fix bugs and most of us would be happy. Prefer that over a feature that most will not even use. Still still had not use any of them yet.had not use .
I believe they sited…
I believe they sited security and privacy as the reason personal voice hasn't come to other applications. The danger being someone could clone your voice and get up to all sorts of mischief with it. I'm guessing this could still be done with the current version, but it would be harder.
Great point about the kindle books though. It would, I fear, make audio book narrators a thing of the past though which would suck.
My fear with apple and AI is that they are going to move too slowly. They've always been behind the curve choosing to do things well instead of do things fast but I'm not sure that playbook is going to fly here. They failed with siri, and unless they have some really deep integration with their OS, being able to do pretty much everything hands off and eyes free, they're already losing. Open AI will release their device and it could be game over for bloated fiddly OS.
Personal Voice
It is totally possible for VoiceOver users like me and others posting on here to make a personal voice, and I am one of those who have already created their own.
Have created one as well
I went through the process several times actually to get the best recording possible. It is just as someone suggested above. You listen to voiceover speak the phrase and then speak it into a microphone. Pretty simple. All you must do is ensure the microphone does not hear what voiceover is reading
Am I correct in thinking you…
Am I correct in thinking you can only use it for very limited tasks and preset phrases, the idea being it is a replacement for those who have lost the ability to speak?
You can have it read out anything you type.
You can use your personal voice during calls and even bring up the keyboard by enabling Live Speech and selecting the newly-added option from the list that appears when you activate the accessibility shortcut.