In this edition of the AppleVis Extra, Dave Nason and Thomas Domville are joined by Sarah Herrlinger, Director of Global Accessibility Policy and Initiatives at Apple; and Dean Hudson, Accessibility Evangelist at Apple. Topics covered in this podcast include a look at some of the new accessibility features coming later this year to Apple's platforms, as well as a broader look at Apple's approach to making their products accessible to as many people as possible.
Full transcript of podcast
Please note, This transcript was created solely for communication access. It is not a certified legal transcript and is not entirely verbatim.
Audio: An AppleVis original.
Dave: Hello and welcome to the AppleVis Extra Podcast. My name is Dave Nason, and I am joined by Mr. Thomas Domville. How are you, Tom?
Thomas: I'm doing great, Dave. It's good to be with you again. This is going to be a fun podcast,.
Dave: Yeah, this is one of our more exciting podcasts that we get to do every now and then. We didn't do it last year, but we did it two or three years before that, and they're back. It's Sarah Herrlinger and Dean Hudson from Apple's accessibility team. Great to have them back.
Thomas: I know, right. It's been a couple years, so I can't wait to see what they have to say and offer to us in terms of accessibility for this year. I'm pretty excited.
Dave: Is there anything in particular that stood out for you before we jump into it?
Thomas: I know everybody in the community has been talking about the new voices, especially, Eloquence. But voices, I think that's probably the biggest hit out there to date. Would you agree with that?
Dave: Yeah, I think so. I think it's definitely been the biggest news of this year's cycle, so yeah. Let's see what they have to say about that. Should we go ahead and jump into it?
Thomas: Yeah. Let's do it.
Dave: Sarah and Dean, you are so welcome back to the AppleVis Podcast. Thanks for coming.
Dean: Thank you.
Sarah: Well, thank you guys very much. It's wonderful to be here.
Dave: Yeah, it's been two years, I think. We missed last year, but it was great. So it's great to have you back and talking about everything that's new in the accessibility world with Apple.
Dean: Wow. Two years-
Sarah: Yes. I-
Dean: ... that's very quick.
Sarah: I know. I was just thinking the same thing. It feels like the tumultuous nature of the last two years through a wrench into everything down to even doing podcasts. So it's great to be back with you guys.
Dave: Absolutely. And I'm guessing this was probably the first WWDC this year at a few years where you actually had people as well.
Sarah: It was, we had a hybrid model this year, but did kick it off with the keynote and the state of the union. And some of those major things, the design awards that go on on day one were done in a way to have more people available on campus. So it was a great opportunity to reconnect with a lot of developers and share the message of accessibility.
Dave: Yeah. Amazing. And there is lots to talk about in accessibility. Thomas, I think you'd agree with me here that one of the biggest stories that's happened since voiceover itself even launched, all those years ago is Eloquence is now on iOS, along with a bunch of other voices. But-
Thomas: Amazingly, I am just shocked in some way, how that was the number one thing that everybody's talking about compared to all the other new things we have.
Dave: Yeah, absolutely. Can you tell us a bit about how that all came about and-
Sarah: Yeah, it's an interesting one just in general. I mean, certainly we are always looking at including more languages for a little bit more background. We added 20 additional locales and languages, including everything from Bengali, to Bulgarian, to Catalan, to Ukrainian, which was a big ask as well, and Vietnamese. And that means that we're already up to over 150 voices available and for voiceover across 60 languages and locales. And when we did this, our goal really is just [about being able to maximize the ways that our users can tailor their devices to whatever is their preference. So whether somebody wants to explore some of the other new voices they've been added as well across many of the different languages, or prefer a voice that they're familiar with, like Eloquence for voices in. We just want to make sure that we have lots of opportunity for people to use the device in the way that works best for them.
Dave: Yeah, absolutely, and yeah. Eloquence. You're right. Yeah. There's a lot of new voices and we shouldn't forget that as it is. A few and very nice voices, and Zoe is a very popular one. I think that's new. Isn't it, Tom?
Thomas: Mm-hmm. We got Zoe, we got Nathan, we got Evan. So we got lots of voices. Right. There's so many new languages that came aboard. So I am curious, what was the process like? I mean, what came to the decision of adding these new languages?
Sarah: Well, I think one of the things that we always try and do year over year is expand the population of people who can use Apple products. And so, when you think about that from the new languages' perspective, we're always striving to add more languages and make sure that individuals who live in specific parts of the world, or just were raised with a language as a first language, or even a second language that they're learning that they have the opportunity to access iOS, iPadOS, macOS to really just get the depth and breadth of what Apple provides in terms of technology.
Dave: Absolutely. And I noticed one of the comments that came up quite a bit on applevis.com, especially around Eloquence, was a way this shows that Apple is listening to the community. Did that feed into this decision around Eloquence? Was it around seeing people keep asking for this or was it more of an internal thing?
Sarah: I think I'd almost take that at a broader level, which is to say, we're always trying to listen. For all the feedback that comes in every format that it comes, whether it's through the accessibility@apple.com account or people using our call centers, people who go to retail stores, people who just come up to us at conferences or whatever it might be, we're always trying to listen and make sure that we expand in ways that support the community and what they want.
And sometimes that what they want is the thing they didn't even know they wanted. We're always trying to innovate, do new things, but at this same vein, we're always listening to hear what are the core problems that our technology might solve.
Dave: Absolutely. Dean, can I ask you, what's your voice of choice now with a new suite?
Dean: Well, I grew up on the jaw side of the world back in the day. And yeah, actually, even earlier, I started with DECtalk, I don't know if you guys remember that speed synthesizer. But anyway, so it was a tough choice when I tried out the Eloquence, because it's like, "should I go back?" But no. Mine is Alex. Pretty much mine's either Alex or Samantha. But Eloquence does sound great, really does.
Dave: Okay.
Thomas: Now I know Dave is probably going to ask this question, but I know a lot of people have been asking. They love the way Alex sounds and they wonder if we will get any new type of voices like Alex. I know one of the things they're talking about is a female version of Alex.
Sarah: I guess, I would phrase that in a... We never talk about what isn't live today, but we appreciate getting the feedback. So I know that's fairly enigmatic as we sometimes like to be when we want to surprise and delight, but it's good feedback for us to hear.
Dave: That's fair. Worth asking, isn't it, Thomas?
Thomas: It was. We'll just poke and see what happens.
Dave: Exactly. Another topic that comes up a lot around improvements is around braille as well. Is there any news on the braille side lately or... Because obviously, we have a lot of users who are very into using their devices with braille displays.
Sarah: Yeah. Certainly, braille is a very critical part of the array of accessibility features that we support in relation to voiceover and something where we're always striving to make progress. So I would say it is a never ending effort on our part to keep moving forward with braille in everything we do. And a good example of that, I think in driving into new areas is actually the new API that we added around support for the tactile two-dimensional braille displays. We did this a couple of months ago, so it's actually already live and something that there is a product by a company called Dot Inc. that is already using that API. And we are super excited for that. So that, I think, is one of those ways that we're trying to look to the future of braille and make sure that we stay at the forefront of supporting new really cool devices. I don't know whether you've ever have a chance to play around with one of the tactile displays, but they're really great for images, and shapes, and emojis, and being able to get visual content in a way that you wouldn't otherwise do so if you were just doing a single line display.
Dave: I can imagine that'd be great applications for that.
Sarah: Yeah. I remember when they first came and we started seeing how it could work with things like emojis and it is a really neat thing to be able to see how much tactile feedback you can get and the layering, the pins coming up at different levels, so you really start to get the feel of different elements and images on the screen. It's not a binary up and down. And so, I think it's going to be a really interesting medium to see where it goes.
Dave: Yeah. I'm even thinking like mapping and indoor mapping and things like that. If you wanted to show somebody the layout of a room or a building or something.
Sarah: Yeah. And so, I'm excited to see with that new API where it goes from here, what other products may take advantage of that, and what they'll be able to build.
Dean: Education's a huge one too, having gone through tons of math courses, just being able to feel what Y squared equals X, what that feels like without someone having to describe, "Well, it looks like an upside down this, could imagine this." Like you could actually feel it.
Dave: Yeah, that sounds really cool.
Thomas: Mm-hmm. That's very interesting.
Dave: And then, people may remember that a couple months back, Global Accessibility Awareness Day, you announced some other quite interesting things. One, I think, was it part of GAAD or was it just after? Was around the Unity engine and a program Apple has now in terms of trying to make that accessible. And for people who don't know, Unity is a development engine and indeed you might be... you're better qualified to explain what it is than I am, I think. But it's an engine that developers use, and they build their games, primarily games. I think that's built on it for the apps and allows you to build a game for multiple platforms. Is that the idea, but typically these kind of engines haven't worked very well with assistive tech technology.
Dean: Yeah. It's a very popular platform. People use to develop games on. And we actually started looking at this about three, four years ago of what could we do to make this accessible? And so, we did announce at GAAD, it's a little plug-in that you could use, and that we have lots of documentation and go to the site, download it, read through how to get started. But it opens up a lot of resources and accessibility for games. And blind people, people with low vision like to play games as well. And the nice part is, we can take advantage of all of our other access tools such as switch control and assisted touch, as well as some of the haptic stuff that's built in, obviously, with voiceover in Zoom as well. So, yeah, we're very excited and we're really hoping that this is something that will grab the developers and start using it.
Sarah: I give it a quick shout out. The link is actually github.com/apple/unityplugins. And we are talking to a lot of developers through our developer relations team, making sure that they know that that's there. So anyone who uses Unity as the foundation of their apps, getting it on their radar, how easy it is to use this plug-in and help makes their apps more accessible to so many different users with different types of ways that they want to play games. As Dean said, this is really built to support a wide range of disability types, so that hopefully it's really going to bust down the doors of gameplay.
Thomas: I hope so. That is fantastic. Is this all done in-house or is that open source?
Dean: It's our initiative.
Thomas: Is it?
Dean: Yeah. So we definitely worked with the Unity folks to make this come together.
Thomas: That's awesome. I can't wait, Dave, for new games to come out with this.
Dave: Oh, yeah. There's even games that maybe are text-based. I'd love to play football manager or soccer manager to you guys, and I've never been able to play it since I lost my sight to a certain level. I wasn't able to play those games and it's like... it's largely text-based game, but I still can't play it because it's built on an engine like this.
Thomas: Mm-hmm.
Dean: Yeah.
Thomas: Well, this will be a new turnaround. Okay. So I got a big one here for you guys. Now, back in Global Accessibility Day, you also-
Dave: Nobody going here.
Thomas: ... made the announce... Here we go, Door Detection, which was a big thing. And so, I wondered if you guys could explain that to our listeners and what equipment or like if you need a LiDAR or no LiDAR phones to make this work?
Dean: Yeah. So Door Detection is a good example of marrying really unique hardware from Apple, as well as software and machine learning. And as you mentioned, LiDAR exist on the iPhone 12 Pro and iPhone 13 Pro. Those technologies is what makes this Door Detection something that's a reality for us, but to put it in context, you really have to go back a few years, all right? Few years we introduced image description, as well as being able to recognize elements on the screen. So looking at a button saying, "Okay, it doesn't have a label but it looks like a button. It must be a button." And so, there was that, there was screen recognition as well. Go forward, like another year or so when COVID hits, right? Now, we're all inside. What do we do? Well, if the world doesn't stop, for people that had to get out and buy food and tasks that had to get done, if you were blind, you were extremely vulnerable because you had to... Grocery store headlines in the early days of COVID. And you don't want to get too close. You don't want the lines of gap in front of you. And so, there was all these things and getting weird looks that you did get too close. So the community really wanted, and we wanted as blind users a way that we could venture outside safely. So we put in people detection, right? That's great. But then we took it a little bit further. I started noticing getting ride shares that sometimes if you got a nice driver, he might direct you where the door is or tell you over there to the right. But after that door closed, he's gone and you could ask for help in the past, but maybe you don't want to. And the COVID air, just doing sighted guide could be risky now. So, what do we do? How do we solve that? How do we get into the last few of your journey? And that's where we introduce Door Detection. Door Detection recognizes many types of doors. It also recognizes if the door is closed, if it's open. If it is closed, what type of action causes the door to open? Is it a slide door? Is it a push door? All things you could probably find out once you got to the door, but it'd be nice to have a clue, like what is there with this door? And so, we use both haptics and speech, which you can totally customize to your needs of when to speak things like how far the door is as you're approaching the door. Bigger than that, I think, is we can also detect text signs on the doors. And I actually did not know this. I've been going to this one store for many years, and I decided to turn on Door Detection. And there was this whole display or about, yes, you must wear masks inside and blah, blah, blah, blah. I had no idea that was there. I was just going in, no mask. And so, that's where we come to full circle of using machine learning, combined with hardware to get great experiences or things like that.
Dave: It's funny. It seems like such a simple idea in some ways but actually, it's so good. We've all been on a street and we're like, I know I'm pretty much there. I know that the door is very, very close, but I just can't quite find it and that little bit of help is amazing.
Dean: Well, I haven't traveled out of the states much since COVID, but what I really want to check it out on is resorts. Because it's pretty consistent in United States that in hotels you'll get braille and tactile information on the door but not so in other countries, it's a hit miss. And if you're staying at a resort to find your room, if they don't have anything tactile, you'll resort into prove things like counting the doors to the end of the hallway, that can be weird.
Dave: Absolutely.
Thomas: Now I'm curious. You're absolutely right, Dean. I like that aspect. So the room numbers on there, so you'd be able to detect that, read it and tell you what room number and that's the door. So, what would happen with... Let's just say iPhone 11 and down that don't have LiDAR? Would they have the same experience other than the distance?
Dean: No, they would not have the same experience. It does require a specific hardware with LiDAR.
Thomas: Okay. So 12, 13, and the new coming 14 this year will support that.
Dean: Yeah. The Pros, right.
Dave: Okay.
Dean: Pros and X.
Thomas: Yes. Yeah.
Sarah: And worth noting, it's not just those iPhone models. It's also the Pro models of iPad as well that support this. But it is something where we're really trying to explore the limits of what our hardware can do. And so, we're utilizing those features like LiDAR that are often seen as being built for the mainstream but have such great applicability when you think out of the box and can come up with features like this.
Dave: Definitely. And related to that, are there improvements for maps for visually impaired users, I believe as well?
Dean: Yeah. This one is a... Like when walking directions, and you guys have probably tried this, they will see you say start, and it'll say proceed to start location. And I don't know what that is. So now we provide sound feedback and direction of where you have to go to get to the starting location. So that's also a really cool feature. So it really comes to a round around in trip of just getting to the starting position, getting there, finding the door. So it's a really good complimentary feature.
Dave: Definitely. I think that's always been a difficulty a lot of people have had with mapping and navigation apps. It's that starting point is like there's an arrow on the screen pointing which way to start.
Dean: Right. Right.
Dave: There you go, amazing.
Thomas: Yeah. That feature is amazing. It's amazing how something's so small, that could be so automatic for those with site, but for us to try to figure out the starting point, it's like, "Oh, this is amazing. I love that."
Dean: Yeah. And it's very useful. Tells you which street you're facing, all those things that you have to know.
Thomas: Nice.
Dave: Definitely, yeah. And shouldn't we give Mac a bit of attention as well? There's some improvements to text you do to it. It gives you information around text, is that right? I've forgotten the day.
Dean: Yeah. Yeah. So this is around blind voiceover Mac users, being able to take a little bit of control of our work, our contents. And so, we will identify things like double spacing, double capitalizations, that type of stuff in the text detection. Also, what's cool, we can identify the number of tabs beginning a line or indent and that is extremely useful for coders. So all of this is added to macOS. It works in about every place you can imagine. Pages, text, edit, mail, all of the notes. It's another one it works in. And it's a really cool feature.
I turned it on one time and instantly caught some double spacing that I had no idea was there, or the beginning of the line is supposed to be starting with a capital letter, but I got a space in there. And so, it's like, would've took me forever to find that out. So I think it's a great step going forward. And I'll add to that that there are some improvements for things like large documents and PDFs that you guys might notice, and our work isn't done there. The PDFs and the web can be a wild wild west, but you should notice some very good improvements when reading long documents or filling out PDF forms. So that's another area which I need to make some improvement. We'll continue to grow in that area.
Dave: Yeah-
Thomas: Nice.
Dave: ... that's great. I know Alex on our team is very excited about it because he does coding. But I think even for just general professional work, like you say, word documents, or pages documents, or emails, just to know that you've got that extra layer of detection for those little details.
Dean: Yeah, yeah.
Sarah: Yeah. Just as an additional little thing on where to find it, it actually is in the keyboard commander within the voiceover utility settings on the Mac. So for anybody, when you download Ventura, that's where you find it.
Dave: Cool. So that's a lot of stuff that's coming this autumn iOS 16 and macOS Ventura. But of course, there was another improvement earlier this year for people who have iOS 15, which is the majority of people still on iOS 15 and who are maybe running tvOS 15 as well, which is in fitness, you added something again that I think when Apple fitness launched people ask that question, "Can I use it as someone who can't see the screen?" And you've made a big leap forward with that this year.
Dean: We did. We did. And we want to continue to do more, but we added audio hints because when you're trying to follow along during an exercise or dance, it's really tough for you to know what's going on on the screen. So we've added voiceover hint to describe what pose the person is in and where their right hand is, is it raised or not? And what's cool about these hints is you can adjust the speed. So if it's too fast and you really need to slow it down, because this is the first time you've tried this exercise, you can do that. So it's very customizable and we want to continue doing more in that space because we believe fitness is something that's very important to everyone, and everyone should be able to have access to it.
Dave: Yeah, it's cool. I tried it a couple of times with cycling ones, which-
Dean: Hmm.
Dave: ... yeah. Just get that little bit of extra, you know what I mean? Because it's probably giving you a bit of information visually through her gestures or whatever and just getting that audio hint. Have you tried it Thomas?
Thomas: I have not. I need to. I am curious if there's anything else, other features that you want to talk about?
Sarah: Well, one other one... No, no. That's a good question. We have so many things. One other one that I would say is more relevant for your listeners who fall into the low vision category, but within books, we are adding in some new ways to customize the visuals of books for one's own personal needs. So for example, being able to change things like the letter word and line spacing so that you can choose to have, for example, fewer words per line and have more space between them, which in some cases can be valuable for low vision. It's also great for individuals with different types of reading challenges. So we're trying to find ways, as always to support a spectrum of vision loss and make tools that are helpful for different people, wherever they are in that process.
Thomas: How about you Dean?
Dean: I will clarify one thing that I heard, if you have not tried Door Detection, I encourage you to try it. You might not realize where it is, but it's in the Magnifier app. In the Magnifier app and you can click on this button for detection mode and there you will find People Detection, Door Detection, even description. So I just want to point that out. I can't think of anything that we haven't talked about in the past. All of this stuff is, like you said, sounds basic when you speak it, but when you actually get in and use it just makes a huge, huge difference. So that's about all I can really say.
Thomas: Now, Sarah-
Sarah: I got one.
Thomas: Yeah.
Sarah: Hold on.
Thomas: Okay.
Sarah: I'm just thinking about another thing that could be relevant for people who use braille, which is one of the other things that we are adding this year in beta in the fall is live captions.
Thomas: Oh, definitely.
Dave: Yeah.
Sarah: Yeah. And so, that will be supported for users of braille displays as well. And I think that will be a huge benefit for our users who rely on braille and in particular members of the deaf-blind community.
Dave: Definitely. Have you seen a big leap forward with machine learning and things like that with accuracy around live captions? Because I'm sure that's not an easy process to do live captioning.
Sarah: Well, it certainly is something that we've put a lot of time and energy into the development of it and trying to think about it from an accessibility way. The fact that we have multiple products and the ecosystem that we do having captioning that works across an entire product. So system wide captions in iOS, in iPad OS, and Mac. And there's a lot of elements that come together and machine learning is a huge, huge piece of it. But we're excited to get it out there and we're excited to get feedback. So we hope people will give it to us.
Thomas: That's exciting. I think that's just really huge. A live caption on FaceTime, everything. I think that's just going to make a world of difference. So I understand there is also a new Magnifier activities for those with low vision. I think that's going to be pretty big as it was for us blind voiceover activities.
Dean: Yes. Yeah. Activities have been really, really popular and this year we've added to the Magnifier. And so, now we realized that again, customization that someone who's low vision wants things tailored to their functional site. And so, you may want a different type of contrast when using mail versus you may want a different type of magnification when using books. And so, giving someone activities to where they can customize the level and intensity of magnification based on what they're doing, it just continues to follow into making accessibility, customizable.
Dave: Yeah, I think you're right. That's great. Like you said, we have it for VoiceOver. So that same concept makes perfect sense for the low vision folks as well. That's brilliant. Guys, I'm conscious even very, very generous with your time, so thank you so much. One thing to maybe finish on is we've all mentioned feedback a little bit. And Sarah, you said you love to get feedback. So do you want to remind people how they can give feedback to Apple and maybe as well, how they can give feedback to developers or where they can send developers if they're trying to advocate as well?
Sarah: Yeah, absolutely. And yes, we do love feedback. So there are a number of different ways to do so. One thing that I want to point out upfront is that the public betas for iOS 16, iPadOS 16, and Ventura for the Mac are all out now. So for individuals who choose to download those betas and provide feedback, we thank you very much for taking the time to do so and helping us make those betas better.
But even for just day-to-day, we have the accessibility at apple.com email address. We get a lot of feedback through that, and we love it. We want to have a dialogue with the community. We want to know where you have questions, where you find things that... If you find a bug, tell us. We also do have our call centers. We have specific accessibility call lines. So please feel free to reach out to us there as well. And then, if you are a beta user, there are feedback opportunities built directly into those beta to communicate with us. So let us know what you're thinking.
Dave: Fantastic. Yeah. I would definitely second that, and you would too Thomas. And if you're running those betas to make sure you are putting every piece of feedback you can in there because it all gets to the right people. So great to have you guys on. Thank you so much again. Yeah. We really appreciate your time.
Thomas: Thank you guys. Take care.
Dean: [inaudible].
Thomas: Well, thank you very much.
Audio: This AppleVis Podcast has been brought to you by the community of applevis.com. For the latest and resources, and tips and tricks to get you the best experience from your Apple device, visit www.applevis.com.
Comments
Thank you for the transcript!
You guys did a great job on the transcript. I have nothing against listening to a podcast. I find the transcript easier to read and takes up less time.
More Transcripts Please
Hi all. I just got through reading the text transcript for this episode, and am wondering if these could be made available for everything on here? Very nice job, and in general I enjoyed this one as well. So great to have Sarah and Dean back to talk about everything new and awesome in Apple's accessibility department.
They didn't give us more contacts into eloquence
hey guys hope you are fantastic. I just felt kind of sad that Sara and the other guy became more biased towards the other voices than eloquent because they wanted to promote. I'm really happy that other languages are added but also was sad thatt eloquence did not get enough attention.