In this edition of the AppleVis Extra, Dave Nason and Thomas Domville speak with Sarah Herrlinger, Director of Global Accessibility Policy and Initiatives at Apple, and her colleague Dean Hudson, who is also a VoiceOver and braille user himself.
On 22 June, Apple’s WWDC keynote highlighted the key changes and updates coming in iOS 14, iPad OS 14, Watch OS 7 and Mac OS Big Sur, from a largely mainstream perspective.
This conversation with Sarah and Dean gives us the opportunity to learn much more about what’s new in the area of accessibility.
Full Transcript of Podcast
Please note, This transcript was created solely for communication access. It is not a certified legal transcript and is not entirely verbatim.
[Music]
Announcer: An AppleVis original.
[Transition sound]
Sarah Herrlinger: Part of our job is to work on making sure that everything that everyone makes at Apple is accessible. It's also about making sure that there's some really fun things to surprise and delight as many of the communities that we support as we can each year and finding new ways and new communities as we move along. I think this year hopefully, is going to be another fantastic year.
[Transition sound]
Dean Hudson: On the technical side of things and when you're buried in the year, just sorting out details and what features will do what and you kind of get sort of lost in the trees and coming out for Dub Dub DC sort of look back and say, "Wow, these are some great features. You sort of put things in a broader perspective.
[Transition sound]
Sarah Herrlinger: Our end goal is always to make sure that whether it is a young student being asked, "What do you want to be when you grow up," and saying, "I want to be an engineer." Or someone who is already an adult looking at going into developing. We want to make sure that there's a next generation of developers out there, that are members of the blind community.
[Music]
Dave Nason: Hello and welcome to AppleVis Extra, this is episode number 75 and we are still in WWDC week. This is our third annual interview with Sarah Herrlinger and again she is joined by Dean Hudson and my name is Dave Nason. I'm joined in the hot seat by my co-host Thomas Domville. Thomas, how are you today?
Thomas Domville: I'm doing great. What a week, boy. This has been a busy week, but we got lots of cool stuff to cover, for sure.
Dave Nason: Really has yeah. We did a pretty long podcast on Monday night after the keynote. But it was largely focused on kind of what we'd seen in the keynote, the mainstream stuff and so hopefully today we'll get a bit more information about what's changed for us in the accessibility space.
Thomas Domville: Absolutely.
[Transition sound]
Dave Nason: Sarah and Dean, thank you so much for joining us again. It's great to have you back.
Sarah Herrlinger: Well, it's so wonderful to be here. We always enjoy being able to spend some time with you both.
Dean Hudson: Yeah, it's fantastic.
Dave Nason: I know. I hope you're both keeping well in these strange times.
Sarah Herrlinger: Definitely. It has been some interesting months we've been going through and I hope everyone out there is safe and well, but we are still making sure that everything keeps moving forward here at Apple.
Dave Nason: What a week it's been I'd say, it's always a hectic week I'm sure at WWDC and this year, kind of different. How has it looked this year for you guys in comparison to previous years?
Sarah Herrlinger: Well, we're really excited. I think it's another great year. The team has really been firing on all cylinders and everybody, even working remotely people have been able to create some really amazing new features and that's always our goal. As much as part of our job is to work on making sure that everything that everyone makes at Apple is accessible. It's also about making sure that there's some really fun things to surprise and delight as many of the communities that we support as we can each year and finding new ways and new communities as we move along. I think this year hopefully is going to be another fantastic year of that as well.
Dave Nason: Yeah, cool and in a way actually I've heard people commenting that it's felt like a more inclusive WWDC in a way this year because everybody's in the same position. Everybody's joining remotely because there's nobody on site, so it's actually a very inclusive WWDC, I think.
Sarah Herrlinger: Yeah. I definitely agree. I think it's been a new world for us, but with things like the keynote I think they really worked hard to try and make it something that was a really interesting fun experience for everyone from remote locations.
Dave Nason: It was a jam packed keynote, I think it's fair to say. I suppose from an accessibility point of view what are the big ticket items for you guys this year?
Sarah Herrlinger: Yeah. There are a lot of different areas. I mean, once again we're trying to hit a number of different groups and I want to have Dean kind of kick it off I think and talk a little bit about the work that the team has done on VoiceOver and some other features for Blind and low vis.
Dean Hudson: One of the interesting thing is when you sort of are on the technical side of things and when you're buried in the year just sorting out details and what features will do what. You kind of get sort of lost in the trees and coming out for Dub Dub DC sort of look back and think, "Wow, these are some great features." You sort of put things in a broader perspective. But, VoiceOver's been a, we’ve had a pretty big step this year.
Dean Hudson: We've been working on utilizing the machine learning in the neural network software, technology in the new hardware and trying to figure out how we can make VoiceOver take advantage of that. We started last year, I don't know if you guys remember with automatic button labels. that was a technology that we use some heuristics to determine if a developer did not put labels on buttons that you forgot or whatever. We would try to auto detect that and would say, possibly play button or possibly order button.
Dean Hudson: This year we took it a bit further, quite a bit further. Now we can use that technology to auto identify elements. If a developer did not put an attribute trait on a button or a table or a slider, or a scroll area. Then we would try to detect what that is through what we call screen recognition. That had a lot of debate back and forth because well then that means developers don't have to do much work.
Dean Hudson: But that's not at all what it's for. It's really for situations where a developer could have very good intention, but just forgot to label something or forgot to identify an attribute or just didn't know how to make something accessible. Therefore just wasn't able to do it and so we didn't want our users to have to experience that.
Dean Hudson: I'm sure you guys know for certain apps that you use all the time. Like I use the grocery apps to order food or to order groceries or the lift app to get a ride someplace. It's so annoying that if they come up with an update and you go and put all these items in your cart and you're ready to check out, but you can't get to the order button or checkout button and that can be a very frustrating experience.
Dean Hudson: We didn't want our users to have to experience that and so we will try to identify that stuff for you. But we're definitely still encouraging our third party developers to make their apps accessible. And we think this technology will further that because, when they turn this on with VoiceOver and they're going through their app and we encourage developers to do this, and they navigate over an item and it says, possibly slider. Well the engineer's going to say, wait, that's not possibly, that is a slider, so let me go in my code and fix it so that it says what it needs to do.
Dean Hudson: It offers our users some sort of a layer of protection so that they don't have to run across just mistakes. Just we're talking software and software often times, there's gotta be bugs just the reality of software.
Dean Hudson: Another thing we did is better image descriptions. Currently if you navigate over an image of a couple eating dinner, we might say two people, table with plates but now we'll give a more context description of the image, so two people sitting at a table enjoying a meal, something like that. Because our eyes see something and we get the image, but it's our brains that interpret that into context. That's one area, sort of VoiceOver recognition, image descriptions, better in their descriptions.
Dean Hudson: Another huge area that we're extremely proud of and very very hopeful is the area of software development. We have done a ton of work in making the whole Xcode experience, not just better, but allowing our users to be proficient when using Xcode. We've included a bunch of stuff in the rotor that allows you to easily navigate through huge chunks of code as a developer. One of the things you have to do is be able to navigate other people's code and your own code very quickly. You can't just do line by line. You got to jump to different sections and then so that whole editing experience is much, much improved.
Dean Hudson: The other area is in the Swift Playgrounds coding, making the live previews accessible, so you can type in a chunk of code. And then on the right side of the screen, you can navigate and actually see your UI live which just wasn't there before, so that's all accessible now. Things like code completion, navigating break points, all those things are crucial when you're trying to do real software development. We really think this is going to be a huge opportunity for people who are blind to, even if they just want to learn something about coding, or they actually want to be a software engineer. This is going to be great for them, so really excited about that.
Dean Hudson: Some smaller items, but huge, we've made some huge braille improvements for both desktop and iOS. Auto panning for braille displays is great for fast braille readers. I have it on, I set mine to like, I don't know, every three or four or five minutes it depends because it takes me forever to read braille.
Dave Nason: I know that pain
Dean Hudson: But some users are really fast, so we wanted to bring that experience to Apple's products. Then some other improvements for things like the languages. We added a bunch of languages last year, I think we're up to more than 82, but now we include it in the rotor. If you quickly need to just switch languages on the, I should say braille tables, if you want to switch on the fly, you can do that with the rotor. That's really cool improvement.
Dave Nason: That's cool
Dean Hudson: Yeah and sort of the last thing is we've now included accessibility settings in the Watch set up. If you need large fonts, speech, all that now is included when you go to set up your Watch. Then we are bringing the rotor to the Watch as well.
Dave Nason: Very welcome
Dean Hudson: Yeah thanks. There's some really cool things there
Dave Nason: Yes, so there's quite a lot there, isn't there for accessibility this year. I think myself and Thomas were talking actually just beforehand about the whole recognition thing. It's gotta be huge because as much as we were saying, we want developers to do the work and to make their apps accessible. To even have a level of accessibility in apps that were previously completely unusable to us, is a huge leap forward. Isn't it?
Dean Hudson: Yeah. I mean, like I said, I personally experienced this and it's just, it's nice to know that, okay, there's a backup plan, I can order my groceries.
Dave Nason: Like you’re saying, to that last point, you've got the whole fast food order or whatever it might be put together and then there's no order button.
Thomas Domville: We were talking about how we were simply amazed by the new text and image recognition as an improvement. Mostly in our photos and app store photos, we were never able to see and those were like, Oh my gosh, this is really clear and understandable.
Thomas Domville: I got a question about the screen recognition, if you don't mind. Now you're talking about, are these missing elements that we typically find where there just be clicks and nothing is speaking to us. It will actually just kinda recognize what's on the screen, such as elements, like you mentioned about the sliders and things that were not spoken to us before. Or these elements that we are able to see VoiceOver just a little better in description.
Dean Hudson: It's both. When you swipe to some item and it clicks and it doesn't say anything, that means that there's an element there that the developer did not promote in accessibility. They didn't say this needs an accessibility trait. And if they don't provide that then VoiceOver will not see anything there, so this does both. It sees that, hey, there's a button here but AX doesn't know about it and so we're going to pull it up. We think this is a button or we think this is a slider and so we're gonna tell you that it's a slider.
Thomas Domville: Wow, that's huge. I can't begin to tell you how much progress and like you mentioned before, we just kinda touched on this last year, it was kind of an emphasis now it's starting to grow on us and being able to do this is just absolutely fantastic.
Thomas Domville: I really love the new Xcode accessibility options you mentioned, because that was kind of a big thing we talked about last year that you're making it more and more accessible for those that want to do use that code and program with it. Knowing that, but you mentioned too, is that you mentioned rotor, I'm guessing that's for the Mac, you said you added some new rotor items in there for Xcode?
Dean Hudson: Yeah, these are specific for Xcode. Jump to the next member function. There’s a bunch of them, jump to, I can't remember off hand, but next scope. A lot of times when you're, especially, if you're working with new API, you need to look at the header files and see sort of the functions that are in there and how they're supposed to work. It's really useful for jumping around a code, getting to code quickly versus doing line by line.
Dean Hudson: Like I did a project, some coding stuff in C++, it's not even Objective-C or Swift, and man it's just like working with some code that was probably written in the nineties. It's pretty tedious back then. But today, like that experience would be a lot better if I had had these improvements with Xcode. We're really excited, we've gotten a lot of requests from people who just want to dip their toe in the pool and try out Swift and see what they can do, everyone can learn to code. Now we can offer them a path to do that.
Thomas Domville: Are you going to come out with your own apps now?
Dean Hudson: I'm working on it man. I’m writing my own OS
Thomas Domville: I love that
Dave Nason: About the screen recognition, is that available on particular devices? Because I know iOS 14 goes right back to the iPhone 6s, but does that particular feature
Dean Hudson: The recognition starts from XS or the XS up, 2018 and 19 obviously to 20.
Dave Nason: Yeah, so that's obviously the very impressive machine learning capabilities of those devices I guess.
Dean Hudson: Yeah.
Thomas Domville: I'm really excited to hear about the Apple watch that, you got some big improvements there. You mentioned about being able to use instead of Apple watch solely by itself. So it’s like the iPhone having VoiceOver baked in from the get go from the setup and introducing the rotor. That'll be exciting to see what's included and a nice feature to have.
Dean Hudson: You guys might've saw David Woodbridge did a video, we're also bringing up braille support for the watch.
Thomas Domville: Tell us about that.
Dean Hudson: Because the watch is starting to become more autonomous, you can now make phone calls independently of the phone. We realize that some folks who are deaf-blind may want to use their watch to do certain things like call 911 if they fall, to monitor their heart. There are things that are independent on the watch that are not on the phone. We wanted people to be able to take advantage of those functions who couldn't hear or couldn't see, so deaf-blind community.
Sarah Herrlinger: It's a great extension of how year over year, we just try and look at what's an area that we haven't covered yet and try and make sure that we cover that off. With braille being available on so many of our other products, this was a natural extension to move into this area.
Thomas Domville: Definitely. Among the braille improvements, you mentioned about the braille panning and being able to use braille display on Apple watch. Are there other braille improvements?
Dean Hudson: Other than a lot of performance things and sort of just bug fixes. I don't know if you guys use the onscreen braille for iPhone?
Thomas Domville: Yes the BSI?
Dean Hudson: Yeah. That's been improved, a lot of bug fixes in there and a lot of performance fixes in that. Hopefully the experience will be even better when using that feature. And braille input, that's always won. Having the separate tables for input and output, we've now introduced that to both iOS and the desktop. Those are the main ones, I think. The other one, I don't know if you've had a chance to play with sort of the math input stuff for braille in pages. That's another big area. It's constantly iterating over just different feedback we get from some really, really, really top notch braille users and that's what we've tried to always improve on.
Sarah Herrlinger: Really quickly on the Xcode work that we've done. I know, we now know that Dean in his off hours is building his own operating system, but we definitely encourage others to really jump into this as well and more importantly to give us feedback. I think our end goal is always to make sure that whether it is a young student being asked, what do you want to be when you grow up and saying, I want to be an engineer, or someone who is already an adult looking at going into developing.
Sarah Herrlinger: We want to make sure that there's a next generation of developers out there that are members of the blind community. Both from the perspective of employment in the community, but also from what that type of diversity coming into any company in the world can do to help anyone make a better app.
Sarah Herrlinger: We want to make sure that the work we're doing is valuable and works well for the community. If there are ways that we can continue to improve that, then let us know. I just want to make sure we have that plug in there that we want this to be something that turns into another generation or a new generation of great coders out there.
Dean Hudson: With any sort of job you get today. It's just one more thing. If you had any kind of coding experience, it's just going to help you. I know with my kids, they do some small sort of coding related activities, but it's very helpful in today's workforce to have. You don't have to be an engineer, but at least have some knowledge of kind of how it works.
Dave Nason: It's great because it is a topic that pops up on the website every now and then, or fairly often, you see an Xcode related thread on AppleVis.com so there's definitely a community out there wanting to do this. Again I love that kind of acknowledgement, that we're not just using devices to consume, we're using devices like everyone else in society to work or wanting to work. You know what I mean? It's an acknowledgement of that as well.
Sarah Herrlinger: Absolutely. Beyond VoiceOver, is there anything beyond VoiceOver, one of the other features that's kind of come up as a great sleeper hit this year is a feature we have called Back Tap. The idea behind Back Tap is that by literally just doing a double tap or a triple tap on the back of phones, and this one is iPhones only and it also falls into that iPhone X and beyond arena. I think in both of these cases there are hardware needs in order to make things like VoiceOver recognition and Back Tap work. That's why these become X and beyond. But by doing that double tap or triple tap on the back of the phone, you can do a gesture.
Sarah Herrlinger: It triggers some sort of a gesture. You can choose from a lot of different things that are out there. It can be anything from turning on and off of VoiceOver or any of the other accessibility features to accessing on the accessibility shortcut. If you use more than one feature or you can do a lot of mainstream kind of general use things. I have mine set up so that a double tap is taking a screenshot and a triple tap is locking the phone, but you could set it up, if you were a Switch Control user, you can do Switch Control actions. As a VoiceOver user you can do different types of gestures and such.
Sarah Herrlinger: Dean I know you have yours set up differently than mine, what are you using these days?
Dean Hudson: Yeah, I might change it, but I have it's so double tap brings up the notifications and then triple tap will bring up the control panel.
Dave Nason: That's the same as what I'm looking at doing as well Dean. I think it's because it can be tricky if you're using the iPhone one handed to reach up to the top of the phone and do those gestures. This gives me an alternative method to...
Dean Hudson: Exactly
Dave Nason: ...to get those, yeah. I think that's great. It's such a small thing and such a big thing at the same time, the Back Tap feature it's great.
Sarah Herrlinger: Yeah, It's been fun to see how it's been picked up so much around in articles across the board. We actually had a developer the other day who posted a video about setting up the back tap to play a Rick roll video. That's another thing, you can actually use your Siri shortcuts and if you create a shortcut to do something, you could set that up to do that as well.
Sarah Herrlinger: You could take a complex workflow and turn it into a single gesture of a double tap or a triple tap.
Dave Nason: Yeah, that's so good. Actually as well, that reminds me that, I think we've spoken in the past about how accessibility being customization, you know what I mean and this is a great example of that. Where it's inaccessibility, but there's probably not a person out there barely who couldn't use this or find this useful. It's customization for everybody, and is an accessibility feature.
Dean Hudson: The top is, if you want to find the important features of iOS, just look in accessibility.
Sarah Herrlinger: Definitely. I think, our real goal is allowing you to use your device in the way that works best for you. Really that element of customization is huge for us. We find there are accessibility features that are built for one community, but have great applicability across the board and get used as more productivity by other groups. All kinds of things that we see that happened with accessibility features. And as Dean said, it's really, I think the heart and soul of what we build, but that's really just about making sure that you get the most out of our devices.
Thomas Domville: It's nice to see a feature like that in accessibility that can also be used by people like yourself Sarah, that do have sighted vision. It goes vice versa by both ways. Which is similar to the caption that you can now add to photos, which is more of a mainstream feature, but it really is a big deal for accessibility folks as well.
Sarah Herrlinger: I think there's a lot of really cool mainstream stuff that certainly does have accessibility implications and can be great for other communities too.
Dave Nason: You also I think wanted to talk about AirPods and kind of the increased importance that they're kind of having for people.
Sarah Herrlinger: There's a number of things we've done this year in support of, I guess, quote and quote, hearing, meaning everything from supporting those who are all the way in the deaf community to those who just want to get more out of using our products once again, as well. Regardless of who you are that you just want to have better usage of our technology and some of those things fall into the realm of the AirPods.
Sarah Herrlinger: There's a couple of different things that we've done this year in that area. One of them is something called Headphone Accommodations. Headphone Accommodations is, it's a new accessibility setting designed to adjust certain frequencies, which can either amplify or dampen particular sounds so that you can better tune audio for each individual's unique hearing needs.
Sarah Herrlinger: This can be applicable for music, movies, phone calls, FaceTime calls, podcasts, whatever it is you use your device to get sound. The way that it works is that in the accessibility settings, you'll find something called custom audio setup. Within that you go through a series of, well, you can do two things. One, if you have your own personal audio gram, you could put that in there, so you can incorporate that. But you can also go through a series of sort of little tests that allow you to say, do you hear this better? Do you hear that better? Once you've gone through that process, you can set up to nine unique profiles based on your personal sound preferences, which are really three amplification tunings with three varying strengths.
Sarah Herrlinger: From that then, by choosing which one you want, you can then get your music and movies and phone calls and such played in a way that is better set up for your personal hearing needs. That is actually available on AirPods Pro, the second generation of AirPods, some of the Beats headphones and EarPods. But for AirPods Pro, there's also an additional little feature which is great, which is. With the AirPods Pro, there's a way in which when you're using them, you can move from noise canceling to transparency as options in using.
Sarah Herrlinger: So you can have it either, I want no outside sound and I just want to immerse myself in whatever is the audio that's coming through the device, or I want to know what's going on around me. I have the music playing or VoiceOver playing or whatever it might be, but I want to make sure that I have some contextual awareness around me and with Headphone Accommodations, the way that you have set up your custom audio is now available in Transparency Mode.
Sarah Herrlinger: From that, it makes just for quiet voices can be more audible and outside environmental sounds can be more detailed. If you, for example, are a voiceover user who is out in the world and you want to have a little bit more contextual awareness of your surroundings that just is tuned a little bit more to you than it would be if you just had transparency mode on in it's baseline setting.
Dave Nason: As someone who listens to a lot of podcasts while walking, I can definitely see myself there.
Thomas Domville: I've turned this on and it works quite well. I mean, it's much better than the default. You'll be amazed at how much context you are given with this on.
Sarah Herrlinger: Another great feature that we've developed in support of the deaf and in some cases, deaf-blind community. I think there's applicability as well in some ways for the blind community too, is a feature called Sound Recognition.
Sarah Herrlinger: We know so much of the world, we have all these sound based alerts and alarms and notifications and things that are going on around us, but are not always accessible to members of the deaf or the deaf-blind communities. We added in a feature called Sound Detection, which will give a visual alert and notification obviously works with voiceover to a user on an iPhone and iPad.
Sarah Herrlinger: I've had an iPad where it gives you that alert when a particular sound or alert is detected. Imagine if there's a smoke alarm going off, or the doorbell chimes, you would get a visual notification that says, there has been an alert could be a doorbell. We're looking at realms around human sounds, a shouting or a dog barking or things like that. Which even just for someone in the blind community who hears something, but may go, might wonder what it is. To have that ability to get it a VoiceOver or an alert telling you, it seems like that might be x, can be beneficial. But certainly for the deaf community and the deaf-blind community, giving them much more information and awareness of what's going on around them.
Dave Nason: That's going to be huge for things like fire alarms and smoke alarms.
Dean Hudson: Babies crying
Dave Nason: If you're asleep, you're not going to see the flashing light on your phone, so having an actual notification.
Thomas Domville: Actually for myself, I get kind of tuned out with my headphones on and I have this item on it was quite nice to when the doorbell would ring and I get the notification and I hear it. It was nice to see that.
Sarah Herrlinger: We're really excited at the potential of this one to really just make things a lot easier for anyone and everyone to be able to better interact with the audible things around them.
Sarah Herrlinger: Another thing that I think folks might be interested in hearing about is the fact that we have done a pretty significant upgrade to magnifier. We know Magnifier is one of our most popular and beloved accessibility features, and it's an invaluable tool really for everyone.
Sarah Herrlinger: I mean, certainly for those who have some level of vision loss. I am an eyeglass wearer and I feel like I am fully reliant on magnifier in my life. Even with my glasses on, because there's so much just tiny print out in the world that I can't get. We know that it just gets used by everyone. We did some work this year too on a number of different fronts. One of them is redoing the UI. Making it so that you have a little bit more control and flexibility in both how much real estate, the magnifier controllers take on your screen as you're using it, but also based on your preferences, what shows up. You can choose what's the primary kind of controls on the screen when you're using Magnifier.
Sarah Herrlinger: For example, the way that you use it is zoom is the most important part for you, that can be the principal controller on the screen. But if, for example, you always go to the same level of zoom, but you need to play around with the brightness in order to figure out what's going on. You can make that kind of a change. So that, that controller ends up being the primary one for you.
Sarah Herrlinger: Just in terms of sort of how it sits on the screen, there's some work that's been done there to make it a better user experience. But then you can also magnify more of the area that you're pointing to as well as being able to capture multi-shot freeze frames. That freeze frame element of this has always been a huge, huge feature that people have loved.
Sarah Herrlinger: Being able to just capture an image and then be able to zoom in and out on it, with this way you can work with those individual frames in their own way. Maybe you want to, it will filter one, but not another, or eliminate images for better clarity. You can do a lot more to be able to move from one to the next and get more out of whatever you're looking at.
Sarah Herrlinger: One of the other things that is really cool, if you're an iPad user, is Magnifier is now also supported in multitasking. It could be a great tool, particularly for students who are using it in a classroom. Because you could have one side of the screen, perhaps beyond an educational webpage where you're using Zoom to zoom in on that. Then you have Magnifier running on the other side of the screen when you're doing, for example, a science experiment, and you need to be able to magnify in on whatever it is you are doing out in the real world.
Thomas Domville: How interesting
Dave Nason: Or simply take notes or anything, that'll be amazing.
Sarah Herrlinger: Yes, a lot of really cool things going on with magnifier, I'm really excited about that one. Then I think, there's a lot of other things that we have worked on as well that may be of interest to folks, things like in group FaceTime. For a while, we've had a feature where if you are speaking, it will detect that you're speaking and make you the principal image or principal bubble on the screen.
Sarah Herrlinger: Now we've done the same thing for sign language. For someone who is a sign language user when a participant is using sign language, it detects that and then makes that person, the prominent person in the call as well. Really helping assist for the deaf community in making group FaceTime, which we know FaceTime is really highly used by the deaf community for communication, it really just makes it another level of usability for that community.
Sarah Herrlinger: Another thing to bring up that we're really excited about is the, I feel like every time I turn around, I'm saying we're really excited about because that is our whole week. We are just really excited about everything we've been doing for the last year. But for people who are gamers you know, Apple Arcade has become such a great experience for people to be able to immerse themselves in just these unbelievably really in depth, just games that are amazing.
Sarah Herrlinger: A lot of those gamers end up using controllers. Whether they be, we've been supporting things, some of the Xbox controllers and PlayStation and such, and we've expanded our controller support this year. One of the controllers we're now supporting is the Xbox Adaptive Controller. That for individuals with physical motor limitations has become just a mainstay of being able to play games at the caliber of others using different types of controllers. We're really excited at the fact that this one's now come to the Apple Arcade platform too.
Dave Nason: I think that's really cool. I was delighted when I read that, although, I admit I can't play too many of the games myself. I don't know are you aware of any more, I suppose, VoiceOver friendly games coming into arcade or onto the Apple TV in general? It's probably one of the most difficult areas we still have kind of an accessibility terms as the blind community, I think.
Sarah Herrlinger: I think it is to your point, is a difficult area and is one that as we work with developers much in the same way we do with apps, we always make sure to evangelize for it within the developer community. I think there's room for things to get better. I think with certainly with all the works we've done this year around VoiceOver recognition, there's a lot of improvement we've been able to do for apps. Games are a world all their own, but we'll always continue to try and keep both through the developer community and through our own work to see what's feasible.
Dean Hudson: One of the example that is, there's a game out on the app store called song pop, and that's a completely based in Unity, which is like a bridge that a lot of developers are going through to use, to make it multi-platform. We've done some voice recognition, string recognition on that, and that's completely accessible. That's a good example of how this recognition stuff is starting to improve even third party apps.
Dave Nason: I'll definitely be downloading that this evening, because it's great to have an app like that to really get a good view of screen recognition I suppose. How it is at its best, that'd be cool to see.
Thomas Domville: You guys you introduce a lot of features, which is outstanding. It is awesome to have our listeners hear all these features. Even the small ones, like you said, the Back Tap is such a big thing. I noticed too in Voice Control that we also have some new voices for United Kingdom and India users out there and I think that's going to be great benefit too. Just lots of little gems in there that might be hidden away, but, well done.
Dave Nason: Yeah, that's correct.
Sarah Herrlinger: Thank you, we'll be sure to share that back with the team because they have certainly been working really hard.
Dave Nason: I suppose, one other maybe quick question to ask then, which might be important to people in our community would be around kind of the bugs and that kind of thing. Because I think iOS 13 certainly compared to iOS 12 is maybe one of the bumpier releases we've had in terms of things like VoiceOver and braille and bugs. I don't know, how's iOS 14 looking in that kind of respect?
Sarah Herrlinger: Well, I think starting off our beta one, we're getting a good response on stability. I think that, our team spends well year round, but certainly the summer timeframe squashing as many bugs as we possibly can. There are hundreds and hundreds and hundreds of bugs that get dealt with over the course of the summer timeframe.
Sarah Herrlinger: Then we know that through the course of the year, we continue to try and make sure that as we catch new ones, we're always getting those. One of the big things about VoiceOver is it is so customizable. They're often that when we get feedback from people who say here's my particular configuration and I've found something. We work to try and make sure we get all of the different nooks and crannies, and we're gonna continue to do that over the course of this summer.
Sarah Herrlinger: Definitely I always encourage people to download the betas and provide us with feedback and let us know where there are places that we can make sure we hit those before they go live. We're going to continue to keep working and doing everything we possibly can.
Dave Nason: Great and I suppose yes if people are joining the public beta, when that happens over the summer, I suppose it's a key message to send in all those reports to the Feedback Assistant. If you're using a live version is to use that accessibility@apple.com email address, I guess.
Dean Hudson: Yes
Sarah Herrlinger: Absolutely
Dean Hudson: Report, report, report, report them all. Testing, testing, testing.
Dave Nason: We hope not to hear that sentence too much anymore. Great stuff. I think that about wraps it up. Just again, I'd like to thank you both Sarah and Dean for joining us. We really appreciate your time that you give us each year and now three years in a row. Thank you so much.
Dean Hudson: Happy to do it.
Sarah Herrlinger: Yes absolutely. We really enjoy talking to you all, thank you.
[Transition sound]
Dave Nason: Well that was Sarah Herrlinger, Apple's Global Head of Accessibility and her colleague Dean Hudson. It was really good to talk to them again, wasn't it Thomas.
Thomas Domville: It's always a treat to have both of them. They are both so insightful and I love the excitement and enthusiasm that they both offer. Dean, you could just give him the mic all day. He could just talk about this all day long and I love all the new features they discussed. There's so much more to that they didn't talk about, but they definitely included a lot of the bigger features.
Thomas Domville: I'm really interested about the screen recognition a lot more. That sounds like amazing feature that I think that's going to be well adapted for us that have apps that are not very accessible.
Dave Nason: Yes the screen recognition is definitely one I'm very interested to see, I'm trying to think of some apps that I have tried using that I kind of wasn't able to, that I can go back now and try it with. Trying to sort of think of what
Thomas Domville: I know right
Dave Nason: Because I've probably deleted all those apps. You know what I mean, I gave up on them a long time ago, now it's like, okay, can I go back and try these apps again with this. Things like the photo recognition, just getting that context around, somebody sends you a photo or it's a photo even you took yourself a few weeks ago, to be able to do that, to get that kind of context.
Thomas Domville: Oh definitely and, I think our users are gonna absolutely love that feature. I know that Dave and I have been using it and we can tell you with no out a doubt that it is amazing, a big difference I thought in terms of what is being recognized and being spoken to us. I've mentioned to Dave earlier that in App Store, for example, some of the photos for the apps were never spoken before and now the describe it, even has text recognition what's on the screen itself. I thought that, wow, this is really fantastic.
Dave Nason: It's kind of, it's funny when it's not like last year where we had like the big ticket item of Voice Control was kind of the big new accessibility feature. I suppose for a Screen Recognition it's the closest we have to that this year maybe. But there's a lot of features, a lot of, kind of just improvements I think in accessibility this year that hopefully we'll see.
Thomas Domville: I totally agree with you. I think that we were spoiled last year when we got some significant features and not that these aren't big features, but I have to agree they're more of a moderate range to minor, but there are lots of them, which is very welcome. Back Tap for example, that's cool.
Dave Nason: Yeah a cool little thing like that. It sounds like there's a real focus on hearing as well, which maybe they think, okay, we focused a lot on things like VoiceOver in recent years.
Thomas Domville: That's true
Dave Nason: And we need to put a little bit of time into the deaf community and you know, not least yourself and Scott in our team are members of the deaf-blind community. There is a lot of cross over there.
Thomas Domville: That includes the low vision too, I mean, you just heard yourself that they made a major UI change and lots of little tweaks and bells for Magnify. Our low vision folks also got some love and attention too.
Dave Nason: Yeah, definitely. I'm sure we'll be discussing it in bits and bobs over the summer and on your other podcasts. Obviously once this software goes out in probably September, if we're on a normal schedule, who knows it might be October this year, in the year that we are in
Thomas Domville: Whatever it is
Dave Nason: But whenever it happens, we'll have loads and loads of content, both in written and podcast form on everything that's new and how to use it and everything.
Thomas Domville: Yes, stay in touch with AppleVis throughout the summer, I'm sure that on the Unleash podcast, I will be going through all the features that is, were not discussed here in accessibility terms. I also will be covering all the mainstream features that are in those iOS 14 and such. Give that a listen, watch the website as well for any updates and news to give you.
Dave Nason: Great stuff, thanks again for joining me, Thomas and thanks everybody for listening.
Thomas Domville: Thank you, Dave. Bye bye.
Dave Nason: Bye bye.
[Music]
Announcer
Thank you for listening to this episode of the AppleVis Extra. To learn more about us, visit our website at www.applevis.com, on Twitter @AppleVis and like us on Facebook.
Comments
Nice job
I like that many accessibility regarding hearing impaired. Sound and back tap. I did heard in first iOS 14, battery is affected by both. Maybe future versions they will address it. Hope 12 pro has a larger battery than 11 pro. Hope also bugs are address. I did see many youtube videos that are happy with stability, performance of iOS 14. I will cross my finger than 14 will be lucky 14. We already had unlucky 13.
Xcode improvements
Greetings,
The improvements to Xcode really got me excited. Especially the part about playgrounds in Xcode being made accessible. A lot of the swift tutorials out there use this feature to teach code. This makes me want to pay the 99 dollars to become a developer so I can really start learning Xcode and swift by getting the developer betas for my MacBook.
excellent interview
hi,
excellent interview. I cannot wait to test out the improvements whenever the public beta goes Live.
I congratulate you for the
I congratulate you for the informative work you have done.
I wish you success in your works.
Hi everyone so I just listen
Hi everyone so I just listen to the podcast and one thing that really kind of disappointed me to hear about is the fact that the screen recognition and the back tap feature are not going to be available in anything under the iPhone X so what I’m wondering is will these features be available in the new iPhone SE 2020
They didn’t really explain why this feature wasn’t going to be in anything lower than the iPhone X and I know the iPhone SE 2020 is like the iPhone 8 but with the better chip so I’m just wondering is it the better chip that provides all of those things like the back tap and the screen recognition or is it the design of the phone itself
I would really appreciate any infermation that anyone can give me on this thank you
Great Stuff
Thank you as always for a wonderful interview. Thanks also to Apple once again for their continued great work with accessibility. I think I'm going to get rid of the earbuds which a neighbor gave me shortly after he moved into my building. They've been falling out a lot lately, so I think on my Christmas list this year will be a pair of Apple's headphones with the enhanced audio. But the other access features sound good too!
...
İt was a nice conversation. The features are great, but there is nothing about Mac, except Xcode. Although Catalina is a very good version for me, I hope they will fix existing bugs at least.
Devices
Thanks all for the kind words. We could happily have talked all day but there was only so much time :)
Regarding the older devices not supporting some features, it is a hardware limitation. Only the newer devices have the processing power required for the screen recognition. For that reason, though I'm not 100% sure, I believe the 2020 iPhone SE will support it, as it has the same internals as the iPhone 11.
The back tap is related to the sensors in the device. So again, I would assume that the 2020 iPhone SE should be included.
OK awesome I’m going to be
OK awesome I’m going to be getting the new iPhone SE soon so I hope that’s true and the back tapand the other stuff is supported with it
dear team applevis
dear team applevis
thank you so much for this podcast, I really enjoyed it. I'm quite disappointed that the new features will not support any IPhone under x.. but still, I'm excited for it. I've a 6s, I cannot change it as I bought it last year only. I'm happy with it, it's my first IPhone.
I wonder whether we'll be able to use the new Siri voice with voiceover? the US english one which came with IOS 13?
thank you so much once again, you all are amazing :)
Wow
Hi. I just watched the episode. Thank you so much for making this interview for all of us to enjoy! Can't wait for the new features.
Back tap may not be on iPhone SE
On further study, I'm now not sure that the back tap feature will work on the new iPhone SE after all. According to a video from Rene Ritchie, it uses the same sensor which allows the tap screen to wake feature on the Face ID range of iPhones. Therefore, if the SE does not have the tap screen to wake feature, then it may not have the back tap feature either.
Disappointing
Well if that’s true then that is extremely disappointing I hope that that’s not true and that it does support it may be because there really wasn’t much in the way of voice over I feel like that they’re adding to the update so it would be nice if at least those two things worked but if not oh well I guess whatever