Apple Launches Redesigned Accessibility Portal and Speaks About Its Efforts to Make Its Products Accessible to Users With Disabilities

By AppleVis, 4 December, 2020

Member of the AppleVis Editorial Team

Apple marked International Day of Persons with Disabilities on 3 December by launching a major redesign of the Accessibility portal on its website, released additional resources for users of the accessibility features of its products, and took part in a virtual interview at the Sight Tech Global conference.

Apple's Accessibility portal showcases the range of accessibility features available on devices such as the iPhone, Mac, and Apple Watch. Each feature page lists all the devices on which that feature is available, and directs end users to guidance on how to set it up on different devices. Each page typically also provides links to more detailed or specific information on the feature.

Of particular note for the AppleVis community, is that Apple's Accessibility portal continues to recommend us, along with the Hadley Institute for the Blind, as additional resources For users of its vision-related accessibility features.

Apple has also added a number of new videos to its accessibility playlist on YouTube.

There are now 24 videos in total, which explain how to use some of the accessibility features. These include how to take a selfie using Voice Control, getting started with Magnifier on iPhone, using Hover Text to display larger text on a Mac, and how to organize apps with the VoiceOver rotor on iPhone and iPad.

In a virtual interview during the Sight Tech Global conference - a virtual conference dedicated to fostering discussion among technology pioneers on how rapid advances in AI and related technologies will fundamentally alter the landscape of assistive technology and accessibility - Apple's Sarah Herrlinger and Chris Fleizach spoke about the company's efforts to make its products accessible to users with disabilities.

Speaking with TechCrunch's Matthew Panzarino, Herrlinger and Fleizach detailed the origins of accessibility at Apple, where it currently stands, and what users might expect in the future - offering up some interesting anecdotes and historical nuggets along the way.

Fleizach, who serves as Apple's accessibility engineering lead for iOS, offered background on how accessibility options landed on the company's mobile platform. Going back to the original iPhone, which lacked many of the accessibility features users have come to rely on, he said the Mac VoiceOver team was only granted access to the product after it shipped.

"And we saw the device come out and we started to think, we can probably make this accessible," Fleizach said. "We were lucky enough to get in very early — I mean the project was very secret until it shipped — very soon, right after that shipped we were able to get involved and start prototyping things."

Fleizach also suggests that the VoiceOver for iPhone project gained traction after a chance encounter with late co-founder Steve Jobs. As he tells it, Fleizach was having lunch (presumably at Apple's campus) with a friend who uses VoiceOver for Mac, and Jobs was seated nearby. Jobs came over to discuss the technology, at which time Fleizach's friend asked whether it might be made available on iPhone. "Maybe we can do that," Jobs said, according to Fleizach.

Herrlinger, Apple's Senior Director of Global Accessibility Policy & Initiatives, says the accessibility team is now brought in early on a variety of projects.

"We actually get brought in really early on these projects," she said. "I think as other teams are thinking about their use cases, maybe for a broader public, they're talking to us about the types of things that they're doing so we can start imagining what we might be able to do with them from an accessibility perspective."

"We want to keep building out more and more features, and making those features work together," Herrlinger said. "So whatever is the combination that you need to be more effective using your device, that's our goal."

Panzarino cites a recently published video showing off software that takes VoiceOver Recognition and applies it to iPhone's Camera to offer a description of the world in near real time as an indication of what might be available to iOS users in the coming years.

The full interview is available on YouTube at https://www.youtube.com/watch?v=v47mD60ertI

In a separate article, Tech Crunch spoke with Chris Fleizach and Jeff Bigham from Apple's AI/ML accessibility team about the origin of Screen Recognition, a new feature in iOS 14 that uses machine learning to identify and label buttons, sliders and tabs automatically.

“We looked for areas where we can make inroads on accessibility, like image descriptions,” said Fleizach. “In iOS 13 we labeled icons automatically — Screen Recognition takes it another step forward. We can look at the pixels on screen and identify the hierarchy of objects you can interact with, and all of this happens on device within tenths of a second.”

It wouldn’t have been possible just a couple of years ago — the state of machine learning and the lack of a dedicated unit for executing it meant that something like this would have been extremely taxing on the system, taking much longer and probably draining the battery all the while.

But once this kind of system seemed possible, the team got to work prototyping it with the help of their dedicated accessibility staff and testing community.

It was done by taking thousands of screenshots of popular apps and games, then manually labeling them as one of several standard UI elements. This labeled data was fed to the machine learning system, which soon became proficient at picking out those same elements on its own.

The Tech Crunch article closes by telling us not to get our hopes up yet regarding Screen Recognition coming to the Mac, as “the model itself is not generalizable to desktop apps, which are very different from mobile ones.”

Also to coincide with International Day of Persons with Disabilities, Apple released an Instagram post highlighting the work of Jordan Nicholson, a photographer born with TAR syndrome (thrombocytopenia absent radius syndrome). The post features several portraits, shot and edited by Nicholson on an iPhone 12.

Options

Comments

By Unregistered User (not verified) on Thursday, December 24, 2020 - 18:13

I remember dean Hudson speaking at the Braille Institute of LA about the conception of Voiceover. It was a fascinating event with lots of vendors and interesting tales.

Although there's much to be done for accessibility, I'm glad that they're putting it front and center. Hopefully that's something the other tech companies start to copy.

By Khushi on Thursday, December 24, 2020 - 18:13

interesting read.
my main reason to buy an apple IPhone was because they really gave thought to accessibility and things were more accessible at this side of the coin. and I can honestly say yes that's true.
work still needs to be done but this is amazing and I'll always love apple for this.

unfortunatly its at the pricy side of things which again proves accessibility costs a lot which is actually unfair.. everything which makes our lives a bit easier as disabled people is high on cost at least in India.
anyway, interesting read :)
thank you applevis

By Ekaj on Thursday, December 24, 2020 - 18:13

I heard only part of the interview with Sarah and Chris, but it sounded really good. I'm going to go back and listen to it in its entirety this week. The revamped accessibility portal looks really nice. It is most certainly a good time to be blind, and I can't wait to see what Apple comes up with in the future. They've certainly demonstrated time and again that accessibility should not just be an after-thought. There's still work to be done, but we could all use some improvement now and then. You go Apple!