Hello everyone, I’m a master’s student working in Artificial Intelligence at McGill University, Canada.
For one of my course, I’m looking into improving the accessibility of the standard Iphone note taking app by implementing interactions that are currently lacking dictation support.
What I noted is the app lacks voice command support for some commonly wanted interactions on Note taking apps :
- you can’t edit an existing note in the middle, you can only append something at the end
- you can’t extract a piece of information from an existing note, Siri can only read the entire note
- you can’t create folders and organize your notes with voice commands
I would be delighted if you could confirm or decline that there’s a benefit from implementing those interactions. Also, please tell me if other types of interactions on the app should have my focus.
Thanks in advance,
Romain
Comments
If the subject is visually impaired
I think it's a bit outdated to use voice control when making improvements for the visually impaired. For example, voicing books, controlling navigation and processing text aloud... We're already able to interact with devices like those who see our screen readers, I'm in favor of optimizing existing features instead of developing new extra features for accessibility.
Some people have multiple disabilities
So adding support for both Voice Control and Voice Over would be really cool in that respect I think, as long as they don’t conflict. I myself have somewhat limited use of my hands, and use Voice Control as well as Voice Over. (Voice Control even has some commands for controlling Voice Over.) The more accessibility the merrier, in many cases; nobody has to use voice commands if they don’t want to.