VoiceOver vs. Talkback: My Time on the Other Side

By mehgcap, 27 November, 2019

A Fair Look at Talkback and VoiceOver

Hello there, reader. Did you come here because you're the world's biggest Apple fan, and are excited to join in some Android bashing? Are you in love with Android, with visions of finally hearing someone put those Apple idiots in their place? Well, my goal is to do neither. You see, I've used iOS for years, and recently spent some time learning Talkback. I found it an interesting experience. I want to compare VoiceOver and Talkback, because each has strengths and shortcomings, and each could learn some major lessons from the other. Don't worry, though: there is a winner.

What This Article Is and Is Not

I want to be very clear about this: my goal with this article is not to provide details on how to use Talkback or Android. It's not to offer a handy list of Android resources. It's not to explain the ins and outs of VoiceOver. I assume you have at least a basic familiarity with VoiceOver on iOS, and an understanding of the idea of touch screen gestures and other mobile screen reader concepts. Finally, I don't want the comments to turn into a free-for-all. Keep things respectful and helpful. And again, don't ask me for step-by-step details on Talkback, because that's not why we're here. I will sometimes give descriptions of how features work in the below text, but that's only so readers can understand how the feature in question works. I'll try not to go beyond the basics needed to understand what I'm referencing.

The Hardware and Software

I've used an iPhone 7 since the model came out in 2016. My Android device for this experiment was a refurbished Pixel 1, a phone released that same year. I got the blue one. The Pixel seems to be in perfect working order, with a good battery, functional parts, and no cosmetic problems I or sighted people who have seen the device have found.

For this experiment, I was using Android 9 with the latest Talkback and other Google accessibility updates installed. My iPhone was running iOS 13.

A Quick Pixel Review

Skip this section if you're here for Talkback. You'll miss nothing important. If you're curious about the Pixel, though, stick around.

My Pixel is slightly larger, in all three dimensions, than my iPhone 7. However, it isn't uncomfortable or poorly designed. It's not as sleek and nice-feeling as my iPhone, but it's not bad at all.

The Pixel has an aluminum body with a glass-feeling material on the upper half of the back, with the fully glass front of any modern smartphone. It has a large chin, which feels odd as there is no home button or other control there. That doesn't impact performance, though, so it doesn't bother me. The fingerprint reader is on the rear, and seems to do a good job. In fact, adding my fingerprint was faster and smoother than on iOS.

The speaker grill is on the right side of the bottom edge, facing you if you hold the phone flat in your hand with the screen facing up. On the left side of this face is an identical grill, where I presume a microphone is hidden. Between the two grills is the USB-C port. A headphone jack is on the upper edge, about where it is on most iPads. The right edge holds the lock button, with the volume buttons below it. One nice touch is the texture: the lock button is ridged, making it rough under your finger, while the volume rocker is smooth. This makes it easy to feel which button you're about to press. Opposite these buttons is the nano SIM slot.

The Pixel's performance felt somewhat slower than my iPhone, but Apps open fast enough to avoid frustrating me, and Google Assistant starts listening about as fast as Siri does. There's a delay between performing a gesture and having Talkback respond, which I imagine is partly the phone and partly the software. Benchmark comparisons show that the Pixel lags behind the iPhone 7, but as a test phone, the Pixel is more than sufficient. It wouldn't kill me to use it daily with its current speeds, but it could definitely be faster. Remember, though, that this is still a 2016 phone.

My main complaint is the speaker, which sounds tinny and weak next to the iPhone's dual speaker setup and better bass response. If I had to come up with an esthetic annoyance, it would be the sticker on the lower part of the phone's back. It has some codes, numbers, and so on. It's a thick, obvious rectangle that ruins the look (I asked a sighted person) and feel of the smooth aluminum shell.

Basically, the Pixel is fine. Not great, or sexy, but... it's fine. It has a few things my iPhone doesn't, but it's slower and bigger. Still, if you want a cheap Android phone with no third party modifications to the software, even something as old as the Pixel 1 is a great choice for the money. There's nothing here that makes me hesitate to recommend it, especially since it can be had for under $150 from places like eBay, or even cheaper if you go for a refurb.

A Brief Talkback Talk

I won't give you a full Talkback primer here. There is one aspect of the screen reader, however, that is essential to understand: it only intercepts some gestures.

VoiceOver on iOS reads every touch you make on the screen, then reacts. This is why some apps have special places where VoiceOver doesn't interfere with gestures, while other apps ask you to turn VO off entirely. In contrast, Talkback only involves itself in touch gestures when those gestures include one finger. Two or more fingers are simply ignored, with Android itself reacting instead. Drag two fingers down from the top of the screen, and Talkback isn't opening your notifications shade, Android is. TB has no idea what just happened, it only knows it has new content to read.

This explains why Talkback lacks support for customizing gestures that include two or more fingers, and why it uses back-and-forth and angle gestures so much. The developers needed a way to pack a lot of commands into one finger, because they couldn't use more.

It also explains a quirk of Talkback: if you touch the screen to explore it, you must pause for a small amount of time. If you place a finger and begin moving it right away, Talkback reads the movement as an attempt at a gesture instead of telling you what's under your finger. This is rarely an issue, as the delay needed is brief, but I've run into it a few times.

Knowing that Talkback can't read multi-finger gestures explains a lot. It won't, however, stop me from holding this shortcoming against the screen reader. It's a problem, no matter why it exists.

What I Like About Talkback

Let's begin with the positives, as I usually do. There are aspects of Talkback I really, really like, and that VoiceOver would do well to consider borrowing. Note that I'm not saying VoiceOver can't already do some of the below. This section is a mix of things VO lacks, and things VO can do that Talkback handles in a simpler way.

Super-powered Fingerprint Sensor

Talkback supports "fingerprint gestures", which are commands you issue by swiping a finger over the fingerprint reader. Not all devices have this, but my Pixel is one that does. I can swipe one finger up, down, left, or right over the sensor on the back of my phone, and Talkback will respond. I can either use this as a menu for speech rate, volume, and other settings, or assign each of the four movements to any of Talkback's commands.

I don't use this often, mostly because of my grip. I hold phones in such a way that none of my fingers are near the reader, so I have to shift my grip anytime I want to use this option. This often requires two hands, so by the time I'm in position to issue a fingerprint gesture, I may as well have used a normal one. But I just need to work on holding my phone differently, and I can see this becoming a very helpful tool. I was hoping to also be able to assign multiple taps on the sensor to commands, but Google hasn't gotten there yet.

Ironic Menus

At any time, you can open one of two menus with Talkback. One is contextual, offering options specific to what you're doing. If you are editing text, you'll have editing commands; if you're on a notification, one of the options will be to view actions for that notification. You get it.

The other menu is also full of options, but these are global ones. They include things like opening Talkback's settings, spelling the most recent utterance, copying the most recent utterance to the clipboard, and the like.

I love this idea. In VoiceOver, you have to remember that the "copy last spoken phrase to clipboard" gesture is tapping three fingers four times, or you're out of luck. What if you have a hard time memorizing all the gestures VO uses? Or you've assigned something you use more often to that gesture? A menu of possible actions makes perfect sense. Just bring it up, choose the function you want, and you're done.

I promised you "ironic menus", and here it is: Apple already did this! On macOS, you can press vo-h, then h again, and you'll be in a searchable menu of every possible VoiceOver command you could ever want. Mouse movement, navigation, speech, and plenty more. For some reason, though, the feature never made it to the mobile world... Except it did. Google implemented it. Now, the only one missing out is the iOS version of VoiceOver.

Circle Menus

Talkback lets you use circular menus if you want to. The idea is that, instead of a list of items you can swipe through, you place a finger on the screen and move it in a circle. As you move, Talkback will speak various menu options, as though each were on the rim of the circle you are drawing. To select an option, just lift your finger when you hear what you want.

While you can turn this functionality off and use a regular list of menu items instead, I've come to like the circles. They are faster to use in most cases, even more so once you know where specific items are. For instance, I can dismiss a notification by drawing the angle gesture to open the actions menu, touching the screen, sliding my finger down a bit, and lifting up. I don't have to open the action menu, swipe or touch to find the option, and double tap. Yes, it only saves me one extra gesture, but it feels faster and just as intuitive as the other method. And hey, that's one more gesture I don't have to worry about.

Speaking Passwords

The question of whether a screen reader should allow you to hear what you type as you enter a password is a long-standing one. One side argues that since a blind user can't see the keys they touch, having the confirmation that they typed what they meant is useful. If they don't want it in some situations, they can turn it off. Most of the time, though, they are somewhere where no one will overhear the characters of their password. The response to this is that apps and operating systems sometimes offer users the ability to view their passwords as they type. If this is available, the user can use it. If not, the screen reader shouldn't override the system's choice and speak the password anyway.

On iOS, VoiceOver won't echo the characters you type unless the field offers a way to view the password. Talkback has a clever idea here, though. It follows VoiceOver's model, but lets you choose whether to speak the characters you type if you have headphones connected. Presumably, headphones mean only you will hear the audio anyway, so the security of not speaking what you type is unnecessary. While I can still see both sides of this debate, I tend to favor this implementation over Apple's more firm stance.

The Double-Edged Sword of Navigation

VoiceOver lets you navigate by swiping up and down to move by character, word, heading, link, and the like. You change what these swipes move by by changing the rotor. This makes sense once you grasp it, but the gesture can be hard for people to master, and it requires two fingers plus an odd wrist movement.

Talkback solves this by letting you swipe a finger up or down to change what left/right swipes move by. It's like a simplified rotor. This system makes it much easier to, say, go from jumping around by heading to moving by character or link. It's all done with one finger and no rotor motion.

Say It How You Want

It's no secret that Android is more open than iOS. This means developers can release Android apps that would never survive Apple's review process. These include speech synthesizers, and even replacement screen readers. Talkback can speak using Google TTS, Eloquence, Acapela, Nuance, eSpeak, and others. You need only purchase the app you want, then change a setting. Or, you can install a different screen reader and tell Android to use it instead of Talkback. For my testing, I stuck with Talkback as my screen reader, and a mix of Google's own voices and eSpeak for my speech.

Clever Gestures

I've touched on some of the gestures unique to Talkback, and the reason they exist (one finger maximum, remember). I want to highlight them, though, because they're really cool.

You can use angle gestures by moving your finger in a straight line, then in another, perpendicular straight line. You basically draw a right angle. This gives you eight gestures to play with : up then right, up then left, right then down, down then left, and so on. By default, you have one-finger access to the local and global menus, notifications, the overview (sort of like the iOS app switcher), home, and back. Some people hate these, either because they can be finicky to get right at first, or because they can interfere with exploring the screen by touch. But I don't mind them.

The other set of gestures are what Google calls back-and-forth ones. To perform one of these, you move your finger up, down, left, or right. Once you've drawn a short line, you reverse the direction, going back along the line you just drew. To jump to the top of the screen, for instance, you move one finger up, then back down. To scroll a list, you move right then left, or left then right.

Both of the above are very clever ways to do more with one finger, adding twelve more possible gestures. I'd love to see VoiceOver implement some of these, particularly the back-and-forth ones. If Apple offered those with multiple fingers, users could do a lot.

Getting Volume Keys in on the Action

Android makes much more use of physical keys than iOS does. You can press power and volume up to mute, for instance, or power twice to bring up the camera from anywhere. Talkback supports the volume keys as well. You can use them to change the value of a slider control, such as the speech rate. You also toggle Talkback on and off by pressing and holding both volume keys at the same time. While less useful than other features I've talked about, this is still a handy way to do things. I didn't like it at first, but I'm coming around. Android being Android, I imagine I could find apps to let me change tracks, activate buttons, and perform other actions with these buttons.

Impressively Specific Volumes

Android has several volume levels, all of which can be changed independently. There's the speech volume, media volume, alarm volume, and ringer volume. There may be more I'm missing, too. Talkback even tells you which volume has been changed; if you press a volume key while the TTS is speaking, you hear that speech volume has changed, whereas pressing a button while music is playing tells you that the media volume has changed. This lets you mix the volumes of speech and media together, to get them balanced just how you want.

Also, you can set your ringer and alarms to different levels than your music and speech. You might want alarms to always sound at 100%, while music plays at 45% and your ringer is at 70%. This is all easily done.

I know that iOS lets you change the ringer volume independently of other volumes, but it can't let you mix media and speech how Android does. It also won't report the new level, or which volume was changed. These announcements can be irritating at times, but the idea behind them is still great.

What I Don't Like

Sorry Google, but now it's time for the negatives. I found plenty to like about Talkback, but I also found plenty of irritations and missing features.

Missing Gestures

Talkback fails to support a shocking amount of gestures. When I first went to the setting that would let me customize its touch gestures, I was very surprised to find that a lot of obvious ones aren't present. There are no triple or quadruple taps at all, and nothing with two or more fingers. No three-finger swipes, no double tap with two fingers, nothing. Just a bunch of one-finger gestures that don't even include triple or quadruple taps.

We've already discussed Talkback's unique gestures, and how I think they're a good idea. Yet I can come up with twenty-one more gestures just off the top of my head which TB doesn't have: triple and quadruple tap one, two, three, or four fingers; one or two taps with three or four fingers; and swiping three or four fingers up, down, left, or right. The list gets much larger if you could use the reverse gestures Talkback already supports with more than one finger. And this still doesn't touch the position-based commands, such as the two-finger swipe from the top of the screen that shows notifications. Why not allow us to customize those, and others like them?

Not a single command in the previous paragraph is available to be customized. Most aren't present at all. I know why this is (remember that Talkback only intercepts one-finger gestures, not all touch input). As I said when we talked about that model earlier, though, I'm still counting this against Google.

The other major missed opportunity here is the magic tap. This is a VoiceOver feature that lets you double tap two fingers to do a surprisingly wide array of tasks without needing to find a specific control on the screen. Normally, these tasks relate to audio, so you don't need to hear speech mixed with other sounds as you try to find a pause button or answer a call. Instead of a simple gesture to accept an incoming call, Talkback's official manual says that you must place one finger "about three quarters of the way down the screen", then swipe right, left, or up. VoiceOver's two-finger double tap, which can be performed anywhere on the screen, certainly seems simpler.

The caveat here is that I was unable to test phone calls during my time with Android. I use iMessage a lot, and I didn't want Apple to find my phone number no longer associated with an iPhone. I can still use iMessage with my email addresses, but I wasn't sure what it would take to re-associate my number. The potential hassle didn't seem worth it. Still, I'm going off the official documentation, so I hope I have accurate information.

If nothing else, a universal "stop the music" gesture is great to have. Audio ducking isn't at all good on my Pixel, with music quieting well after speech begins and coming back to full volume almost before speech ends. This not only gives the ducked audio a choppy sound, it makes it harder to hear speech. I didn't realize how nice the magic tap was until it wasn't there.

A Lack of Actions and Options

Let's assume for now that Talkback did have the same set of gestures as iOS, plus its right angle and reverse ones. What would you assign to all those commands? Would you use four of them to always have heading and form element navigation available? Maybe set some to spelling the last utterance or copying it to the clipboard? Too bad.

You see, Talkback has a surprisingly small amount of actions. You can assign a gesture to move to the next or previous navigation option (similar to turning the rotor in VoiceOver), but you can't set a gesture to actually move by one of these options. In other words, you can customize how you get to moving by headings, but not set up a way to simply move by headings anytime you want to. If you're on a webpage and move by heading, then you want to keep going to see what is under that heading, you have to first move back through the navigation options to "default". If you don't, swiping right on the heading will simply jump to the next one. I found this slowed me down a lot. After all, VoiceOver always moves by what Talkback calls "default", regardless of the rotor setting. Land on a heading, and you need only swipe right to have VO read what follows that heading. Not so with Talkback. This is odd, because you can choose which keyboard shortcuts will move you by link, heading, and other options. I have no idea why you can't do the same for touch gestures.

Speaking of actions, VoiceOver long ago introduced the custom actions rotor. On emails, message threads, notifications, app icons, files, links, some buttons, and countless other places across iOS, you can simply swipe up or down to find actions. Share a file, delete an email, clear or view a notification, and on and on. Simply swipe one or three times, double tap, and you're done.

Android has actions as well, but Talkback hides them inside the actions menu. To open the menu, you have to perform a gesture. I realize this is only one extra step, but trust me, you feel it. Instead of swiping up or down, you must do the gesture that opens the actions menu, wait for the menu to appear, find the action, and double tap it. It doesn't add more than a second, but those seconds add up. Oh, and the best part: the gesture to open the actions menu is unassigned by default! You read that right: unless I've missed something, you have to open the actions menu by first opening the local context menu and finding the right item. Now we're up to three gestures at minimum just to get to something VoiceOver offers automatically.

This translates to more than just an annoyance when handling notifications, though. It means there are more buttons all over the place, and they are sometimes not convenient to find. Each notification, for instance, has a button to expand or collapse it. That's an extra swipe per notification as you move through the list. On iOS, actions are just that--actions, tucked away in the rotor. To be fair, this can hurt discoverability on iOS, so for new users, Talkback's approach may be better. But once you know how the rotor works, you don't need extra buttons cluttering up your navigation.

Let's get back to the other kind of annoying button. In the Gmail app, you must swipe twice per email: once for the message itself, and once for the control that selects or de-selects the message. Once you select a message, you have to go to the top of the screen and find the action button (delete, archive, move, etc) you want. Once done, you have to get back to the list of emails and find your place again. If you've used VoiceOver, you know that deleting a message is as simple as swiping up and double tapping. No selecting by using an extraneous control, no leaving the list of messages then going back to it. Again, VoiceOver's rotor is only good if you know how to use it, so having on-screen buttons isn't a bad thing in itself. But having those buttons be the only way to manage emails makes Android a less efficient way for me to get things done.

No Help Mode

Every screen reader, braille notetaker, talking book player, and other blindness-specific technology I can think of includes a special help mode. When active, the press of a key or activation of a gesture will announce what should happen, but not do anything. This way, you can try something to confirm that it will do what you expect, or work through a set of new commands you've just learned, without actually affecting the device you're using. I consider myself a power user on macOS, iOS, and Windows with NVDA, but I still use this help mode.

Talkback doesn't have it, as far as I can tell. You can go into the settings and review the assigned gestures, but that's as far as it goes. You can't tell your phone to speak what a gesture would do. You can only try the gesture and see what happens, or try to find the gesture in Talkback's settings. This is especially disappointing given the unique gestures Talkback offers. Instead of getting frustrated while trying to learn angles or back-and-forth motions, but only getting random focus movements, it would be great to have a place to practice things, where the only feedback is silence or the pronouncement that what you just did matches the gesture you were going for. I know I'm not alone in having used VoiceOver's help mode to learn the rotor, how quickly to double tap, how far to move when swiping, and so on. I can't imagine why Talkback doesn't offer a similar mode.

The Double-Edged Sword of Navigation

I said earlier that Talkback's method of swiping up and down to change what left and right swipes do is far simpler than VoiceOver's rotor. It is, but I've also told you how much of a problem that can be. When you move through a website by heading, then want to read the text past a heading, your only option is to swipe up or down to set your navigation back to default. You can't rely on swiping left and right to always move by element, regardless of which element it is. Not having this always-present navigation is another way Talkback hurts efficiency.

Imagine using the screen reader on a Windows or Mac computer to move around a website. You'd likely press h to move by heading, or another single key to move by link or landmark. Once you got where you wanted to be, you'd probably use your down arrow to read, right? With Talkback, it's as though the arrow keys are all you have. Press up and down to cycle through the different elements by which left and right can move. Up arrow to headings, press right, and try to read. You can't, without pressing up or down several times to let left and right review the text of the page again.

The analogy isn't great, but hopefully it gives you an idea. If you're still confused, just trust me that Talkback's method is objectively slower than that of any other screen reader I can think of, including Voiceover on iOS.

Bye Bye Braille

In iOS 8, Apple introduced system-wide braille input, available right on the touch screen. Suddenly, I could type text with ease, instead of poking at the on-screen keyboard. I don't exaggerate when I say this feature was life-changing, either. With it, I've moved to doing far more on iOS than I ever expected. Social media, emails, texts, writing beta testing feedback, writing reviews, and more are all just a rotor twist away. My phone has become my primary computing device, and braille screen input is largely why.

Talkback has no such option, and that's the worst part of the experience for me. I'll give Google this: their automatic suggestions are better than those on iOS. Still, they're no substitute for the almost effortless input of braille screen input. BSI has profoundly impacted my mobile computing experience for the better, and Android can't offer it. For a heavy user of this feature, its absence is difficult to ignore. I realize I'm in a minority, but if you're a braille user like me, this is going to be one of the largest factors to consider when you look at Android.

Other Android Thoughts

While this post is about Talkback and VoiceOver, I wanted to take a moment to acknowledge the other aspects of Android I appreciate. Some are specific to accessibility, others are not.

Moving Apps

I'm using the Pixel Launcher, since it came with my phone and the only other one I tried, Nova, didn't let me move apps around. The Pixel Launcher has a pretty neat way of moving apps, a way that's easier than on iOS.

First, you touch the app to be moved. Next, you open its actions and choose "move app". Third, you touch where you want it to go. When you touch the screen in this mode, Talkback says "move to row 4, column 3" (for example) if you touch an empty app slot. If you touch another app, you instead get, "create folder with [other app]". It's simple, intuitive, and does everything you need. Again, other launchers will vary.

Widgets

I didn't appreciate widgets until I tried them. My favorites are the weather, which lets you have the temperature right on your home screen, and the Apple Music widget, which allows me to skip tracks or play/pause without opening the app itself. I haven't used other widgets yet, but what I've seen so far makes me curious to explore more. It also makes me wish Apple hadn't relegated its version of widgets to a screen that takes extra gestures to get to.

Vibrations Everywhere

One thing I've never understood about Apple, from the first time I booted my first iPhone, is the company's eversion to using vibration to indicate, well, anything. On my Pixel, there's vibration feedback when the phone restarts, when I should lift my finger while saving a fingerprint, and more. I like this, both because it feels more slick than using speech for simple notifications like those, and because it's more accessible.

Split Screen

I don't use an iPad often, but when I do, I usually place my two top apps side by side so I can access them both without switching between them. It's very cool to have this same ability on a phone. Sure, it might seem visually cramped, but I don't care. I can touch one app or the other to put my focus there, and that's all Talkback needs. Managing them isn't as easy as it is on iOS, but it's at least an option. By Apple's decree, I can't do this at all on my iPhone.

Who Wins?

You won't be surprised by my conclusion: VoiceOver is better in almost every way I care about, so it wins, hands down. It could certainly borrow third-party speech synthesizers, new gestures, and menus from Talkback. But what it lacks in those areas is more than made up for by what it offers that Talkback doesn't. Talkback has no global braille input, no help mode, far less commands that can be assigned, less gestures to which to assign the commands it does have, a less efficient navigation system, and no quick way to pause media playing in the background. Even answering phone calls requires you to put your finger in exactly the right place.

That said, I can absolutely see why people prefer Android. Note that I said Android, not Talkback. Widgets are awesome, using split screen on a phone is great, assigning my own apps for my browser, mail client, and other functions is quite nice, and placing apps anywhere I want is helpful. Also, I love having the ability to install any speech synthesizer I care to.

On the flip side, I'm missing a lot of apps I use all the time. Seeing AI, Overcast, Twitterrific, various games, and even first-party apps like Apple's Mail app are all ones for which I've not found Android counterparts that come close to being as good. Android's use of buttons instead of relying on swipe actions is painful at times, such as having to swipe four times just to move from one tweet to the next, or two times to move between emails in Gmail.

I hope you found this useful and informative. Please don't decide based on my experience alone, though. Try it for yourself, or, at the very least, seek out other peoples' experiences. I've read posts that echo my own, and I've read posts that talk about how much better Talkback is. Each person has their own preferences, so your milage will vary. Feel free to leave a comment, telling me what you thought of my coverage of this topic, or what I might have missed/gotten wrong. I'm sure there's plenty to learn, and short of switching to Android for a month with no fallback iPhone at all, I'm not going to get to know all the details. If I've mischaracterized something, or omitted an important point, tell me about it. As I said at the start, though, please keep things civil and fair, and please don't use this as an excuse to post detailed Talkback or Android content. We're still focused on Apple here on AppleVis, and while discussion of "the other side" is good, we don't want to wander so far into the weeds that we get lost.

Tags

Options

Comments

I’m mostly with you I think, though I’m a little confused about the low vision vs blind idea here. I’m totally blind and spatial awareness on a touch screen is a little new but totally manageable. If you're talking about braille display support and smart watch access I can only agree, though I personally feel comfortable with that. I don’t currently own a smart watch, nor do I plan to get one any time soon, and my braille note touch plus is now more powerful than before thanks to my always-on mobile hotspot. I needed that more than I needed a braille display for my phone, though I’ll admit if I could go back and start over with totally new braille devices that may be a different story. As it stands now though my touch plus with constant internet access is all the mobile braille I need.

By Unregistered User (not verified) on Tuesday, August 25, 2020 - 08:15

In reply to by Trenton Matthews

That's good to know. To clarify, that was my experience on the OG iPhone SE with a home button. From what I understand, iPhones with FaceID now use swipe gestures. Do you use those for Voiceover as well for pulling out control center and the notification shade?

By Unregistered User (not verified) on Tuesday, August 25, 2020 - 08:15

In reply to by Holy Diver

Glad to hear your setup is working out for you. Let me clarify what I meant. I believe that Talkback is better suited to someone with low vision because, in my experience, I had a much easier time with Talkback back when I could still see a little. I couldn't see enough to read, but I could see enough to spot UI elements and text. Not to read it, but I could at least infer where it was. That's no longer the case, and I rarely look at my screen anymore. Apps like Reddit are the exception and particularly problematic because they feature a lot of tightly-packed, small elements. While the iPhone felt a little less versatile, using scroll gestures and whatnot, I felt much more comfortable not looking at the screen than with my Galaxy.

Maybe it's just me being silly. I guess everyone is a little different.

No I get you I think. It's definitely new for me and I can see how if you're used to spacial awareness coming from vision talkback might be harder if you're totally blind. Hell, it's harder anyway, I've got no qualms saying so. iPhones are great at what they do and I'm probably gonna hold onto my iPhone 8 for apps like Seeing AI. BTW, have you had any luck getting the Samsung cameras to be better at autofocusing on faces, print text etc? That's another area where iOS seems a ton better for us blind folks.

You can (if you wish) do a 3-finger-up/down while on the status bar to open the Control Center/Notification Center, instead of the swipe and hold method.
The above also works on iPads too (even if it has a home button.)

By SeasonKing on Tuesday, August 25, 2020 - 08:15

Since this thing was posted, there have been few updates to Talkback.
Braille screen input is now possible and works well. Yes, it only works in "screen faced away" mode, and there's no tabletop mode. But it is reliable.
In Android 11, Google is testing multifinger gestures, and yes, magic tap is coming. So you can do various things with double tap with 2 fingers. Not only that, you can customise these gestures to your liking.

By Holy Diver on Tuesday, August 25, 2020 - 08:15

In reply to by SeasonKing

About time, and I say that as someone who really likes Android for what it is. I know I'm skirting the edge of what's allowed on Applevis but is this all in the Android 11 beta?

By Teresa on Tuesday, August 25, 2020 - 08:15

Wow, this is a great article! I have had an older Android phone for two years, and also a number of Apple iOS devices. i am much more comfortable using iOS, so my next phone will be an iPhone. I'm glad I tried Android, though. It's quite useable; just not my preference. One amusing thin I noticed is that I never realized how much I used the iOS split-tap gesture until I found myself trying to use it with TalkBack.

By Dan TeVelde on Tuesday, August 25, 2020 - 08:15

This was a good post. I am a heavy Braille user so I have completely ruled out Android. Whenever I express my concerns on lists where there are Android users I am accused of lying. Yesterday I had a fight with someone on the Hims discussion list. the person accused me of spouting off. I posted about the iOS versus Android Braille issue last year on Applevis and got a lot of positive comments. I no longer have a Pixel phone so I can't compare the June 10 2020 version of BrailleBack with the one I used in my previous job. Despite the fact that there have been numerous issues with Braille support in iOS, I still say iOS wins with regard to Braille support. I don't see much action on Google's part to improve Braille support.

By Dan TeVelde on Tuesday, August 25, 2020 - 08:15

In reply to by Devin Prater

I appreciate Applevis because I'm not attacked whenever I post a comment. There has been an ongoing discussion on the Hims user list comparing the Polaris with native Android. I don't have an interest in getting the Polaris it doesn't meet my needs. I expressed my concerns about the lack of Braille support in native Android and one person on the Hims list accused me of spouting off and lying. I don't normally respond to attacks like that but the person wouldn't let up. There was an update to BrailleBack on June 10 2020 which I have not tested. My previous job ended so I no longer have access to an Android device. I was using the 2019 version of BrailleBack and found it sorely lacking. I just wish Google devotees would stop treating Android like a religion and attack anyone who disagrees.
There are always rumors about forthcoming imprevements for Braille support in Android but so far they are rumors. Google made a big deal about the Braille keyboard they added this year but iOS has had Braille screen input for a long time. I've never gotten iOS Braille screen input to work for me but I think it's a great idea.

Cool there's an update. I'll test it out and see. For what it's worth I've seen plenty of rabid fans on both sides of the iOS / android divide, I find it silly. They both do different things better and both have their place. As for braille support, yeah, I'm a heavy braille user but never really got used to braille over bluetooth, I'm one of those weirdos who still prefers a dedicated notetaker so I don't need my mobile device to have great braille support. I'm all for both platforms getting better in that regard though, we all should ;.

By Holy Diver on Tuesday, August 25, 2020 - 08:15

In reply to by a king in the north

I'm glad you've been able to make that work. NEVER tried that hard though, I'm glad it works on iOS but I think I prefer google's assistant over-all. How do you set Siri up to do this?

By CJ on Tuesday, August 25, 2020 - 08:15

I thought this was a good even handed post, but I wanted to point out a few things. First, when this review was written Android 9 was being compared to IOS 13. Android 11 and IOS 14 are both now in beta. Plus some of these issues have been addressed in Android 11 and Talk back 8, such as there is now a virtual braille keyboard and support for multifinger gestures.

Second, you can assign gestures in Talk Back, and you could when this review was originally posted. In particular, you can assign a gesture to bring up the custom actions menu, so using custom actions in Talk Back doesn't have to be as cumbersome as described in this review.

Finally, I've been using Android since 2016, when it became my primary device until this year when I switched back to an iPhone 11. I still pick up my S7 though, especially to control media and consume content. Prior to getting my S7 in 2016, I had an iPhone 4S since 2012, and my employer has provided me an iPhone since 2014.

By CJ on Tuesday, August 25, 2020 - 08:15

In reply to by Devin Prater

Hmm, one wonders who the fan boy is here. I agree the post was mostly accurate when it was written, so Gary's plea to get the facts accurate was unwarranted, but he is making a valid point and that is that we've had a major upgrade to Android and Talk Back recently that addresses some of the more significant concerns here. It's a pity that Gary's post had to be so quickly discounted as just coming from a fan boy. Until then I thought it was a nice and balanced read, both the posts and the comments that followed.

By Brandt on Tuesday, August 25, 2020 - 08:15

Hi,

As a nerd that would run anything from a Windows machine, to a MacBook to a Linux box, I have to say that according to testing I have done with Braille input in TalkBack 8.2, Google's implimentation is simply better in all ways.

Firstly, the dots don't ever, and yes, I mean ever move, so once you know where they are, you will never have any issues.

The fact that you have to hold the device with the USB port to the right is good, since you know this and thus don't have to calibrate the dots at all.

By Devin Prater on Friday, June 25, 2021 - 08:15

So, I spent a few months with Android, and much of my original comments still stand. I went with a Samsung Galaxy S20 FE. No, not the "Final Empire," but the "Fan edition." First, typing in braille is a mess. Yes, you can actually do the FOR sign, all six dots at once, but if you don't touch it just right, you'll get extra dots, like a dot 5 after what you meant to type. And there's no suggestions, no using it on the homescreen to super quickly open apps without having to summon the voice assistant, and no using it on the web for navigation.

Next, multi-finger gestures... sometimes works. The less fingers involved, the more likely it is to work. But now, look at this. When I double-tap with two fingers, it takes about half a second to respond. Why? Because TalkBack still treats it like an angle gesture. Sure, VoiceOver has to process gestures too, but that happens far faster on iOS. If you try to swipe with three fingers, Android is more likely to see that as holding down three fingers instead. Now, I do like that we get a few more optionns, especially in the realm of text editing. And the whole need to switch from headings to default web navigation is gone, to the dismay of old-time Google fans who happen to be blind, but one pretty big interaction still isn't there. We can't freaking spellcheck!

Yep, no spellchecking for blind Android users, at least with TalkBack and GBoard. Sure, GBoard may autocorrect what you write, but if you're using the TalkBack Braille Keyboard, you're out of luck. Also, the novelty of new speech options kinda wore off. Google TTS is good, but still sounds lesser quality than any Apple voice. Espeak has never appealed to me. Eloquence is good, but not even supported anymore, so new Android users will have to get it from somewhere else besides the Play Store. Also, the way it says days of the week really shows Codefactory's inattention to regular expression. I mean, if an unofficial NVDA addon can do this much better, having Eloquence not crash on finding a word that usually crashes it, without messing up normal words, then surely a company can do better. Also, Accapella sounds... really not so great compared to even Apple's Siri voices, and we get Vocalizer built in. Don't get me started on how great Alex is.

Now, Android is great for gaming... um, well, in the fact that I can have emulators easily from the Play Store, and JIT, a way for code to be ran faster, is available for Android apps, but not iOS apps outside of the App Store. So, PSP games? Yep, ran fine on Android. Gamecube games? Ran fine on Android. Wii games? Ran fine on Android. To be fare, if Apple allowed it, these games would probably run even better on iPhone. But besides workarounds that don't even offer the full power of iOS, like JIT, you can't play these on iPhone.

Except, no OCR, and no screen recognition. On Android, TalkBack has none of this. Even after Apple released their list of updates for accessibility this year, which is what made me switch back to iPhone, Google did nothing. 'This, more than anything, shows to me at least, that Google couldn't care less about Android accessibility. Also, yeah, BrailleBack is still crap. No excuse. With Snapdragon chips being as powerful as they are now, there's also no excuse for TalkBack to not have even a rudamentary OCR system. How is it that a hardware and services company, Apple, can beat Google, an AI and software company, in accessibility? By actually putting resources into it. And, while accessibility on iOS isn't perfect, I'm writing this comment, using Braille Screen Input, on my iPhone X R. I couldn't even imagine doing that, with this level of precision and lack of stress from slow typing or phantom braille dots, on the Galaxy S20 FE. Oh yeah, typing on iPhone is like literal Investiture... magic... compared to Android's slow slog through the keyboard. One almost must have a Bluetooth keyboard in order to even type quickly. That's pretty darn bad.

By Gabriel Oria on Friday, June 25, 2021 - 08:15

nice work on this article. I use android as my daily main phone, but I also have an iPhone. My samsung galaxy s20 F E works great with talkback and I have enjoyed it so far. But there is one thing for sure. I am not ready to dump ios completely as of yet. android 11 works pretty well, they definitely made talkback better than they used to.

By Gabriel Oria on Friday, June 25, 2021 - 08:15

You probably know this by now. but google just implemented multi touch gestures with newer versions of talkback. I do find they are quite flaky to use though.

By Mister Kayne on Friday, June 25, 2021 - 08:15

I am a bit taken back where the comment said that VoiceOver OR Talk Back can be used right out of the box. I can agree with the former but the latter is a mystery after I got my daughter an android tablet. I was unable to turn Talk Back on after starting it; checked all forums thay said to do a few gestures or press both the volume keys together and yet nothing happened.
It took the help of my sighted wife to turn talk back on and I was surprised that talk back had options to turn talk back on; on the locked screen and another weird option that would actually but put the dis in the ability of a blind user.
I would like to put my foot down on this ridiculous setting where the choice of using talk back is in the hands of a sighted person; many times my daughter who is 4 messes up the settings and I can't get through in starting talk back on the locked screen and have to cry out for help to get it started. Very easy press both the volume buttons to start talk back and doing this over the life time of your device end up damaging the device and the controls of these buttons that are so fragile for a lack of a better word.
In conclusion I am not at all motivated to pick up my daughters tablet and play with it even to learn things on Talk Back.
Others who think elsewise; go fish I wasted much time to get me to like the screen reader but they failed to impress me unlike the other applications from Google like YouTube, Maps, Pay, Gmail etc.