Anyone here use Point and Speak on Magnifier?

By Louise, 22 October, 2023

Forum
iOS and iPadOS

Hi all:

I like the 4-finger tripple-tap to bring Magnifier up, but I can't seem to get the hand of the Point and Speak text recognition. I was hoping to use it for the microwave or the touch pad on my stove.
Would welcome any tips.

Options

Comments

By Karina Velazquez on Tuesday, October 24, 2023 - 14:22

Hi, while reading your post, I decided to try it with my deskphone here in my office, which has a lot of buttons which I never use but could use them if I knew what they are for.
The results where somehow a mess, because no matter if I choose the down or up option for the finger position, it reads what is around. For example, I know that a certain button is the one to activate/deactivate the speaker, but it says SP phone, but also mute, hold, which I infer that are the labels for the buttons beside it.

I haven't use it with my microwave, but I will try it. Maybe it works better there where the distance between buttons is larger.

Best regards.

By Louise on Tuesday, October 24, 2023 - 14:22

I tried with my microwave, and it didn't work. It even said that the text was up-side-down a few times.
I tried it on paper, and it just doesn't work well. It's a shame, because I had hoped to use it for spreadsheets, menus and such.
Hopefully, it gets an upgrade soon. The concept is really cool.

BTW, I have an office phone with a huge display screen, and about a billion buttons. It sits unused all day while I use my cell. I really ought to dump it.

By Ekaj on Tuesday, September 24, 2024 - 14:22

I got a bit curious as is often the case, so I decided to try out the magnifier app on my iPhone 14 now running iOS 18. I opened Live Recognition and it told me to re-download Magnifier which I did. I enabled the detection mode, which seems to include Point and Speak. I actually got double-speaking with VoiceOver speech on, which was kind of cool but a bit redundant. So I turned VO speech off with a 3-finger double-tap and Magnifier read out a bunch of things at once. I know they were accurate because I've lived in this particular apartment since the end of 2022. I'm certainly going to have more fun with this, and hopefully I can get it to read some more ingredients in my meals and other grocery items. It seems that Apple once again did not disappoint with this though. It'd be cool if somebody could do an audio walkthrough comparing/contrasting all the scanning options that are available to us, including the 3rd-party apps. I know that Thomas demonstrated some of it awhile back on here, and he did a wonderful job as usual.

By Ollie on Tuesday, September 24, 2024 - 14:22

Nah, it's useless. Like everyone else, I think, tried it, found it wanting, never tried it again. A half hearted concept apple could use to virtue signal... Same goes for door detection. All very PR friendly, pointless in practice.

By Manuel on Tuesday, September 24, 2024 - 14:22

I have tried it out after I got my iPhone 15 Pro Max last October and found that it worked well. Maybe it‘s more precise with the LiDAR scanner. Also, you definitely need a bit of practice which means that you will not have the best experience at the very beginning.

By Ollie on Tuesday, September 24, 2024 - 14:22

What use cases? I'm on the iPhone 15 pro and even using it on a flat surface pointing at symbols it was erratic, at best.

I know some familiarisation with these technologies is required, but this is so difficult to use that the steepness of the learning curve renders it pointless. Boom boom... Pointless! Get it?

By Brian Giles on Tuesday, September 24, 2024 - 14:22

Agreed. How is there no good documentation on how to actually make this feature work? Of course Apple demoed it under perfect conditions.

Interesting that the new live recognition in VO actually uses the magnifier app. I also don't like that it uses its own speech instead of just VoiceOver, even though it seems like it's using VO speech settings.

I wonder if point and speak is primarily meant for people with usable vision.

By Tyler on Tuesday, September 24, 2024 - 14:22

Member of the AppleVis Editorial Team

My understanding is that Point and Speak is intended mainly as an augmentative aid for those who have enough usable vision to know what they're pointing their finger at, as it seems that you have to be pointing directly at something to have it spoken.