Hi all,
I'm currently in the process of upgrading my iPhone 12 Pro and debating between regular 16 and 16 Pro. I got the 12 Pro because of the Light IR.
I'm just curious - for short text detection, do you prefer Seeing AI or the text detection built into the Pro iPhones?
I find Text Detection works almost similar to Google Lookout on Android, which I love how that reads (as in, it actually reads price tags in order). Seeing AI doesn't read price tags in order. I'm just frustrated that I have to spend more to get just one feature but switching to Android would be too difficult at this point (parents & friends use iMessage and we use Find My, plus I love my Apple Watch and AirPods Pro 2).
I definitely need to upgrade because I need a better, longer-lasting battery. USB-C would also be nice to have, especially so my partner (Android user) and I can share a charger when travelling, plus I have an Apple gift card to use.
Comments
One man's oppinion...
Hi Chamomile,
I have the text detection on my SE3. I seldomly use it. While having a simple gesture to activate/deactivate it, I can get comparable functionality with Seeing AI and a shortcut or 3.
. . .and the shortcuts are already builtin to Seeing AI. We also can assign shortcuts to gestures which makes shortcuts a blessing to work with in Seeing AI.
To answer your question; I would say the text detection is nice, it is quite responsive and easy to get to in a tight situation. However, I enjoy what Seeing AI offers over this new Recognition feature.
Perhaps I am a creature of habit, because Seeing AI ends up being my "go to" app. 🤷
These are, of course, just my own observations and opinions.
Your mileage may vary. 🙂👌
Thanks Brian
Thanks for sharing your thoughts Brian :) I didn't know Text Detection was on non-Pro iPhones - I do wonder if Light IR increases accuracy?
I do tend to go for Seeing AI first but it tends to read in a garbled mess sometimes, though Text Detection does the same. Just wish it would be like Lookout and slow down and read things in order. Seriously, sometimes I swear Seeing AI & Text Detection reading garbled is a representation of my mind lol. I'm sure I can create an activity to slow down the speech slightly when opening the app.
I do really like other features of Seeing AI - was going through one of mum's cookbooks and scanning recipes, which worked great, and the World feature can be handy which also uses Light IR I believp, and I'll have to look into the Shortcuts.
LiDAR
Light Detection And Range is a type of distance measuring. I believe it is utilized here for the door detection feature, which sadly is the function only available on pro models. So if you wanted a reason for going pro, that would be one reason I suppose.
Not sure if there are any other pro-exclusive features with regards to recognition at this point in time. 😓
HTH.
Yeah
I don't use people or door detection at all, I hate having my phone out most of the time because I'd be juggling a cane and a mini guide and a phone. Unfortunately I'm not an octopus :P And they don't turn off in Detection Mode even though I thought I turned them off, which is frustrating.
In that case...
In that case you would do well with either the iPhone 16 standard or even the SE4 coming soon to a cellular provider near you. 😁👍
Since I installed iOS 18 I…
Since I installed iOS 18 I've started using text detection more And I really like it. The way live recognition takes the detection options from the magnifier app and makes them easier to get to is nice. The four finger triple tap makes it easy to toggle on and off, and then the options for what you want to detect are along the bottom of your screen, with a stop button below them. Putting live recognition in the rotor would make for another quick way to get at what you need, though I haven't tried it yet.
There are six live recognition options, with a regular iPhone you get half of them -- text detection, scenes, and point and speak. The pro also gives you people, door, and the new furnature detection because they require lidar. I don't know if the lidar sensor would make the other options more accurate. To me, it wouldn't make sense. Someone who has a standard and a pro iPhone should do a comparison podcast. lol
Interesting
I'm not sure how often I'd use the other detection modes, the Furniture detection sounds similar to World detection in Seeing AI which also uses Light IR. I usually have a mini guide with me, so that deals with the obstacle detection for me, but having those other detection modes have come in handy when my mini guide broke and I needed to order a new one. How does the scene detection in Live Recognition work?
LiDAR alternatives?
so this is a little outside of the topic, but I will try to stay within those boundaries. Recently, somebody posted a new app that is similar to Be My Eyes on here, called “Speakaboo”. I mentioned it, because you can ask it a question like, “how far away is the table, chair, etc”? What you will get is a result something like, “the coffee table is approximately 1 m away, etc., etc”.
As I mentioned above, I have an iPhone SE3, and do not (so far that I know) have a lidar capable device.
So, my question is is this feature in the Speakaboo app just random, is it accurate, do we really need LiDAR?
Thoughts?
Wow
I'll have to check that app out, that sounds really cool.
I've decided on the iPhone 16 (in pink, if anyone cares). I'm just worried because I don't think the battery life is as good as the Pros.
Speakabu doesn't seem to work for me.
Whenever I try and take a photo in the speakabu app it says I'm a text based AI so can't help you with immages.
Has anyone faced this issue?
Keep in mind that LiDAR is…
Keep in mind that LiDAR is not only for distance measurement. LiDAR helps in instantly modelling the real world, including text, which should give you more precise output in text detection mode.
My sister, for example, uses a non-Pro iPhone 14, and the text detection is significantly slower compared to my 15 Pro Max which utilizes LiDAR.
On my end, the result is almost instant with great precision, whereas, on my sister's iPhone 14, finding text takes a lot more time and effort.
Nevertheless, it's a great decision made by Apple bringing the text detection, point and speak and so on to the non-Pro models as well.
detection features
just to clarify, have text detection, point and speak and Seen detection been available on non-pro iPhones all along and I’ve just thought that these features were only available on pro iPhones, or is this a new thing in iOS 18. The reason I ask is these features are available on my iPhone 14 running iOS 18. bare in mind that this is an iPhone 14 not a 14 pro or pro Max.
I'm not totally sure, but I…
I'm not totally sure, but I think those features were added to non-Pro iPhones with the release of iOS 18.
iOS 18 feature
I am running an iPhone SE3 and did, not, have this in iOS 17.
Does not work on a SE 2020
Has anyone got this working on a iPhone SE 2020? When i do the gesture nothing happens and if I activate it in the rotor it says it has activated but when I check again it still says that it is off.
This might be a language-specific issue as well as I've seen reports from others in Sweden who are having the same problem, so it would be nice if someone here could clarify if it works in English.
Overall I'm a bit confused about this, since everything Apple writes about it on their support pages suggests that this is a LIDAR-only feature, but since you lot got it working on non-LIDAR phones that does not seem to be the case.
I think those features would work much better with a phone with
I have a 13 mini and they don't work all that well for me. Pretty useless, as a matter of fact, but if I had a pro model, I'm pretty sure I could get a lot more out of them.