The best app I've seen for doing this is Seeing AI. If you use the setting to read short text it uses the video to capture text. So as you move the phone and it sees something different or if the display changes Seeing AI should capture that.
The problem is that if there are several items on the display you might not know exactly where the camera is pointing and which element is being read. But give it a try and it might help in your situation.
Hi. Seeing AI is okay, however it does not do a great job with text that is frequently updated on an LCD display, like that you'd find on an amplifier or thermostat.
It does not read 7 segment displays like those found on microwaves or clock radios very well.
If you're looking for an app to do this, I'd recommend Aira or Be My eyes as a human can interpret what's happening much better than the software in your phone.
yes, if Seeing AI can't do it for you, you might try AIRA to get human assistance as the previous responder suggested. The first 5 minutes of each Aira call is free.
Another alternative that provides human assistance for free is Be My Eyes. Fortunately we have alternatives these days.
If you have a friend, relative, etc. you can even try apps like Skype, Zoom, or FaceTime.
I'm familiar with those volunteer services, very useful for those who live alone.
Not the case with me, I have my wife who can read them for me.
I just find it fascinating how machine vision and AI is still struggling with something that seems apparently so simple.
But not so fast, this is more complicated than anyone thought, experts say.
First, let me just say that I discovered the be my eyes app about a year ago and it has proven extremely helpful in several situations. That said, I still see it as a fallback option when no other existing adaptive technology will get the job done. Being able to access the displays of every day household appliances seems like a basic daily living need that current low vision tech has yet to address. We should insist on more, more from the app makers, more from designers of low vision tools and more from the manufacturers of these products that aren’t the least bit accessible. There’s got to be a better solution than having somebody with vision read it for us.
I mean, machine learning and artificial intelligence are doing exponentially more complex and amazing things than reading LCD displays. Why the heck can’t they get this figured out?
It is easy to say that it should be simple to do. If it were, I figure that it would have already be done. Luke: Got any ideas on how to accomplish the task? This isn't meant as a criticism, but I'm betting that developers would love good input.
Comments
Reading Appliance Displays
The best app I've seen for doing this is Seeing AI. If you use the setting to read short text it uses the video to capture text. So as you move the phone and it sees something different or if the display changes Seeing AI should capture that.
The problem is that if there are several items on the display you might not know exactly where the camera is pointing and which element is being read. But give it a try and it might help in your situation.
--Pete
Reading appliance display with Seeing AI
Hi Peter,
I tried Seeing AI, it reads almost everything printed on the appliance, but it can't read the LCD display no matter what I do.
LCD/7 SegmentDisplays
Hi. Seeing AI is okay, however it does not do a great job with text that is frequently updated on an LCD display, like that you'd find on an amplifier or thermostat.
It does not read 7 segment displays like those found on microwaves or clock radios very well.
If you're looking for an app to do this, I'd recommend Aira or Be My eyes as a human can interpret what's happening much better than the software in your phone.
Aira and Be My eyeEyes
yes, if Seeing AI can't do it for you, you might try AIRA to get human assistance as the previous responder suggested. The first 5 minutes of each Aira call is free.
Another alternative that provides human assistance for free is Be My Eyes. Fortunately we have alternatives these days.
If you have a friend, relative, etc. you can even try apps like Skype, Zoom, or FaceTime.
Hope that helps.
--Pete
Nothing beats the human eye
I'm familiar with those volunteer services, very useful for those who live alone.
Not the case with me, I have my wife who can read them for me.
I just find it fascinating how machine vision and AI is still struggling with something that seems apparently so simple.
But not so fast, this is more complicated than anyone thought, experts say.
Apps like Be My Eyes shouldn’t be the only solution
First, let me just say that I discovered the be my eyes app about a year ago and it has proven extremely helpful in several situations. That said, I still see it as a fallback option when no other existing adaptive technology will get the job done. Being able to access the displays of every day household appliances seems like a basic daily living need that current low vision tech has yet to address. We should insist on more, more from the app makers, more from designers of low vision tools and more from the manufacturers of these products that aren’t the least bit accessible. There’s got to be a better solution than having somebody with vision read it for us.
I mean, machine learning and artificial intelligence are doing exponentially more complex and amazing things than reading LCD displays. Why the heck can’t they get this figured out?
regarding Luke's post
It is easy to say that it should be simple to do. If it were, I figure that it would have already be done. Luke: Got any ideas on how to accomplish the task? This isn't meant as a criticism, but I'm betting that developers would love good input.
Envision AI
Although it is still kind of hit and miss, I've sometimes had luck getting Envision AI to read my oven's display and my AC's temperature display.