Introducing New App: Object Voice

By E7 Company, 12 February, 2024

Forum
iOS and iPadOS

Hey everyone,

I'm thrilled to announce the launch of my latest iOS app, which I've been working on tirelessly for the past few months. This app takes object and animal detection with AI and machine learning.

https://apps.apple.com/es/app/object-voice/id6477760948

Object Voice can identify over 1000 different objects and animals simply by pointing your iPhone camera at them. What sets this app apart is its seamless integration of AI technology with the native capabilities of iOS. When the app detects an object in the camera view, it announces its name using the synthesized voice of your iPhone, providing an intuitive and accessible user experience.

One of the key features of Object Voice is its ability to work offline, making it perfect for use in areas with limited or no internet connectivity. Additionally, I've focused on making the app as user-friendly as possible, with a clean and intuitive interface that anyone can navigate with ease.

But that's not all – Object Voice also offers the option to detect objects from images stored on your device or iCloud, giving you even more flexibility and convenience.

I would love to receive feedback on the app so that I can continue to improve it. Your insights and suggestions are invaluable in helping me make the app even better.

Please feel free to share your thoughts and experiences with me—I'm eager to hear from you!

Thank you for your support!

Options

Comments

By Brad on Thursday, February 8, 2024 - 21:04

I'll try the app but I honestly can't see the point of it. I mean where would you use it in a day to day situation?

We have apps like BeMyEyes and seeing AI for pictures and offline is nice, for those that need it but again I just don't see the point.

The only thing I can really think of is if adoor is open or shut.

Sorry for being such a negative first comment. i'll try the app and let you know what i think.

By Brad on Thursday, February 8, 2024 - 21:04

I didn't know it was paid for and i'm sorry to say, I don't think blind people are going to use this ap when we already have free versions that can do this but with more features.

If someone is willing to give it a try, it's £2 something on the app store in the UK, so probably about $3 something.

By Siobhan on Thursday, February 8, 2024 - 21:04

Seriously, I get that things aren't "for you" as you so eloquently put it. Can any app, paid or not, just get a you're doing a good job, hope it works out? Instead, it's it would be better if... Inserting a plethora of suggestions which may or may not help the developer. Not everyone has to like something. It'd be nice to see positivity though. Not just your comment, others on here. Myself included, I've tried not to jump to conclusions quickly. Guess I ask to much for someone to think before they speak. That's my mistake. :)

By Siobhan on Thursday, February 8, 2024 - 21:04

Hello. If you are the same person who has been on Color Voice and i think there's one or two more I can't remember, i have on suggestion. It's a personal annoyance of mine to have something like Voice in an app title. Not because I am trying to be discouraging, because we use voice over in general, i guess it's just not something i'd put in. again, this is a personal annoyance, nothing to worry about. Having said this point I have one more question. You state you can detect over a thousand objects. Will the updates increase this amount? Or is there a way maybe when online, the user can suggest to check for updates, within the app possibly? I'm not saying updating the app from the store. I'm suggesting something like a notification for say another thousand objects every month or so. It might also be helpful for the person to hear the categories. say birds, reptiles, household objects. I have not used your app so these are just off the top of my head.

By Dennis Long on Thursday, February 8, 2024 - 21:04

$3 is a bit high when there are free options that will do a lot of the same thing. I was going to try it until I saw the price.

By Cobbler on Thursday, February 8, 2024 - 21:04

I can only imagine how disheartening it must feel as an app developer to pour months into creating something to help people, only to have it met with negativity and complaints about pricing. Developing any app takes an incredible amount of time, effort, and resources - it is far from easy or free.

While I understand the desire for free apps, we have to realize that developers are people too who need to support themselves. The one-time $3 price seems more than reasonable for an assistive app that offers to help its users in their daily lives. It's hard to imagine that the developer is going to become rich off the back of it.

I find it so frustrating when our community immediately meets new apps and services with negativity rather than considering how it might offer value or interest to others. It's perfectly reasonable that an app doesn't offer anything to you personally, but having choice and options in our toolboxes is no bad thing. Supporting developers allows them to continue creating apps that only add to the that choice.

E7 Company, congratulations on the release of your new app! 🥳

By E7 Company on Thursday, February 8, 2024 - 21:04

Thank you for any feedback, as there are no good or bad comments, it's all feedback and I understand and accept it.
I realize the app is paid, which may deter some from trying it, but I believe this monetization method is preferable to using subscriptions or in-app ads, which can be more intrusive.
Also, I understand that if the app doesn't offer you anything, there's no need to try it.
I just wanted to share that I've created this app, and it's another option to consider in the store.
I know there are many similar apps, which is why I seek opinions to improve it.
I've done the same with my previous apps (QR Voice, Color Voice, and OCR Voice), and I greatly value all contributions. Thanks to user feedback, I've been able to update and add improvements.
One of the ideas for Object Voice is to continue training the AI with new objects. I can't promise 1000 more every month, but I'll update it as much as possible.
As for adding "Voice" to my apps, it wasn't my initial idea, but the name I had in mind was already in use, so I had to use this alternative word.
Thanks again!

By Holger Fiallo on Thursday, February 8, 2024 - 21:04

Nice job. Please continue to do what you want regarding this app. Sad there are some who want things free and complained all the time. As a blind person I say Thanks for taking the time to create it. I might or not get it. Do not need it but want to say thanks.

By Jimmy V on Thursday, February 8, 2024 - 21:04

thanks for this app. The fact it works offline is going to be of help to alot of people
Keep up the good work!

By Erick on Thursday, February 8, 2024 - 21:04

I tried the app, and while I do thank you for adding offline accessibility, it is extremely inaccurate. So thanks, but as some people have said, we have other applications like be my AI.

By Brad on Thursday, February 15, 2024 - 21:04

App devs need to research what's out there before reinventing something.

I do think writing that it's a paid app would help those who rread this post decide if they want it or not.

I said I was going to try it because I thought it was free, I won't make that mistake again because now I feel bad that I can't give my honest feedback on the app itself.

By Brad on Thursday, February 15, 2024 - 21:04

When you say it recognises objects, what does that mean? Does it mean if I hold a box of something it will read the barcode, or will it tell me it's a box of whatever it is?

I'm sorry for being so negative but I really can't see a situation in wich a blind/vi person might use this, could you give me an example?

The only one I can think of is if you drop your keys but then we have BeMyEyes.

This is why I say research is so so important before making apps that devs think we might need.

By Brad on Thursday, February 15, 2024 - 21:04

Siriusly? You want to bring my writing stile into this? Would you have prefered that I write: dear sir/madam,

I regret to inform you that I will not be downloading your applocation from the applocation store. There are a plethora of applications that already perform this and if not this then a very similar function.

hoping you are well, Brad.

Well guess what, we're not in that time piriod anymore, just because I'm English does not mean I have to write like I'm Jane Ostin.

By Jason on Thursday, February 15, 2024 - 21:04

E7 Company, grate work! I have some feedback, but let me start by telling others here about your app.

I think this app is excellent, practical for daily use, and well thought out! Let me tell you why.

Object Voice offers real time recognition. Apps like be my eyes, chatGPT, and Envision AI, among many others, have a huge Achilles heel. They all need a stable internet connection. Unlike Object Voice's only requirement (aim the camera), the rest require user interaction, too. They all require the user to line up a shot, take the picture, then they require a live Internet connection to upload the picture, wait for it to be processed, then download and speak whatever was generated.

Combined, thest steps take time. My personal best is 15 seconds. If you want to be inconspicuous, more power to you, that's challenging though. With those apps, you would stick out like an even bigger sore thumb than you do with your cane or guide dog.

Though I don't mind asking for help when I need it, being able to do things for myself is always awesome! There are many things I'm told I can't do, so doing those things safely and independently always brings out my inner kid. Insert Bronx cheer.

With this app, I can see myself walking into an office, locating the counter, doing my business, locating a seat, locating the door, you get the idea. I get all this with no interaction from me. When I hold my phone in one hand and my cane in the other, I can focus on the journey, not the fiddling details and follow up questions.

Object Voice is valuable to me in so many ways. I can use it to help me find the door to natures phone booth on a plane, find seats and doors on subways, even find my camp chair while I'm so far in the sticks that radio waves fear to follow.

Then there are emergencies. Hurricanes and tornados tend to take down cell towers, fires and blackouts disconnect power lines, and you just can't find your ... whatever important thing. I trust I've made my point, the Object Voice app is perfect for any situation where you need to find something, you may or may not have internet, and can't have, or don't want, human assistance until trying for yourself.

E7, I have some feedback for you, as you asked, but lets tackle the elephant in the room, things people are saying.

I respect all of you, we have faced many of the same challenges, and have each faced challenges that the rest of us can't imagine. Please remember that I can respect you and dislike your comment at the same time.

Respectfully then, I've read some comments that absolutely disgust me on this particular thread. The entitlement some comments show is incredible, and shows a lot of immaturity, a few chips on shoulders, and little to no empathy. there are two in particular I'd like to respond to.

I'd like to respond to this comment first: "To many products, I don't know anyting about this one, but others, are dreampt up by some sighted person who thinks he/she has solved blindness and wil be our sighted saviour."

First, how do you know if this person is sighted or not?

Second, the app has been presented as a tool, not salvation. No praying or idols here. The developer made this clear when they "asked the blind community" what we want. Such humility is hard to come by nowadays. Put yourself in E7's shoes and ask yourself this. If this is how this community treated me, why not find somewhere else that appreciates my time?

Third, things that "some sighted person drempt up" opened the world for the VI community. Here are two examples. Thomas Edison's idea for recording human speech lead to talking books. The "father of speech synthesis" was in fact sighted. When sighted people began using "micro computers," his efforts meant we had an equal chance to enjoy the "digital revolution!"

Comments like these are not only rude, but needlessly hurtful. We are capable of so much more. Even if we don't like the food, let's not bite the hand that offers it, maybe next time it will offer something we do like.

Finally, as the comment says, "I don't know anyting about this" app. I respectfully submit that conversations like this belong in another post or comment. Someone respectfully and humbly asked for help. If you are visually impaired, no doubt you have asked for help. Instead of offering help in the same way each of us would like to be helped, we roasted the person who needed, and deserved, kindness and assistance. Whether the app was good or bad, this person put their hart in it, hoping it would be helpful. Once we heard about it, we all rush to tell E7 Company how ugly their baby is. Make no mistake, when you put your hart into something for months, it feels like its your baby, and when you release it it feels like taking the kiddo to school the first day and hearing gossip about how terribly you dressed them. We can respect a person's time, respect the app, and still not like it. We can be tactful in our responses.

I have only one more thing to say. I've read several responses saying the price was too high. It is $2.99, about half the price of a cup of coffee. Unless you go to McDonalds, in which case it's about the same price. I read several comments saying: "I was going to try it but it costs too much." Really? First, Apple has a refund policy, you can get your money back if you don't like the app. Second, a measure of additional independence is worth missing at least one cup of Joe to me.

E7 Company, keep rocking it! You did a very good job, this app does fill a need for me, and thus for others who may be in similar circumstances or situations, and I thank you for making it. I was happy to pay $3. Blindness is a niche market, you could've created a flappy bird remix and made more money, but you chose to focus on helping others instead. Thank you.

There are a few improvements I'd like to suggest, these are free ideas I'm voluntarily sharing, so take them, burn them, or otherwise do with them what you will.

I don't know if something is visible in the window other than the "Open Document" and "Info" buttons, but there isn't anything that VoiceOver can detect. I suggest you put a label, maybe a status message or a pause/play button, something we can easily find, on the screen. When the app starts, VoiceOver (VO from now on) took me to the status bar where the time and battery info is. That means there is no way for people who aren't used to finding controls by touching their screen to navigate to the app. Well, there is a way, but it involves the VO rotor, which some newbies struggle with. Giving us a big area we can touch, even if it's invisible, enables non-touchers to get access by tapping anywhere on the screen. Setting focus to this control whenever the app starts or the user returns to this screen would also be awesome.

The design with the two buttons on the top is fine, but it would be easier to find if they were tabs along the bottom of the phone. Several apps do this, especially VI apps, so we naturally start there. I'd suggest: Now, Photos, Help, and Settings. Now for the current main screen of the app. Photos for opening photos. Help, some basic documentation. One very accessible way to show docs is to use a WebView. Settings, well, see below.

The app uses the default speed, voice, volume, and pitch when it speaks. Many users, including myself, don't know where to change that in IOS, so it can be frustrating to listen to. Daily TTS users often speed up the voice output, allowing time for more information. I suggest either a button that opens the correct area in the IOS Settings app, or controls so the user can customize the voice for your app. I have a voice I use for VoiceOver, and I like to assign a different voice to apps so I know which one is talking. If you go this route, please save the settings in iCloud, but not in a way that changes all devices. once the bigger issues are resolved, a feature that allows the user to load settings from iCloud for another device would be cool. Note that even if the user loads the settings from another device, they should not be synched, but saved separately.

Just two more ideas. First, I couldn't get the app to identify people. That's very useful. I don't mean recognizing who a person is, but recognizing a human. Why?

If I'm walking to a chair the app has identified, I'd hate to be called a pervert for sitting on someone's lap. There is quite a difference between: "chair," "empty chair", and "person in a chair." I know this from first hand embarrassment ... umm I meant experience.

Finally, I suggest giving us a way to pause the app. Continuous speech is awesome while we need it, but may become overwhelming when we don't. It does for me. Locking the screen works, but the button idea would be more elegant.

Many apps respond to the "magic tap." That's two fingers double tapping the screen. Choosing a different tab should pause also. I noticed that the app kept speaking when I brought up the "Info" screen. Thank you again, E7! Whoever is still reading after all that text, thank you too.

By Andy Lane on Thursday, February 15, 2024 - 21:04

So much ugliness in this thread. We can should and must do better than this. Someone has spent their time and effort on designing an offline, inexpensive app to try to solve problems for our community and this is how we thank them. The pricing is very affordable, the app may or may not offer genuine advantages but let’s find out and provide feedback. I’m frankly horrified by the vitriol our community has shown toward someone’s hard work. Absolutely disgusting behaviour. If people treat hard work and consideration like this then we don’t deserve to have people trying to improve our lives.

By mr grieves on Thursday, February 15, 2024 - 21:04

Thank you so much for posting that. I've been reading the comments on this thread with a lot of discomfort. A similar thing happened on the TypeAhead thread.

Your description of the app was great as it makes me understand what it is about and why it might be useful compared to the competition. I don't see why devs should have to give away apps for nothing and the cost of this is really nothing much. So I'll be happy to give it a try when I have a moment.

I love Applevis and always have a good experience here, but I don't see how this negativity is helpful. I think it is reasonable to ask why the app is going to be better than the free alternatives, and Jason has kindly and eloquently provided that answer. But if it's not of interest, no one is forcing you to buy it.

As someone still struggling to come to terms with my sight loss, I realise that there can be some friction between the sighted and the blind worlds. Not all help is appreciated and it can be a bit of a sensitive subject. But this isn't the same as someone grabbing you and pushing you about without asking. Regardless of the motives for developing it, I don't really see this as something being thrust on us. If you don't like it, feel free to ignore it. But I struggle to think of any way that the existence of this app can be a bad thing.

So, thank you dev - I will give it a try and post on here if I have any feedback. I do agree about person detection. I'm always worried about sitting on someone's lap in a waiting room. Usually I just make a joke of it but I'd rather be able to figure it out myself.

Given the app is using AI, how are the hallucinations? I always think back to one experience I had with Be My My AI which told me where something was, so I tore the room apart trying to find out and it had just invented it. I'd hope that maybe this is a bit more reliable. (I should say that I do love Be My AI - I just needed to change my expectations a little)

By Holger Fiallo on Thursday, February 15, 2024 - 21:04

Makes me feel like some are entitle. Blind people, what can you say. no social skills or inability to provide respectful feedback. Now do not jump on me for my view. This is what I observed here. Thanks.

By Brad on Thursday, February 15, 2024 - 21:04

I honestly must admit, I completely forgot about apples refund policy. I'm going to give this a shot and give you guys my feedback.

By Ash Rein on Thursday, February 15, 2024 - 21:04

I don’t think people are realizing that something like this can be used with Vision Pro. Instead of taking pictures, we’re just looking around and it’s telling us what it sees. I did buy the app myself and it works extremely well. It’s very accurate and provides the information in a very concise way.

Definitely an app worth having when Vision Pro or other smart glasses become more prevalent. Glad that I got to buy this. I hope the development continues and it can be part of my overall tool kit. Live description apps are what we need.

By David Goodwin on Thursday, February 15, 2024 - 21:04

I want to thank everyone who has thoughtfully pointed out reasons why we as a community should be receptive when developers share new apps, even if those apps don't interest us personally. It is so important that we create a positive and welcoming environment that encourages constructive dialogue.

"In particular, I appreciated Jason sharing his personal experience using the app and providing feedback on future improvements – that is immensely valuable for potential users and for the developer and demonstrates constructive dialogue at its best. Even if an app does not suit someone's exact needs today, getting insight and ideas from this community undoubtedly can help shape better versions for tomorrow.

By Brad on Thursday, February 15, 2024 - 21:04

I did buy the app and got a refund but I honestly completely forgot that refunding was an option.

I shouldn't have been so negative in my comments and tried the app first.

unfortunetly I found that the detection was bad, I pointed it at a door and it told me it was a chest of drawers, a quill, and other things, (I made sure to remove the stuff on the door first,) so for me at the moment the description isn't good.

It might have been because I didn't turn on the light as I don't do that often but it did recognise my quilt but I wasn't pointing at it fully.

By mr grieves on Thursday, February 15, 2024 - 21:04

I've given it a quick go and so far I'm not quite as convinced as the others.

I started out pointing it vaguely at my Mac and it said "library" a few times.

I then moved it around the room and it said book shelf and couch (well, sofa, but I'll let it off :)), radiator and sliding door. Igot studio a lot.

It kept saying shoe store and shoe shop a lot for some reason. Not sure why - as far as I know there are no shoes about.

And vaccuum cleaner, which I don't have in this room. But it's convinced there is one in here.

It got very excited about my desk and kept repeating it over and over again. It ignored everything on the desk, but at leat it found it, and my chair.

I tried again and it told me there was a taxi cab in my office. And a matchstick apparently.

As I put my phone down it said "Projectile, missile!" which was a little alarming.

The other thing I noticed is that it seemed to go quiet a bit - maybe it just couldn't decide which bit of clutter in my room to comment on or maybe it had nodded off, not sure. But there's plenty of stuff about.

So at the far side of the room I pointed it at the dog bed. I was curious to see if it would tell me if my dog was in it or not.

It said "sliding door", then "forklift". I think it has a particular problem if you move the phone quickly - it will just seemingly blurt out random words. It said mouse trap at my desk.

I don't want to be negative, particularly after my last post, but so far it has been pretty unable to tell me about any object I wanted to know about and is hallucinating like mad.

Maybe as a tool it works better if you have a specific task for it to perform and you do it very very carefully. So maybe just messing about in my office isn't a realistic test, but so far I think it needs an awful lot of work to be useful.

By E7 Company on Thursday, February 15, 2024 - 21:04

Thank you for all the feedback received for my app Object Voice.

Every contribution is incredibly valuable and appreciated on my end, as it helps me improve the app. I understand that I can't know everything that users would like from the app, which is why I've created this post to gather feedback from the community and enhance the app.

A special thank you to Jason for his opinions, motivation, and contributions; they make all the work worthwhile. Regarding the feedback, there are several points I'll address for the next version. Firstly, the interface—I acknowledge that it's not the best and I need to better adapt it for VoiceOver. The idea of tabs at the bottom of the app seems fitting, and I regret not thinking of it earlier.

Additionally, apart from the current buttons, I'll add two more: one to stop object detection and another to directly open the device settings to select the voice and other attributes. Regarding the button to stop and restart object detection, I'm considering whether a button or tapping directly on the screen is better. What do you think?

Apart from that, as I mentioned in my previous response, I need to improve the AI by training it to detect more objects. I know that sometimes, due to lack of light or sudden camera movements, errors in detection can occur. It's not an exact science, but I'll strive to improve it.

I believe it's a great idea to start with detecting people, and I'll aim to have it available in the next version. However, training machine learning models is quite costly.

Furthermore, one idea I'm considering is the ability to test the app with Apple Vision Pro, but unfortunately, it's currently not feasible for me.

Again, thank you for all the contributions.

For me, there's no good or bad feedback—I understand and respect any comments.

That's why I created this post; my goal is to keep improving the apps, and every comment, contribution, or idea helps me immensely.

So, thank you for everything!!!

By Brad on Thursday, February 15, 2024 - 21:04

Can you tell me how you imagine us using the app? I mean what would be your most ideal scenario?

By Morgan Watkins on Thursday, February 15, 2024 - 21:04

Thanks for the work that went into Object Voice! It's been a very long time since I personally wrote production software, but I absolutely understand how much effort has to go into any software product. Thanks for your efforts.

I decided it was interesting enough that it was worth spending the $3 to find out if it might prove useful to me. I happen to be totally blind and I was curious to see if it might do some extra work for me.

So, my initial impressions are a bit shallow right now. I am reacting after having played with it for only a few minutes, in between cups of morning coffee, so I will likely experiment with Object Voice a little later and hopefully offer more useful feedback.

First, Object Voice was easy to find on the App Store, easy to purchase and eeasy to Open. Object Voice began to speak right away with what it believed I was pointing my iPhone at. Now, I was sitting in my comfy chair in my living room and it first identified it as a shoe store. I might have had my shoe in the picture, but I think that is the only shoe it might have seen. When I pointed it toward my fireplace, it did identify the fireplace and the fireplace screen. That was nice. When I likely caught a bookcase in the frame, it said "library" at first, but then did say "bookcase." When I pointed the camera at the ceiling, it first said something like "bi-plane," but then correctly identified it as a ceiling fan. When I pointed the camera at my wrist, it did not immediately identify the object as my Apple Watch. I tried moving my wrist in the frame and it did not say a thing. However, when I flipped my wrist over, it said "band-aid." When I pointed the camera at the little table next to my chair, Object Voice made a lot of guesses as to what was sitting on the table, but I don't remember any of the guesses as being correct. Even so, it might have done better if I had worked on focusing the camera on singular objects. I suspect there is a learning curve required for me, and perhaps for the code.

Again, understanding this was only a first impression of the app, I did feel like it just kept talking, after I had already heard what it had to say. But, I appreciate what you are trying to do and I will give it more time to see how it might work for me. And, I hope you are very successful in making it a truly useful addition to my personal toolkit.

Thanks again for all your time and best wishes,

Morgan

By Holger Fiallo on Thursday, February 15, 2024 - 21:04

I think with time and more work on it by developer will get better. Hope so. Nice that the app is there to be tested.

By Rusty Perez on Thursday, February 15, 2024 - 21:04

I have not purchased this app but I just want to applaud you for your work and for recognizing the fact that an app that works in real-time is important.
One of the things I like about Seeing AI is how it works in real-time reading text around me and indicating if the edges of a document are in view.
So going forward, developers need to incorporate this method.
About the app being discussed.
I know these are early days of AI, but having an individual app for recognizing objects, and another one to recognize colors ETC is just so tedious.
In fact I have the same comment about Seeing AI though it is easy enough to flip the channel and put them in order of importance.

I know, what I really want is for the app to intuit what I want when I want it, and that's not possible yet. So I'm just sharing a little aspirational thinking right now.

Again, I applaud you for your work! I'm sure it will find its users and your work will help improve AI tech.

By Missy Hoppe on Thursday, February 15, 2024 - 21:04

First of all, I'd like to thank the developer of this app for his time and effort. Even if the app isn't something that everyone out here wants to use, which is understandible, a lot of time and effort went into it, and at the very least, that deserves recognition and appreciation. I've always been taught to go by the old saying "if you can't say something nice, don't say anything at all." Sometimes, it's much easier said than done, and God knows I've messed up more times than I'd care to count, but at the same time, to the best of my knowledge, I've never knowingly criticized or tried to diminish the value of someone's hard work.
I couldn't even bear to read all of the hurtful comments directed towards this app developer. This is why the blind community can't have nice things, or, if we do get nice things, they eventually end up going away because of complaints, whether they're legitimate or not. It always makes me sad and ashamed when I come on here and find negative, sometimes even unnecessarily rude comments. If that kind of negativity makes me uncomfortable, I can't even imagine how discouraging they are to any developers who are trying to make helpful apps. In fact, I felt so bad reading some of these comments that I just purchased all of this developer's apps. The way I see it, I can never have too many vision assistant apps. Sure, I mostly use seeing AI, but I'd be willing to bet that I'll find uses for these new apps as well. At the very least, I'm looking forward to checking them out. I'm going to go hide under my rock now.

By Holger Fiallo on Thursday, February 15, 2024 - 21:04

Well put.

By Louise on Thursday, February 15, 2024 - 21:04

Well, I might just be contrary, but after reading all the negative comments, and the complaining that someone might wan! to get paid for there hard work, I decided to buy the app. I thought it would be nice to have an app where if I'm walking into a building, I could point my phone around discretely, and spot the counter, elevator, etc.

Unfortunately, I did find that it wasn't accurate It called my foot a sleeping bag, my coattree a quilt, and my dog a puff. That being said, I'm going to keep the app, so the developer gets a chance to improve it.

The developer asked about what people might like for pausing object recognition I personally like a nice big button just above the bottom tabs. Another thought would be if it's possible to assign a back-tap command, but I don't know if that works.

My other suggestion would be to see if it's possible for users to teach the app objects. It could have a button that says "what's this" and I could tell it what that object is. It would be fun if it would remember custom objects for just my device.

I could show it my dog, and rather than say black lab, it would say whatever I tel; it. That way, I could point my phone around my office, and hear Jess when it spots her.

Interestingly, I think it would also be cool if it had a search mode. I could go into the list of objects I tought it, and select the one I want to find. In this mode, it would ignore all but the thing I'm looking for.

I don't know if any of these suggestions are possible, but if they help, I'm glad to share ideas.

By the way, someone made a comment earlier about differentiating between empty chairs and occupied ones. If only I could tweak my guide dog's coding. LOL, she just assumes my only goal in life is to meet new people. . .

By Brad on Thursday, February 15, 2024 - 21:04

reinventing the wheel isn't worth it if you don't know what the community you're reinventing something for wants.

Maybe this app will improve, that's great,, but we have a right to say that we've already got x, why do we need y when x works just as well if not better.

People say comments I and some others have made are disgusting or hurtful but that's from your point of view, the dev actualy seams to want these types of comments to see where they can improve, which is a great sign.

I'd rather be critical of a thing, than just praze people constently for making something I as a blind person don't need.

By mr grieves on Thursday, February 15, 2024 - 21:04

This whole conversation has really stirred up a lot of contradictory feelings in me.

I agree with a lot of what Charlotte is saying, but not necessarily how it was said.

I find it a little odd when apps like this come along with no particular use cases and no motives for making it. We've no idea what the background of the dev is and whether they have been toiling away with a strong desire to do good for us, or were they just messing about with AI because it's new and fun and couldn't think of what to do with that.

However, I don't think it necessarily matters. At the end of the day, if I got an app that made my life better then I don't really care how it came about.

I'm a developer myself and often I am given no requirement and asked to build a feature that I know nothing about. So what I do is invent the use cases and requirements and try to build the simplest thing that can satisfy it.

Actually giving someone the wrong thing can help enormously with their ability to tell you what they actually want. And I think if you accept that then it's fine. I think that's what agile is all about - fail quickly and iterate.

There are a number of difficulties with an app like this.

Firstly, AI is notoriously unreliable. If I can't trust what the app is saying, what use is it? This is in contrast to Be My AI, whereas I know not to trust it but it is still adding a lot of value to me.

I hate to say it but after I used the app it was laughably bad. If it is telling me that there is a black cab, forklift truck and shoe shop in my little room and that missiles are raining down on me, how can I really trust what it says?

Secondly, it is going to be hard for the app to know what I am actually wanting to find. So it was happy to tell me over and over again about my desk, but not what was on it. Maybe I am looking for the desk in a reception or something, or maybe I'm trying to locate where I left something. If it spoke out every single object in my room then it would never stop.

So I think for this app to work the AI needs to be a hell of a lot better and maybe there needs to be a way for me to try to tell it what kind of information I am wanting. I'm not really sure exactly how best it would do it. I think it might be good to be able to look for a specific thing, and maybe have a list of favourites for things I tend to look for. Or maybe some sort of categorisation so you can give it a context - am I out and about just trying to navigate an indoor space, or am I looking for something.

Maybe it needs a way to drill down - so it can tell me desk, table, book shelf, and I can ask it what is on them. Or a slider for the level of detail - so either it's set at the level where I just get desk, bookshelf etc, or I can slide it down and it will just describe the smaller objects.

But the app should decide what it wants to be. What problems is it solving? It's also OK if it just wants to specialise and become just a way to navigate indoor spaces.

I do wonder if AI is good enough to solve these problems right now, but would be very, very happy to be proven wrong.

Regarding this being an advert - I guess it is and I don't know about applevis policy, but how else am I going to hear about it? And if we can have a polite and constructive discussion about it here then I personally am fine with it.

Going back to the negativity, I think it is healthy and helpful for us to question where an app comes from and what expectations the developer has as long as we can keep it polite and constructive. I think most comments on here could be justified if they were worded a little more tactfully.

By Morgan Watkins on Thursday, February 15, 2024 - 21:04

Hello, again,

I must admit that I was intrigued by one of the suggestions. I, too, would find it handy to be able to tell the app what I was looking for and then wait for it to spot it as I panned the environment. For instance, my keys, my dog's harness, my wife's cell phone, a package on my porch, my favorite coffee cup, a leash, that darn bottle of Tylenol, and my mandolin case in a hotel room.

I have had a chance to experiment with the app and the number of incorrect guesses do make it a bit too chatty with incorrect information. But, I choose to be hopeful. I hope Object Voice becomes more accurate and less rapid fire guessing. And, I would love the two-finger double-tap to silence it when I need to pay attention to something else, but still want the app waiting to be restarted with a repeat of the same gesture.

And, absolutely yes, it would be handy to know if there were people in a room, or my dog. Admittedly, I have not really tested that yet, but I did deeply appreciate the request that the app identify whether a chair was empty or occupied. I've only sat on someone else's lap once, close to fifty years ago, and I am still embarrassed by it. (I left the theatre.)

Best wishes,

Morgan

By Panais on Thursday, February 15, 2024 - 21:04

This last comment regarding sitting on someone’s lap was hilarious. Best thing I’ve read on this site I think.

By Dave Nason on Thursday, February 15, 2024 - 21:04

Member of the AppleVis Editorial Team

First up, thanks to the dev for trying to create an app that could be of use, and for reaching out about it here. Of course, nobody is obliged to buy it if they don't want to. It's perfectly fine to question its usefulness and make criticisms too as long as we're nice and polite about it.
I agree that developers, and indeed anyone planning to invest their time and or money in something, should do their research first. I would not though jump to the conclusion that this dev did not do so. They do appear to have a USP in mind at least.
That said, from what I've read, I'm struggling to see what the app would offer me. As well as the above mentioned apps like Seeing AI and Be My Eyes, the Camera app on the iPhone already does a pretty decent job of live object recognition. Simply open the Camera on your iPhone or iPad and put VoiceOver focus in the view finder window. It will speak the items it sees. Based on what I've read here so far, this app is not improving on that, and in fact may be worse as it's hallucinating. Admittedly I haven't tried it yet though.
Apple don't call it AI, as lets face it that's a bit of a buzz word at the moment, but it is machine learning and I suspect we could see it continue to improve. It also works offline. The magnifier app of course also incorporates these kinds of features.
I'd agree that being able to specify an object that you are looking for could be extremely useful. Am I mistaken or does EnVision AI have a feature like this?
On the pricing question, we can't have it every way. When developers come to us with as subscription model, people complain and say they prefer to just buy apps outright. When this developer does that, and at a very reasonable price, people complain that it's too much. It's less than half a pint of beer where I am. A free trial would be nice, but unless I'm mistaken, Apple don't allow for free trials with apps that are purchased outright.
I wish the developer well however, and would be more than happy to find I want this app after all.

By E7 Company on Thursday, February 15, 2024 - 21:04

Hi everyone,

I'm excited to announce that a new version of the Object Voice app has been uploaded to the store!
This update brings a host of improvements based on your feedback and suggestions.

https://apps.apple.com/es/app/object-voice/id6477760948

In this new version, I've focused on enhancing the user interface to provide better compatibility with VoiceOver. Also, I've introduced a convenient button for pausing and resuming object detection, and you can now also initiate this action by double-clicking the screen. Furthermore, there's now direct access to language settings for voice preferences.

I want to extend my sincere gratitude for all the contributions and feedback I've received. Your input is invaluable in driving the evolution of the app. Looking ahead, I'm committed to continuing to improve the app's AI, with plans to enhance person detection and potentially introduce a feature for filtering objects by search criteria. I'm also eager to hear your impressions and opinions on the new UI.

As always, I welcome any new ideas or suggestions for further improvements. Together, we can continue to make Object Voice even better with each update.

Thank you all so much for your support and assistance.

Best regards.

By Holger Fiallo on Thursday, February 15, 2024 - 21:04

Did you address the issues people were having with not giving the correct description of objects?

By Missy Hoppe on Thursday, February 15, 2024 - 21:04

I love that this app can do object recognition ofline. I had fun playing around with it a little bit last night. My only suggestion would be an option to speed up the speech a bit, and it sounds like this has been addressed in the new update. If I have any additional thoughts once I get the update, I'll let you know, but as I said previously, at least in my mind, there's no such thing as too many vision assistants, as, at least in my experience, they all have subtle differences.

By Tristo on Thursday, February 15, 2024 - 21:04

This is a great app, but not all objects are accurate, so some room for improvement. I don't know how it would go because of privacy, but it would be great if you could add someone's face, and then when you scan and the camera finds their face it would tell you. Good job though

By Justin Harris on Thursday, February 15, 2024 - 21:04

Dave,
I'm curious to know more about how you get the camera to do object recognition because with my iPhone 13 pro, all it does is tell me how many faces are in frame, centered or not, but that's about it. Say, for example I want to get a pic of my dog? That's most likely gunna be a no go. Are there any settings I should change?
To the dev of this app, at this point I don't know if it is something I have need of right now, but I think it's awesome that you have released and are trying to improve it. One thing I might like, assuming I can't get the regular camera app to describe things that show up in the camera, would be, once we have found something we were looking for, let us also snap a pic of said item. Some might say this isn't needed, as the camera app already describes what should show up in a picture we're about to take, but all I can get it to tell me about are number of faces and where they are in the frame, but no info about other objects. So for me, once I've found what I'm looking for, being able to then snap a pic of said thing would be handy.

By Brad on Thursday, February 15, 2024 - 21:04

This is why people like charlotte and I get annoyed.

I've asked you multiple times now I believe to let me know where you think we'd use this app and so far you haven't. That tells me that you don't actually know a good use case for something you created.

Some may enjoy this app as a novalty but I really can't see people going outside with this thing and actually getting acurat info as to what is around them.

Others may think I'm being harsh and you can think that but the dev hasn't told us why the app was made, what they invision a blind person doing with it, and stuff like that.

At the moment, as harsh as this may sound, this is a toy, a very smart toy, but a toy. A blind person will not be able to go outside and find landmarks for example.

By mr grieves on Thursday, February 15, 2024 - 21:04

It did a slightly better job. It managed to see my border terrier as a norwich terrier which I presume is close. My whippet cross was sleeping in a dog bed with a blanket over her - it said sleeping bag a lot and then identified her as an Italian greyhound which was close enough.

But I still have a black cab in my office, now joined by a couple of go karts and a sawmill I think.

I'm not sure about my own suggestion of a slider now I think about it. I guess the app should ideally b able to tell hoe far away from something and be able to decide what level of object to describe based on that, but maybe that's asking too much.

But I personally wouldn't bother trying to add any new features until the accuracy has improved. If it's right only occasionally it's not going to be useful.

Brad and Charlotte - we don't know the motives of the dev as he doesn't seem to want to share them. Maybe it is just an excuse to play with AI, who knows. But ultimately it doesn't matter if we do get a useful app at the end of it. Time will tell if that ends up being the case here.

By Jason on Thursday, February 15, 2024 - 21:04

Hi again. I've been keeping up with this post and its comments, and I appreciated the responses I've seen. I appreciate feedback, especially polite feedback, so thank you. I'm glad my comment was received as it was intended, respectful constructive feedback, not an attack. I'd like to answer some things that were written.

@Charlotte

Thank you for responding. I agree with many things you said, especially that checking in with the community before putting a lot of hard work in to a project would help. It helps us because we can understand how the developer imagines the app being used. It helps the dev because they can get confirmation on weather the app, or specific features of the app, will be helpful. My objection was never what you said, but was with the way it was said. I agree some like to play the "sighted savior," there are also that have a genuine desire to help. I was just upset that so many of us in this community went carnivore on E7 Company.

Regarding this being an advertisement, this post was asking for feedback, but we had to pay $3 to give it. If someone wants feedback, don't charge for the privolege, give a free trial, even if it's only a three or four hour trial. I love apps that use in app purchasing to pay for a license after trying it out. Paying without trying means taking a leap of faith. I've been burned before, and I'm sure many of us have been. It just seemed people objected to the price itself. I do appreciate Apple's refund policy, it keeps me from getting burned again. Maybe I read something in to the comments that wasn't there. If so, I'm sorry.

I regret missing your other posts. I'm not the most active member of Apple Vis, I just happened to see this because I had a question I was going to post, and this one was near the top at the time. I'll give your articles a read, and thank you for contributing. Though I quoted parts of your comment, there was nothing personal, you were only one of many saying similar things. To be honest, I thought your original comment was more abrasive than it needed to be, and I appreciated your later responses acknowledging that, and clarifying what you meant. I'm sorry you and/or your previous posts received more negative feedback than positive. I hate for anyone to be misjudged or mistreated.

I wish we could all treat each other better, but we're all imperfect. I only wrote because many responses here broke my hart. I was so sad and disgusted I felt I had to say something. I'm sorry if I made you, or anyone, feel bad or picked on. My goal was only to encourage more positivity.

I found this comment: "It serves me right for forgetting that contributing content to this site is a waste of my time." Contributing to this site is not a waist of your time, you have valuable input to give. All I ask of you and everyone is that we be kind while giving that input. Your later comments clarified what you were trying to say better to me. Also, sometimes people are just shy, maybe they don't have an opinion, who knows. Lack of response does not mean lack of interest. You are part of this community, and whether responses are given or not, you do help shape it.

@Missy Hoppe

The feelings you described are what motivated me to comment. I'm usually more of a listener than a talker. Thank you for sharing.

@Louise

Nobody told you? All you have to do is plug your dogs tail into the USB-C connection on your computer. Just kidding, but I got a kick out of that.

The largest room:

Thank you all for being awesome. We all have rough edges, and we can each help smooth them. A good friend of mine had a saying: "the largest room in the world is the room for improvement." I feel sure they locked me up in there and threw away the key. Lets keep the positivity going!

By Brad on Thursday, February 15, 2024 - 21:04

@charlotte if you want to stop writing I can't and won't stop you but just because people have diffirent oppinions to you does not mean it's a waist of your time to write here.

We're not going to agree with one another and that's fine, in fact that's great! If we did, life would be boring.

The thing to remember is; we're not always right and sometimes we need to grow as people. For example, I've learnt that I didn't need to write as many replies on here and could do with some self reflection. I still don't see the point of this app but could have said that and cut down on my 5 or so posts and turned it into 2 perhaps, before this and gotten on with my life.

@mr grieves, I completely agree with you that improving the accuracy should be the devs top priority at the moment.

By E7 Company on Thursday, February 15, 2024 - 21:04

I'm excited to announce the release of Object Voice 1.2!

In this update, I have added the ability to detect people, along with improvements in object detection accuracy.

Yes, Louis, in the next version I hope to have the app available in many more languages, including French, and I'm also considering adding a feature to filter detected objects in the next version. I'd love to hear your thoughts on whether a text input field or a selection list would be better for this purpose, given the extensive list of over a thousand detectable objects.

The creation of Object Voice was motivated by the desire to assist and empower this community. Whether it's helping you choose fruits at the market or finding a lost remote control or phone, Object Voice aims to simplify daily tasks. Also, it can serve as an educational tool for teaching children object-word associations.With the upcoming release of Apple Vision Pro, Object Voice could potentially offer even more benefits and applications.

I believe offering a one-time purchase is the best option for the app, rather than using intrusive ads or subscription models. If the app doesn't meet your expectations, you can always request a refund.

I appreciate all the support and feedback I have received and remain committed to continual improvement.

Knowing that Object Voice can make a difference in someone's life is truly gratifying.

Thank you very much to everyone!

By Louis on Thursday, February 15, 2024 - 21:04

Hello, I would like to have a search field to be able to find a specific object. And possibly sections which would include the type of objects to search for. Thank you in advance for your effort to further improve the application.

Kind regards, Louis

By Louis on Thursday, February 15, 2024 - 21:04

Hello, I would like to have a search field to be able to find a specific object. And possibly sections which would include the type of objects to search for. Thank you in advance for your effort to further improve the application.

Kind regards, Louis

By mr grieves on Thursday, February 15, 2024 - 21:04

Thanks for the quick updates. I had been meaning to try out some tasks with it this weekend but today has flown by and I'm out all tomorrow, but I would like to try simulating some dropped objects I'd like to find and see if it helps. I did find a potter's wheel in my room just now which was a surprise!

If the app gets a search maybe it could also remember the last few things searched for to make it easy to find them again.

One other suggestion - is it possible to stop the phone going to sleep when using the app? Or at least when it's actively speaking to me. I find when it does it also takes the phone a little time to become responsive so I can unlock it again. I do get this sometimes anyway post ios 17 but it does interrupt the flow a little.

By Dave Nason on Thursday, February 15, 2024 - 21:04

Member of the AppleVis Editorial Team

Hi Justin. The Camera app describes what’s in the viewfinder essentially by default. You just need to make sure that image recognition is enabled in the app.
Dave