Welcome to the March 2026 installment of AnonyMouse's App Pick of the Month, where each month I highlight a noteworthy new or updated app. This month, I’m picking CurbToCar, an app designed to help blind and low-vision users find their ride using only their phone’s camera.
According to its App Store description, CurbToCar helps users locate vehicles with object detection, real-time tracking, audio feedback, and distance estimation. It is built around that last stretch between the curb and the car, which is such a simple idea on paper, but one that could make a real difference in the right situation. The app is designed to help take away some of the uncertainty that can come with waiting for a rideshare, a friend, or another pickup.
What stood out to me right away is how specific and practical the idea behind CurbToCar is. We have so many AI tools now that can describe photos, read text, identify products, and do all kinds of useful things, but CurbToCar feels like it is solving an everyday problem for which the solution has thus far been elusive.
What I like about CurbToCar is that it feels like another useful tool to add to the toolkit. In my own experience, waiting for a ride can sometimes be frustrating, awkward, and even embarrassing when you are not fully sure whether the car pulling up is actually yours. That moment of hesitation can create confusion for both rider and driver, and that is exactly why CurbToCar caught my attention. It is not trying to do everything. It is focused on solving one specific problem, and I really appreciate that.
Another part of the appeal of CurbToCar is that I have not really seen anything quite like it before. That alone made it stand out this month. As my own impression, I can also see this being especially useful in places like outside your home, at an apartment complex, or after work in a less crowded pickup area. In extremely busy places, I would personally expect locating the correct car to be more challenging, but the concept itself still feels smart, fresh, and genuinely helpful.
There is also something exciting about seeing accessibility-focused apps continue to grow in ways that feel practical and empowering. CurbToCar was co-designed with blind users through the MIT Assistive Tech Club, and that comes through in the way the app is described. It sounds like the app was built with a clear purpose, and that purpose is what made it memorable for me.
Why I’m Picking CurbToCar This Month
I’m picking CurbToCar this month because it offers a unique idea, addresses a practical problem, and feels like a meaningful accessibility tool rather than just a novelty. It may be a great fit for blind and low-vision users who want more confidence when meeting a ride. At the same time, it may not be ideal for every environment, especially very crowded pickup zones, but the concept and potential are what really made it stand out to me.
Download the App
CurbToCar is available now on the App Store.
Platform: iOS (iPhone) Price: Free
App Store: https://apps.apple.com/us/app/curbtocar/id6748715225
Now It’s Your Turn
Have you ever been in that difficult position yourself, standing outside and wondering if the car pulling up was really your ride? I know that awkward feeling, and that is part of why this app stood out to me. If you have tried CurbToCar, or if you have ever found yourself in a similar situation, I’d love to hear your thoughts and experiences.
Comments
A Basic Question
How well do you need to describe the vehicle you want this app to detect and track?
Glasses
I hope the devs are looking at Meta integration as this would be great to be able to use hands-free. I'm never keen on holding my phone out in public. Sounds like a really nice app though - not something I'd use much but it's good to know it's there if I need it.
meta integration, as well as a question
That's a really good point and something I hadn't thought of, but yes, the few times I've used this app I felt weird about holding my phone out like that.
For those that use it, I did have a question. Do you have it read every car it detects, or just the one you're looking for?
I originally had it set up to just detect the car I'm looking for, but I have to keep hitting the rescan button so that gets a little annoying.
Single car
Because when announce all cars I found that the app started to get slower and confused and eventually crashed!
More Thoughts
I thought I’d leave a little more detail here for anyone who still has questions or is curious about my suggestion.
Unfortunately, this app is probably not ideal in super busy or congested areas where there are lots of cars around, like airports for example.
I live in a residential area, so I do get several cars going by when I’m waiting for a ride. If it’s really quiet, I can usually tell it’s my ride just by hearing the vehicle pull up. But sometimes a neighbor comes home and parks nearby, or even across the street, and that’s when I’m not always sure if it’s actually my ride or not. Some drivers also don’t identify themselves, and that’s where this app really comes in handy for me. Every now and then, a car kind of sneaks up on me before I notice it.
Like with any app that uses your camera, it really depends on what the camera can see. You have to give the AI something to look for, which is why it helps to have at least some kind of description of the car. So if I tell it to look for something like a silver Kia Seltos, it needs to be able to pick up on things like the color, the car logo, and other identifying details. Just saying “silver” usually won’t be enough, because there are so many cars in common colors like black, white, silver, and gray. It really helps if the app can see the logo on the front or back of the vehicle, and maybe even the license plate if you’re lucky.
I’m not totally sure how well it can tell the difference between specific car models yet. I still want to experiment with that and see whether saying something like sedan, SUV, 2-door, or 4-door helps at all. That’s definitely something I want to test more.
Someone also asked about distance, and again, it really comes down to getting a good view. The app has to clearly see what it’s looking for before it can give you the pop-up saying your ride is here. What I usually do is keep the app running, but paused. Then when I sense my car is close or hear a vehicle coming up, I unpause it so it can get a better look once the vehicle is nearby.
Is it perfect? Nope. Some days I get better results than others, and sometimes it just doesn’t pop up even when the car is right there in front of me. Most of the time I get pretty good results, but every once in a while I don’t.
For the person who asked about taxis or cars where you have no idea what the vehicle looks like, unfortunately the app just won’t know what to look for if you can’t give it any description. I completely understand that with something like a taxi, you may not know what kind of car is coming. Sadly, in that situation this probably won’t work very well for you.
The only thing I’d suggest, and only if you’re in a very quiet area, is that there’s a setting you can turn on to announce every car. That could help, but I really wouldn’t recommend it in places with lots of traffic, like parking lots or busy pickup areas, because it would probably be way too much.
One nice thing about the app is that it’s free, and I really do recommend downloading it and trying it for yourself. That’s honestly the best way to answer a lot of your questions, because everyone’s experience is going to be a little different depending on where they live and how they use it.
I totally agree that this would be amazing on smart glasses. That would make it so much easier. But for now, I just lift my phone up and unpause the app when I hear or sense a vehicle coming. Also, make sure to give it a few seconds, because like most AI, there can be a little delay before it recognizes something that matches your description.
I also highly recommend giving the developer as much feedback as you can. That only helps all of us in the long run, because I’m sure the app can keep being improved and refined over time.
What I like about this app is that there are already tons of AI apps out there, but I haven’t really seen one that identifies vehicles like this, which is why it caught my attention. I just thought it was something interesting to share with the community in case others wanted to try it too.
Hopefully someday soon, we’ll have more AI with live features where I can just give one prompt and have everything work in a single app. That would be much more ideal than opening different apps for different tasks. But honestly, I still can’t complain too much, because this works on your device, it’s free, and it doesn’t rely on the cloud or a server to work.
I’m really hoping that in the near future, more AI tools will run directly on our devices, because I think that would make them even more useful, more affordable, and more practical for the blind community.
vehicle description
I thought I'd share my experience with this app. The first time I used it, it worked well on the first try!
I was actually waiting for my paratransit vehicle. It gives the vehicle type (not the color) and vehicle number. So for example, chrysler voyager, vehicle number 9576. I also added "minivan" at the end, wasn't sure if that was necessary.
Immediately after it pulled up, the app announced that it was on my left.
More details
CurbToCar does require an internet connection. The app identifies cars locally on the phone and then isolates those cars and uploads them for a more detailed identification of make and model. The tech used is like that used at toll booths to identify cars that don't pay tolls. It's that expert identification that gives CurbToCar reliability beyond simple image recognition models. This is why it can slow down a bit if there are a lot of cars in the image. That said, it does indeed do quite well in crowded situations. I've had great success at both Nashville and LAX where there are multiple lanes for ride share pickup, so don't count it out in busy environments.
As for the car description text, it was designed to cut and paste the color, make, model and license plate line from the Uber app, but it uses AI to match results so it's very flexible in doing it's best to figure out what you meant. Blue Honda will generate results.
So, the intended workflow for the app was to copy the ride description from Uber via copy last phrase spoken by VO to the clipboard, and when the ride is two minutes out, switch to CurbToCar, paste in the description from the clipboard, and start looking. I simply hold my phone with my cane vertically as a post and aim it in the general direction I expect to find the approaching car.
And yes, Meta AI glasses support is currently in testing. It will be Apple's and Meta's bureaucracy that will prevent it making it to the App Store sooner rather than later.
And please do send feedback from the contact info in the settings screen of the app even if to say it worked. Everyone's stories can help make the app better and find weird fringe cases.