Unless there's some setting I don't know about, the iOS Photos app gives no direct feedback when cropping images in the Edit>Crop and Rotate function. I assume the image is on the screen, but it isn't described by VO. Instead, there are "Handles" at the edges and corners of the image, which are spoken by VO, and activated by a tap and hold, then slide. You can check the Info tab to find out if the pixel counts have changed, assuming the app has updated them after you are done editing. Sometimes it doesn't show the change until you leave the photo and return.
Over on Seeing AI, in the tab for describing photos, there is an option to explore by touch. So I have been getting an idea of where the subject I want to crop around is located on the screen. I go through the Seeing AI app to do this because my phone gets stuck if I try to explore by touch in a image I have recognized with Seeing AI through the Share Sheet.
I get the sense that the image on the cropping screen is smaller than the one on the Seeing AI screen, however I can get an idea that I might need to move the left handle a lot toward the center if the subject is all on the right side of the screen etc.
From there I switch between the Photos app and Seeing AI, having it reanalyze the image each time after saving the edit. Asking Seeing AI questions about the location of the subject is also helpful. Unfortunately, the whole process does require asking a sighted person if the picture looks all right. It usually needs a little more work, but I did get one picture done without any need for further edits.
Working with the handles is something I'm starting to get a feel for. I haven't yet needed to use the corner handles, *which I think tilt the image. Over cropping is easy to catch, even with the VO description in the Photos app, but the tiny adjustments required to get something reasonably centered can take several cycles through the process, including the sighted feedback. Still, after just a couple of days experimenting, I think I will be doing this regularly for my photos.
* Was wrong. Corner handles don't tilt the image, just move the corner to a new place on the image.
By OldBear, 23 June, 2025
Forum
iOS and iPadOS
Comments
Later on
I'll do one more report on cropping, in case someone comes searching about the topic. It's gotten a lot easier with practice over the last week or so. One frustration with the cropping in the Photos app is that you have to tap done and end the editing session to check the pixel counts. The issue is if you overshoot your goal, which is very easy to do, you have to revert the photo back to the original and start again. And obviously, you have to end the editing session to check the result with Seeing AI or what ever AI describer is being used.
I've found that the corner handles are useful if you have the aspect ratio locked and you move the cropping and your finger diagonally. Like if you have something in the bottom left quadrant that you want to be central, you drag the top-right handle at an angle toward the center. Otherwise, having the aspect unlocked and using the top/bottom and left/right handles works much better to trim the edges off something. That's where the frustration with checking pixel counts comes in. Both with the aspect ratios, and the resolution for printing.
One example of using the cropping is when I've taken a picture of something like jewelry that I've placed in a Scanner Bin box on an 8x10 gray card, and the camera takes a picture of a much wider area. I had always intended to get help cleaning up the pictures, cropping out the cardboard boarders around the gray card, but it's something I can mostly do on my own time now.
Cropping photos
Oh wow, I don’t think I have the patience to crop photos the way you can. I’ve always wondered if this was possible on the iPhone though, especially if you’re totally blind. I wonder why we don’t have AI handling some of this—like, you could essentially paste the picture into an editor, with a simple tip like “crop out the dog from this picture” or “replace the sunset with a sunrise,” or something to that effect.
I wonder if screen sharing would work—like, if you shared a photo in real time, maybe on ChatGPT or Gemini, and then you continually ask for feedback while you’re moving around the screen. I don’t know if that would work with VoiceOver on, because I never know if there’s an arrow moving around when I’m trying to select elements on the screen. Basically, I don’t know what the AI is able to see when I’m moving or pointing, like where I am navigating.
So the program could say, “You’re above the dog, move down a little bit, move a couple inches to the right, then stop.” The problem with this kind of technology is that it constantly requires you to be interacting with the program. It never tells me in real time what is happening—like, “Stop, you’re right over the dog, you can crop the image now.” I always have to be asking for feedback, and by the time I get the warning, I’ve either gone too far or I’m too close, which makes the picture choppy. Very interesting.
An idea for iOS
It is an exhausting process, almost as much as setting up a good shot in the first place.
If VO recognized the image that is in the cropping controls, it might help somewhat.
If you were able to flick the cropping handles as adjustments, and it announced the pixel count or something like that, it would be much easier. Pages does this, and that might be an other option, to just paste the image as an object in a Pages document. You have to deal with the margins that Pages seems to require, but boarders around the images aren't too big of a deal.
Need to play with it a lot more before I really know what would make for good suggestions.
Re: Corner handles
Hey OldBear,
Unless things have changed in the last couple of decades, the corner handles are for moving both the horizontal and vertical edges of the frames, depending on which corner you grab., I don't think therefore tilting. Of course I could be mistaken, this is iOS, after all.
In my experiences, from when I had eyesight, that is how the handles worked. When you grab a handle on the left side, for example, you are moving the left frame edge, to show or conceal parts of that area of the image. Vice versa for the right side handle. When you grab the top handle, you are moving the top (horizontal)) frame edge up and down, to show or conceal that area of the original image, and vice versa for the bottom handle. Finally, when you grab the corner handles, you are moving the horizontal and vertical frame edges, dependent on which corner you grab.
For example, grabbing the upper-right corner will move the top horizontal frame edge, and the right vertical frame edge at the same time. If you grab the bottom right corner, you'll get the right and bottom horizontal frame edges.
Vice versa for the top left and bottom left corners.
HTH. :-)
Editing photos with ChatGPT
Hi,
Has anybody tried editing photos with ChatGPT. So, uploading a photo, and then asking it to crop stuff out? I tried editing screenshots quite a few months back. The first few times I did this, it didn't follow my instructions at all. It totally changed the font, even though I didn't mention anything about the font, all I asked it to do was remove all my desktop icons and taskbar from the shot, and just leave the word document. It didn't crop the desktop icons and taskbar out properly, half of it was still there. However, when I tried a few months later, later in 2024 I think, I asked it to remove all the desktop icons and taskbar, and it seemed to work. I put the image through Aira's AccessAI, and it just described the word document, and that was all, no desktop or taskbar icons. I asked it if there were any desktop icons or anything else in the image, and it said no. So, I'm presuming it worked. It might be worth trying ChatGPT out with this. It might give you absolute rubbish, but it's worth a shot I think.
@Brian and @Tara
Ya, @Brian, I realized that after I fooled with it for a while. You're moving that corner to a different place, but it stays a right angle. Not sure if you're supposed to move your finger diagonally to the place you want it to be, or up/down, then left/right, or if it matters as long as your finger drops it where you want it.
@Tara, I'm waiting for the AI editing, or at least the frensy about it, to ripen and the bad seed to fall away before I start relying on it.
When I was a kid, I got to do cropping with a negative in an enlarger machine, and you physically moved a frame with paper behind it, or you moved the thing holding the negative in a different machine, I can't remember the names, and there were even microfilm machines for newspapers and print, all kind of different combinations. I figure it's much the same, but maybe easier with the digital... if you're sighted.
I've ordered a zoom lens to attach to my phone. Don't know how it will work out, but I remember using those with sight.
OldBear
You move up/down, or left/right, on the corners, it doesn't really matter, as the corner will move diagonally automatically. It's just how it's been coded.
OK
I think I get it now, and it make sense, you just drag the corner where you want it and lift. I have come across complaints by the sighted that if you don't do it a certain way with the iOS cropping, it also moves the opposite corner too. I'll fool with it and try to figure out what happens, between strait down and across or diagonally.
don't crop it blind, i wouldn't even try
https://apps.apple.com/us/app/ai-background-eraser-cleancut/id6566187176
Interesting app
That's not what I'm doing, though. I want to crop it blind and still have the background.
Anyway, it seems like I get much better results with the corner handles if I do right angle motions than a diagonal strait line, like up then over. In fact, I just did a bird sitting on a branch and nailed it to the center with two edits and reanalysis with Seeing AI. The photo isn't very good because the branch blocks a lot of the bird and it has its back to me, but the cropping was good, going by Seeing AI.