As of the latest update to the ChatGPT app by open AI for iOS, the app appears to be unusable with voiceover. No matter where you touch on the screen, voiceover just announces, “close menu.” I did notice that if you single finger swipe left or right, the actual elements of the app are still there, but trying to get around this way would be nearly impossible. I am on an iPhone SE, latest generation. Is anyone else experiencing this? I’ve tried restarting the app as well as fully deleting and reinstalling it, all with no luck. This is a serious accessibility problem.
Edit: Since posting, here are a few updates:
1. Screen recognition does restore partial access, though as anyone who has used VoiceOver screen recognition knows, it's a bit flakey and unreliable so I don't consider this an adequate solution.
2. OpenAI support email address is support@openai.com, though when I emailed this address I got an auto-response directing me to use the support chat feature on their support website. So, I'm unclear if this inbox is actually ever reviewed by a human.
3. The website where you can find the chat intercom at the bottom of the page is at https://help.openai.com/en/
4. They also have a developer forum, but it seems people submit general issues there, too. It's at https://community.openai.com.
I have submitted this bug to the email inbox, the live chat intercom and on the developer forum, so hopefully I get some traction. If you're experiencing the issue too, I encourage you to post to as many of these avenues as you can. If you solve the issue, please let me know how. Thanks!
Comments
Have you tried
Hi, have you tried the app with Screen recognition on?
Screen recognition helps. Not an acceptable workaround
Hi there. Yes, actually shortly just after posting this, I tried screen recognition. It does restore some of the accessibility, but as we all know, screen recognition is a workaround at best, and not always a reliable one. Clearly some thing with accessibility was broken in this latest ChatGPT update. Trying to find a way to actually submit feedback to them is nearly impossible. If anybody comes across a link or email, let me know. Thanks
Screen recognition
I'd love to know what is going on in the app development world lately that is causing so many apps to require screen recognition in order to provide basic accessibility. Besides the app discussed here, I had to use screen recognition recently to accept the changed terms and conditions of the Uber app - no way to access the checkbox with it. And even apple's native iOS Podcasts app player controls are sub-inaccessible. I can swipe right to get to them, but can't find them by dragging my finger on the screen without screen recognition enabled.
I can see screen recognition being useful for accessing apps that are not accessible. What I can't understand, and what I'd like to see stopped immediately, is new apps being designed in such a way that they depend on screen recognition, as if it's a new standard for accessibility that everyone must adopt.
Please contact OpenAI
Hey there and thanks for the reply. So it looks like I’m not alone in experiencing this bug. I would highly encourage you to get in touch with OpenAI to report it to them. The more of us that report this, the more likely they are to address it quickly. I left some information in my original post for how to reach them. Thanks!
A suggestion
Have you all tried to use the 'Move In and Move Out' gesture in iOS? Think interacting on Mac OS.
By default the gesture is a 2 finger swipe right and left, respectively.
HTH. 🤔
The move-in gesture
Hi Brian. Thank you for the reply. I wasn’t aware of that gesture, though now that I hear that sound effect it makes, I realize I’ve accidentally instantiated it many times. Lol. I’m not a macOS user, but I’m guessing that’s something like NVDA’s object navigation where you can move inside of a parent element and navigate child elements? Something like that maybe? Anyway, it was definitely worth a shot but unfortunately no dice when it comes to the ChatGPT issue. I welcome folks keeping the suggestions coming.
So far, the only way I’ve found to work with the app in its current state is using screen recognition and good old single finger left and right swiping, though it can take a while to get to what you want.
Hopefully a fix is forthcoming. Remember, the more of us who reports this to them, the more likely it is they fix it :)
Well...
There goes my superhero delusion of grandeur. 🦸🏼♂️
My workaround
Greetings and salutations everyone, I, too, have encountered this bug with the software. What I do, and it seems to work is I start at the top of the app, and use a one finger swipe to the right to move down the page. Going past the close button that never seems to go away. After swiping a few times, you will come to the text box where you enter in for a new chat, after you use dictation or type in what you want, I swipe till I see the send button. Sometimes it does not appear so I have to start at the bottom of the page and swipe left moving up the page till I see the send button, or start at the top and work my way down with a one finger swipe to the right. A lot of times, when it’s done generating, you can’t see you the response, so would you have to do is start at the top of the page And swipe with one finger to the right until you either see go to the bottom, or it brings you to the question you asked, and then the answer. If you have any follow-up questions, it’s kind of a wash, rinse and repeat type of deal. This is not comfortable since , it’s more work, but I found it what the easiest and most reliable way that I found my answers.
Relieved to have auto-update disabled.
You never know when an app's accessibility will be totally ruined overnight as you sleep in peace. Anyway, I might be wrong on that but assume the two-finger gestures mentioned above are applicable if the navigation style is set to "grouped".
Experiencing it as well, and another workaround
Once you launch the app, do a 4 finger tap on the bottom half of your screen, this will take you to the last item, which, if you swipe once to the left from the last item, should bring you to the place to type. For typing I use braille screen input, and when i'm done typing, I do a three finger swipe up for quick action, this'll send the text, and you do not have to be searching for the send button. However as not everyone uses braille screen input, you could assign a touch gesture for quick action if you don't use braille screen input.
My OpenAI forum post
I appreciate the replies. I’m sharing the link here to the post I made on the OpenAI forum. Even if you don’t have the time to file a proper bug report with them, even just replying to my post with a “having same issue” here would help raise visibility:
https://community.openai.com/t/chatgpt-ios-app-no-longer-accessible-for-blind-users-with-latest-update/554146?u=kyle-reese
About app updates…
I hear the point about disabling auto update and I was doing that for a while. I started slowly but surely getting confident enough to turn updates back on because in general, I think you want the latest versions, if nothing else because updates often include security fixes etc. That being said, when I look through my update history in the App Store, I don’t even see the ChatGPT app actually being updated anytime recently – certainly not around the time it started misbehaving. I wonder if they have some weird backdoor method of pushing updates dynamically where it doesn’t actually show up as an update? Anyway, interesting if nothing else. Clearly something changed
Fixed?
Version 1.2023.340 has just been released to the App Store and appears to be accessible.
In response to Luke's strange question
Another weird thing is that I do find some apps that I can update under the relevant heading, but a refresh with a three-finger swipe down adds more to them. So why's that list not refreshed when I open the App Store or double-tap on the My Account button? Yet another thing's that some apps that can actually be updated, do not show up on that list.
Thanks for the tip about refreshing the recently updated section
Thank you! I was just looking under my account in the App Store and didn’t see any updates for ChatGPT today. Refreshed and there it was. I had to manually click update as it had not begun yet. Fingers crossed. Sounds like this might do the job! That would be wonderful
No improvement here with latest version 😞
Well, my burst of excitement about the new update was pretty short-lived. Lol. I just upgraded and am still experiencing the bug. I tried deleting and reinstalling the app just to be sure, but no change. To the user who said they are not experiencing the problem anymore with the update, was there something else you did that got it working?
Re: No improvement here with latest version 😞
I hadn't used the app recently, so hadn't experienced the accessibility issues with the previous release.
However, when this new version dropped, I figured that I would check it out, and everything works as expected. There are still the minor niggles that I've seen previously - such as the button to close the menu being found by VoiceOver despite the menu not being open - but I was able to send a new message and switch to voice mode.
I don't have screen recognition enabled.
I don't know why my experience is different.
Let's see if the new update is accessible to anyone else.
Hi Cobbler
I just want to make sure I’m understanding you correctly. The bug you mentioned, where the clothes menu button is always present even when the menu isn’t open, is actually the bug we are referring to. The reason this makes the app hard to use is because no matter where you touch on the screen, all you hear is voiceover announcing the presence of that button. It is possible to left and right swipe, and others have posted some good strategies for making the most of that, but not being able to actually touch elements on the screen and activate them in the conventional way is a huge limitation. Are you saying you are actually hearing the other individual elements of the user interface as you drag your finger around the app?
Re: Hi Cobbler
Okay, my bad, I misunderstood the issue that people were talking about.
When I opened the app, I simply flicked through the page, activated a few items, performed a query, and switched and used voice mode. Everything worked as expected. I was expecting things to be far worse having seen the title of this thread describing the app as “almost totally inaccessible.”
When exploring by touch, I now see the issue that people are talking about.
This will teach me to pay closer attention to a discussion!
I've never explored by touch using this app in the first place.
I send my messages, and when the conversation gets a little long, I just swipe up with two fingers, then quickly swipe right until I find the type box again. Given I've always done it this way, I was unaware that this issue even existed. Oh well, hope you guys can get it sorted.
No worries :)
No problem at all, cobbler. You’re right that using the single finger swipe gestures can still get you around on the UI, but having to go in order of every element, when in some cases there are numerous elements and chat messages etc. can become excruciating pretty quick. Lol. I titled it such because simply touching your finger to the app and not being able to interact with any of its elements in the usual way is a major accessibility problem. Thanks again
I heard from OpenAI
Good news! I received a private message based on my forum post in the OpenAI community from an employee. He said he will be bringing this up to the team either this Thursday or next, so while a bug fix may not be on the immediate horizon, at least we know somebody in the company is actually taking ownership and will hopefully get this in front of the right people. Will keep you posted as I hear more. In the meantime, please continue to provide them with helpful data points by submitting your own forum posts, emailing the support team at the email address I provided in my original post or even just replying to the post I already made on their community indicating that you are experiencing the same thing. Also, feel free to keep the workaround suggestions coming! Thanks everybody
DMNagel
Hi DMNagel
I think this just highlights how we all have different approaches and strategies for using apps. I still think being able to slide your finger across the display and locate elements the traditional voiceover way is important, though. It’s good that there is more than one way to do things in iOS though for sure! Thanks
Landscape Mode
Here is a workaround.
If your phone is in portrait moe, the problem of exploring the screen shows up. No matter where you move your finger, you see “close button.” Now, change the orientation of your phone to landscape. Explore your screen and the behavior returns to normal.
Generally, I keep my phone’s orientation locked, but I decided just for the hell of it to see what happens if I put it in landscape. Give that a try.
Re: landscape mode
Oh interesting! Yes, landscape mode does seem to restore access to the other controls, though oddly you still hear the announcement of the clothes menu button in between the elements. Something funky is still going on, but this definitely makes it far more doable. Great catch and thank you for sharing!
Bug has been fixed, should release soon
An OpenAI staff member just responded to the community thread a few minutes ago confirming that the team has actually solved the bug and it should appear in an upcoming app update. They didn’t specify exactly which release it would get into, but just knowing they’ve taken this huge step and that we will see improvements soon is amazing! I certainly couldn’t have done this myself, so I want to thank everybody here for pitching in with suggestions, workarounds and reaching out to OpenAI about the issue
Latest version, same issue as you
It was working until yesterday. Weirdly enough I set up ChatGPT for a friend of mine a week ago and the app had this bug, but didn't think much of it. Assumed it's just his phone since it's an old iPhone.
OpenAI and accessibility
Nice to know that Open AI is responsive to accessibility concerns. Also nice of them to offer their services for free to Be My Eyes.
While we're on the topic however, I do wish they would label the several unlabled buttons that are on their web interface. I wrote to them about that a long time ago and seem to remember getting one of those bot responses. So I don't know if my comments ever got through. In either case, the buttons are still unlabled and, unless one has experience with the UI, someone using a screen reader has no idea what those buttons do!
--Pete
Peter
So, tell us, what do the unlabeled buttons do?
Get the latest update. It’s fixed!
Looks like the fix has already shipped in the latest app update. Yay!
@Peter and @Bruce
I reported the unlabeled buttons just two days ago through the chat interface (didn't know about the forum then, might post it there too). The model selector is also pretty inaccessible, it doesn't look like a combo box, a menu or anything else that makes sense.
As for what the buttons do, I believe that there are buttons below each response to report if it is good or bad, but there are more buttons that I don't know what they do and I'm not inclined just to try them out.