Before you check your Settings app, this update is currently available to Apple Intelligence supportive devices (iPhone 15 Pro/Pro Max in all 16 models)
This update includes many things, such as ChatGPT integration with Siri, and the redesigned Mail app.
Also included is visual intelligence for iPhone 16 models.
My iPhone 16 Pro Max doesn’t get here for about a week, so I can’t play with these updates yet, but my Mac just finished installing macOS 15.2 beta with these features as I am writing this.
From what I’ve heard, this update is a really big one when it comes to Apple Intelligence. You can set it up to where you can simply tell Siri, ask ChatGPT, in front of your Siri request, and anything you say will go to ChatGPT.
Also, Apple Intelligence expands to a few more languages And regions.
I’ll keep you posted on anything I find about these updates.
Also, you can now set default apps much easier in the app section inside of your settings app.
These are all the features and changes I know of right now, but I will keep you posted on anything I find in macOS, and then iOS when my iPhone 16 Pro Max gets here.
By Levi Gobin, 23 October, 2024
Forum
Apple Beta Releases
Your Feedback is Important to Improving Accessibility on Apple platforms
Don't assume that Apple is aware of a bug or that your report won't make a difference - submitting bug reports directly to the company helps them reproduce and resolve the issue faster. Your report may provide crucial information. The more reports they receive, the higher the priority they give to fixing the bug.
If you're using a beta version, use the Feedback Assistant on your device to submit feedback and bug reports. If you're using a public release, follow the instructions on this page.
Comments
Yeah just watched yt video…
Yeah just watched yt video on chatgpt integration. I wonder how the new genmogi character will be held by voiceover as they''re not standard part of unicode consortium? I'm especially impatient to test siri with chatgpt on my mac. I don't care enough of AI yet to buy a newer iphone as I got my 14 just last year, but the little ipad mini is really, really becoming attractive to me. I hope writing tools are still as good as they were with VO on this version :) I used to summarize apple license at update.
Type to siri
Saw that now people can ad it in control panel. Since accessing it with VO is hard it will be nice that people can go to control and open type to siri from there.
Downloading 15.2 beta on my…
Downloading 15.2 beta on my beta partition, on my m2 pro from 15.1 rc it's still about 15gb of download so beware :)
ChatGPT, it's awesome!
ChatGPT, it's awesome!
Oh my gosh, that's all I'll say, they've killed it. Just wow! You'll very soon forget the chatgpt app and only use the integration on the mac. It's... I'm speechless.
How does the Chat GPT integration work?
How does the chat GPT integration work?
Is it similar to the actual Chat GPT app?
Sadly I can't test apple intelegence since I have a Iphone 14 pro still.
It's on the mac I can't…
It's on the mac I can't comment for iphone as me too only have iphone 14.
Kushal Solanki
From what I heard, you nee to go to setting turn it on and Siri will ask you or you can ask Siri to use GPT directly.
My experience with ChatGPT Integration
I will just say this up front, I love this! I was able to get chatgpt to describe an image on the mac without vocr, but this could be used in combination with VOCR.
You can enable chatgpt integration by going to apple intelegence and siri, and under the "Extend Apple Intelligence & Siri" heading, you will find a ChatGPT button which you can click to set it up. I can't wait untill I get my iPhone 16 Pro Max!
How did you get the…
How did you get the description and did you go by writing tools or another method? This is the first thing I thought of but haven't tried yet. I'd love that.
voices?
i still thought voiceover was getting more natural-sounding voices in ios 18?
will watch the video now on this i use the chat gpt app when need it so no difference to me at the moment it uses those voices to read back to me i assume ios does not.
Visual Inteligence?
Is it only for iPhone 16?
How does it work?
VI
Yes. I also think is for 15 pro and pro max.
It takes a screen shot and…
It takes a screen shot and sends it to chat GPT. You can log in with your paid account but I'm not sure if it uses your preferences as in, getting it to describe in specific ways as you can with chat gpt directly. The nice thing on mac is type to siri reads out its reply with siri's voice. It's not as expressive as all the chat gpt voice stuff, but you can only use that if you speak into it too. I like typing and getting audio feedback.
Chat GPT integration
Just received my 16 Pro Max. it's in the box waiting for a window of time where I can set the bleeder up. I, too, am very keen to hear more about Chat GPT integration.
One specific question I have: Olly said you can log in with your paid account. Do I take it that a paid account is still necessary to use this feature fully, then? Even when integrated into IOS, if I don't have a paid account will I be cut off after a bit, as now? I don't necesarily begrudge that, but I am nto at all clear on what the position is.
Chat GPT
@bingo Little you actually don't need to login with your chat GPT account. The advantage of using your free Chat GPT account if logged in, is that you can keep your history of queries. As for the paid account, is simply a plus to use this, but has little bearing on the functionality of siri and chat GPT integration.
Voice over buggy on this beta.
I have reported this to Apple, but voice over is buggy when swiping through your phone. I am using Iphone 15 ProMax, South African Tersa voice. Basically what happens, throughout almost all the apps, is that when swiping from left to right voice over would not read some elements on the screen.
For example, when on outlook, voice over would read the subject of the first message, but it would then say button on the second message, and read the third message. I have to swipe from right to left again in order for voice over to read the second message.
This bug is almost consistent on many apps, including Apple native apps.
Corporate Greed At Its Finest
Currently on an iPhone 14 Pro and I'm getting bored. Since the redesigned mail app wasn't mentioned in the Apple Intelligence section at WWDC, it would make sense that this would come to older devices. They did talk about the ChatGPT integration while discussing Apple Intelligence, but since ChatGPT is running entirely over an Internet connection, it would be outright stupid of them not to include that on older devices as well. If you think about it, there was a time in iOS history when Siri had to be run over an Internet connection, and for the most part, it still does (except for a few tasks). This could simply be another scenario where that has to be the case.
not good enough internals
For older phones you don't have good enough internals to run the new stuff. Just the facts of life. Wait a few months and get the new iPhone SE.
Paid vs free
The difference is in the number of requests and the GPT version you have access to. I don't think it makes a huge difference at this point.
Also agree that voiceover has become quite a lot buggier in this release, with some issues with Braille screen input and snagging on navigation in general.
Siri, contrary to what I posted earlier, doesn't read out the full responses on Mac when you do type to Siri. It did once, but then I think now it just says, “This is the answer,” and dumps output in a field as text, and navigating the Siri app, as with all Apple in-house apps, is highly awkward with no shortcut I can find to jump between fields.
Regarding older phones, I get the frustration, but as I've said before, your phone doesn't become less because it doesn't get the latest features. They are experimental at best, and it is very much like they are feeling their way through and trying to find a use for them. For me, the most useful aspect of this is proofing, which is best used on a Mac, and all Macs with m1 and up get it. Siri is still pretty rubbish, the image stuff probably isn't really relevant for us, and though the screenshot and upload to chat Siri is still pretty rubbish. The image stuff probably isn't really relevant for us, and though the screenshot and upload to ChatGPT is cool in principle, it's nothing you can't do with the stand-alone ChatGPT app. is cool in principle, it's nothing you can't do with the stand alone chat GPT app. I do agree there is some splitting of hairs and obviscation of the truth with needing 8 GB internals when the real heavy lifting has to be sent off-device anyway. In short, you can do 90 percent of what Apple Intelligence can do with ChatGPT and alike on a phone that is two years old. The deep integration is still a way off.
Running iOS 18.2 beta versus sticking to released version
Lots of comments on here about working on a Mac, so I'm a bit confused about using this new beta on an iPhone.
My main question is, are these betas stable enough and/or free of enough bugs that it is worth putting onto my only iPhone and iOS device? I consider almost all software these days to be a kind of beta since no software is ever free of bugs and problems. I have been running JAWS betas for decades on Windows with no problems, but, of course, that doesn't affect the operating system.
Anyway, just wondering if I should wait until the official release or try installing this beta to first of all play around with it, but also to provide feedback to Apple on problems so that they can be fixed in the final release.
--Pete
Voiceover on IOS is…
Voiceover on IOS is certainly buggier on 18.2 than it was on the 18.1 beta, but it's probably because it is at the start of its cycle.
If I could go back to this morning I'd have not installed the beta on my iPhone but would still have installed it on my mac... If that's any help?
On mac people who want beta…
On mac people who want beta test and report bugs will have an easy time doing that, apple provides documentation about how to have two partition with 2 different versions of macos. Even with 256gb mac doesn't take that much space. That's what I do on my base model m2 pro.
Thanks Ollie
Thanks. Although I don't mind minor bugs as long as I can work around them reasonably well on my iPhone, if Voiceover is pretty buggy, perhaps I'll wait until a bit later in the 18.2 beta cycle.
Thanks for your feedback.
--Pete
Playground app?
I got on the waiting list, but unlike the waiting list for Apple intelligence, it's been 2 days and still, not approved. I'm not complaining in the least, so please don't tell me to be patient and chill. I'm totally cool. It may just take a bit. It's all good! I'm just wondering though if anyone else is yet approved, and if so, is it accessible at least to get it to generate an image based on a prompt?
Secondly, can someone explain to me with Siri how to go about getting it to snap a photo then describe it? I have a 15 Pro Max, so I'm a little confused what features I am not ever gonna get on the 15, that only are for the 16. Was hoping someone could explain. I still don't get this whole thing of visual intelligence. Is that part of the image generation thing, as that's what I really wanna see in action. I can do it directly with Chatgpt, but I wanna see how iOS directly does it. And yes. I do have a paid chatgpt account.
Great point regarding Mac…
Great point regarding Mac and Beta, yes, create a separate disk for testing... I've not, but I'm not encountering many issues on Mac, just a few on IOS.
Yes, I'd wait a while. I don't exactly regret getting 18.2 beta 1, but if I could easily downgrade, I would.
Regarding looking at images, once you have chat GPT set up in settings > Siri > Chat GPT, and you have the photograph open, you can just use siri to ask something like, describe this picture, and it will send it out to chat GPT and come back with a description. It's a bit annoying as it describe the entire screen, EG, this appears to be a screen shot of a phone etc... There may be a way of just doing the image, I've not worked it out yet though. As far as I can tell, it only uses screen shots at the moment.
Regarding playgrounds, it's thought the waiting list is going to be up to a month as they roll it out. I've applied but not been accepted yet.
Another way to describe images
Another way I can describe images is by opening a text field and then using the compose with ChatGPT and then selecting an image from the photo library. You can also attach any other type of document, so maybe could describe inaccessible pdf's or something.
Visual intelegents is really cool. You can hold the camera button down, and to get chatgpt to describe the image, you press the button that VO says as 'Comment Lines'. you then hear what I think sounds like the siri sound but lower pitch, and then it will speak the description out loud. If it doesn't, find the unlabled item at the bottem and use VO image descriptions, preferably with a custom gesture.
Visual intelegents does not require being excepted for image playground, neither does the ability to have siri generate an image with ChatGPT.
you can see the image playground interface on mac
I saw this on a youtube video. If you are not excepted forr image creation, You can still look at the image playground interface. When you are on the "early access requested" screen, press escape before you hit done. It may seam like it's not doing anything, but when you hit done, you can see the interface, but I've not gotten it to generate anything.
thought I would let you know that if anyone was interested.
This doesn't work on iOS. I tried the VO scrub gesture, and nothing happened. The app still took me to the home screen.
Image Playground
I am in South Africa, and yeah, I am still waiting for the approval to use or experiment with image playground.
Differences between developer beta and "normal" beta
I signed up for iOS betas to try out the 18.2 beta and provide feedback on accessibility before the final release.
Although my iPhone says that I have betas turned on in the Software Update section, this beta doesn't seem to be available and I am told that my iPhone is up to date with 18.1.
I'm thinking there may be a difference between signing up for the beta versus the developer beta. Alternately maybe I just have to wait to be accepted into the beta cycle? Don't know.
Anyway, if there is a difference with the developer betas, what is it and how does that work versus what might be the "normal" betas?
Thanks.
--Pete
Explanation of betas
If you have registered for an account at developer.apple.com, you will have access to the Developer beta inside of your settings under the beta updates option.
If not, you will only see the public beta option that you registered at beta.apple.com.
If you want 18.2, you need to have a free developer account in order to receive the beta. You will also need an apple intelligence capable device (15 pro/pro max and all iPhone 16 models) in order to even see the update.
Re: Explanation of betas
Thanks for the clarification about the different stages of betas. I thought that might be the case.
So is the developer beta an earlier version than the ordinary beta where the ordinary beta would presumably be expected to be a bit more mature with more of the bugs wrung out?
--Pete
How can I see "all" e-mails?
There are several buttons in the Mail app to show Primary, Transactions, Updates, and Promotions.
However I don't see a button or any way of showing "All" mail items in my Inbox.
I have some unread items in my Inbox that do not show up on my iPhone's Inbox at all under any of these catagories.
Is there something I am missing or is this a bug?
--Pete
Seeing all e-mails
I found the trick. At the top of the screen in Mail there is a "More" option. In that dialog you can choose the "List" button. When this is activated, all of the buttons for special sorting of mail disappears and all e-mails show up as they did in 18.1. This is the "normal" view.
--Pete
Voice Over bug fixed in IOS18.2 beta 2
I have to say in beta 2 Voice over bug has been fixed. At least voice over reads all elements on the screen.
Image Playground.
Oh yes, Image playground is now available in South Africa. I am yet to play around with the feature.
peter
Will remember this tip because I will not want my email in categories just they are now. Thanks.
On the latest beta of 15.2…
On the latest beta of 15.2 as of writing, the built-in script to speak wifi status again announces the SSID the mac is connected to. The VO-N shortcut to list currently on screen notifications works again. Moreover, this is a very nice change that could have been there previously too but I am not sure, but now, on the web, when you reed by word or by character with the rotor and you have a custom verbosity setting such for links and headings to be announced after the actual text of those links and headings, these are now announced in the right order as per your verbosity settings. This is all I was able to see in 10 minutes after the update. I didn't even have to reboot to correct some random problem in web browsing in safari for example as it was the case for previous updates since the first beta of sequoia if I remember correctly. The menu bar bug is still there as well as that of the volume adjustment with command + trackpad rotor swipe counterclockwise/clockwise.
Apple, please don't break what already works
After accessing the dock with vo+d or double tapping with 2 fingers at the bottom edge of the trackpad, pressing escape no longer focus the cursor back in the app
When I access the Dock with either the vo+d shortcut as per apple VoiceOver user guide or by double tapping at the bottom edge of the trackpad, in all previous releases, I was able to focus back in the previous context/application by pressing escape. For example if I had textedit opened, I could access the Dock as described above, and pressing escape would have had shifted the cursor back where it previously was, in this context textedit. This no longer works however, even with the two fingers scrub gesture on the trackpad. The only workaround is to press VO-escape.
FB15839086
Safari seems remarkably…
Safari seems remarkably stable on chatgpt which was a major point of frustration in the past so far, I know it doesn't mean anything due to the very short time of testing but this dev betea is definitely more stable than 15.1 overall for me.
Did they just fix the…
Did they just fix the horrible 3 finger gestures issue with the trackpad and voiceover? There is no misinterpretation now and I haven't changed anything in how I perform them. Like scrolling is actually really possible now!
**Edit: it seems that it was just luck. Like why are these so complicated on the trackpad while on iphone I never have false positives? Scrolling is something we can't live without.
Never mind. On ChatGPT when…
Never mind. On ChatGPT when your thread hits a certain length voiceover goes crazy with sn and on other browsers as well and this is only a VO problem not experienced at all on windows for example or chromeos.
When in a Finder Window,…
When in a Finder Window, impossible to navigate in the desktop content with arrow keys after pressing vo-shift-d
FB15899517
To trigger this bug, I must be on the desktop.
I start with opening a folder or folder alias by moving to it with the arrow keys (all form of quick nav in VO utility off) with cmd-down arrow or cmd-o.
Depending on VoiceOver verbosity and finder settings, for me it says something like this once I open the folder by one of the methods mentioned above.
Cal I window column view browser 1 item selected. Cal I group
With “Cal I” being the folder I just opened, in column view (cmd-3 in finder).
Then I press vo-shift-d to go to the desktop, after having moved a little bit in the folder content, or not.
Voiceover does not announce anything nor does any of the cursor actually moves to the desktop except for the mouse one (vo-fn-5).
Then, with vo-fn-3 Describe item under the cursor, it says “nothing is in the VoiceOver cursor”. Vo-fn-5 would say “Desktop group is under the mouse”.
Otherwise, if I press cmd-` (us layout keyboard) to move to the desktop after having opened a folder from the desktop as described above, it’s still possible for me to navigate the desktop normally.
Please eliminate "writing tools in action menu" announcement
For those of you on the MacOS beta, could you please put in a Feedback request to eliminate the VoiceOver announcement, "writing tools in action menu," or at least have that announcement tied to VO verbosity or speech hints. Hopefully it's not too late to eliminate this distracting and unnecessary announcement. Thanks.
Apple Music
What about Apple music? Is it fixed on new betas?
Thanks.