Intro
Today, Apple told us what to expect in the next round of software updates, coming this fall. While the focus was, unsurprisingly, AI, there is more than just that to be excited about. Let's dive into what you can look forward to on your iPhone, iPad, Apple Watch, Mac, and Apple TV later this year. Or next year, in some cases; as has become common in the last few years, Apple announced features it won't be rolling out until well after the software's initial release.
Siri and Apple Intelligence
Siri got a lot of attention this year, and for good reason. AI is being used to upgrade and enhance Siri in a lot of big ways. Many of the improvements can be summarized quite simply: real, natural interactions. Basically, the goal seems to be to let users ask for something to happen in natural speech, and Siri will do it. It feels a lot like Apple's old "it just works" philosophy--no thinking about how to phrase a request, no having to know what actions Siri can and can't do. Just ask that this contact's name be changed, or this photo be set as your wallpaper, and it happens.
Siri can even handle mistakes. If you ask, "Hey Siri, what's the weather in San Diego? Wait, I mean San Francisco." Siri will correctly identify that you wanted the weather in San Francisco. It'll even keep the conversation in mind for future queries, so if you then ask, "How long would it take to drive there?" Siri will know that "there" is San Francisco.
Siri will also be able to keep track of, well, pretty much everything. If you and a friend agreed to have lunch in a text message thread, Siri knows about those lunch plans. If your boss sends you an email with a time-sensitive task, Siri knows it. Ask it to play "that song Jim texted me about", and Siri will figure out which song you mean. This even extends to photos and video--Siri can handle a command like, "show me pictures of my husband from that trip we took last year where he's wearing a hat."
Apple Intelligence (which abbreviates to AI, which definitely won't get confusing to talk about) is behind all this. Apple Intelligence runs locally, noting everything it can about you and your life. Does that sound creepy? Apple was careful to emphasize that this is all local. It'll never send your aggregated information anywhere outside of Apple's own infrastructure, or sell it, or do anything else with it. If Siri can't handle a request on your phone, it'll use a purpose-built server for the job. These servers run Apple Silicon, and are designed to never save data or allow anyone to read any data while a request is being processed. Apple even open-sourced the code they use on these servers so it can be independently audited.
AI (Apple Intelligence, not artificial intelligence) can do pictures and audio, too. It can generate emoji-like images from text prompts, which Apple calls Genmoji. It can generate images inside notes, or clean up images you draw yourself on iPad. It can automatically OCR images and PDFs, and save transcriptions of audio from voice memos or audio you record in the Notes app. You can even record phone calls now, and have transcriptions available almost immediately, transcriptions AI can add to Siri's data set for use later.
The tricks don't stop there, though. Apple has integrated ChatGPT into its Siri revamp. If Siri determines that ChatGPT is better able to help you, it'll ask for your permission, then send your question (anonymized) to ChatGPT 4O.
What else can AI do? It can summarize emails, and show those summaries in place of the message previews we have now. It can categorize your emails by topic (newsletters, receipts, marketing, important, personal, and so on). It can generate text, and help you rewrite text you type yourself, in just about any text field you might use. It can decide if a notification is important enough to show you when you have a specific focus mode enabled. It can summarize a bunch of notifications, such as a text thread that blew up, so you can get the gist of what was discussed without having to read a bunch of messages. It can do a lot more, too, or will be able to eventually. Within the next year, Apple says Siri will be able to understand what's on your screen, and take natural language commands about what you want to do. Delete this email, copy this summary and put it into a note, save this document with this name, you get the idea.
All these features are coming to iPhone, iPad, and Mac. We don't yet know exactly what can run locally on which devices, or which features might not work at all on older products.
iOS 18
Apart from all the Siri and Apple Intelligence stuff, iOS 18 has more new features.
First up, the home screen and Control Center are getting a makeover. You can now put app icons wherever you want on your home screen, instead of having to use a grid that fills in from the top left. If you want all your apps on the edges of your screen, or the bottom, you can do that. You don't even need to keep them in a grid anymore if you don't want to. App icons will now change color when dark mode is on, or you can choose your own color that will tint all your icons.
Control Center now has multiple pages, similar to the lock screen. Perhaps more exciting is the ability for app developers to create their own Control Center widgets. For example, if Uber takes advantage of this, you might one day be able to put a widget to "get a ride home". Open your Control Center, tap the widget, and your ride is on the way. These new widgets can also be tied to the action button on iPhone 15 Pro, letting you issue any Control Center command with the push of a physical button. You can also replace the camera and flashlight buttons on the lock screen with any Control Center widgets you want.
At long last, you can lock any apps you choose behind authentication. Some apps already offer this, like journals or banking apps, but now you can decide. If you want to make it so someone has to use Face ID or Touch ID before they can get into your book player, you can do that. You can even take it a step further and hide the app completely. No one will even know the app exists without authenticating first. This is perfect for hiding apps you find embarrassing or overly personal, so you can hand your phone to someone else without worrying that they'll ask awkward questions.
Messages is getting some new features, in addition to the Genmoji we've already talked about. There are more quick reactions available, you can apply bold, underline, and italics to text, and you can use special effects on parts of your text. RCS is finally on the way, letting non-iMessage conversations take advantage of typing indicators, read receipts, and other features. You can schedule a message to send later, and if you have a compatible iPhone, you can exchange messages via satellite.
We've been over some of the Mail app's new features, but there's more. I told you Mail can automatically categorize messages and offer AI-generated replies or edits to replies you type. Mail can also group mail by business, into what Apple calls a digest. All the messages from a particular company, for instance, might make up a digest. You can act on all these messages at once--that is, move or delete them.
When Mail puts a message into a particular category, you can manually move it if the AI was wrong. Apple didn't say if this will train the sorting of future messages, but I hope it will. One category to pay attention to consists of messages Apple Intelligence thinks are important. Requests for you to do something, messages you should reply to, that kind of thing.
Photos is the other app that got a huge update in iOS 18. The top of the screen shows the familiar grid of photos, with the ability to filter by objects, places, and more. You can also have the grid show only photos from all time, this week, and other timeframes, just as you could before.
Below the grid is the new library, where you can select a collection of pictures. Collections are put together by Apple Intelligence. It might be a trip you took, or pictures of someone specific, or some other commonality. You can choose collections to pin for easy access, and make slideshows (aided by on-device AI) complete with music. You can share collections with others if you want to.
There are other changes, too. Filter out "clutter", such as pictures of receipts. Hide screenshots. Make AI-assisted edits, such as removing a distracting object from the background of a shot. The presentation didn't describe the layout of the new app that well, though, so I can't really tell you what to expect beyond that.
There are some smaller, but no less useful, changes coming across the OS.
- Tap to Cash will let you send or receive money through Apple Cash with NFC, by putting your phone near someone else's.
- There will be topographical maps of 63 U.S. national parks, useful for planning hikes. These can be used on watchOS as well.
- Game mode will focus your phone's power on the game, minimizing background operations, and will decrease the latency on AirPods.
- Games can build in support for personalized spacial audio on certain AirPods models.
- With AirPods Pro 2, you will be able to nod or shake your head in silent response to yes/no questions from Siri, and voice isolation will be improved.
- There's now a dedicated app for managing your saved passwords. It is coming to iPhone, iPad, Mac, and Windows.
- The Journal app will have writing streaks, a search function, a mood tracker, and some other new features.
- Reminders have been integrated into Calendar. There's still a Reminders app, but you can add reminders while inside the Calendar app as well.
- You have the option of choosing larger Home Screen icons.
watchOS 11
With watchOS 11, Apple continues its fitness focus. You can set up your activity rings by day, such as giving yourself harder goals on Sundays when you have some extra time. You can pause your goals, so you can have a rest day every so often or not have an illness break your move streak. You can also customize the summary screen shown in the Fitness app on your watch, focusing on the metrics you care about most.
The new Vitals app tracks your overnight metrics--temperature, heart rate, breathing rate, and more--and alerts you if anything is out of the ordinary. It can also help you link these aberrant values to causes, such as suggesting that alcohol consumption makes you get less sleep or that your heart rate has been high while you sleep, which may be due to your body temperature also being high at night.
Training load is the other big change. Now, you can get an understanding of how your workouts are affecting you. This is especially useful for those on a specific training plan or who are working toward a goal, but it can help anyone better understand their fitness. Some workout types will generate an effort rating from 1 to 10. If a workout can't auto-generate a rating, you can enter one manually. You can also adjust an automatic rating, such as bumping it up if you were sore during strength training and you feel that Apple's estimate is too low. Eventually, these numbers will be used to help you get insights. For instance, if you logged five days in a row with effort ratings of 8-10, maybe you should give yourself an easier day. You will also be able to compare your most recent 7 days to the previous 28, to see if your effort has been below, above, or about the same in the last week compared to the last month.
While fitness changes are the big ones, watchOS 11 has other improvements I'm looking forward to.
- The Smart Stack is more intelligent. It offers more widgets, and uses time, location, and other factors to bring up widgets that may be useful.
- Checkin, a feature that can be used to tell friends and family when you get home or that you didn't arrive when expected, is now available on Apple Watch. It's also integrated into the Workout app, so you can use it to let someone get an alert if you don't get home from a workout.
- Live activities will work on watchOS now. This makes it much easier to track flights, rides, sports scores, and other events on your watch.
- In they Cycle Tracking app, you can mark that you are pregnant. This will start tracking your gestation age, and will suggest things for you to do. It might ask if you want to increase your high heart rate threshold, or remind you to check your mental wellbeing more often.
- Apps can be told when you perform a double tap. Finally, third party apps will be able to do things when you double tap with the app open.
iPadOS 18
In addition to all the Apple Intelligence, Mail, Messages, and other features we went over in the iOS 18 section, iPad is getting some new tricks of its own. We'll start with the one many of you probably won't care about: Apple Pencil.
iPad can figure out your unique handwriting style. When you write, it can keep that style in place, but clean up your text to make it more legible. If you write with Apple Pencil and then paste text from another app, the text will be transformed to look as though you wrote it. A similar automatic cleanup process can be applied to sketches you make in the Notes app.
Math Notes is a feature that was presented as though it's tied to Apple Pencil, but I don't think it is. I think it works if you type, too. Math Notes basically detects when you're writing math-like text, and will auto-fill answers. If you write an equation, then add an equals sign, iPad will solve the equation and add the answer to your text. This feature will come to macOS as well, but it's unclear if iOS will get it.
Speaking of math, iPad finally gets a calculator app! It hasn't had one since its launch back in 2010, and the lack of a calculator has been something of a joke around the internet for years. Now it has one, with everything from basic math to scientific functions to unit conversions.
SharePlay has been updated. You can now take control of the other person's device if you're in a screen sharing session, to help them do something. You can also use your Apple Pencil to point to your device, and the other person will see the pointer on their screen.
The only other feature that was discussed is one I didn't quite understand. Tab bars now float, and can transform into sidebars and back again automatically. I don't quite know what this is about, but Apple seems very excited about it, so I guess it's a good thing. Hurray for the magic tab bars!
macOS Sequoia
macOS Sequoia isn't a major update, but it has a headline feature you'll definitely want to hear about. It also gains Apple Intelligence, Math Notes, the passwords app, and the same topographical maps that the other Apple devices get.
The big new thing in macOS is iPhone mirroring. Choose this from the macOS Control Center, and a life-sized image of your iPhone screen appears on your Mac. You can use the mouse/trackpad to perform gestures on the virtual screen, and the keyboard to type. Your phone's audio comes through your Mac's speakers. All this happens while your actual iPhone stays locked and tucked away--this is all wireless. Your Mac will now show iOS notifications; clicking one will open iPhone Mirror with the notifying app already displayed.
Other features Apple mentioned are:
- Safari can pull highlights from a webpage, such as the contact information for a business, and show them to you. It can also summarize a page and show an auto-generated table of contents when Reader Mode is activated.
- Presenter Preview works with video conferencing apps to show you what your camera is about to share with everyone else. You can replace your background with a premade one, or one generated from one of your own photos.
Other Announcements
There were a few other things covered by Apple that I didn't feel warranted their own sections. My apologies to all you Apple Vision Pro owners in the audience.
- visionOS 2.0 will increase the resolution of the Mac displays that can be mirrored on Apple Vision Pro.
- Machine learning algorithms can turn any photo into a spacial one, suitable for viewing on Apple Vision Pro, and SharePlay can now be used to let a group of people all look at the same photos on their headsets.
- There are new visionOS gestures for common tasks, such as checking the battery.
- If you're watching an Apple TV Plus production, you can get a live feed of which actors are on screen, their characters' names, and what the name of the song playing is.
- Apple TV has wider support for speakers, better enhanced dialog, and automatic subtitling if you skip back or mute the volume.
- Apple TV has the ability to use your own photos as screen savers, or clips from Apple TV Plus content you've been watching.
There You Go
And there you have it. A huge focus on AI, which isn't a surprise, with Apple's typical focus on user privacy. But we also got some great new features on all of Apple's devices. I'm tentatively optimistic about AI and Siri, and I'm looking forward to seeing how good it winds up being. I'm thrilled by the fitness features on watchOS, intrigued by Math Notes and some other changes on iPad, and would be more excited by iPhone Mirroring on macOS if I used a Mac more.
How are you feeling? Does all this AI stuff creep you out? Will you find iPhone Mirroring useful? Are you looking forward to being able to arrange your icons however you want? Are you one of our rare Apple Pencil users who's excited to get your hands on iPadOS 18?
Comments
My AI Hurts
If it actually happens that way, great.
Great summary
Thank you, mehgcap. I actually just got out of a meeting where I gave a WWDC summary in a 6 minute talk. Your summary was way better than mine. LOL.
I instaled iPadOS 18
Omg this is prob my favorit iPadOS update ever
Moving apps where ever is awesome, the new control center is buggy with voiceover but for a first beta its ok
I will work on reporting that bug and any others i find in the feedback app
iPad calculator
My wife will be thrilled to have a calculator on her iPad.
Caught Just a Bit of Today's Event
One of the maintenance workers was doing a couple repairs in my bathroom, because I requested that on the tenant portal yesterday. He and I were chatting a bit, so I didn't catch most of Apple's event but what I did hear sounded great. First off, I'm so grateful to the company for starting to have these audio-described and today of course was no exception. One iOS-specific thing which I did hear was that the journaling app is being expanded. I've started keeping track of what I eat and drink throughout the day using this app, and it is working very well. I plan on going back to listen to the archived presentation from today.
Great post
Thanks for this. While you touched on quite a few features coming to iOS 18, I am curious as to how many things those of us with older model iPhones will receive. 🙂
I'm so envious! I don't think I'll get any of these features.
I was hoping I'd at least get something, because my 13 isn't exactly ancient. However, I'm not in a position to get another phone just now, especially I just paid it off this spring and need some debt-free time.
I'm not that interested.
Honestly, nothing really caught my eye, the AI stuff might be cool, there might be no more, ok I found, thing x, on the web,, take a look, but apart from that, I don't see anything that interesting.
I'm hoping there's some new voiceover stuff in this update but apart from that, I don't care.
Call recording
Wow, given how strict apple is about privacy, I never thought that this feature would've been implemented. Never saw this coming. That's for sure. All parties will be notified that their calls are being recorded, but, there are always exceptions to the rule, where people will always try to circumvent the rules. Either way, common sense should tell you not to say anything on the phone that could be taken out of context, but, phone calls can always be edited after the fact, so, who knows. Does this work on FaceTime, or only direct phone calls? Plus, what's stopping somebody from recording you with a secondary device without your knowledge, even without this feature? I also wonder what the quality is going to be like. The devil is in the detail, I suppose. I wonder where the recordings of the phone calls get saved? Do they get saved as videos in the camera app? Or recordings in the Voice Memos app?
Keynote on accessibility?
Who can tell me when this is happening? Given the way in which AI is implemented in IOS 18, I'm curious as to how it will figure in the accessibility skeem of things.
Hello, any update on…
Hello, any update on voiceover related new features in ios 18?
New voices
Hello everyone.
Are there any new VoiceOver voices? Have been especially interested in new languages.
Further revelations from WWDC presentations and early testers
Hello everyone. Quite often, people miss out on pretty substantial accessibility items if all they pay attention to is the main keynote speech. Already, people have grabbed the iOS 18 beta and found that we're at last getting a tutorial for VoiceOver. Also, a new and much better audio ducking system allows people to set VoiceOver volume to be a steady amount louder than everything else. This will make enjoying things like music a lot easier while working.
I'll be trying to glean what other information I can from the remaining events. This will hopefullyt reveal more details which I can begin using to update the third edition of my book over the Summer. For one thing, I'm hoping for more clarity on exactly which features will be exclusively for iPhone 15 Pros and newer devices with M series chips in them. I suspect that for older devices, there may be an option eventually to tap into Apple's secure cloud computing servers to use some of this new AI advantage. Memory and processing ability seem to be vitally important to running large language models on device, but cloud computing may soften the blow for people who, like me, upgraded last year to regular iPhone 15. This morning at around 11 AM Eastern if I'm reading the schedule right, there will be a presentation about catching up with accessibility. As far as I can tell, that's the only directly accessibility related item on the schedule. However, I don't think all of the events have been revealed yet. Also, there are doubtless one or more labs for accessibility questions from developers. As a non-developer, I don't think I can access those. I can just access the presentations and resources. In my experience doing this year after year, there are often details revealed in other presentations but you have to wade through the programming terminology which often eludes me. I'm a mere English major and not a programmer of any sort. They dished out some of that terminology in their state of the platform presentation yesterday that made my head do a complete 360.
@Danil if you have read the…
@Danil if you have read the earlier post in May, Apple have already mentioned that there will be new voices. But I would like to hear real demonstration and also more detailed information about it.
@Michael Feir ok great,…
@Michael Feir ok great, waiting for your update.
Thoughts
I think the majority of the key note seemed to have been leaked beforehand which meant there wasn't anything there that felt like a massive surprise.
The AI stuff is potentially going to be very good. I think being able to find photos by description will be smart, but I'm hoping it might also be able to give me a brief description of a photo as I swipe through it. Like what it currently can do butwith some actual detail.
I still feel al ittle reluctant to trust AI, but I suppose once it gets integrated into everything then it will start to slowly earn it.
I think part of me will miss simple Siri. Like when you ask "Hey S, when will it stop raining?" and the reply is "Yes -it is raining right now.". OK, OK, I will ask one of the grown-up voice assistants instead. You go and play with your toys.
Shame about the hardware. My Mac will run the AI stuff fine, but I am quite anxious about upgrading my Mac since my terrible experience with Sonoma. I know I know my will power is likely to last about 3 seconds after it gets released, but hopefully common sense will take over for one time in my life. Considering how Sonoma felt liek it offered nothing new at all and yet was still a total mess, this feels much more radical which makes me nervous.
And my iPhone 13 Pro Max will run iOS 18 without the AI. Not sure if I can use ChatGPT stuff but guess the whole thing will be out. I don't really feel ready to upgrade my phone now. It's the most expensive one I've owned and 3 years is not a long time for me to own a phone. I guess if I do give in to the mac temptation then I can try a lot of it out first.
Live scores on the watch feels like something it should already be able to do, but now we have Apple Sports maybe it's the right time. I like that idea.
I think Apple is very good at making me want things I don't need, and time will tell if any of these features become part of my day to day interactions with my phone or Mac. I do use ChatGPT a fair bit via Skype and the idea of using Siri instead is appealing if the interface works well.
But having said all that if they said "nothing new this year" then, "oh wait... one more thing! This time it will actually work!" Then that would have made me a lot more excited.
Siri on older devices.
It would be nice to know whether iPhone 15 and older gets stuck with dropped on head Siri or whether it gets a better version. We know the AI fun is only coming to iPhone 15 pro and later as well as M1 iPad and later but if anyone could find out if Siri is getting any less differently abled on older devices I’d love to know.
I believe head gestures to…
I believe head gestures to answer or decline calls are limited to AirPods Pro.
OK, I'm interested again.
An actual tutorial,, we get an actual tutorial! Finally, it took them what, at least 10 years to catch up to android with that but finally they did it!
In same boat as Jo
I'm disappointed that my iPhone 13 pro probably won't get many of the new features, but if I'm being honest, most of them didn't sound like anything I must have to survive. I'll pay attention to the hardware event in September, of course, but finances dictate that holding on to my 13 pro for another year would be the wisest course of action. Besides, at least based on all the rumors floating around out there, I'm not overly keen on the changes they're making to iPhones; I definitely don't want a device bigger than what I already have. Kind-a wishing I would have gotten an iPhone mini when they were a thing, but I know the battery life would have been significantly reduced. I'm just going to enjoy any audio demos I can get for the new features. Maybe they'll be more tempting next year, and even more maybe, I'll be tempted by the iPhone 17.
head gestures
Is that for both airpod pro? Or just 2?
Airpods Pro
only, not sure if it's pro 2 or just pro.
New Voices
You have new Kazakh and Lithuanian voices, and the ability to use VoiceOver with personal voices you create, which does not appear to have been mentioned by anyone on AppleVis ever since the beta was rolled out.
The magnifier stuff works on my iPad 9
The magnifier stuff that was only on devices with a litar sencer works on my iPad,
Well it is there, it doesn’t work vary well from what I could tell
The Coolest Announcement from the Platform SOTU
It's not that Swift is coming to Windows and Linux, although that's pretty cool.
It's not that Apple is replacing many of its core components written in C, C++ and Objective C in Swift although that's pretty cool.
It's that Swift has evolved to the point where it can be used to write operating system kernels and microcontrollers!
An OS kernel is the glue between the hardware and software of the computer. It's what determines what happens when your camera is activated or the Thunderbolt receives a connection. Typically, these kernels are written in languages like C or even Assembly and can look just plain WEIRD if you read the code.
A microcontroller is a baby kernel that works on one piece of hardware instead of many. Your thermostat, dishwasher, microwave oven, massage chair and surround receiver often have a programmed microcontroller inside. In the old days, these would have been written in Assembly or maybe C.
I imagine that it's now possible to write these kernels and microcontrollers using all of Swift's accessibility frameworks and APIs which means some entrepreneurial person reading this is going to say, "Bring on Bosch, or Whirlpool, or Samsung and I'll destroy them with an accessible interface built off of a Swift microcontroller."
Happy 10th birthday Swift!
wow love it
wow love this!!!!
the AI bits are okay. more than anything i think the OCR stuff if accurate will help blind folks a lot navigating inaccessible documents.
and even though the company has faults i know it still remains one of those hugely accessible things out there and i can see it helping other disabled folks too.
i would however this time read all the privacy statements carefully. and love that its on device.
My thoughts
Hi.
I think I've mentioned in another post, but I decided to delve into this one.
The updates for iOS and iPadOS are very welcome, especially for the huge amount of artificial intelligence they've implemented. Unfortunately, I won't be able to use it because I have my SE2022, but I still value and appreciate Apple for it, although I find the limitation a bit unwarranted, but 'I understand.'
As for macOS, it just made me more eager to migrate to a Mac. Being able to use the iPhone without having to touch it has been a long-time dream of mine.
As for watchOS, I have an Apple Watch, but I haven't touched it in over a year, it's simply not for me, and I don't use or know about the others, so...
in summary, I think it was a good event.
AI
Siri could describe the picture now? If so, Apple just kill any other app that does it. They might had done so with 1password by doing password app. Heard that it will also be for windows when it comes up.
Pixie genocide
Although it might mean that there is less need to use other image description apps, I think there is likely still room for things like Piccy bot given that they have other AI models and options. At least for now.
Do we know if photo descriptions are definitely coming and how they will work?
What I want is a way to browse my photo gallery and get an idea of what each photo is, as I swipe through them then ask either Siri or the pixies or whoever to give me a load more detail about it if I choose to. And ideally same for videos. If iOS 18 can do that then it would make me very tempted to upgrade my phone.
I think given how quickly AI is moving forwards and how the big companies are now integrating it with absolutely everything I'm sure we are going to see a number of casualties. There has been a rush to get as many chatbot/assistant style apps launched as possible and they aren't all going to survive. I'm still a little dubious that they are all needed already.
mr grieves
How apple deals with privacy will do it. If what they stated about cloud and protecting data and making sure is secure, it will do it. Others just talk and talk. I think I trust more apple than others.
Passwords app
This is so true. I've been a happy 1password user for two years. If the windows app is decent then I don't have any reason to use 1password as I already have a mac. It's weird though because they are partner with 1password but then they launched their own app? We'll see.
As for macos... See my other comments. Voiceover aside, in retrospect Sonoma was a small unwanted and buggy upgrade for most people from what I can see on the worldwide web.
As for Vision pro, I live in Canada and I'll really go to my local store and spend two three hours on it with voiceover whenever it's released.
Same for secret vault stuff etc.
There are those apps that let you hide or lock content like photos and notes, and apps. There are also Screen Time tricks that let you lock your apps with Touch ID or Face ID by setting a timer and having the device require Touch ID or Face ID in order to access locked apps. Now these will all be gone, and since security software for iOS already differ from those for other operating systems by lacking anti-virus functionality, they will now lack these other features I mentioned as well.
Re. New voices
Thank you so much, Enes Deniz. Glad to hear about new VO languages. The first ever Baltic language along with the second Turkic — that's quite nice.
Although I've been waiting for the dedicated Serbian Vocalizer TTS. Croatian Lana is not the way you expect to read Serbian Cyrillic text at all.
And yes, what's about Chinese Vocalizer voices. Do they still sound like compact ones?
Welcome.
Thing is, Kazakh has been among the languages, or dialects I'd say, supported by iOS for years, and you can already have eSpeak-NG read out text in Kazakh, but even with the new Kazakh voice (Aru) VoiceOver still switches to the default Russian voice to read out content in Kazakh in apps that are set to display their content/interface in Kazakh.