According to a New York Times article, Apple will unveil a redesigned SIRI on June 10, based on generative AI. SIRI will be more conversational and versatile. Sources for the artticle were two people not authorized to speak publicaly. But this has been long-anticipated and isn't really a surprise.
The article described a large-scale company reorganization to accommodate development of this new feature. The article didn't mention accessibility, but it seems reasonable to conclude that the company's AI focus was responsible for many accessibility issues receiving inadequate attention. The article stated that Apple's self-driving cars were one casualty of the company's AI focus.
The article concerned the iPhone primarily, and emphasized that the AI processing would take place on the phone, rather than a remote server as with OpenAI and other current technology. This might indicate that new hardware would be required, though nothing specific was mentioned in the article. Let's hope the new SIRI is also available on MacOS. That might get me to upgrade my 2018 Mac Mini.
The article didn't go into detail on SIRI's new capabilities, but did mention SIRI would be better at doing the things it already does, like scheduling calendar events and making grocery lists. If so, this would be a huge disappointment. What we need is an AI that is capable of interacting with inaccessible UIs, and AIs that can change their own system settings from plain spoken commands.
The article went on to describe Apple's less-than-stellar reputation in the AI field and why it has been unable to attract industry-leading AI researchers.
Comments
New Hardware
I strongly believe that the AI-powered new Siri needs new hardware with better hardware acceleration. The reason that the M4 was first released in iPad instead of Mac is, at least in my opinion, to have a testing device ready for the new features on iOS / iPadOS throughout the beta cycle. When the M4 comes to Mac and the A18 builds on the M4, AI comes to the new iPhone 16 (Pro) and the Mac this fall.
And to think. . .
A good portion of you all just purchased an iPhone 15x just 8 months ago. Here is to hoping that particular hardware is capable of holding the 'new and improved' Siri. 😵
Oh God please!
I got an iPhone 14 pro almost a year ago. I seriously hope I’m going to be able to take advantage of what the new Siri has to offer. Looking at other hardware requirements Apple has used in the past, I’m thinking at the very least one of the later iPhone 13 models will be the absolute minimum and I hope that this is the case.
The thing with on-device AI…
The thing with on-device AI is that they consume a lot of memory. The iPhone 13 has only 4 GB RAM, the 13 Pro, however, 6 GB if I remember it correctly. The 15 Pro models have 8 GB RAM, which could be the minimum of memory in order to run the model. Once the model is in memory, you need a lot of computational resources. Such resources are provided by the GPU and Neural Engine, as they offer the capabilities AI models need like parallel computing and power for fast floating-point operations. Not sure whether the current models are capable of all that power.
Manuel
Agree. Another benefit might be is regarding pictures. I understand the reason pictures are not consistent with description is that it has to go out and that affect privacy. If it is done within the phone, that might no longer be it. Looks promising but with promises, they are not kept. Will see what happen.
I started to worry
My phone is the wrong model, and too old, and too under-powered, and won't get the new SIRI. Then I realized I had already stopped using SIRI for anything other than making phone calls to people on my contact list. I seem to remember asking SIRI the name of a song that was playing in a shopping center a long, long time ago, and getting an answer. Also, trying to make an apointment on the calender once. If I don't use it, do I need it?
SIRI usability, and inside Apple
I find that SIRI works well for what I use it for. But some of the most simple things don't work, for no apparent reason. For example, I can tell SIRI, "open accessibility settings," and SIRI opens Settings on the Accessibility screen. But if I say, "open phone settings," it simply opens settings at the main screen.
I wouldn't even need to get to the phone settings if I could simply tell SIRI, "block unknown callers," or, "don't block unknown callers," which is what I need to do in phone settings 99% of the time. I can use SIRI to turn VoiceOver on and off. Why can't I use SIRI to toggle blocking unknown callers? The inconsistency is baffling.
I think the most revealing thing about the article was the peek behind the Apple curtain. It surprises me that a company of this size and market cap has so many problems staying on top of current technology and research. SIRI's flaws and design limitations have been known for years, and Apple took no steps to rectify it. But the advent of generative AI has caused the company to overhaul SIRI with a priority high enough to nix planned features and ignore known issues, as if the disregarded SIRI suddenly became the company's raison d'etre. Because this overhaul was a knee jerk reaction, rushed to market to match the WWDC schedule, my fear is that the result will be nowhere near as good as if it were part of a long range vision.
I wonder what it might take for apple to restore priority to VoiceOver.
Simple Things
Sometimes, the simple or basic functions can be the most useful... Is this why they get ignored?
I use basic text files to record appointment dates, do all my budget and bill tracking (mostly in a single file for each month) and even a lot of my writing, as convoluted as it is. Two things that completely changed my iOS experience were when Apple made the Files app, and when Dropbox incorporated a direct text editor into it's iOS app.
I look at the iOS calendar when I need to figure out on which day of the week a date falls. I suppose I could ask SIRI, and the next time I need that, I might try it.
Siri
Now I notice that she is able to open web page. On Tuesday, I asked her to go to apple.com and she did, also if I want an app, ask her to find it on the app store. She does well with it. Only issue is that she has problem with messages, I ask to send a message, she sometimes stop reading the message on the middle and does not work. She needs to have a brain. Hope apple does a brain transfer and she get better.
You can also spell words with siri.
Just ask siri, hey siri, how do you spell x, x being the word, and siri will spell it.
Don't use the irish voice because for some reason it doesn't mention h's at all, it's odd and I've reported it with no luck.
Cloud and M4 ultra chip
Heard that to make AI better Apple will be using the new chip that they are developing or already have. It suppose to be faster, better than any chip in the market.