Apple's highly anticipated annual Worldwide Developers Conference (WWDC) kicked off today with the customary keynote presentation. During the event, Apple unveiled the next major updates to all its software platforms and introduced the much-awaited mixed-reality headset and some new Mac models.
Our editorial team member Alex (‘mehgcap’), who typically provides detailed event summaries, regrets being unable to do so this time due to work commitments.
Summarizing a 2-hour information and technical-rich event with accuracy and speed, while also teasing out what's of particular relevance and interest for our community and adding personal opinions, requires skills and abilities that I frankly do not possess. Instead, I will provide direct links to reliable sources where you can find accurate and up-to-date information about all the news announced today. These resources include Apple's official press releases and the previews they have released for their upcoming major software platform updates. I will update this post with additional resources as they become available.
We are eager to hear your thoughts on today's announcements and updates from Apple! Please feel free to share your opinions by leaving a comment below.
Apple Press Releases
- Introducing Apple Vision Pro: Apple’s first spatial computer
- iOS 17 makes iPhone more personal and intuitive
- iPadOS 17 brings new levels of personalization and versatility to iPad
- macOS Sonoma brings all‑new capabilities for elevating productivity and creativity
- Introducing watchOS 10, a milestone update for Apple Watch
- tvOS 17 brings FaceTime and video conferencing to the biggest screen in the home
- Apple introduces the 15‑inch MacBook Air
- Apple unveils new Mac Studio and brings Apple silicon to Mac Pro
- Apple introduces M2 Ultra
- AirPods redefine the personal audio experience
- Apple announces powerful new privacy and security features
- Apple provides powerful insights into new areas of health
Apple Operating System Previews
Media
- Watch WWDC23 keynote replay on Apple website
- Watch WWDC23 keynote replay on Youtube
- You can also watch the WWDC23 keynote in the Apple TV app on any supported device.
WWDC Sessions
- Create accessible spatial experiences: Learn how you can make spatial computing apps that work well for everyone. Like all Apple platforms, visionOS is designed for accessibility: We'll share how we've reimagined assistive technologies like VoiceOver and Pointer Control and designed features like Dwell Control to help people interact in the way that works best for them. Learn best practices for vision, motor, cognitive, and hearing accessibility and help everyone enjoy immersive experiences for visionOS.
- Extend Speech Synthesis with personal and custom voices: Bring the latest advancements in Speech Synthesis to your apps. Learn how you can integrate your custom speech synthesizer and voices into iOS and macOS. We'll show you how SSML is used to generate expressive speech synthesis, and explore how Personal Voice can enable your augmentative and assistive communication app to speak on a person's behalf in an authentic way.
- Build accessible apps with SwiftUI and UIKit: Discover how advancements in UI frameworks make it easier to build rich, accessible experiences. Find out how technologies like VoiceOver can better interact with your app's interface through accessibility traits and actions. We'll share the latest updates to SwiftUI that help you refine your accessibility experience and show you how to keep accessibility information up-to-date in your UIKit apps.
- Perform accessibility audits for your app: Discover how you can test your app for accessibility with every build. Learn how to perform automated audits for accessibility using XCTest and find out how to interpret the results. We'll also share enhancements to the accessibility API that can help you improve UI test coverage.
Mainstream Coverage
- The Verge: Apple Vision Pro first look: the mixed reality future is (almost) here
- MacRumors: Apple Reveals 'Vision Pro' Headset and visionOS
- MacRumors: iOS 17 Compatible With iPhone XS and Newer, Available in Beta Today
- MacRumors: watchOS 10 Compatible With watchOS Series 4 and Later, iPhone XS/XR or Later Also Required
- MacRumors: macOS Sonoma Drops Support for These Macs
- MacRumors: Apple's Vision Pro Headset Uses 'Optic ID' Iris Scanning Authentication
- 9to5Mac: Siri no longer requires 'Hey' command to activate
- 9to5Mac: Which iPads support iPadOS 17? Here's the list
- 9to5Mac: Apple Vision Pro: US-only at launch, some eye prescriptions not supported, age requirement, AR hardware preview
- The Verge: I wore the Apple Vision Pro. It’s the best headset demo ever.
- TechCrunch: First impressions: Yes, Apple Vision Pro works and yes, it’s good.
- 9to5Mac: Vision Pro: I just tried Apple’s first spatial computer, and here’s what I think
- MacRumors: WWDC 2023 Recap: Everything Apple Announced Today
- Six Colors: The features that didn’t get discussed onstage at WWDC
- MacStories: WWDC 2023: A First Look at Messages in iOS 17
Comments
Nothing exciting for me except the Vision Pro
A headset with cameras all over the place, LIDAR, all the sensors and a connection to an AI assistant could end up being the perfect assistive tech. Like having a sighted personal assistant who understands what I need to know and narrates it to me with the option for a conversation about what I need to know. I’m very excited to see where it goes. No word yet on accessibility though. The rest of the keynote was pretty Meh for me. All I want is for voiceover to work properly on all my devices, its currently non of them, then we can start talking about what features I’d like. I was very surprised Apple still don’t have any story to tell about AI. Nothing, zip, nada. MS and google are racing ahead getting it into everything and the only change to Siri was the trigger word. I was pretty surprised and disappointed. If Apple don’t get into this tech, they’ll be left behind. I don’t know when the window closes and everyone finds their product but I thought Apple wold have something to say at least.
Andy Lane
Siri would have to be very smart. Right now, she could not even make chain for a burger.
The vision pro does have potential if accessible with voiceover
The Vision pro does have some pritty good use cases if it is accessible with voiceover.
Like I can see it being useful for things like Aira and Be my eyes.
No eloquence improvements in iOS 17?
Hi,
Are there any eloquence improvements in iOS 17? I guess not.
Accessibility
There is an IOS, visual impairment feature listed there under accessibility called Point and Speak. It said something about objects with labels. That might be interesting to look into, though it would not be available for my model of iPhone.
I wo0uld like to try the Vision Pro
I would like to try the Vision Pro and hope we can know more about t accessibility about the Vision OS in the near future.
Apple Vision Pro accessibility
I've now added a number of mainstream articles to the above post, including some early hands-on experiences of the Apple Vision Pro.
I've also added links to some accessibility-focus developer sessions. The following session is of particular note:
Few rods on Eloquence
In the first Beta of iOS 17, Eloquence sounds worse than ever in German. I've not tested it yet for other languages.
Hearing Pro.
The headset uses something similar to a Bose frames style of open speakers. They are 2 way so probably sound better but one of the promo’s had someone using AirPods so I’m guessing the onboard audio isn’t great.
Immersion with Vision Pro
I agree that the Vision Pro should have very important and cool accessibility use cases, like describing things in the visual environment with AI more conveniently than using a phone, and maybe even in motion like describing stop lights, intersections, ETC while walking, which would be amazing! However, it definitely seems like the immersion aspect of the device will be far less for blind people.
The WWDC keynote says that users will control the device using their eyes and fingers, and it literally creates a new visual environment around the user or adds things to the existing environment using the really advanced display technology that the keynote describes. The WWDC session about accessibility in Vision OS is not available yet, but I would guess that VoiceOver would describe objects in the virtual world, perhaps also with accompanying sound effects, and would probably be controlled by hand movements, gestures, and maybe head movements. However, this is not the same degree of immersion as the visual experience, since the real objects are not simulated like people would interact with them in real life.
For example, one feature mentioned in the keynote is the ability to add a 4K monitor into the virtual environment that displays your Mac and lets you interact with it, I'm guessing by eye and finger movements. If VoiceOver responds to gestures for this similar to how it does on a Magic Trackpad, and plays the audio from the Mac sounding like its in a certain position, or even if it allows users to make typing motions to type on a virtual keyboard that isn't there, this would not be the same level of immersion. For the same level of immersion, I can imagine reaching out with my arm and making a special hand gesture and pulling a keyboard out of the air. Then the keyboard could hover as I type on it, simulating how typing feels on a real life keyboard, and if its the wrong size I could pull the sides out to make it bigger or push the sides in to make it smaller. Also I could pull a braille display out of the air with however many cells I want, and reposition the sound by pushing a hovering speaker around with my hand.
And for the example they gave of watching a movie on a moutain top, this experience for blind people would probably be very similar to regular spatial audio, maybe with some background sound. But for this experience to be as authentic as the visual experience, I should be able to hear the complete open space around me, feel the breeze on my body, and be able to move around and explore the terrain and find rocks and bushes and things like that.
For running iPad and iPhone apps, maybe a sheet of plastic could materialise next to me with braille and tactile graphics on it representing the app's layout, and if I press hard on the text for a button or on an icon, that would activate the control and update the braille and graphics with the new layout. And I could make the plastic sheet as big or small as I want just like the keyboard.
For video games and movies, I can imagine the whole environment being simulated instead of just the sound effects. For blind people, sound is definitely an important part of the environment, but there is so much more. It would simulate touch, not just touching things with your hand but how the ground feels to walk on, how it feels sitting down, etc. And for sound, the layout of a space can be sensed by how the sound echos, which I have never heard simulated digitally, although the Vision Pro might include this aspect as far as I know. For example, you could actually feel like you're walking down a hallway, unlocking a door, ETC, or even like you're in a battle if that's what the game is, like hear exactly where your enemies are around you and feel a sword in your hand with how the grip feels and its weight as you swing it, and feel the struggle of holding a shield up as someone tries to push it down.
This probably isn't possible with current technology; perhaps the only way this would be possible is to hook a machine up to your brain and have it stimulate different nurons and read data from the brain somehow. But until we get this technology, I doubt that the Vision Pro or future AR headsets can ever be nearly as immersive for the blind as they are for sighted people.
I'm not sure about everyone…
I'm not sure about everyone else here, but personally, I have reservations about wearing something on my head that supposedly resembles a ski mask or a visor helmet.
It seems like we won't have the chance to experience it firsthand until next year, possibly towards the end of the year, if the information I've heard is accurate.
Moreover, the price doesn't seem justifiable for a first-generation product. I'm not inclined to spend that much on it.
Plus, walking down the street with this thing could potentially lead to trouble.
Eye movement
For those with artificial eyes, or totally blind, I'm not sure this product would be beneficial. My understanding is that one has to control their eye movement to use this product, as the lenses need to be the correct size. Is that correct? The accessibility of the visual pro is interesting.
My albeit limited…
My albeit limited understanding is that VoiceOver on visionOS will allow you to control the device entirely with your hands, using a series of pinch gestures to navigate interfaces and activate items.
Vision Pro
Well, initially I found it hard to get excited by something called Vision Pro which costs £3000 and has batteries that last 2 hours. But after listening to how excited they got on the Double Tap podcast about it, I'm maybe rethinking.
For starters, they played a clip that confirmed VoiceOver support. They also suggested that maybe you could charge it on the go, in which case the battery could last much longer. Anyone know anything about this?
They also made the point that £3000 for speciality tech would be a bargain, so looking at it through the lens of what else I could buy at this price point and it maybe doesn't seem so bad.
I would absolutely love a device that could combine the Soundscape + Bose frames experience I have now (but not for much longer) with all the stuff the Envision can do. But tied up into a more mainstream interface could have a number of interesting benefits.
I'm also wondering how wearing a ski mask out and about will look. Not sure I could go banking wearing it! (Not that this has happened for many years...)
Really looking forward to seeing where this one goes.
Very hard to be even slightly interested in anything else they had to say.
Vision Pro Battery
On the Apple's promo page it says the battery lasts up to 2 hours, but all day if you leave it plugged in (well, duh..)
Maybe it's possible to use a portable charger to keep it going longer when on the go.
2 hours doesn't seem like a lot. I've found the Bose frame battery a bit poor. but I think that goes something like 3.5 hours.
I'm guessing a lot of that battery drain will be coming from the monitor that most of us won't really care much for.
Must catch up on all the links posted in this article. I'm only dipping in and out of this as I get a few mins here and there.
I'm really interested to know how all the apps will work together. If I'm walking about, and it could give me directions, let me know what's around me, maybe even alert me to obstacles my cane isn't going to find, then detect the door when I get there, maybe use a cheeky bit of OCR to tell me what signs are, then I get inside and get a Clew like way to navigate my surroundings. Or would we forever be switching apps? Would get some funny looks walking about with ski goggles on waving our hands all over the place in front of us.
As long as journeys don't last more than 2 hours of course.
My other worry for this is that it will be too expensive and too niche and won't catch on with the mainstream, but I really hope it finds a general audience.
Ultrahaptics
I looked into Ultrahaptics, and they were purchased by a company that makes hand tracking devices and the merged company is now called Ultraleap. The only end-user products available so far are hand tracking devices, but they do allow developers to test and build on their haptics software and hardware. From what I've read, they have been making progress, and I will definitely try to obtain an evaluation kit! The website mentions it being able to simulate the sensation of textures and complex shapes by creating complex configurations of ultrasound waves against your hand, and to coordinate with their hand tracking device. I think blindness technology companies should definitely look into this; I can imagine many great uses for it, for example creating tactile graphics easily without expensive machines, and allowing users to easily manipulate the graphics to get exactly what they want. I will see what its like and what I can do with it if I can get a kit.
Vision Pro without a monitor
Stupid question - but when you use Vision Pro I presume you are looking at a screen at all times. So I guess if you were to turn the monitor off, you wouldn't see anything at all in front of you. My vision doesn't do much for me these days, but I can see some changes in light still and occasionally some things stand out - like I might be able to see the bars of a gate if the lighting is right. I usually wear shades because I am sensitive to the light. So I guess if I could turn the screen off, then I would be walking around in total darkness? Maybe there might be a benefit to that because it might help focus the other senses, but I'm not sure if I'm ready or not.
A few years ago I tried some smart glasses called Oxsight which were just that, and even a few a few minutes my eyes were feeling tired looking at a screen. Even though they aren't so great any more, I'm not sure how I'd feel about it now.
I still really, really want to try this and see how it works. It's a shame that Apple aren't the sort of company to allow other companies to take their tech and run with it. A version specific for blind and low vision users would be really interesting.
If you watched the keynote,…
If you watched the keynote, you would have noticed that all the promo videos were for indoor settings such as hotels, offices, and homes. If this was a pair of glasses, I wouldn't have a problem walking around with it on. However, a ski mask would just scream 'come and rob me!' People will see it and think that the person must be loaded if they're willing to walk around with something that costs $5288 in Australia.
Indoor only
Yes maybe you are right that this isn't going to be that suitable for wearing outdoors. This does limit the usefulness for the likes of us I think.
I like the idea of having glasses that can do some of the things my phone can do, like the OCR or be my eyes etc. But the envision is cheaper for that and I'd already decided that was too expensive for what is a small extra convenience.
Hopefully when we get a few iterations down the line it will start to shrink down to something a bit more subtle.
It would be great if a specialist company could take this and turn it into something designed purely for us, but I know Apple doesn't work that way.
Apple Vision Pro - 20 Things You NEED to Know!
Said video from the 'Zone Of Tech' YouTube channel, will clear up the question on "who is this first-ever 'Spacial Computer" meant for?" As of now, especially since only '100000' of these will most likely be manufactured.
Watch here:
https://www.youtube.com/watch?v=A-tG8vuJjUI
Question
How much is division pro in Australian, US and UK dollars