WWDC 23 Keynote: Apple Unveils Platform Updates and Introduces Apple Vision Pro Headset and New Macs

By David Goodwin, 5 June, 2023

Apple's highly anticipated annual Worldwide Developers Conference (WWDC) kicked off today with the customary keynote presentation. During the event, Apple unveiled the next major updates to all its software platforms and introduced the much-awaited mixed-reality headset and some new Mac models.

Our editorial team member Alex (‘mehgcap’), who typically provides detailed event summaries, regrets being unable to do so this time due to work commitments.

Summarizing a 2-hour information and technical-rich event with accuracy and speed, while also teasing out what's of particular relevance and interest for our community and adding personal opinions, requires skills and abilities that I frankly do not possess. Instead, I will provide direct links to reliable sources where you can find accurate and up-to-date information about all the news announced today. These resources include Apple's official press releases and the previews they have released for their upcoming major software platform updates. I will update this post with additional resources as they become available.

We are eager to hear your thoughts on today's announcements and updates from Apple! Please feel free to share your opinions by leaving a comment below.

Apple Press Releases

Apple Operating System Previews

Media

WWDC Sessions

  • Create accessible spatial experiences: Learn how you can make spatial computing apps that work well for everyone. Like all Apple platforms, visionOS is designed for accessibility: We'll share how we've reimagined assistive technologies like VoiceOver and Pointer Control and designed features like Dwell Control to help people interact in the way that works best for them. Learn best practices for vision, motor, cognitive, and hearing accessibility and help everyone enjoy immersive experiences for visionOS.
  • Extend Speech Synthesis with personal and custom voices: Bring the latest advancements in Speech Synthesis to your apps. Learn how you can integrate your custom speech synthesizer and voices into iOS and macOS. We'll show you how SSML is used to generate expressive speech synthesis, and explore how Personal Voice can enable your augmentative and assistive communication app to speak on a person's behalf in an authentic way.
  • Build accessible apps with SwiftUI and UIKit: Discover how advancements in UI frameworks make it easier to build rich, accessible experiences. Find out how technologies like VoiceOver can better interact with your app's interface through accessibility traits and actions. We'll share the latest updates to SwiftUI that help you refine your accessibility experience and show you how to keep accessibility information up-to-date in your UIKit apps.
  • Perform accessibility audits for your app: Discover how you can test your app for accessibility with every build. Learn how to perform automated audits for accessibility using XCTest and find out how to interpret the results. We'll also share enhancements to the accessibility API that can help you improve UI test coverage.

Mainstream Coverage

Options

Comments

By Andy Lane on Tuesday, June 27, 2023 - 04:14

A headset with cameras all over the place, LIDAR, all the sensors and a connection to an AI assistant could end up being the perfect assistive tech. Like having a sighted personal assistant who understands what I need to know and narrates it to me with the option for a conversation about what I need to know. I’m very excited to see where it goes. No word yet on accessibility though. The rest of the keynote was pretty Meh for me. All I want is for voiceover to work properly on all my devices, its currently non of them, then we can start talking about what features I’d like. I was very surprised Apple still don’t have any story to tell about AI. Nothing, zip, nada. MS and google are racing ahead getting it into everything and the only change to Siri was the trigger word. I was pretty surprised and disappointed. If Apple don’t get into this tech, they’ll be left behind. I don’t know when the window closes and everyone finds their product but I thought Apple wold have something to say at least.

By Holger Fiallo on Tuesday, June 27, 2023 - 04:14

Siri would have to be very smart. Right now, she could not even make chain for a burger.

By OldBear on Tuesday, June 27, 2023 - 04:14

There is an IOS, visual impairment feature listed there under accessibility called Point and Speak. It said something about objects with labels. That might be interesting to look into, though it would not be available for my model of iPhone.

By ming on Tuesday, June 27, 2023 - 04:14

I would like to try the Vision Pro and hope we can know more about t accessibility about the Vision OS in the near future.

By David Goodwin on Tuesday, June 27, 2023 - 04:14

I've now added a number of mainstream articles to the above post, including some early hands-on experiences of the Apple Vision Pro.

I've also added links to some accessibility-focus developer sessions. The following session is of particular note:

  • Create accessible spatial experiences: Learn how you can make spatial computing apps that work well for everyone. Like all Apple platforms, visionOS is designed for accessibility: We'll share how we've reimagined assistive technologies like VoiceOver and Pointer Control and designed features like Dwell Control to help people interact in the way that works best for them. Learn best practices for vision, motor, cognitive, and hearing accessibility and help everyone enjoy immersive experiences for visionOS.

By Manuel on Tuesday, June 27, 2023 - 04:14

In the first Beta of iOS 17, Eloquence sounds worse than ever in German. I've not tested it yet for other languages.

By Andy Lane on Tuesday, June 27, 2023 - 04:14

The headset uses something similar to a Bose frames style of open speakers. They are 2 way so probably sound better but one of the promo’s had someone using AirPods so I’m guessing the onboard audio isn’t great.

By emassey on Tuesday, June 27, 2023 - 04:14

I agree that the Vision Pro should have very important and cool accessibility use cases, like describing things in the visual environment with AI more conveniently than using a phone, and maybe even in motion like describing stop lights, intersections, ETC while walking, which would be amazing! However, it definitely seems like the immersion aspect of the device will be far less for blind people.

The WWDC keynote says that users will control the device using their eyes and fingers, and it literally creates a new visual environment around the user or adds things to the existing environment using the really advanced display technology that the keynote describes. The WWDC session about accessibility in Vision OS is not available yet, but I would guess that VoiceOver would describe objects in the virtual world, perhaps also with accompanying sound effects, and would probably be controlled by hand movements, gestures, and maybe head movements. However, this is not the same degree of immersion as the visual experience, since the real objects are not simulated like people would interact with them in real life.

For example, one feature mentioned in the keynote is the ability to add a 4K monitor into the virtual environment that displays your Mac and lets you interact with it, I'm guessing by eye and finger movements. If VoiceOver responds to gestures for this similar to how it does on a Magic Trackpad, and plays the audio from the Mac sounding like its in a certain position, or even if it allows users to make typing motions to type on a virtual keyboard that isn't there, this would not be the same level of immersion. For the same level of immersion, I can imagine reaching out with my arm and making a special hand gesture and pulling a keyboard out of the air. Then the keyboard could hover as I type on it, simulating how typing feels on a real life keyboard, and if its the wrong size I could pull the sides out to make it bigger or push the sides in to make it smaller. Also I could pull a braille display out of the air with however many cells I want, and reposition the sound by pushing a hovering speaker around with my hand.

And for the example they gave of watching a movie on a moutain top, this experience for blind people would probably be very similar to regular spatial audio, maybe with some background sound. But for this experience to be as authentic as the visual experience, I should be able to hear the complete open space around me, feel the breeze on my body, and be able to move around and explore the terrain and find rocks and bushes and things like that.

For running iPad and iPhone apps, maybe a sheet of plastic could materialise next to me with braille and tactile graphics on it representing the app's layout, and if I press hard on the text for a button or on an icon, that would activate the control and update the braille and graphics with the new layout. And I could make the plastic sheet as big or small as I want just like the keyboard.

For video games and movies, I can imagine the whole environment being simulated instead of just the sound effects. For blind people, sound is definitely an important part of the environment, but there is so much more. It would simulate touch, not just touching things with your hand but how the ground feels to walk on, how it feels sitting down, etc. And for sound, the layout of a space can be sensed by how the sound echos, which I have never heard simulated digitally, although the Vision Pro might include this aspect as far as I know. For example, you could actually feel like you're walking down a hallway, unlocking a door, ETC, or even like you're in a battle if that's what the game is, like hear exactly where your enemies are around you and feel a sword in your hand with how the grip feels and its weight as you swing it, and feel the struggle of holding a shield up as someone tries to push it down.

This probably isn't possible with current technology; perhaps the only way this would be possible is to hook a machine up to your brain and have it stimulate different nurons and read data from the brain somehow. But until we get this technology, I doubt that the Vision Pro or future AR headsets can ever be nearly as immersive for the blind as they are for sighted people.

By kool_turk on Tuesday, June 27, 2023 - 04:14

I'm not sure about everyone else here, but personally, I have reservations about wearing something on my head that supposedly resembles a ski mask or a visor helmet.

It seems like we won't have the chance to experience it firsthand until next year, possibly towards the end of the year, if the information I've heard is accurate.

Moreover, the price doesn't seem justifiable for a first-generation product. I'm not inclined to spend that much on it.

Plus, walking down the street with this thing could potentially lead to trouble.

By Alana on Tuesday, June 27, 2023 - 04:14

For those with artificial eyes, or totally blind, I'm not sure this product would be beneficial. My understanding is that one has to control their eye movement to use this product, as the lenses need to be the correct size. Is that correct? The accessibility of the visual pro is interesting.

By Tyler on Tuesday, June 27, 2023 - 04:14

Member of the AppleVis Editorial Team

My albeit limited understanding is that VoiceOver on visionOS will allow you to control the device entirely with your hands, using a series of pinch gestures to navigate interfaces and activate items.

By mr grieves on Tuesday, June 27, 2023 - 04:14

Well, initially I found it hard to get excited by something called Vision Pro which costs £3000 and has batteries that last 2 hours. But after listening to how excited they got on the Double Tap podcast about it, I'm maybe rethinking.

For starters, they played a clip that confirmed VoiceOver support. They also suggested that maybe you could charge it on the go, in which case the battery could last much longer. Anyone know anything about this?

They also made the point that £3000 for speciality tech would be a bargain, so looking at it through the lens of what else I could buy at this price point and it maybe doesn't seem so bad.

I would absolutely love a device that could combine the Soundscape + Bose frames experience I have now (but not for much longer) with all the stuff the Envision can do. But tied up into a more mainstream interface could have a number of interesting benefits.

I'm also wondering how wearing a ski mask out and about will look. Not sure I could go banking wearing it! (Not that this has happened for many years...)

Really looking forward to seeing where this one goes.

Very hard to be even slightly interested in anything else they had to say.

By mr grieves on Tuesday, June 27, 2023 - 04:14

On the Apple's promo page it says the battery lasts up to 2 hours, but all day if you leave it plugged in (well, duh..)

Maybe it's possible to use a portable charger to keep it going longer when on the go.

2 hours doesn't seem like a lot. I've found the Bose frame battery a bit poor. but I think that goes something like 3.5 hours.

I'm guessing a lot of that battery drain will be coming from the monitor that most of us won't really care much for.

Must catch up on all the links posted in this article. I'm only dipping in and out of this as I get a few mins here and there.

I'm really interested to know how all the apps will work together. If I'm walking about, and it could give me directions, let me know what's around me, maybe even alert me to obstacles my cane isn't going to find, then detect the door when I get there, maybe use a cheeky bit of OCR to tell me what signs are, then I get inside and get a Clew like way to navigate my surroundings. Or would we forever be switching apps? Would get some funny looks walking about with ski goggles on waving our hands all over the place in front of us.

As long as journeys don't last more than 2 hours of course.

My other worry for this is that it will be too expensive and too niche and won't catch on with the mainstream, but I really hope it finds a general audience.

By emassey on Tuesday, June 27, 2023 - 04:14

I looked into Ultrahaptics, and they were purchased by a company that makes hand tracking devices and the merged company is now called Ultraleap. The only end-user products available so far are hand tracking devices, but they do allow developers to test and build on their haptics software and hardware. From what I've read, they have been making progress, and I will definitely try to obtain an evaluation kit! The website mentions it being able to simulate the sensation of textures and complex shapes by creating complex configurations of ultrasound waves against your hand, and to coordinate with their hand tracking device. I think blindness technology companies should definitely look into this; I can imagine many great uses for it, for example creating tactile graphics easily without expensive machines, and allowing users to easily manipulate the graphics to get exactly what they want. I will see what its like and what I can do with it if I can get a kit.

By mr grieves on Tuesday, June 27, 2023 - 04:14

Stupid question - but when you use Vision Pro I presume you are looking at a screen at all times. So I guess if you were to turn the monitor off, you wouldn't see anything at all in front of you. My vision doesn't do much for me these days, but I can see some changes in light still and occasionally some things stand out - like I might be able to see the bars of a gate if the lighting is right. I usually wear shades because I am sensitive to the light. So I guess if I could turn the screen off, then I would be walking around in total darkness? Maybe there might be a benefit to that because it might help focus the other senses, but I'm not sure if I'm ready or not.

A few years ago I tried some smart glasses called Oxsight which were just that, and even a few a few minutes my eyes were feeling tired looking at a screen. Even though they aren't so great any more, I'm not sure how I'd feel about it now.

I still really, really want to try this and see how it works. It's a shame that Apple aren't the sort of company to allow other companies to take their tech and run with it. A version specific for blind and low vision users would be really interesting.

By kool_turk on Tuesday, June 27, 2023 - 04:14

If you watched the keynote, you would have noticed that all the promo videos were for indoor settings such as hotels, offices, and homes. If this was a pair of glasses, I wouldn't have a problem walking around with it on. However, a ski mask would just scream 'come and rob me!' People will see it and think that the person must be loaded if they're willing to walk around with something that costs $5288 in Australia.

By mr grieves on Tuesday, June 27, 2023 - 04:14

Yes maybe you are right that this isn't going to be that suitable for wearing outdoors. This does limit the usefulness for the likes of us I think.
I like the idea of having glasses that can do some of the things my phone can do, like the OCR or be my eyes etc. But the envision is cheaper for that and I'd already decided that was too expensive for what is a small extra convenience.
Hopefully when we get a few iterations down the line it will start to shrink down to something a bit more subtle.

It would be great if a specialist company could take this and turn it into something designed purely for us, but I know Apple doesn't work that way.

By Dominic on Thursday, July 27, 2023 - 04:14

How much is division pro in Australian, US and UK dollars