Apple Adding Customizable People Detection Capability to LiDAR-Equipped iPhones and iPad Pro with Upcoming Releases of iOS 14.2 and iPadOS 14.2

By AppleVis, 30 October, 2020

Member of the AppleVis Editorial Team

In the upcoming releases of iOS 14.2 and iPadOS 14.2, Apple is adding a new People Detection feature to its Magnifier app on liDAR-equipped devices that will enable blind and low vision users to determine the distance between themselves and nearby people.

With the introduction of People Detection, Apple is delivering the first example of how the LiDAR Scanner's ability to quickly and accurately scan and map nearby objects can be harnessed to offer significant added value to blind and low vision users of iPhone 12 Pro, iPhone 12 Pro Max, and the 2020 iPad Pro.

TheLiDAR Scanner on these devices sends out laser light and measures how long it takes to bounce back. Because light travels at a constant speed, the round-trip time can be translated into a precise distance estimate. By Repeating this process across a two-dimensional grid, a three-dimensional map of objects around you is generated.

By utilizing the power of on-device machine learning and ARKit, People Detection uses this data to identify people framed in the view finder and calculate the distance between the user and the closest person in the frame.

When your device detects that people are nearby, People Detection can let you know using sound, speech, and haptic feedback. The configurable options for feedback are:

  • Spoken distance to person in feet or meters;
  • An audible tone to alert that a person is nearby, and which can be configured to increase in pitch when they enter a user-defined proximity;
  • Haptic feedback that gets faster the closer a person comes; and
  • Distance to the person is also displayed on screen along with an arrow that points to the person being tracked.

Each of these can be used individually or in combination.

When used with AirPods Pro, People Detection takes advantage of Spatial Audio to place the feedback relative to the location of the person. So, if they are to your left, the feedback will be played through the left AirPod.

Feedback is more frequent when the person is closer to you; and you need to make sure your device is in ring mode to hear the sound play or the distance of the person announced.

People Detection is effective at a range of up to 5 meters and works regardless of whether a person is standing or sitting. As a person gets closer or further away, the feedback updates in realtime; and if a person moves out of the frame, People Detection focuses on the next-closest person in the frame.

Sample use cases for People Detection include wanting to know if people are approaching when walking; identifying empty seats on public transit; and following social distancing requirements in situations such as the checkout line at stores.

Apple states that People Detection should not be used for navigation or in circumstances where you could be harmed or injured. Additionally, People Detection will not work in pitch darkness.

Our expectation is that People Detection will be an edge case tool - one that people may not find practical or comfortable to use on a regular basis, but one which may have great value in specific situations. Accordingly, it's likely to become another tool in the iOS and iPadOS toolbox that many blind and low vision users will have, alongside apps such as Seeing AI, Envision AI, Be My Eyes, and BlindSquare.

While the technology is very likely just in its infancy, People Detection is a taste of the potential value that the LiDAR Scanner and the other features of iOS and iPadOS can offer to blind and low vision users when combined, so it should be interesting to see what the developers of Seeing AI, Envision AI, etc. are able to do with the LiDAR scanner in the future.

It seems a reasonable assumption that much of what Apple has been developing and implementing publicly with machine learning, ARKit, and now the LiDAR Scanner is essentially test bedding of these technologies as part of the development work for the long-rumoured Apple Glass. If People Detection is indeed a taste of what Apple is working on behind closed doors, then we hope that Apple Glass soon switches from being a rumour to being a confirmed product with a release date.

Even if you don't currently have a LiDAR-equipped device, we would encourage you to take a few moments to explore the Magnifier app, as it is significantly improved in iOS 14 and iPadOS 14 and also now incorporates VoiceOver's new recognition features.

You can now configure the app so that the controls you routinely need are easily accessible, and it's more clear what each tool does. There are options for adjusting brightness and contrast, or adding a filter to change what's being magnified to a color scheme that's optimal for your personal circumstances.

New to the Magnifier app in iOS 14 and iPadOS 14 is a multi-shot option that allows you to take multiple photos at one time, capturing something like different pages of a leaflet, and then review them all at once.

The Magnifier app also makes use of the VoiceOver Recognition features of iOS 14 and iPadOS 14 to deliver what are often very impressive scene descriptions - which often exceed those offered by third-party apps. If used in conjunction with People Detection, this can offer a good level of contextual awareness.

It's not there yet, but it's possible to see how, with a little more development and use of the features and capabilities of the software and hardware, the Magnifier app could become a viable alternative to apps such as Seeing AI, Envision AI, and Supersense.

To access the Magnifier app, you first need to enable it in Settings > Accessibility.

Once enabled, Magnifier can be added to the Accessibility Shortcut, opened using Siri, or accessed via the App Library on iOS.

People Detection is only supported on iPhone 12, iPhone 12 Pro Max, and the 2020 iPad Pro.

There is no confirmed release date for iOS 14.2 and iPadOS 14.2, but the GM versions have been released to developers and beta testers today, which suggests that the public releases will probably be next week.

Options

Comments

By Missy Hoppe on Tuesday, November 24, 2020 - 20:07

For financial and color reasons, I have been leaning towars getting the regular iPhone 12, but the more I'm hearing about the lidar sensor, the mmore I'm starting to believe that I should choose the accessibility benefits of the iPhone pro over my other, less sensable considerations. Still not planning to buy any time soon, so will be alert to any posts, podcasts or whatever that actually demonstrate the usefulness of the lidar and other pro features.

By Unregistered User (not verified) on Tuesday, November 24, 2020 - 20:07

I look forward to the other benefits to come when I upgrade in a few years.

By WellF on Tuesday, November 24, 2020 - 20:07

Now I want one of these new Iphones. Not that I can afford even a third of the price of one. Maybe when they launch Iphone SE 3rd generation.

By Holger Fiallo on Tuesday, November 24, 2020 - 20:07

Although is nice, the battery will be affected. Sadly the battery on 12 pro is not that good. Maybe when apple release next week iOS 14.2 be better.

By Esan on Tuesday, November 24, 2020 - 20:07

People detection was available on the first four iOS14.2 betas and worked on the 10s and 11 iPhones. It used the AR camera and was very accurate. This is Apple‘s way of making us buy the new phone even though the other ones were capable of handling this feature. The new 12 pro phones may be a little more accurate then the 11s and 10 S, but it was a great feature while I had it.
Maybe if we all complain to Apple, we can get this feature back.