In case you missed it yesterday, Apple simply gave us our first official have a look at iOS 16. After all, it didn’t really name it that—because it did final yr, the corporate highlighted a number of upcoming accessibility enhancements to its working programs, saying solely that they’re coming “later this yr with software program updates throughout Apple platforms.” That’s mainly code for iOS 16, iPadOS 16, watchOS 9, and macOS 13.
Whereas the options Apple introduced are implausible enhancements for these with numerous imaginative and prescient, speech, or motor impairments, in addition they communicate to some general enhancements—particularly in AI and machine studying—that we’ll seemingly see all through the following generations of Apple’s working programs. If we learn into the bulletins, listed below are just a few of the most important developments we will anticipate to see in iOS 16:
Stay captions = Higher speech recognition
Android has had a stay captions characteristic since model 10, and now Apple will get it three years later. With this setting enabled, your iPhone or Mac (if it has Apple silicon) will routinely produce captions in actual time for actually any audio content material, together with movies, FaceTime calls, cellphone calls, and extra. It’s a pure extension of the on-device speech processing that was launched final yr in iOS 15, nevertheless it speaks to a giant enchancment within the sophistication of that characteristic.
We hope this implies an enchancment in Siri’s understanding of your instructions and dictation, however one may simply see these options present up somewhere else. Take for instance, the Notes app, the place one can think about a “transcribe” characteristic to create textual content out of any audio recording or video. If Apple’s billing it as an accessibility characteristic, Stay Caption’s transcription will have to be rock-solid, and it opens up a world of potentialities for the remainder of iOS 16.
Apple Watch mirroring = AirPlay enhancements
One other accessibility characteristic coming later this yr will allow you to mirror your Apple Watch in your iPhone and use your iPhone’s show to regulate your watch It’s designed to make parts simpler to govern for these with motor perform issues and permit disabled customers to take pleasure in all the iPhone’s further accessibility options.

Apple will enable Apple Watch mirroring later this yr—due to new AirPlay developments.
Apple
Nonetheless, Apple Watch mirroring additionally has intriguing implications. Apple says the characteristic “makes use of {hardware} and software program integration, together with advances constructed on AirPlay.” That doesn’t essentially imply that we’ll see one thing like AirPlay 3, nevertheless it feels like there are some enhancements coming to AirPlay, in all probability in the best way of recent frameworks for builders.
Notably, this looks as if it permits units to speak management intent in a manner that AirPlay doesn’t proper now. AirPlay pushes audio and video out to units, and permits for easy controls (play/pause, quantity, and so forth), however permitting AirPlay-compatible units to sign superior contact controls appears new and will result in some unimaginable new options.
Right here’s a killer state of affairs: If Apple can mirror your Apple Watch to your iPhone and help you absolutely work together with it, it may in all probability mirror your iPhone to your Mac or iPad and do the identical! That alone could be a game-changing characteristic.
Door Detection = Actual-world AR object recognition
Apple’s been quietly enhancing its object recognition for a while now. For instance, you’ll be able to seek for all kinds of issues within the Pictures app and get photographs containing them, and iOS 15 added a neat visible lookup characteristic that makes use of the digicam to determine crops and animals, well-known landmarks, art work, and different issues.
Now Apple has introduced it will likely be including the power to detect doorways in actual time utilizing the Magnifier app, together with judging their distance and studying textual content on them. It’s just for units with LiDAR (which the way it’s measuring vary), nevertheless it speaks to a broader enchancment in object recognition.

The iPhone’s digicam will quickly be capable to detect if doorways are open.
Apple
The obvious use-case is augmented actuality glasses or goggles, which aren’t anticipated to be launched till subsequent yr on the earliest. However Apple already has a sturdy ARKit framework for builders, which is used for AR apps, and it contains the power to acknowledge and monitor sure on a regular basis objects. And it wouldn’t be out of character for Apple to preview new know-how that’s not launching for some time.
It appears affordable to presume that the Door Detection characteristic is a pure extension of labor Apple’s already doing in augmented actuality scene and object detection. So don’t be stunned for those who see a demo at WWDC of recent ARKit framework options for builders. It would begin in iOS 16 with new AR apps, nevertheless it’s additionally sure to point out up in a lot greater tasks as Apple continues to march its AR software program instruments towards eventual integration in AR glasses.