WWDC might be a month away, but Apple has revealed a bunch of new accessibility features ahead of the developer conference – and one in particular has taken the internet by storm. No longer confined to the realm of Vision Pro, eye tracking has come to iPad and iPhone, letting users control the devices using their eyes.
Likely to be released as part of iOS 18 which we'll see debuted at WWDC, the new features use Apple silicon and AI to "further Apple’s decades-long commitment to designing products for everyone."
Without the need for any additional software, eye tracking uses the front-facing camera to set up and calibrate in seconds. According to Apple, The tool lets users navigate through the elements of an app and use Dwell Control to "activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes."
Along with eye tracking, Apple also revealed Music Haptics, which plays taps, textures, and refined vibrations to the audio of music to help users with hearing impairments enjoy songs. And Vehicle Motion Cues can help reduce motion sickness, with animated dots on the edges of the screen representing changes in vehicle motion "to help reduce sensory conflict without interfering with the main content."
It's curious that Apple opted to launch these features via a press release rather than waiting for WWDC, but it could mean there's a whole host of super impressive updates on the way. And we'd bet money on them involving AI.