Ahead of World Accessibility Day on May 16, 2024, Apple introduced a host of new accessibility features for iPhone, iPad, Mac, and Vision Pro. Eye tracking heads a long list of new features that will let you control your iPhone and iPad moving eyes.
Eye Tracking, Music Haptics, Vocal Shortcuts, and Vehicle Motion Cues will arrive on eligible Apple devices later this year. These new accessibility features will most likely launch with iOS 18, iPadOS 18, VisionOS 2, and the next version of macOS.
These new accessibility features have become an annual downfall for Apple. Typically, the curtain rises a few weeks before WWDC, also known as the Worldwide Developers Conference, which begins on June 10, 2024. That should be the event where we see Apple show off its next generation of major operating systems. and artificial intelligence functions.
Eye tracking looks really impressive
Eye tracking looks really impressive and is a key way to make the iPhone and iPad even more accessible. As stated in the release and captured on video, you can navigate iPadOS as well as iOS, open apps, and even control items, all with just your eyes, and uses the front camera, artificial intelligence, and local machine learning. throughout the experience.
You can look around the interface and use “Dwell Control” to interact with a button or element. Gestures will also be handled simply by eye movement. This means you can first look at Safari, Phone or another app, hold that view and it will open.
Most importantly, all setup and usage data stays local to the device, so you'll be set up with just your iPhone. You won't need any accessories to use eye tracking. It is designed for people with physical disabilities and is based on other accessible ways to control an iPhone or iPad.
Voice shortcuts, music haptics and live subtitles in Vision Pro
Another new accessibility feature is vocal shortcuts, designed for iPad and iPhone users with ALS (amyotrophic lateral sclerosis), cerebral palsy, stroke, or “acquired or progressive conditions that affect speech.” This will allow you to set a custom sound that Siri can learn and identify to launch a specific shortcut or execute a task. It coexists with Listen for Atypical Speech, designed for the same users, to open voice recognition to a broader group.
These two features build on some introduced in iOS 17, so it's great to see Apple continue to innovate. With Atypical Speech, specifically, Apple is using artificial intelligence to learn and recognize different types of speech.
Music Haptics on iPhone is designed for deaf and hard-of-hearing users to experience music. The built-in touch motor, which drives the iPhone's haptics, will play different vibrations, such as touches and textures, that resemble the audio of a song. At launch, it will work with “millions of songs” within Apple Music and there will be an open API for developers to implement and make music from other sources accessible.
Additionally, Apple has previews of some other features and updates. Vehicle Motion Cues will be available on iPhone and iPad and will aim to reduce motion sickness with animated dots on that screen that change as vehicle motion is detected. It is designed to help reduce motion sickness without blocking what you see on the screen.
A major addition coming to VisionOS, aka the software that powers Apple Vision Pro, will be system-wide live captions. This will allow subtitles for dialogue spoken within FaceTime conversations and audio from apps to be seen right in front of you. Apple's version notes that it was designed for deaf and hard of hearing users, but like all accessibility features, it can be found in Settings.
Since this is live subtitles on an Apple Vision Pro, you can move the window containing the subtitles and adjust the size like any other window. Visual accessibility within VisosOS will also gain reduced transparency, smart inversion, and dim flashing functionality.
Regarding when they will ship, Apple notes in the statement that the “new accessibility features [are] arriving later this year.” We'll be monitoring this closely and imagine these will ship with the next generation of operating systems like iOS 18 and iPadOS 18, meaning people with a developer account will be able to try out these features in upcoming beta builds. .
Considering that some of these features are powered by on-device artificial intelligence and machine learning, helping with accessibility features is just one of the ways Apple believes AI has the potential to make an impact. We're likely to hear the tech giant share more thoughts on artificial intelligence and consumer-ready features at WWDC 2024.