Apple is making waves in accessibility with new features coming to iPhones and iPads. Designed to empower users with diverse needs, these features include eye-tracking control, voice-activated shortcuts, and an enhanced haptic music experience. The announcement comes just ahead of Global Accessibility Awareness Day.
Previously, Apple’s iOS and iPadOS supported eye-tracking, but it required extra hardware. Now, for the first time, Apple is introducing built-in eye-tracking that doesn’t need additional accessories.
No ad to show here.
This feature utilizes the front-facing camera and AI to understand where a user is looking and what actions they intend (like swiping or tapping). Additionally, Dwell Control detects when a user pauses their gaze on an element, signifying selection intent.
Voice Shortcuts” takes Apple’s voice control to the next level. Users can now assign custom sounds or words to trigger shortcuts and complete tasks.
Imagine launching an app with a simple “Ah!” after setting that sound as a shortcut with Siri.
Apple also introduced “Listen for Atypical Speech”, which uses machine learning to recognize unique speech patterns. This feature is designed to help people with conditions that affect their speech, like cerebral palsy, ALS, or stroke.
Building on previous advancements like last year’s “Personal Voice” feature, Apple introduces “Music Haptics”.
This innovative feature lets users experience the vast library of Apple Music through a series of taps, textures, and vibrations – a boon for those who are deaf or hard of hearing. Additionally, Music Haptics will be available as an API, allowing music app developers to integrate this new and accessible way of experiencing music into their apps.
Also read: Nedbank launches stellar new campaign, Bank Your Time