New Apple Accessibility Update: Eye Tracking, Music Haptics, and Vocal Shortcuts

New Apple Accessibility Update: Eye Tracking, Music Haptics, and Vocal Shortcuts

Apple has announced a range of new accessibility features set to launch later this year. Among these is Eye Tracking, which enables users with physical disabilities to control their iPads or iPhones using their eyes. Music Haptics will provide a new way for deaf or hard-of-hearing users to experience music through the Taptic Engine in iPhones.

Vocal Shortcuts will let users perform tasks by making custom sounds. Additionally, Vehicle Motion Cues will help reduce motion sickness when using an iPhone or iPad in a moving vehicle.

More accessibility features will also be added to visionOS. These updates demonstrate Apple's ongoing commitment to creating inclusive products, leveraging their hardware and software capabilities, including Apple silicon, artificial intelligence, and machine learning.

Eye Tracking Comes to iPad and iPhone

Apple introduces Eye Tracking for iPad and iPhone, aimed at users with physical disabilities. This feature, powered by artificial intelligence, allows control of the device using just eye movements.

The front-facing camera sets up and calibrates Eye Tracking quickly, and on-device machine learning ensures that all data remains secure and private, not shared with Apple.

Eye Tracking is compatible with iPadOS and iOS apps, requiring no extra hardware or accessories. Users can interact with app elements and use Dwell Control to perform functions such as pressing buttons and making swipes or other gestures, all with their eyes.

Music Haptics Makes Songs More Accessible

Music Haptics is designed to help deaf or hard-of-hearing users experience music on the iPhone. This feature, when activated, uses the Taptic Engine to provide taps, textures, and vibrations in sync with the music.

Music Haptics works with millions of songs in the Apple Music catalog and will be available as an API for developers to integrate into their apps.

New Features for a Wide Range of Speech

Vocal Shortcuts allow iPhone and iPad users to assign custom sounds that Siri can recognize to launch shortcuts and perform tasks. Listen for Atypical Speech is another new feature that improves speech recognition for various speech patterns.

This feature, using on-device machine learning, is designed for users with conditions affecting speech, such as cerebral palsy, ALS, or those recovering from a stroke. These enhancements build on features in iOS 17 for users who are non-speaking or at risk of losing their ability to speak.

Artificial intelligence can significantly improve speech recognition for millions with atypical speech," said Mark Hasegawa-Johnson, principal investigator of the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology, University of Illinois Urbana-Champaign. "We are excited that Apple is introducing these new accessibility features. The Speech Accessibility Project was established as a community-backed initiative to help companies and universities enhance speech recognition, and Apple has been a key supporter of this effort.

Vehicle Motion Cues Can Help Reduce Motion Sickness

Vehicle Motion Cues is a new feature for iPhone and iPad designed to reduce motion sickness for passengers. Motion sickness often occurs due to a sensory conflict between what a person sees and what they feel, making it hard to use an iPhone or iPad in a moving vehicle.

Vehicle Motion Cues uses animated dots on the screen edges to represent changes in vehicle motion, helping to reduce this sensory conflict without affecting the main content. This feature uses built-in sensors to detect when the user is in a moving vehicle and responds accordingly. It can be set to show automatically or be toggled on and off in the Control Center.

CarPlay Gets Voice Control and More Accessibility Updates

New accessibility features are coming to CarPlay, including Voice Control, Color Filters, and Sound Recognition. Voice Control allows users to operate CarPlay and apps with their voice. Sound Recognition provides alerts for car horns and sirens, benefiting those who are deaf or hard of hearing. Color Filters make the CarPlay interface easier to use for colorblind users, along with additional visual accessibility options like Bold Text.

Accessibility Features Coming to visionOS

visionOS will soon include new accessibility features such as systemwide Live Captions to assist users who are deaf or hard of hearing with following spoken dialogue in live conversations and app audio. Live Captions for FaceTime in visionOS will enable more users to connect and collaborate using their Persona.

The Apple Vision Pro will add the ability to move captions during Apple Immersive Video and support additional hearing devices and cochlear implants. Other updates for vision accessibility include Reduce Transparency, Smart Invert, and Dim Flashing Lights for users with low vision or those who want to avoid bright and flashing lights.

Additional Updates

  • VoiceOver Improvements: For users who are blind or have low vision, VoiceOver will introduce new voices, a flexible Voice Rotor, custom volume control, and customizable keyboard shortcuts on Mac.
  • Magnifier Enhancements: Magnifier will feature a new Reader Mode and an easy way to launch Detection Mode using the Action button.
  • Braille Screen Input Advancements: Braille users will benefit from faster control and text editing with a new way to start and stay in Braille Screen Input. This update includes Japanese language support for Braille Screen Input, multi-line braille with Dot Pad, and options to select different input and output tables.
  • Low Vision Features: Hover Typing will display larger text in the user's preferred font and color when typing in a text field.
  • Personal Voice and Live Speech: Personal Voice will be available in Mandarin Chinese for users at risk of losing their ability to speak. Those who have difficulty with full sentences can create a Personal Voice using shorter phrases. Live Speech will now include categories and work simultaneously with Live Captions for nonspeaking users.
  • AssistiveTouch and Switch Control: For users with physical disabilities, Virtual Trackpad for AssistiveTouch will allow control of the device using a small, resizable region of the screen. Switch Control will include the option to use iPhone and iPad cameras to recognize finger-tap gestures as switches.
  • Voice Control Updates: Voice Control will support custom vocabularies and complex words, making it easier for users to operate their devices.

Conclusion

Apple's new accessibility features are set to make a significant impact, offering innovative solutions for users with diverse needs. From Eye Tracking and Music Haptics to the enhanced capabilities of VoiceOver, Magnifier, and Braille Screen Input, these updates demonstrate Apple's commitment to inclusivity and user-friendly technology. The advancements in visionOS and CarPlay, along with the new features for speech and physical disabilities, underscore Apple's ongoing efforts to create accessible and empowering tools for everyone.

Stay updated on the latest tech news and accessibility innovations by subscribing to my newsletter. Follow me for more insights and updates on how technology continues to evolve and improve our lives.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了