Greg Morris

Apple’s New Features For Cognitive Accessibility

Apple has announced new accessibility features and tools that will help people with cognitive, vision, hearing, and mobility needs. These tools are designed to be user-friendly and incorporate on-device machine learning to ensure privacy. This shows Apple’s longstanding commitment to accessibility and making products for everyone.

Assistive Access is a feature that distills apps and experiences into their essential features, making them easier to use for people with cognitive disabilities. The feature focuses on activities that are foundational to iPhone and iPad, such as connecting with loved ones, capturing and enjoying photos, and listening to music. Assistive Access also includes a distinct interface with high-contrast buttons and large text labels, as well as tools to help supporters tailor the experience for the individual they support.

For users who are nonspeaking or at risk of losing their ability to speak, Apple has introduced Live Speech and Personal Voice. Live Speech allows users to type what they want to say and have it spoken out loud during phone and FaceTime calls, as well as in-person conversations. Personal Voice allows users at risk of losing their ability to speak to create a voice that sounds like them.

Detection Mode in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance, Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.

These new features and tools build on Apple’s long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love. The groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.

Reply via: