Screen iPhone with braille symbols
Image Credits: Apple

Apple announces package of new accessibility features for 2025

A Apple announced this Tuesday (13) a series of accessibility features coming later this year, designed to empower users with diverse needs across its ecosystem. Upcoming updates include new tools for discovering accessible apps, improved navigation and interaction methods, and expanded support for communication and environmental understanding.

ADVERTISING

A key announcement is the introduction of the Accessibility Nutrition Labels on the App Store. These labels will provide users with detailed information about the accessibility features supported by an app, such as VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduce Motion, and captions, enabling more informed download decisions.

Magnifying glass to enlarge the physical world

For users who are blind or have low vision, a new Magnifier app for Mac will utilize your computer's camera system (including Continuity Camera and connected USB cameras) to augment the physical world, supporting features such as reading documents with Tabletop View and offering customizable views for better visibility.

A significant update for braille users is the Braille Access, transforming iPhone, iPad, Mac and Apple Vision Pro on integrated braille notepads. This feature includes an integrated app launcher, support for taking notes and performing calculations in braille, the ability to open Braille Ready Format (BRF) files, and integrated Live Captions transcribed directly onto braille displays.

ADVERTISING

Live Captions on Watch

Um Accessibility Reader throughout the system is also being introduced in iPhone, iPad, Mac and Apple Vision Pro. This new reading mode offers extensive customization options for font, color, spacing, and support for Spoken Content, making text easier to read for users with dyslexia or low vision. It will be integrated with the Magnifier app to interact with real-world text.

As Live Captions are coming to Apple Watch, providing real-time transcriptions of audio from Live Listen sessions initiated on a iPhone paired. The Apple Watch will serve as a remote control for these sessions.

visionOS in the Apple Vision Pro will receive enhanced accessibility features, including powerful updates to Zoom that utilize the main camera to magnify surroundings. Live Recognition in visionOS will use on-device machine learning to describe the environment, identify objects, and read documents to VoiceOver users. A new API will also allow approved apps, such as Be My Eyes, to access the main camera for live visual assistance.

ADVERTISING

Additional updates announced include:

  • Background Sounds more personalized with new EQ settings, scheduled stops and Shortcut automations.
  • Faster and more natural creation of Personal Voice with only 10 recorded phrases, plus support for Spanish (Mexico).
  • Vehicle Movement Tips coming to Mac, with new customization options for on-screen dots in iPhone, iPad and Mac.
  • Eye Tracking improved in the iPhone and iPad with options for switch or dwell selection, and improved keyboard typing on the iPhone, iPad and Apple VisionPro.
  • Head Tracking to more easily control the iPhone and iPad with head movements.
  • Support to Brain-Computer Interfaces (BCIs) via Switch Control on iOS, iPadOS, and visionOS for users with severe motor disabilities.
  • Um Assistive Access updated with an app Apple Personalized TV and an API for developers to create tailored experiences.
  • Musical Haptics more customizable in iPhone, allowing haptics for the entire song or just vocals, and adjustable intensity.
  • Sound Recognition gaining Name Recognition to alert users when their names are called.
  • Updates of the Voice Control, including a developer mode in Xcode, vocabulary syncing, and expanded language support.
  • Expanded language support for Live Captions.
  • Updates of the C, including Large Text support and improved Sound Recognition to notify deaf or hard of hearing drivers or passengers of sounds like babies crying, horns, and sirens.
  • Share Accessibility Settings, a new feature to quickly and temporarily share accessibility preferences between iPhones or iPads.

Scroll up