Every spring, Apple gives a taste of their new software updates for iPhone, iPad, Mac and Apple Watch with regard to accessibility. Also this year they have announced their first macOS 16 and iOS 19 functions including magnifying glass for Mac, special reading mode and live captions for Apple Watch.
With the magnifying glass for MAC, users can use their iPhone or iPad directly on a Mac to, for example, display text from a book bigger on the Mac screen. In addition, the App Store will receive labels for accessibility, a special reading mode in the reader and live captions to Apple Watch. The new accessibility features will become available later this year together with iOS 19, iPados 19, MacOS 16 and Watchos 12.
First iOS 19 Functions
In one press release Apple has announced new accessibility features that will be available later this year. Even though Apple does not mention iOS 19 and iPados 19, the past teaches us that we can assume that the announced position will become available in the large software updates later this fall. We list all new accessibility functions for you.
Accessibility Labels in App Store
Accessibility labels bring a new section to App Store product pages that will emphasize accessibility functions within apps and games. These labels offer users a new way to learn whether an app will be accessible to them before they download it. The labels give developers the opportunity to better inform their users and to inform about functions that support their app. This includes voiceover, voice control, larger text, sufficient contrast, reduced movement, captions and more.

Magnifying glass for Mac
Since 2016, the magnifying glass app on iPhone and iPad gives users who are blind or visually impaired tools to zoom in, read text and detect objects around them. From macOS 16, magnifying glass comes to Mac to make the physical world more accessible to users who see less well. The magnifying glass app for Mac connects to a user’s camera so that they can zoom in on their environment, such as a screen, book or whiteboard. Magnifying glass works with continuity camera on iPhone and connected USB cameras, and supports reading documents using Desk View.
With multiple live sessions players, users can multitask by viewing a presentation with a webcam while following a book with the help of Desk View. With adapted views, users can adjust the brightness, contrast, color filters and even the perspective to make text and images easier.

Custom settings can also be recorded, grouped and stored to add later. In addition, magnifying glass for Mac is integrated with another new accessibility function, accessibility reader, which transforms text from the physical world into an adapted readable format.
Introduction of accessibilitybeezer
Accessibility lenon (Accessibility Reader) is a new system -wide reading mode that is designed to make text easier to read for users with a wide range of handicaps, such as dyslexia or visually impaired.
Accessibility reader is available on iPhone, iPad, Mac and Apple Vision Pro and offers users new ways to adjust text and concentrate on content they want to read, with extensive options for font and color, as well as support for spoken content.

Accessibility reader can be started from any app and is built into the Magnifier app for iOS, iPados and macOS, so that users in the real world can communicate with text, such as in books or on din men.
Live captions come to Apple Watch
For users who are deaf or hard of hearing, live listening controls come to Apple Watch with a new series of functions, including real-time live captions. Live listening turns the iPhone into an external microphone to stream content directly to AirPods, Made for iPhone hearing appliances or Beats headphones. When a session is active on the iPhone, users can view live captions of what their iPhone belongs on a linked Apple Watch while listening to the audio.
The Apple Watch serves as a remote control to start or stop live listening sessions, or to jump back in a session to capture something that may have been missed. With Apple Watch, live listening sessions can be operated from the other side of the room, so that you do not have to get up in the middle of a meeting or during class. Live listening can be used in Hearing Health functions available on AirPods Pro 2, including the first of its kind of clinical hearing aid function.
A new Braille experience
Braille Access is a completely new experience that iPhone, iPad, Mac and Apple Vision Pro and the device changes a fully functioning Braille device that is deeply integrated into the Apple ecosystem. With Braille Screen Input or a connected Braille device, users can easily enter braille texts. With Braille Access, users can quickly take notes in Braille format and perform calculations with Nemeth Braille, a Braille code that is often used in classrooms for mathematics and science.
Users can open Braille Ready Format (BRIF) files directly from Braille Access, which unlocks a wide range of books and files that have previously been made on a Braille memorandum. And an integrated form of live captions enables users to transcribe directly into braille screens in real time.
Other updates
- For users who are blind or visually impaired, Visionos will expand the accessibility functions for vision using the advanced camera system on Apple Vision Pro. With powerful updates for Zoom, users can increase everything in sight. For VoiceOver users, live recognition in Visionos Machine Learning uses on the device to describe the environment, find objects, read documents and more.
- Background become easier to personalize with new EQ settings, the option to automatically stop after a certain time and new actions for automation in assignments. Background noises can help minimize distraction to increase a feeling of focus and relaxation, which, according to some users, can help with symptoms of tinnitus.
- Instructions for moving vehicleswho can help to reduce travel sickness when driving in a moving vehicle, comes to Mac, together with new ways to adjust the animated dots on the screen on iPhone, iPad and Mac.

- Users of eye tracking on iPhone and iPad now have the option to use a switch or slider to make selections. In addition, typing on a keyboard is made easier on iPhone, iPad and Apple Vision Pro using eye tracking.
- Of main tracking Users can more easily operate iPhone and iPad with head movements, similar to eye tracking.
- Sound recognition adds Name recognitiona new way for users who are deaf or hard of hearing to know when their name will be called.
- Updates for CarPlay include support for lyric. With updates for sound recognition In CarPlay, drivers or passengers who are deaf or hard of hearing can now be informed of the sound of a crying baby, next to sounds outside the car, such as horns and sirens.
- Share accessibility settings is a new way for users to share their accessibility settings quickly and temporarily with a different iPhone or iPad. This is great to borrow a friend’s device or to use a public kiosk in an environment such as a cafe.
- Feelable feedback for music On iPhone becomes more adjustable with the option to experience haptic vibrations for a whole number or only for singing, as well as the option to adjust the overall intensity of ticks, textures and vibrations.