Apple announces first iOS 18 accessibility features including eye tracking

Every spring, Apple previews their new software updates for iPhone and iPad related to accessibility. Also this year, they announced their first iOS 18 features including eye tracking, tactile feedback for music and a solution for motion sickness.

Eye tracking allows users with physical disabilities to control iPad or iPhone with their eyes. Tactile feedback for music lets deaf and hard of hearing users experience music in a new way through iPhone’s Taptic Engine. The new accessibility features will be available later this year with iOS 18 and iPadOS 18.

First iOS 18 features

In a press release, Apple announced new accessibility features that will be available later this year. Even though Apple does not mention iOS 18 and iPadOS 18, history tells us that we can assume that the announced feature will become available in major software updates later this fall. We list all the new accessibility features for you.

Eye tracking on iPad and iPhone

Eye tracking is an AI feature that allows you to operate your device with your eyes as standard on an iPad or iPhone. Developed for users with physical disabilities, this feature can be configured and calibrated in seconds via the front camera.

Eye tracking on iPad and iPhone

Eye tracking works in iPadOS and iOS apps, without any additional hardware or accessories. With eye tracking you can navigate through the elements of an app and activate random elements with hold controls. This way you get access to extra features – with nothing but your eyes – and you can operate physical buttons, swipe and use other gestures.

Tactile feedback for music

Tactile feedback for music allows deaf and hearing-impaired users to experience music in a new way on iPhone. When this feature is enabled, iPhone’s Taptic Engine produces taps, textures, and refined vibrations that follow the music. Tactile feedback for music is available for millions of songs in the Apple Music catalog and will also be made available as an API so that all developers can increase the accessibility of the music in their apps.

palpable music

New voice support features

With Voice Commands, you can assign custom sounds on iPhone and iPad to tell Siri to perform commands and complex tasks. Another new feature, listening to atypical speech, allows you to expand speech recognition, allowing more speech variations to be recognized.

On-device machine learning is used to recognize speech patterns. Developed for users with progressive or non-progressive conditions that affect speech, such as cerebral palsy, previous stroke or ALS, these features offer completely new options for customizing and using devices. They are based on features of iOS 17 for users who do not speak or are at risk of losing their ability to speak.

Directions for moving vehicles for motion sickness

Moving Vehicle Directions for iPhone or iPad can help prevent motion sickness for passengers in moving vehicles. Research shows that motion sickness is often caused by conflicting signals from the senses, where the visual input does not match what the user feels. This may make it difficult for some users to work comfortably on iPhone or iPad when they are in a moving vehicle.

Moving vehicle cues consist of dots at the edges of the screen that track the vehicle’s movements. This limits the conflicting signals from the senses, without affecting the most important information on the screen. iPhone and iPad built-in sensors detect if the user is in a moving vehicle so the display can be adjusted. The feature can be automatically activated on iPhone or turned on and off via the Control Center.

CarPlay gets voice control

CarPlay will be expanded with features for voice control, color filters and sound recognition. Voice control allows users to navigate CarPlay and use apps using their voice. With sound recognition, deaf and hard of hearing drivers or passengers can enable notifications for the sound of horns and sirens.

Carplay voice control eng

Color blind users can use the CarPlay interface more easily thanks to color filters. Other visual accessibility features are also available, such as bold and large text.

Other updates

  • For blind and partially sighted users VoiceOver new voices, a flexible voice rotor, custom volume controls, and the ability to set custom keyboard shortcuts for VoiceOver on Mac.
  • Magnifying glass includes a new reader mode and the ability to quickly start detection mode via the action button.
  • Braille users benefit from a new method to… braille input via the screen to start and use. This way they can work faster and edit texts more easily. In addition, on-screen braille input in Japanese is added, multi-line braille is supported via Dot Pad and users can choose from multiple input and output tables.
  • Visually impaired users benefit from the larger display of text under the pointerwhich displays larger text in a different font and color when text is entered into a text field.
  • For users at risk of losing their ability to speak, a personal voice available in Mandarin Chinese. Users who have difficulty pronouncing or reading entire sentences can create a personal voice based on shortened sentences.
  • For users who don’t talk, contains live speech categories. This feature is also simultaneously compatible with live captions.
  • Users with physical disabilities can use the virtual trackpad for AssistiveTouch to control their device, turning a small part of the screen into a trackpad that can be resized.
  • Of switch control iPhone and iPad cameras can be used to use finger tapping gestures as switches.
  • Voice control supports custom vocabularies and complex words
  • This year will be visionOS enhanced with several accessibility features, including live captions throughout the system. Also, more Made for iPhone hearing aids and cochlear hearing processors are supported.

Recent Articles

Related Stories