MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Monday, 23 December 2024

Apple unveils new features to aid people with disabilities, introduces eye tracking

Eye Tracking takes advantage of the front-facing camera and combines it with on-device machine learning to work across iPadOS and iOS apps without the need for additional hardware or accessories

Mathures Paul Calcutta Published 16.05.24, 09:18 AM
Representational image

Representational image File image

Apple has announced a number of new features to help users with physical disabilities navigate the iPad and the iPhone. These include “Eye Tracking”, an artificial intelligence-powered feature that helps navigate the iPad and iPhone with just the eyes.

Eye Tracking takes advantage of the front-facing camera and combines it with on-device machine learning to work across iPadOS and iOS apps without the need for additional hardware or accessories.

ADVERTISEMENT

Users will be able to work around an app and use Dwell Control (which allows interaction with controls when the eyes are focused on it for a certain amount of time) to activate each element and access functions like physical buttons, swipes and other gestures. The data used to set up and control the feature will be kept on the device.

The Cupertino headquartered company is also working on “Music Haptics”, which will help users who are deaf or hard of hearing to experience music on the iPhone.

Once the feature is enabled, the Taptic Engine (a component that uses haptic technology to provide users with tactile feedback) on the iPhone plays taps, textures and refined vibrations to the audio of the music. Music Haptics will work across millions of songs in the Apple Music catalogue and it will be available as an API for developers to make music more accessible in their apps.

Also coming later this year will be “Vocal Shortcuts”.

With this feature, iPhone as well as iPad users can assign custom utterances that the company’s voice assistant, Siri, can decode to launch shortcuts. For users with acquired or progressive conditions that affect speech (cerebral palsy, amyotrophic lateral sclerosis or stroke) there will be the new feature, “Listen for Atypical Speech”, which uses on-device machine learning to recognise user speech patterns.

The Tim Cook-led company announced Vehicle Motion Cues for the iPhone and iPad. It’s helpful for those who want to reduce motion sickness — which usually arises from a sensory conflict between what the user sees and what they feel — while riding in vehicles. Animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content.

The features have been announced ahead of the Global Accessibility Awareness Day, which falls on May 16. Apple and Google are responsible for the software that powers nearly all of the world’s smartphones.

With generative AI taking over conversations, there is an ongoing dialogue about how AI models could offer added capabilities to people with disabilities.

VisionOS, which powers the spatial computing headset Apple Vision Pro (yet to launch outside the US), is also receiving accessibility updates.

CarPlay users are going to receive accessibility features like Voice Control.

It will help navigate CarPlay and control apps with just the users’ speech. For the safety of drivers or passengers who are deaf or hard of hearing, the Sound Recognition feature will display warnings about sirens and car horns.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT