Smartphones have become more useful for people with a range of physical abilities, thanks to tools like screen readers and adjustable text sizes.
With the release of Apple’s iOS 16 and Google’s Android 13 software, more accessibility features have been introduced. Your phone can now send you a visual alert when a baby is crying, for example, or a sound alert when you’re approaching a door. Here’s a tour.
On your iOS or Android-based phone, open Settings and select Accessibility to find all the tools and features. For full reference, the websites of Apple and Google have dedicated Accessibility sections, but your exact features will vary based on your software version and phone model.
Alternative navigation
Swiping and tapping to navigate doesn’t work for all, but iOS and Android provide ways to move through the screens and menus, including quick-tap shortcuts and gestures to perform tasks. These controls (like Apple’s Assistive Touch tools and its Back Tap function) are in the iOS Touch settings.
Android’s accessibility shortcuts offer similar options. One way is by opening the main Settings icon, selecting System, then Gestures and System Navigation.
Both platforms support navigation through third-party devices like Bluetooth controllers or by using the camera to recognise facial expressions assigned to actions. These devices and actions can be configured in the iOS Switch Control and Head Tracking settings, or in Google’s Camera Switches and Project Activate apps for Android.
There are tools for those who can’t see the screen. Apple’s iOS software offers the VoiceOver feature and Android has TalkBack, which provides audio descriptions of what’s on your screen. Turning on the iOS Voice Control or Android’s Voice Access option lets you control the phone with spoken commands. Enabling the iOS Spoken Content or Android’s Select to Speak setting has the phone read aloud what’s on the screen — and can be helpful for audio-based proofreading.
Don’t forget a few classic methods of hands-free interaction with your phone. Apple’s Siri and the Google Assistant can open apps and perform actions with spoken commands. And Apple’s Dictation feature (in the iOS Keyboard settings) and Google’s Voice Typing function let you write text by speaking.
Visual assistance
In their Accessibility settings, iOS and Android include shortcuts to zoom in on sections of the screen. But if you like bigger, bolder text and other display adjustments, open the Settings icon, choose Accessibility and select Display & Text Size. In Android, go to Settings, then Accessibility and choose Display Size and Text.
Apple’s Magnifier app for enlarging objects in the camera’s view has been upgraded to help people who are blind or have low vision. The results are spoken aloud or displayed in large type. The door-and-people detection uses the device’s LiDAR scanner to calculate distance and requires an iPhone 12 or later. The website has a guide to setting up the app on the iPhone and iPad.
Google’s recently updated Lookout assisted-vision app can identify currency, text, food labels, objects and more. Google introduced Lookout in 2018, and it works on Android 6 and later.
Auditory aids
Both platforms offer controls to amplify speech through your headphones. In iOS, go to the Audio/ Visual section for Headphone Accommodations, in Android, visit the Sound Amplifier setting.
Apple includes Live Captions, a real-time transcription feature that converts audible dialogue into text. Android’s Accessibility toolbox includes the Live Caption setting that captions videos, podcasts, video calls, etc.
Google’s free Live Transcribe & Notification Android app converts nearby speech to on-screen text, and can also provide visual alerts when the phone recognises sounds like doorbells or smoke alarms. The Sound Recognition tool in the iPhone’s Hearing section of Accessibility does the same.
NYTNS