Amid the many announcements made during Apple’s WWDC21 keynote, you might have missed out on upcoming iOS and watchOS accessibility features that support users with hearing, mobility, visual or cognitive impairments.
Since 2020, although the tech industry has become more inclusive and accessible, there are still areas that need special attention. The spectrum of disabilities is wide, ranging from vision, hearing, mobility and speech to aspects such as learning disabilities and other similar conditions. That’s a lot of ground for tech companies to cover.
(Subscribe to our Today’s Cache newsletter for a quick rundown of the 5 best tech stories. Click here to subscribe for free.)
That said, while Apple didn’t feel the need to revamp the UX design for iOS 15 and look too much at OS 8, there was a wave of accessibility features for Apple users announced during the launch. world developer conference last week.
Read more | Top 10 takeaways from Apple’s WWDC21 keynote
On World Accessibility Awareness Day, Apple actually pre-announced a few of these features coming to iOS 15 and watchOS 8. But here we dive deep into the scale of new features that are taking over. supports hearing, mobile, visual or cognitive users. challenges.
For a long time, watch users with different limbs had to tap their nose on the watch screen to interact with the device. Enter Support contact, which supports people with limb differences such as amputees.
Assistive Touch is controlled by gestures such as double-clenching, pinching index finger on thumb, hovering over the screen, shaking hands, etc., for assigned tasks on the watch. These can also be done in combos – similar to the game – where an action routine can be rounded up.
Visual aid tools
Voice off, although this is an existing feature in iOS and iPadOS, has been changed for watchOS 8. This built-in screen reader for watchOS 8 has an option for hand gestures, which reflects what is happening with Assistive Touch and becomes a tool for the visual-deficient community. So what does it actually do? This mimics the experience of a visually impaired person using a cane or service animal as they move through a space. While this feature is unlikely to replace these tools entirely, it serves as an additional feature for this group of users.
VoiceOver will also be used for image descriptions on iOS and iPadOS. This combines Apple’s machine learning systems to deliver complex descriptions. When you select a photo in your Photos app, the device reads a description for the user who may be visually impaired. For example, it will say in the voice selected by Siri: “A group of people smiling and laughing, posing in front of a house. Maybe Akash. Wait, where does the name come from? One of the existing features of Photos is the ability to assign identical faces to a contact. VoiceOver will therefore be integrated to also use this functionality in accessibility.
Additionally, iOS 15 will allow you to enter Image Explorer Mode, allowing you to hover over certain elements of a given image, and VoiceOver will read detailed descriptions such as “a girl with long brown hair wearing sunglasses, smiling” or “a lamppost.”
It’s for everyone …
- Background sounds are another new feature where users can stay familiar and calm with certain ambient noises, such as ocean, wind, dark noises, etc. This can be implemented when media is playing to block out external noise for more focus. These sounds will be quite subtle, thankfully.
- One of the biggest revisions for iOS 14 and 15 has been the custom app settings. For the new operating system queue, Apple introduces the per-app settings. For example, a user can customize their Calculator app to have larger text, inverted colors, and other such customizations.
- Those who listened to the WWDC21 keynote would have noticed the large number of Animojis and Memojis, which have become a staple for Apple. In the Accessibility space, people can personalize their Animoji or Memoji with an oxygen tube, cochlear implant, soft headset and more for a more personalized experience.
VoiceOver will work with alt text Additionally, users can manually enter their own descriptions for a photo such as “Avantika Kumar on her 17th birthday in front of her cake at Theobroma Bakery, Mumbai”. This is accessible in the markup function for a given photo. So when the image is sent to someone else, that person can also see the alt text description because it is saved as part of the asset’s metadata.
For better mobility
People with severe motor limitations can expect a Improved switch control, who – via Switch Access – use their device. This should be a robust feature, as it allows users to make full use of their device with the help of capacity switches and other adaptive devices. Launched with iOS14, this feature sees on-screen elements being highlighted and activated sequentially through a range of motions such as typing, moving your head in front of the front camera, or pressing adaptive switches. Users can also use the point swipe and the aforementioned gestures like pinch to zoom. In addition, users will be able to use multiple switches.
For iOS15, sound has been incorporated into this accessibility part. In the event that a user is unable to verbalize a voice command, he may use assigned sounds or phonetics such as “mmuh” or “paa” for certain navigation and actions.
As mentioned, users can use external switches, but a new one is game controllers. With Apple Arcade’s continued success, it seems Apple wants games – just like Xbox and Sony – to be as accessible as possible. There are game controllers on the market, such as the Xbox Adaptive Controller, designed for people with limited mobility. When users hook up devices that are already lying around their homes, they can make it their main switch to avoid moving or unplugging devices.
Additionally, Voice Control, which allows users to verbally control their Apple devices in English and Spanish dialects, will be available in French, German and Chinese. While India remains a huge market for Apple devices, it looks like they have a long way to go in terms of vernacular accessibility here.
Apple will also support Eye Gaze systems, one of the world’s pioneers in advanced eye tracking technology. And, yes, it is a third party through Apple’s MFi (Made for iPhone, iPod, iPad) program. This indicates that Apple will be working with more organizations for accessibility features at all levels.
Hearing improvement tools
Apple will also present Headphones, which allows hearing-impaired users to customize the way audio passes through their device through their headphones or earphones. Currently, AirPods Pro, AirPods (2nd generation), AirPods, Powerbeats, and Powerbeats Pro support this function. This customization option evaluates the types of tones that work best for a user. The headphone slots also feature audiograms, even paper audiograms. The latter allows a user to use optical character recognition through custom settings.
By the end of 2021, hard of hearing people can expect to support a new type of MFi hearing aid: two-way hearing aids, in which the hearing aid microphone can be used as the primary microphone to turn the hearing aid. helps itself. in a hands-free device.