Over the last month, Apple has released information about upcoming changes to their operating systems for iPhones and iPads. The operating system is like a scaffold. It supports all the apps we use on our devices. The latest updates will include several exciting improvements to accessibility. Some of these new features include the following:
- Sign Time will provide live ASL interpretation for signers.
- Assistive Touch on the Apple Watch will allow users to use small movements to control a pointer on the watch screen
- Support for third party eye-tracker bars will make eye gaze possible on the iPad
- Access for digital car keys and IDs right in your Apple Wallet
- Better support for navigation using Apple Maps
- Point your camera at a street corner and Apple will give you the walking route to your destination.
- Voice search for the internet in the Safari web browser
- Use Voiceover to search photos for people, objects, and text!
- Read that text aloud!
An important question: How will these new features help people who use AAC? A few ideas come to mind.
Physical Disabilities
For some people with complex bodies, Assistive Touch on the Apple Watch should allow on-the-spot communication when AAC devices are not available. Several AAC apps have Apple Watch folders. These include Proloquo2Go, Proloquo4Text, Posco, and Go Talk WOW. The new feature allows you to use subtle muscle and tendon movements, along with different motions, such as a pinch or a clenched fist, to control a pointer on the Watch screen.
Support for third party eye gaze bars will open the use of the iPad as an AAC device to a greater range of people. This includes folks with paralysis or motor neuron disease. It also stands to lower the price for eye gaze communication. Dedicated devices can cost thousands of dollars. Just buying the eye gaze bar will be much less expensive. I look forward to testing it out. As well, Sound Actions for switch control will also allow people to use mouth sounds, such as a tongue click or pop, to operate their device. How cool is that?
Other Types of Access
Some of the new Voiceover features may have an impact on AAC users. The ability to read text found in photos is a great support for many disability types. It can help those who do better hearing new information, rather than reading it. Take a picture of a menu and have it read aloud. This new feature will even translate from other languages.
For AAC users who multitask on their devices, new the new Focus feature may be helpful. I plan to use it! Focus allows you to selectively mute Notifications, so you won’t be interrupted. Still love Candy Crush? You can select which apps will appear on the home screen, decreasing distractions. Alerts are automatically sent to your friends, letting them know that you are busy.
Apple Maps is making it easier to get around in cities. Take a photo of a street corner and artificial intelligence (AI) will geolocate your position. Maps will then plan your route. This can be great for increasing independent navigation of the environment, especially for those who need to use mass transit. Just out of the subway? Take a picture and you know where to go.
With each iOS update, Apple adds new supports to help increase independence and functional capacities for people with disabilities. Increased independence means greater access to work, recreation, and human connection. This gives people who use AAC a greater chance at an everyday life. Everyone should have that right.
0 comments on “Apple’s iOS 15 and AAC: Accessibility and Independence ”