The next iOS update is now in beta testing. iOS 13 promises some big changes that will have an impact on people with disabilities.
Right off the bat, Accessibility is no longer hidden under the General folder in Settings. Just go to Settings and scroll down. Under Physical and Motor access, one of the biggest changes is the addition of Voice Control. It is now possible to perform a range of actions on your phone or iPad without physical touch. Here are some of the basic commands.
- GO HOME (takes you to the home screen of your device)
- OPEN _______ (choose the app you want)
- WAKE UP (activates Voice Control)
- GO TO SLEEP (temporarily disables Voice Control)
- OPEN CONTROL CENTER
- TAP (NAME OF ITEM)
- SHOW ME WHAT TO SAY (gives you hints as to how to phrase a command)
One very important command is to SHOW NUMBERS. Unless you have enabled a continuous overlay, this command brings up a numbered grid, or gives a number to each of the actionable items on the screen. This allows you to do things such as take a selfie, or send a text message, all with your voice. You can Facetime with someone without having to touch your phone.
There are also gestural commands that can be used to help you navigate your device. You can SWIPE LEFT or RIGHT, ZOOM IN, or LONG PRESS. iMore has a great, comprehensive list of all the types of commands.
But, what about people who use AAC? Can they still use Voice Control?
I have been experimenting with this. As with using Siri, you can’t effectively use both Voice Control and your AAC app on the same device. Yes, you can use your vocabulary to say a command, but if you leave your communication app, you are stuck. It comes down to needing two devices: one for AAC and one for everything Apple.
Given that, the next challenge is to set up a vocabulary folder with the commands needed by the AAC user. I am working on this and will keep editing.
One of the possible challenges has to do with the selection of your AAC voice. My first attempts have shown that Voice Control is a bit more responsive to the male “Ryan” voice than to the female Acapela voices. As well, you first have to craft your message, then speak the whole thing from the speech bar. Voice Control does not understand the utterance if the words are separated in time. This also means you have to clear the speech bar each time you want to give a new command.
Let me paint a picture. Someone with ALS has their iPad Pro (3rd gen with TouChat) and their iPhone (with iOS 13) mounted to their wheelchair. They use head pointing in TouchChat HD with WordPower to speak their commands to their phone. They command the phone to OPEN CAMERA. Then, SHOW NUMBERS. They select the number for reversing the camera. They take a selfie and send it to someone they love.
Then, that person can OPEN MUSIC and chill to their favorite playlist. Later, they can use TouchChat to command Alexa on a Fire Cube to turn down the lights and play their favorite show on Netflix.
It is now possible for someone with a complex body to use either their voice or AAC to command their iPhone. Using the SHOW NUMBERS command allows them to activate anything on the screen.
This is a significant increase in personal independence, all using consumer electronics! No, the equipment is not cheap, but it is available without having to do an insurance request (or having to file an appeal when that request is denied).
The world of consumer electronics is becoming more and more accessible. If the past is any guide, when the world becomes more accessible, it benefits everyone.