Working with What You Have: iOS 11

Photo of iOS 11 picture

We often look to the app store to “solve” problems.  Afterall, “there’s an app for that.”  It is easy to overlook the feature that are built into the iPhone and iPad .  Over the years, Apple has responded to consumers (and the competition) by adding features to each update of the iOS operating system.   Some of these are designed to support individuals with disabilities, including those who use AAC.  Some are just general features that we can adapt to our own uses.

I know that, for many, some of this will be “review”,  but many changes have come with the rollout of iOS 11.  Of course, what works well will depend on the needs of the individual.  

You can now contact emergency services by pressing sleep/awake button five times.  Go to settings to enable this to happen automatically.   You can also add a selected list of people from your contacts.  This may be very useful for those with limited mobility, but good upper body fine motor control.  

You can also enable writing to Siri.  You won’t need to talk with her if you either don’t want to, or can’t.  Go to settings → Accessibility –> Siri, and enable typing.  When you hold down the home button, a keyboard will appear. For anyone with impaired intelligibility, or who uses text to speech AAC, this will be a great addition.                                                                                                                                                                              

You no longer need a third party app to markup PDFs.  When viewing a webpage, or other document, tap the share icon and select “save as PDF”. Use the new markup tools to add text to your document before sending it to someone.  If you tap the + sign, you can add text in several fonts.  Long press the word “text” and select edit to pull up the keyboard.  Take a picture, or screenshot of any worksheet, and you can save it as a PDF, fill it, and send the completed work back to your teacher or colleague. Cool!  One should note that third party apps that allow you to complete/fill PDFs  may have other features that make them a better choice for some, but it is great to have this set of features built into your iPad or iPhone!

Create PDFs from screenshot.

 

Also, in Notes, you can scan documents, take a photo, or add a sketch.  Tap the + icon to scan or add a photo.  Notes are also now searchable for handwritten text!  Notes can be shared with others.  This allows them to see changes to the note.

Screenshot of Notes app

There is now a built in QR code reader.  I know, QR codes seem to be going the way of the dinosaur….but they may make a resurgence.  Anywhere you find a QR code, you can now point your camera right at it.  A notification will drop down from the top of the screen, offering to take you to the website in question.  You no longer need a third party app.  This could prove very useful for those who have difficulty typing a URL into their web browser.  Imagine being a student on a field trip to a museum.  You can access all the information associated with the exhibit and read it later ( when you need to write (or dictate) that homework assignment).  

iOS 11 also now features optical character recognition. Now when viewing an image with text in it, your device will automatically scan the image for text and read it aloud when VoiceOver is enabled.  It will also, to some degree, describe the objects visible in the scene.   

When you enable Speak Selection, or Speak Screen, you can now customize the colors used to highlight the text.  This can be of help to struggling readers.  

Other features can help those with executive function.  I am having a great time with using geofencing to set reminders.  This works as long as you have a location entered into your contacts.  “Hey Siri, when I get to work, remind me to write that article…”  Imagine how this could support those with executive dysfunction, or ID.  it has the potential to increase independence and decrease the amount of outside prompting needed for task completion.

Screenshot of geofenced Reminder via Siri.

Of course, the camera can be used to take pictures of the whiteboard in class, or the homework assignment.  Combine that with a geofenced reminder:  “Hey Siri, when I get home, remind me to check the photo of my homework assignment.”  You can also share a reminder with someone on your contacts list.

Apple Maps now includes INDOOR NAVIGATION for major malls and airports!  This could really help support independent navigation of these public spaces.  

Working with split screen on the iPad, you can  now drag and drop an image from your camera roll, or a website, over to another app, such as Notes.  Very cool!  You can drag a worksheet over to your note and mark it up there as well.

Split screen view of image search and Notes app.

Using split screen with AAC apps has its limits.  You can open another app, but it “floats” above the AAC app, as opposed to going into full split screen.  This is not necessarily a bad thing.  For those who rely on automaticity/motor planning to access their voice, the icons remain in the same place!   It does give you some potential to talk about what you are playing or working on. You can drag the second app to either side of the screen.  

One thing I could not get to work was using the speech bar in the AAC app to provide input to the microphone for speech to text in Notes. That would be really nice.

The control center has become more customizable.  If you go to settings →  control center →  customize controls, you can choose what you want to appear when you open the control center.  Choices include the timer, calculator, accessibility shortcuts, guided access, the magnifier, and voice memos.

View of new control center on iPad.

Another possible choice is Screen Recording.  You can easily make a video to use for modeling or presentations.  Imagine the possibilities for teaching! I only wish that button clicks/hits were highlighted in the video across all AAC apps. This seems to work for apps that include button press animation, such as Speak For Yourself, or Proloquo2Go.   And, there’s more!  You can use 3D touch to pull up a menu that allows you to activate an external mic, so you can record your voice while creating the video.  This will be great for creating How To videos.   I see a lot of potential for sharing these videos with teachers and classroom staff.  YOu could create a mini “How To” library to support students using AAC, or any other features of iOS devices.  Load these videos onto the teacher’s iPad, and school-based SLPs may be able to optimize the amount of time spent on supports to school personnel for device users on their caseloads.

All in all, I have found much to like about this new  iOS.  I welcome your comments,  tips & tricks, and any other thoughts you have about this change.  

Kathryn Helland

Kathryn is a certified speech-language pathologist and works with children and adults with complex communication needs. She has been with the TechOWL team since 2015 and is currently working on her doctorate. She would like to examine how to best support AAC users in higher education.

0 comments on “Working with What You Have: iOS 11

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.