Image of Jabberwocky app icon over a keyboard.

At long last, apps are beginning to appear that take advantage of Apple’s True Depth camera to provide eye gaze control on the latest iPhone or iPad Pro.  There are actually two cameras; your usual selfie camera and an infrared camera that maps your face.

Eye gaze for Apple devices has been a long wished for accessibility feature. Eye gaze bars that hook up to a computer usually cost around $2000.  Access to eye gaze and face tracking on the iPad is a big deal!

News of this new technology came out at Apple’s WWDC in June of 2018. Apple’s ARKit 2.0 introduced an eye tracking feature.  Folks quickly realized that this tool was not just for advertisers, but could really benefit people with disabilities, such as ALS. It took a few months for the first apps to follow.

I will mention a few of the apps that are making use of this. I fully expect the list to grow.

Hawkeye Access allows you to browse the internet using your eyes. When the app is opened, it takes the user through a five-point calibration process.  There is a brief tutorial that teaches you how to navigate: i.e., scroll, go back, or return to the home page.

This app does rely on the user being able to speak.  Dictation is used to perform a search or provide input to text fields (such as your Twitter login). This poses a challenge for people who need AAC, or whose speech is not clear.

You can use Hey Siri to open the app, so the user is not dependent on outside help. The app is customizable, so you can adjust the sensitivity, pick your selection mode, and configure your timing. Pretty cool.

Image of Hawkeye Access app icon

There are now some options out there for those who need AAC.  One app is called I Have Voice. It is is set up for the basic communication of physical needs and does not provide access to generative language.

Image of I Have Voice AAC screen.

The app is aimed at adults with ALS or other types of motor neuron disease. You can use your eyes to select between 12 “Actions” (messages). You can choose between using dwell or a blink to activate the icons. There is no calibration procedure, so it takes a while to get the feel for how to make small motions to control the cursor on the screen.  You can now customize your actions/messages as needed.

Image of Jabberwocky (dragon) app icon.

For those who need text to speech, another new app is called Jabberwocky. This app uses face tracking to access a keyboard with predictive text. It prompts you to calibrate by blinking three times.

The iPhone camera translates your head motion into movement of the cursor on the screen. The app allows you to speed up your input by using “Draw” mode.  The cool thing with Type mode is that you see how your head movements track across the screen.

Image of the Jabberwocky app screen.

The app saves your recent, favorite, and frequent utterances in your history. There is a pre-set ”I use this computer to help me talk” sentence to allow the user to ask for more time. This app offers a free, thirty-day trial, but does require a Lifetime License or subscription after that. Full purchase is $250.  

Image of Predictable Text to speech app icon

Predictable AAC has also added head tracking.  This is a text to speech app that has symbol-supported folders for topical communication. Thanks, Rebecca Gadgil, for the update!

 

Image of the home page of TouchChat HD AAC app

Another, soon to be released, option will be face tracking with Saltillo’s TouchChat HD with WordPower! The company anticipates releasing the new feature this spring.  As with all of these apps, it will only work on a new iPad Pro (or iPhone X).

This is NOT an in app purchase, or new app that you are required to buy. The feature will be made available to those who already own TouchChat in a coming app update.  As long as you have the True Depth camera on your device, you will be able to go into settings and simply turn it on. How awesome is that?

We got to try this out at ATIA, the conference of the Assistive Technology Industry Association.  It was easy to use and allowed you to choose five different access methods: raised eyebrow, open mouth, tongue protrusion, dwell, or blink.

I fully expect that eye gaze/face tracking on the iPad will be a game changer. For the first time, eye gaze access for AAC will be possible on a consumer electronic device. This is just the beginning. I can’t wait to see what will be possible in six months, or a year. In the meantime, I am having a great time exploring this.

Image of Kathryn sticking out her tongue

I am having fun, but this is not a game. This technology has the potential to truly make a difference for people with complex bodies.  My hope is that Apple will add eye control to its accessibility settings in the future. Give people the power to access the entire device.   

 

Kathryn Helland

2 comments on “Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

  1. This news is beyond exciting! This coming from a person into a 4+ year ALS diagnosis typing with my last usable finger. As a Mac/Apple user since 1984 I’ve wondered why such a hi-tech company has not stepped up – eye gaze technology is hopefully in my remaining time for use on my iPad Pro!

    1. Yes, very exciting! As long as you have the newest iPad Pro model, or an iPhone X or beyond, you will be able to use this feature on the apps mentioned in the post. I would try out the apps and give your feedback to the developers!
      Kathryn

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.