Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

Image of Jabberwocky app icon over a keyboard.

At long last, apps are beginning to appear that take advantage of Apple’s True Depth camera to provide eye gaze control on the latest iPhone or iPad Pro.  There are actually two cameras; your usual selfie camera and an infrared camera that maps your face.

Eye gaze for Apple devices has been a long wished for accessibility feature. Eye gaze bars that hook up to a computer usually cost around $2000.  Access to eye gaze and face tracking on the iPad is a big deal!

News of this new technology came out at Apple’s WWDC in June of 2018. Apple’s ARKit 2.0 introduced an eye tracking feature.  Folks quickly realized that this tool was not just for advertisers, but could really benefit people with disabilities, such as ALS. It took a few months for the first apps to follow.

I will mention a few of the apps that are making use of this. I fully expect the list to grow.

Hawkeye Access allows you to browse the internet using your eyes. When the app is opened, it takes the user through a five-point calibration process.  There is a brief tutorial that teaches you how to navigate: i.e., scroll, go back, or return to the home page.

This app does rely on the user being able to speak.  Dictation is used to perform a search or provide input to text fields (such as your Twitter login). This poses a challenge for people who need AAC, or whose speech is not clear.

You can use Hey Siri to open the app, so the user is not dependent on outside help. The app is customizable, so you can adjust the sensitivity, pick your selection mode, and configure your timing. Pretty cool.

Image of Hawkeye Access app icon

There are now some options out there for those who need AAC.  One app is called I Have Voice. It is is set up for the basic communication of physical needs and does not provide access to generative language.

Image of I Have Voice AAC screen.

The app is aimed at adults with ALS or other types of motor neuron disease. You can use your eyes to select between 12 “Actions” (messages). You can choose between using dwell or a blink to activate the icons. There is no calibration procedure, so it takes a while to get the feel for how to make small motions to control the cursor on the screen.  You can now customize your actions/messages as needed.

Image of Jabberwocky (dragon) app icon.

For those who need text to speech, another new app is called Jabberwocky. This app uses face tracking to access a keyboard with predictive text. It prompts you to calibrate by blinking three times.

The iPhone camera translates your head motion into movement of the cursor on the screen. The app allows you to speed up your input by using “Draw” mode.  The cool thing with Type mode is that you see how your head movements track across the screen.

Image of the Jabberwocky app screen.

The app saves your recent, favorite, and frequent utterances in your history. There is a pre-set ”I use this computer to help me talk” sentence to allow the user to ask for more time. This app offers a free, thirty-day trial, but does require a Lifetime License or subscription after that. Full purchase is $250.  

Image of Predictable Text to speech app icon

Predictable AAC has also added head tracking.  This is a text to speech app that has symbol-supported folders for topical communication. Thanks, Rebecca Gadgil, for the update!


Image of the home page of TouchChat HD AAC app

Another, soon to be released, option will be face tracking with Saltillo’s TouchChat HD with WordPower! The company anticipates releasing the new feature this spring.  As with all of these apps, it will only work on a new iPad Pro (or iPhone X).

This is NOT an in app purchase, or new app that you are required to buy. The feature will be made available to those who already own TouchChat in a coming app update.  As long as you have the True Depth camera on your device, you will be able to go into settings and simply turn it on. How awesome is that?

We got to try this out at ATIA, the conference of the Assistive Technology Industry Association.  It was easy to use and allowed you to choose five different access methods: raised eyebrow, open mouth, tongue protrusion, dwell, or blink.

I fully expect that eye gaze/face tracking on the iPad will be a game changer. For the first time, eye gaze access for AAC will be possible on a consumer electronic device. This is just the beginning. I can’t wait to see what will be possible in six months, or a year. In the meantime, I am having a great time exploring this.

Image of Kathryn sticking out her tongue

I am having fun, but this is not a game. This technology has the potential to truly make a difference for people with complex bodies.  My hope is that Apple will add eye control to its accessibility settings in the future. Give people the power to access the entire device.   


Kathryn Helland

10 comments on “Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

  1. I just saw on IFA in Berlin a german start up called eyeV that had an eye tracker for the iPad pro. It worked pretty good for me, but the downside is, that you have to pay more…

    1. I will be interested to test this out once it comes to the app store here. I was hoping that more AAC apps would hop on this access modality and update their software, as TouchChat has done.

  2. Very exciting. I have a 5 year old daughter who is completely paralyzed from the neck down and has a tracheostomy but thankfully she can talk. This device will hopefully allow her to navigate Netflix, YouTube and hopefully allow her to play games like Roblox independently. Cant wait to try it out. Must get to saving!!!!

  3. Can someone please help me? I have a patient with Rett Syndrome, non-verbal. Mom would like to see if this IPAD / eye gaze device is covered under her Medicaid insurance. Can someone provide me information on this or how she would go about this? She said she might need a code but I am not sure. Thanks!

    1. Hi Daniella,
      It may be possible to get this device approved through Medicaid if her daughter has an AAC evaluation (necessary for the insurance report) and applies through a company that will “bundle” the device as a dedicated communication device. Many app companies will do this now. Two companies that allow you to choose which app to load on the device are Ablenet and ACCI. Check with your state AT Act program to see if you can borrow the device/app to try it out! You can find your state AT program at AT3Center.net.

  4. I’ve just tried Hawkeye Access and I Have Voice on my iPhone X. I don’t have accessibility issues, but my son is tetraplegic and cannot speak, but he can point with his eyes accurately enough for a few seconds.
    Question is: the position detection doesn’t seem very accurate, even after more calibrations (on Hawkeye Access). Do you have better experiences with the paid apps or with the free ones but on an iPad Pro?

  5. This news is beyond exciting! This coming from a person into a 4+ year ALS diagnosis typing with my last usable finger. As a Mac/Apple user since 1984 I’ve wondered why such a hi-tech company has not stepped up – eye gaze technology is hopefully in my remaining time for use on my iPad Pro!

    1. Yes, very exciting! As long as you have the newest iPad Pro model, or an iPhone X or beyond, you will be able to use this feature on the apps mentioned in the post. I would try out the apps and give your feedback to the developers!

      1. Kathryn, I have now purchased an iPad Pro. Is there any way to test the TouchChat HD with Wordpower before spending $299? I have advancing ALS and need something now.

        1. Hi Cris,
          We are adding an iPad Pro to our lending library. If you are a Pennsylvania resident, you should absolutely borrow the device, request the TouchChat app and “try before you buy”. The website is: TechOWLpa.org.

          If you live in another state or territory, check out AT3Center.org to Find Your State and contact your local AT Act program.
          Once you borrow, feel free to contact me with any questions about how to set up the Facetracking in settings. My address is: tug34189@Temple.edu.

Leave a Reply to Cris Simon Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.