Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

Image of Jabberwocky app icon over a keyboard.

At long last, apps are beginning to appear that take advantage of Apple’s True Depth camera to provide eye gaze control on the latest iPhone or iPad Pro.  There are actually two cameras; your usual selfie camera and an infrared camera that maps your face.

Eye gaze for Apple devices has been a long wished for accessibility feature. Eye gaze bars that hook up to a computer usually cost around $2000.  Access to eye gaze and face tracking on the iPad is a big deal!

News of this new technology came out at Apple’s WWDC in June of 2018. Apple’s ARKit 2.0 introduced an eye tracking feature.  Folks quickly realized that this tool was not just for advertisers, but could really benefit people with disabilities, such as ALS. It took a few months for the first apps to follow.

I will mention a few of the apps that are making use of this. I fully expect the list to grow.

Hawkeye Access allows you to browse the internet using your eyes. When the app is opened, it takes the user through a five-point calibration process.  There is a brief tutorial that teaches you how to navigate: i.e., scroll, go back, or return to the home page.

This app does rely on the user being able to speak.  Dictation is used to perform a search or provide input to text fields (such as your Twitter login). This poses a challenge for people who need AAC, or whose speech is not clear.

You can use Hey Siri to open the app, so the user is not dependent on outside help. The app is customizable, so you can adjust the sensitivity, pick your selection mode, and configure your timing. Pretty cool.

Image of Hawkeye Access app icon

There are now some options out there for those who need AAC.  One app is called I Have Voice. It is is set up for the basic communication of physical needs and does not provide access to generative language.

Image of I Have Voice AAC screen.

The app is aimed at adults with ALS or other types of motor neuron disease. You can use your eyes to select between 12 “Actions” (messages). You can choose between using dwell or a blink to activate the icons. There is no calibration procedure, so it takes a while to get the feel for how to make small motions to control the cursor on the screen.  You can now customize your actions/messages as needed.

Image of Jabberwocky (dragon) app icon.

For those who need text to speech, another new app is called Jabberwocky. This app uses face tracking to access a keyboard with predictive text. It prompts you to calibrate by blinking three times.

The iPhone camera translates your head motion into movement of the cursor on the screen. The app allows you to speed up your input by using “Draw” mode.  The cool thing with Type mode is that you see how your head movements track across the screen.

Image of the Jabberwocky app screen.

The app saves your recent, favorite, and frequent utterances in your history. There is a pre-set ”I use this computer to help me talk” sentence to allow the user to ask for more time. This app offers a free, thirty-day trial, but does require a Lifetime License or subscription after that. Full purchase is $250.  

Image of Predictable Text to speech app icon

Predictable AAC has also added head tracking.  This is a text to speech app that has symbol-supported folders for topical communication. Thanks, Rebecca Gadgil, for the update!

 

Image of the home page of TouchChat HD AAC app

Another, soon to be released, option will be face tracking with Saltillo’s TouchChat HD with WordPower! The company anticipates releasing the new feature this spring.  As with all of these apps, it will only work on a new iPad Pro (or iPhone X).

This is NOT an in app purchase, or new app that you are required to buy. The feature will be made available to those who already own TouchChat in a coming app update.  As long as you have the True Depth camera on your device, you will be able to go into settings and simply turn it on. How awesome is that?

We got to try this out at ATIA, the conference of the Assistive Technology Industry Association.  It was easy to use and allowed you to choose five different access methods: raised eyebrow, open mouth, tongue protrusion, dwell, or blink.

I fully expect that eye gaze/face tracking on the iPad will be a game changer. For the first time, eye gaze access for AAC will be possible on a consumer electronic device. This is just the beginning. I can’t wait to see what will be possible in six months, or a year. In the meantime, I am having a great time exploring this.

Image of Kathryn sticking out her tongue

I am having fun, but this is not a game. This technology has the potential to truly make a difference for people with complex bodies.  My hope is that Apple will add eye control to its accessibility settings in the future. Give people the power to access the entire device.   

 

Kathryn Helland

15 comments on “Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

  1. I am a Speech-Language Pathologist and have a student who just recently got approved to receive an iPad pro with Touchchat-wordpower after trialing it. The student is in a general education classroom, because cognitively, is on grade level. But minimally verbal, hence the need for a voice-output program, and has no volitional control for fine motor tasks (hands), hence eye gaze/facial tracking. We would like to help our student be more independent with math, but need to be able to provide access a calculator, especially since the student can’t write anything down. Right now it’s a laborious process with our student telling the aide which numbers and signs to input. Do you know of any calculator apps that currently allow or will in the future allow for the facial tracking and selection? Or if the Hawkeye app would allow access/selection of an online calculator? I came across this list while trying to find something that might work better than our current situation. Thank you for any input or guidance!

    1. Hi Barbara, I have not yet tried to access a calculator using head tracking, but will try to find a way to do so! You mention Hawkeye, but have you tried the Jabberwocky web browser? It uses head tracking and I have found it to be pretty easy to set up. My other, out of the box, thought: If you have a second device running iOS 13, your student could use their device to interact with voice control on the second device. Using a series of commands, such as “Tap 5”, or “Tap multiply”, they could use the calculator independently.

      I will let you know what I find out!
      Kathryn

      1. Thank you so much for your suggestions and getting back to me quickly! I actually didn’t know that Jabberwocky has a browser app, too! I only knew about their AAC app. That could potentially work for our student with the Desmos online calculator (one that the teacher has mentioned). I’m going to follow up on that, and we will hopefully try it out! Unfortunately, the voice command would not work for our student, as intelligibility is greatly decreased. Vowels are distinct, but consonants often are not. We adapt any multiple choice homework, tests, or quizzes that have letters so that numbers are used instead. “bee,” “cee,” and “dee” all sound the same, but “one,” “two,” “three,” and “four” all have different, distinct vowels. And utterances longer than 2-3 syllables are very unclear.

        1. Hi. Voice commands might work coming from the AAC device to another iPad or iPhone running iOS 13. You would need to program a page of commands. I asked the folks at Jabberwocky, and they thought this calculator site might work well with their browser:

          https://www.desmos.com/scientific

          Good luck! Kathryn

          1. I see what you mean now! Sorry! Initially, I didn’t pick up on second iPad following voice command from first iPad. Yes- that could potentially work. I shared the information with our AT lead this afternoon, and we are going to try using the Jabberwocky Browser with the Desmos calculator. On the loaner if we can (their browser is not installed, but their AAC app is) or as soon as we get the permanent one in. Thank you so much for helping us out with this!

  2. I just saw on IFA in Berlin a german start up called eyeV that had an eye tracker for the iPad pro. It worked pretty good for me, but the downside is, that you have to pay more…

    1. I will be interested to test this out once it comes to the app store here. I was hoping that more AAC apps would hop on this access modality and update their software, as TouchChat has done.

  3. Very exciting. I have a 5 year old daughter who is completely paralyzed from the neck down and has a tracheostomy but thankfully she can talk. This device will hopefully allow her to navigate Netflix, YouTube and hopefully allow her to play games like Roblox independently. Cant wait to try it out. Must get to saving!!!!

  4. Can someone please help me? I have a patient with Rett Syndrome, non-verbal. Mom would like to see if this IPAD / eye gaze device is covered under her Medicaid insurance. Can someone provide me information on this or how she would go about this? She said she might need a code but I am not sure. Thanks!

    1. Hi Daniella,
      It may be possible to get this device approved through Medicaid if her daughter has an AAC evaluation (necessary for the insurance report) and applies through a company that will “bundle” the device as a dedicated communication device. Many app companies will do this now. Two companies that allow you to choose which app to load on the device are Ablenet and ACCI. Check with your state AT Act program to see if you can borrow the device/app to try it out! You can find your state AT program at AT3Center.net.

  5. I’ve just tried Hawkeye Access and I Have Voice on my iPhone X. I don’t have accessibility issues, but my son is tetraplegic and cannot speak, but he can point with his eyes accurately enough for a few seconds.
    Question is: the position detection doesn’t seem very accurate, even after more calibrations (on Hawkeye Access). Do you have better experiences with the paid apps or with the free ones but on an iPad Pro?

  6. This news is beyond exciting! This coming from a person into a 4+ year ALS diagnosis typing with my last usable finger. As a Mac/Apple user since 1984 I’ve wondered why such a hi-tech company has not stepped up – eye gaze technology is hopefully in my remaining time for use on my iPad Pro!

    1. Yes, very exciting! As long as you have the newest iPad Pro model, or an iPhone X or beyond, you will be able to use this feature on the apps mentioned in the post. I would try out the apps and give your feedback to the developers!
      Kathryn

      1. Kathryn, I have now purchased an iPad Pro. Is there any way to test the TouchChat HD with Wordpower before spending $299? I have advancing ALS and need something now.

        1. Hi Cris,
          We are adding an iPad Pro to our lending library. If you are a Pennsylvania resident, you should absolutely borrow the device, request the TouchChat app and “try before you buy”. The website is: TechOWLpa.org.

          If you live in another state or territory, check out AT3Center.org to Find Your State and contact your local AT Act program.
          Once you borrow, feel free to contact me with any questions about how to set up the Facetracking in settings. My address is: tug34189@Temple.edu.

Leave a Reply to Cris Simon Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.