Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

Image of Jabberwocky app icon over a keyboard.

Updated June 25th, 2021

At long last, apps are available that take advantage of Apple’s True Depth camera to provide Head Tracking on the latest iPhones or iPad Pro.  These devices actually have two cameras; your usual selfie camera and an infrared camera. An infrared emitter projects 30,000 points of light.  These are ‘photographed’ to allow your device to recognize the topography of your individual face. This allows your device to unlock without a passcode.

Head tracking for Apple devices has been a long wished for accessibility feature. An eye gaze bar that hooks up to a computer costs around $2000.  A dedicated communication device with eye gaze can cost upwards of $10,000. Access to face tracking on the iPad is a big deal!

News of this new technology came out at Apple’s WWDC in June of 2018. Apple’s ARKit 2.0 introduced an eye tracking feature.  Folks quickly realized that this tool was not just for advertisers, but could really benefit people with disabilities, such as ALS and other motor neuron diseases. It took a few months for the first apps to follow.

I will mention some of the apps that are making use of this feature. The list will hopefully continuing to grow.

The Hawkeye Access app icon.
Hawkeye Access app icon

Hawkeye Access allows you to browse the internet using your eyes. When the app is opened, it takes the user through a nine-point calibration process.  There is a brief tutorial that teaches you how to navigate: i.e., scroll, go back, or return to the home page.

This app does rely on the user being able to speak.  Dictation is used to perform a search or provide input to text fields (such as your Twitter login). This poses a challenge for people who need AAC, or whose speech is not clear.

You can use Hey Siri to open the app, so the user is not dependent on outside help. The app is customizable, so you can adjust the sensitivity, pick your selection mode, and configure your timing. Pretty cool.

Image of I Have Voice app icon
I Have Voice app icon

There are now some options out there for those who need AAC.  One app is called I Have Voice. It is is set up for the basic communication of physical needs and does not provide access to generative language.

Image of I Have Voice AAC screen.
I Have Voice screen.

The app is aimed at adults with ALS. You can use your eyes to select between two pages of “Actions” (messages). It is possible to edit and create new actions.  There is a limited set of icons that you can choose to go with your written text.

You can choose between using dwell or a blink to activate the icons. There is no calibration procedure, so it takes a while to get the feel for how to make small motions to control the cursor on the screen. It is possible to change the sensitivity of  the cursor, your blink, or your dwell time. 

Some other apps that offer similar functionality are Pigio and SpeakProse.

Image of Jabberwocky (dragon) app icon.
Jabberwocky icon.

For those who need text to speech, another app is called Jabberwocky. This app uses face tracking to access a keyboard with predictive text. Calibration is as simple as looking at one point on the screen.

The iPhone camera translates your head motion into movement of the cursor. The app allows you to speed up your input by using “Draw” mode.  The cool thing with Type mode is that you see how your head movements track across the screen. 

Image of the Jabberwocky app screen.
Jabberwocky screen

The app saves your recent, favorite, and frequent utterances in your history. There is a pre-set ”I use this computer to help me talk” sentence to allow the user to ask for more time. While the app used to cost $250, it is now FREE!  That is a the kind of app update I like. 

Jabberwocky also offers a web browser. You can use head tracking and speech to navigate the internet.  This will also work for those who use AAC with a bit of tweaking.  You would need to browse the web on a different device than you use for speech. You would also want to set up a folder of frequently used phrases and sentences for web browsing on your AAC device.

Image of Predictable Text to speech app icon
Predictable App Icon

Predictable AAC has also added head tracking.  This is a text to speech app that has symbol-supported folders for topical communication. There are some updates coming soon.  You will soon be able to see your “Recents” and other frequently used messages at the top of the screen.

ipad and iPhone Dialogue AAC
Dialogue AAC screen

Update 06/25/21: PRC-Saltillo has launched a new text to speech app for the iPad. Dialogue AAC is based on the Essence Vocabulary System and will be accessible via head tracking on the iPad Pro or iPhone with the True Depth Camera. You can also import your own voice through Voice Keeper. Message Banking allows you to create a synthesized voice that is unique to you.

 

Image of the home page of TouchChat HD AAC app
TOuchChat app screen

Another option is PRC Saltillo’s TouchChat HD with WordPower. Head tracking works with the app on both the iPad Pro (2018) and the iPhone X or beyond.  I have tried it out and it works great!  You simply turn on head tracking within the settings in the TouchChat app. As long as you have the True Depth camera on your device, it is immediately available without any additional in-app costs. How awesome is that?

TouchChat also allows you to set your cursor shape, tracking speed, and off-screen indicator.  A great feature is that you can choose between 7 different Trigger Actions, such as raised eyebrows, smile, or tongue out. This allows you to further customize head tracking based on the individual’s abilities.  

Sesame Enable

iPad screen with Sesame Talking Keyboard.
Sesame Talking Keyboard

Sesame had been offering head tracking on Android-based devices and Windows computers. Their business model became difficult to sustain and the company closed in the fall of 2019.  Instead of ending access to their software, they have made it free! On the iPad, you can download the Sesame Talking Keyboard. It works with the True Depth camera to allow head tracking for typing. It does not offer the ability to save frequently used messages, but it is great for on the spot, hands free communication.

Eye Gaze for the iPad

The skyle iPad case with eye gaze bar for iPad Pro.
skyle iPad case with eye gaze bar

A new device is now being offered by Inclusive TLC that pairs the iPad with an eye gaze bar.  It is called the Skyle and runs for around $3000.  This is much cheaper than other dedicated communication devices with eye gaze. Skyle offers a choice of AAC apps, access to social media, email, and environmental control (using the Environ app to control Pretorian smart home devices). The protective case comes with a built in mounting plate for wheelchair users. It works with the Skyle eye tracker utility app.

Update 06/16/21: in the fall of 2021 Apple will release iOS 15 and a new iPad OS. With this update to the operating system, they will allow third party eye tracking bars to integrate with the iPad. This will open up even more routes to access for those who need eye tracking/eye gaze.

The above technology has the potential to truly make a difference for people with complex bodies.  By making alternate access available on consumer electronics, it brings the price point way down. My hope continues to be that Apple will add head tracking to its accessibility settings in the future. Let’s give more people the power to access the entire device!   

 

Kathryn Helland

Kathryn is a certified speech-language pathologist and works with children and adults with complex communication needs. She has been with the TechOWL team since 2015 and is currently working on her doctorate. She would like to examine how to best support AAC users in higher education.

24 comments on “Face Tracking and Eye Gaze for the iPhone and iPad: Access and AAC

  1. Hi Kathryn
    Do you know of an eye or face gaze app for yes/no that would work on the split screen on an iPad? I have a competent AAC user, who is finding the motivation to move to eye gaze difficult so was hoping to have both on the screen.

    1. Hi Kathryn,
      I just attempted this on my 2018 iPad Pro with the following apps:

      TouchChat
      Predictable
      Jabberwocky

      Jabberwocky does not support split screen. The other two apps allow you to use head tracking on the iPad. In both cases, when I entered split screen mode, the camera no longer read my face to move the cursor on the screen. It is possible that there are options for this if you use a third party eye gaze bar with the iPad Pro. You might also want to check with the company that makes the SKyle eye gaze device (using the iPad). I have not checked this, but they may have dealt with this question before.

      I do know that companies, like Tobii, PRC, and Smartbox that create dedicated communication devices, have been including access to more and more outside websites and apps. It is possible that your client might find one of those devices motivating if they can access Netflix, social media, etc….

      If you are a Pennsylvania resident, you can borrow some of these devices through our AT Lending Library at TechOWLpa.org. If not, visit AT3center.org to find your state’s AT Act program. You can also contact device vendors directly for information about device loans and demonstrations.

      I hope this helps! Kathryn

        1. Hello Fayyaaz,
          Yes, any size iPad Pro, 2018 or later, has the true depth camera. The key is that there is no longer a Home Button. The iPad is able to use facial recognition to unlock. This camera has then been used by some app developers for face tracking. Several AAC apps have added this feature. These include TouchChat HD with WordPower, Jabberworky, and Predictable AAC.

          I hope this helps! Feel free to email me at Kathryn.Helland@Temple.edu with any further questions.

          Kathryn

  2. I have a friend with advanced ALS – eye movement only no speech – who needs a way to summon caregivers via text messaging or phone call. Essentially she wants an eye gaze controlled version of the emergency call button used by the elderly who live alone. I have looked in vain for such. Can you help ?

    1. Hi Christine,
      Since your friend has only eye movement, head tracking probably would not work. However, there is now the Skyle eye gaze device from Inclusive TLC. There are also dedicated communication devices by companies like Tobii and Saltillo PRC. If you get an AAC system set up, you can always use it to talk to Alexa. There are Alexa skills that allow you to send a message to people on your contact list. One is “Call my Buddy”. That might be one way to set up emergency messages to caregivers. Good luck!

  3. Do where to go in the settings to turn on the eye tracking? I can’t seem to find a site that gives the directions for turning it on (before someone asks, yes, the device I’m trying to use is the eye tracking enabled iPad).

    1. Hi Nicole, sorry for replying so late! For TouchChat, you need to go into the settings within the app. Open a vocab set, look for menu on the right hand side, and go to settings. Scroll down until you see head tracking as an option. It is different for each app, and there is no setting on the iPad itself. This may come with future iOS updates, but the don’t have it yet. Some of the other apps, such as Jabberwocky, they calibrate the head tracking automatically.

  4. I am a Speech-Language Pathologist and have a student who just recently got approved to receive an iPad pro with Touchchat-wordpower after trialing it. The student is in a general education classroom, because cognitively, is on grade level. But minimally verbal, hence the need for a voice-output program, and has no volitional control for fine motor tasks (hands), hence eye gaze/facial tracking. We would like to help our student be more independent with math, but need to be able to provide access a calculator, especially since the student can’t write anything down. Right now it’s a laborious process with our student telling the aide which numbers and signs to input. Do you know of any calculator apps that currently allow or will in the future allow for the facial tracking and selection? Or if the Hawkeye app would allow access/selection of an online calculator? I came across this list while trying to find something that might work better than our current situation. Thank you for any input or guidance!

    1. Hi Barbara, I have not yet tried to access a calculator using head tracking, but will try to find a way to do so! You mention Hawkeye, but have you tried the Jabberwocky web browser? It uses head tracking and I have found it to be pretty easy to set up. My other, out of the box, thought: If you have a second device running iOS 13, your student could use their device to interact with voice control on the second device. Using a series of commands, such as “Tap 5”, or “Tap multiply”, they could use the calculator independently.

      I will let you know what I find out!
      Kathryn

      1. Thank you so much for your suggestions and getting back to me quickly! I actually didn’t know that Jabberwocky has a browser app, too! I only knew about their AAC app. That could potentially work for our student with the Desmos online calculator (one that the teacher has mentioned). I’m going to follow up on that, and we will hopefully try it out! Unfortunately, the voice command would not work for our student, as intelligibility is greatly decreased. Vowels are distinct, but consonants often are not. We adapt any multiple choice homework, tests, or quizzes that have letters so that numbers are used instead. “bee,” “cee,” and “dee” all sound the same, but “one,” “two,” “three,” and “four” all have different, distinct vowels. And utterances longer than 2-3 syllables are very unclear.

        1. Hi. Voice commands might work coming from the AAC device to another iPad or iPhone running iOS 13. You would need to program a page of commands. I asked the folks at Jabberwocky, and they thought this calculator site might work well with their browser:

          https://www.desmos.com/scientific

          Good luck! Kathryn

          1. I see what you mean now! Sorry! Initially, I didn’t pick up on second iPad following voice command from first iPad. Yes- that could potentially work. I shared the information with our AT lead this afternoon, and we are going to try using the Jabberwocky Browser with the Desmos calculator. On the loaner if we can (their browser is not installed, but their AAC app is) or as soon as we get the permanent one in. Thank you so much for helping us out with this!

  5. I just saw on IFA in Berlin a german start up called eyeV that had an eye tracker for the iPad pro. It worked pretty good for me, but the downside is, that you have to pay more…

    1. I will be interested to test this out once it comes to the app store here. I was hoping that more AAC apps would hop on this access modality and update their software, as TouchChat has done.

  6. Very exciting. I have a 5 year old daughter who is completely paralyzed from the neck down and has a tracheostomy but thankfully she can talk. This device will hopefully allow her to navigate Netflix, YouTube and hopefully allow her to play games like Roblox independently. Cant wait to try it out. Must get to saving!!!!

  7. Can someone please help me? I have a patient with Rett Syndrome, non-verbal. Mom would like to see if this IPAD / eye gaze device is covered under her Medicaid insurance. Can someone provide me information on this or how she would go about this? She said she might need a code but I am not sure. Thanks!

    1. Hi Daniella,
      It may be possible to get this device approved through Medicaid if her daughter has an AAC evaluation (necessary for the insurance report) and applies through a company that will “bundle” the device as a dedicated communication device. Many app companies will do this now. Two companies that allow you to choose which app to load on the device are Ablenet and ACCI. Check with your state AT Act program to see if you can borrow the device/app to try it out! You can find your state AT program at AT3Center.net.

  8. I’ve just tried Hawkeye Access and I Have Voice on my iPhone X. I don’t have accessibility issues, but my son is tetraplegic and cannot speak, but he can point with his eyes accurately enough for a few seconds.
    Question is: the position detection doesn’t seem very accurate, even after more calibrations (on Hawkeye Access). Do you have better experiences with the paid apps or with the free ones but on an iPad Pro?

  9. This news is beyond exciting! This coming from a person into a 4+ year ALS diagnosis typing with my last usable finger. As a Mac/Apple user since 1984 I’ve wondered why such a hi-tech company has not stepped up – eye gaze technology is hopefully in my remaining time for use on my iPad Pro!

    1. Yes, very exciting! As long as you have the newest iPad Pro model, or an iPhone X or beyond, you will be able to use this feature on the apps mentioned in the post. I would try out the apps and give your feedback to the developers!
      Kathryn

      1. Kathryn, I have now purchased an iPad Pro. Is there any way to test the TouchChat HD with Wordpower before spending $299? I have advancing ALS and need something now.

        1. Hi Cris,
          We are adding an iPad Pro to our lending library. If you are a Pennsylvania resident, you should absolutely borrow the device, request the TouchChat app and “try before you buy”. The website is: TechOWLpa.org.

          If you live in another state or territory, check out AT3Center.org to Find Your State and contact your local AT Act program.
          Once you borrow, feel free to contact me with any questions about how to set up the Facetracking in settings. My address is: tug34189@Temple.edu.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.