Over the years Netflix has encourage their developers to experiment with different ways to interact with the platform. For example several years ago the company created its own “Netflix and chill” button, and a couple of years ago they explored the idea of a virtual reality video store where you could “walk around” and browse videos like the good old days.
This year’s Netflix Hack Day has brought about several interesting experiments, one of which actually leverages Apple’s ARKit by using the same technology that enables Face ID (the TrueDepth camera system) to bring eye tracking to the app. According to Netflix, the feature (created by Ben Hands, John Fox, and Steve Henderson) was designed with accessibility in mind.
Many of us take our hands and fingers for granted, but there might be some who do not have hands or fingers, or might simply be disabled where they can’t use their hands, so by using eye tracking, they will be able to navigate the app. The feature will use the camera to detect your eye movement and when it notices that you’ve spent an extended amount of time on a particular area of the screen, it will act as a tap.
They also incorporated facial feature recognition, such as sticking your tongue out, to dismiss a screen. That being said it is unclear if this feature will make it into Netflix’s apps in the future, but here’s hoping that it will!