News

How Eye Tracking is Driving the Next Generation of AR and VR

AR and VR are gearing up for a giant leap forward thanks to advancements in eye-tracking technology.

The industry has been experiencing a boom in recent years with hundreds of startups and heavy investment from tech giants including Google, Apple, Samsung, and Facebook. Despite all the activity, AR/VR hardware remains relatively crude. Most interfaces take cues from head movement and manual inputs. Graphics often appear artificial, and can be harsh on the eyes due to low resolution and slow frame rates.

Eye-tracking systems, which monitor eye movements in real time, promise to change this.

Historically, the technology has been used to collect information for scientific and business applications, such as market research and medical diagnostics. Because humans primarily use vision to navigate their environment, the eyes can reveal volumes about what’s happening in the mind. They can tell a device what the user is focusing on and how they’re responding. In computing, eye tracking helps lay the groundwork for a revolution in human-to-machine relationships by allowing the control centers to “talk” to each other without manual inputs, such as buttons, controllers, or a mouse. A smartphone or laptop monitor that responds to eye movement and verbal commands, for example, is working more closely with the human mind than the device that requires touch or mouse and keyboard. The evolution of Internet of Things (IoT), including driverless cars and smart appliances, relies on these types of relationships.

Thanks to tiny, powerful components, including compact infrared light emitting diodes (IRED), companies are finally integrating eye tracking sensors in their products. When done well, these systems could enable virtual displays that respond to natural, even subconscious, cues from the user. It could be the beginning of a truly immersive virtual experience.

Industry Is Changing

In January of 2017, FOVE, a Japanese VR startup, released the first eye-tracking VR headset.

Meanwhile, a host of mid-to-high-end AR/VR hardware companies have been working to add some form of eye tracking to existing headsets, as well as integrated solutions in the future. Tobii and other eye tracking developers have begun licensing their technology for consumer AR/VR products while tech giants like Google and Facebook have absorbed some of the most promising startups in the space.

Growing enthusiasm for consumer VR/AR is having a profound impact on the market. Already, nine percent of Internet-connected household in the U.S. expect to purchase a VR headset in the next year, while 24 million households worldwide will do so by the end of 2017, according to Parks and Associates. UBI Research expects total shipments to exceed 65 million units by 2021 and International Data Corporation (IDC) forecasts a five-year compound annual growth rate of 108.3% for AR/VR headsets.

In China alone, the consumer VR/AR market could reach $8.5 billion in the next four years, reports Bloomberg Technology. Meanwhile, Market Watch believes the eye tracking market will reach $1.4B by 2023 due, in part, to its role in VR/AR products.

How Eye Tracking Supports Immersion

The long-range goal for AR/VR is full immersion—the ability to deliver a user from the real to the virtual world without sensory interruption. It’s one of the most complex undertakings in the history of modern computing. To complete the illusion, hardware will need to shed manual inputs and enable the user to interact with the virtual world as they do with the real one.

Getting there will take time. Here’s how eye tracking facilitates the development of immersive features:

Foveated Rendering

We perceive visual details at the center of our field of vision through our fovea (the part of the eye responsible for visual acuity located at the base of the retina, directly behind the pupil). Peripheral objects, outside the foveal range, appear blurry. Therefore, using the same resolution throughout each frame, as most AR/VR systems do, is ultimately wasteful. Foveated rendering, a digital imaging process, mimics how we see by showing the foveal target in high definition (HD) while lowering the resolution of the surroundings. Eye tracking is essential for locating this apex as the eye bounces around the display.


The technique is substantially gentler on our eyes, and on the hardware. Because it reduces overall pixel count — without altering image integrity — foveated rendering reduces load and power consumption significantly. This frees up resources, which can be redirected to other operations such as speeding up the frame rate. As a result, the user experiences less lag and motion sickness while the fidelity of the display improves. A video demonstration by The Eye Tribe notes dramatic changes to the system’s benchmarks, including: Graphical Processing Unit (GPU) load from 80-90% to 40-30%; Clock Rate: from 1,200 MHz to 800 MHZ; Thermal Design Power (TDP) from 70% to 40%.

Precise Interpupillary Distance (IPD)

The user engages with AR/VR through a set of lenses, called “goggles.” These must align with the user’s pupils to fully convey three-dimensionality in the display. If the alignment is off-base, the eye strains and the image becomes flattened and obscured.

Finding the proper position can be challenging. The length between pupils, known as the Interpupillary Distance (IPD), ranges from 5.1cm to 7.7cm in adults. This variation is apparent in the use of binoculars, microscopes, and telescopes, which require adjustment to fit the viewer’s IPD and transmit optimal detail and depth.

Most AR/VR headsets overcompensate with extra wide goggles or manual adjusters. Thanks to eye tracking systems, which locate the pupil within the device, new products can calculate IPD with precision. The right fit allows the user’s eyes to relax and fully engage in the experience, and improves the overall feel of the display.

Natural User Interface (UI)

When the display responds to eye movements—not just head positioning—the user gains a more natural field of view. Eye tracking can empower the user to select virtual objects, such as a ball or a weapon in a game, or teleport to a new place within the virtual environment, simply by looking.

This eliminates the need for icons that clutter the screen and reduces the mental energy required for navigation. In augmented reality, a gaze-directed interface might allow a user to select items for purchase at a brick and mortar store without requiring the user to interact with the employees. 

Social Response

The eyes are our first instrument of communication. Long before we learned to talk, we perceived emotions through subtle facial movements. By monitoring eye signals, immersive hardware supports software programs that recognize and respond to the user with increased sensitivity. The technology could help bring virtual characters to life by endowing them with the ability to perform authentic social cues. New avatars could relay the eye movements of their human counterparts bringing a new dimension to video games and other virtual content.

Seamless Identification

An individual’s iris is as unique as their thumbprint. Through iris scanning, enabled by eye tracking, AR/VR headsets eliminate the need for passwords, making the system more accessible and secure.

Eye Tracking & Innovation

The development of high-impact technologies tends to speed up after a major breakthrough. Now that FOVE has released the first eye-tracking VR headset, we might expect these functionalities will soon come standard in a range of products—from affordable smartphone headsets to high-end systems.

“The view that eye tracking will be a key part of second generation headsets is shared by a large number of VR HMD vendors,” Tobii Business Unit President Oscar Werner explained to TechCrunch. “This drives technology development and innovation.”

As a point of reference, Tobii’s sales nearly doubled from 2014 to 2016, shortly after the company began investing in VR technology.

To take full advantage of the changing market, hardware developers will need to integrate eye-tracking systems capable of keeping up with the human eye. The technology works by beaming invisible infrared (IR) light into the cornea. The light passes through the pupil undetected and reflects off the iris, revealing the edge of the pupil to camera sensors.

An eye twitch, known as saccades, is one of the fastest movements in the human body–the eye can shift 0.9-degrees/millisecond and initiate movement in 200 ms. The eye tracking systems employed in AR/VR must match this pace, especially for foveated rendering functionalities where the image changes with each eye movement. In addition to powerful memory cards and tiny, accurate sensors, eye tracking requires an efficient, lightweight source of IR light. For this purpose, Osram Opto Semiconductors developed Firefly FH 4055, an IRED designed for eye tracking at near range (less than 5cm between the eye and the sensor). Firefly SFH 4055 is angled for side installation, maximum reflection, and minimal energy consumption.

Eye tracking affords remarkable opportunity to AR/VR developers: A chance to build machines capable of interacting with the human mind. As the technology advances, ultra-compact IR emitters play a critical role in hardware innovation.

About the Scout

Eric Kuerzel

Eric Kuerzel is a Product Marketing Manager for infrared products at Osram Opto Semiconductors North America. Kuerzel joined Osram Opto Semiconductors in 2006 and has worked with the display backlighting, camera flash, and mobile device markets. He holds a master’s degree in Business Engineering (MBE) from Steinbeis University in Berlin, Germany and an Engineering degree in Micro Systems from the OTH in Regensburg, Germany.

Send this to a friend