OREANDA-NEWS. July 22, 2016.  NVIDIA researchers are using SMIs latest eye-tracking technology to develop a new technique that matches the physiology of the human eye to heighten visual fidelity in VR.

The demo which were bringing to the annual SIGGRAPH computer graphics conference in Anaheim, Calif., July 24-28 is simple. Strap on a head-mounted display with integrated eye tracking. Look around the virtual scene of a school classroom with blackboard and chairs. Looks good, right?

Now gaze at the teachers chair, turn off the eye tracking and look around again. Only the area around the chair is rendered in detail. In your periphery the demo was rendering a less detailed version of the image and you couldnt tell.

This new perceptually based foveated rendering technique is more than just a neat trick. This technique promises to put computing resources to work where they matter most, letting developers create more immersive virtual environments.

What Makes NVIDIAs Foveated Rendering Technique Special

Human vision can be thought of as having two components: foveal and peripheral vision. The small region of your retina called the fovea is densely packed with cones a type of photoreceptor cell providing sharp and detailed vision. Peripheral vision covers a much wider field of view but lacks acuity.

This acuity difference has inspired foveated rendering systems, which track the users gaze and seek to increase graphics performance by rendering with lower image quality in the periphery. However, foveated rendering taken too far will lead to visible artifacts, such as flicker, blur or a sense of tunnel vision.

Our researchers used SMIs prototype eye-tracking HMD to perform a careful perceptual study of what people actually see in their peripheral vision in VR. Our researchers then used those insights to design a new rendering algorithm that enables much greater foveation, or reduction in rendering effort, without any discernible drop in visual quality.

The NVIDIA Research team, led by Anjul Patney, Joohwan Kim, and Marco Salvi, discovered that existing foveated rendering techniques tend to generate

either blur or flicker in the peripheral vision. So the team worked to understand the details that humans pick up like color, contrast, edges and motion in the periphery.

For example, they found that the traditional approach of rendering lower-resolution images for our peripheral vision results in distracting flickering if the foveation is too aggressive. They also found that simply blurring images picked up by the eyes peripheral region reduces contrast, inducing a sense of tunnel vision.

However, by combining blur with contrast preservation, they found that users could tolerate up to twice as much blur before seeing differences between the foveated and non-foveated images.

Thats a big deal, explains Patney, because, with VR, the required framerates and resolutions are increasing faster than ever.

Eye-Trackings Potential

The demonstration wouldnt be possible without our partners at SMI, the Germany-based maker of computer vision applications. Its technology works by surrounding the edge of each of a head-mounted displays lenses with infrared lights. Combined with SMIs software, this allows computers to detect where your eye is looking precisely and with incredible speed.

Foveated rendering is just one application for SMIs eye-tracking VR technology. It also promises to enable personal display calibration, barrier-free natural interaction, social presence and new analytical insights.

Learn More

For all the details, visit our project page. To see our demonstration for yourself, visit our booth in the Emerging Tech area at SIGGRAPH.  And follow the latest at the show at #SIGGRAPH2016.