The range that the human eye can see is limited. If you look at the wall clock above the blackboard, your eyes will focus on areas where objects other than the clock will automatically become blurred. We are in a 360-degree world, but at any time humans can only see things within a 120-degree range of vision, and people’s eyes only focus on small areas that do not exceed 6 degrees in the field of vision, and the surroundings are blurred. It is the way we see the world.
Nvidia is developing the above-mentioned eye tracking technology that simulates human visual effects in virtual reality and announced the latest developments in the technology yesterday, which will make the virtual world more realistic.
VR hardware vendors pain pointsThe current problem facing VR hardware manufacturers is that the player's own computer hardware cannot meet the demand for high-definition rendering of display devices. The Oculus Rift only needs to be equipped with a computer with more than $1,000 in order to render a resolution of only 1k. Match the resolution of the real world, monocular rendering must reach 8K, only the hardware configuration of this one, enough for manufacturers headaches.
In order to solve this problem, the most recognized method at present is the local rendering technology that combines eye tracking. Eye tracking technology focuses on the screen that the user's eyes really care about, so the requirements on the GPU are greatly reduced.
In the past 9 months, Nvidia's David Luebke and researchers have attempted to simulate this phenomenon in a virtual reality environment: The device will be fully rendered to the area where the user's line of sight is focused, but the area of ​​the out-of-focus area should be reduced. Resolution. When the player focuses on a small area of ​​the screen, the eye tracking system will constantly adjust the rendering focus. In order to fully render a picture at 90 FPS (lowest acceptable frame rate), a 4 megapixel picture must be rendered nearly 100 times per second. Focusing only on rendering the user's line of sight focus will greatly reduce computational tasks.
The application of existing eye tracking technology
Eye tracking technology is not new in virtual reality research. Eyeball tracking is a common implementation principle:
1, according to the changes in the eyeball and peripheral features
2, tracking according to changes in the angle of the iris
3. Actively project infrared and other light beams to the iris to extract features
Currently in the field of virtual reality, Japan's FOVE Corporation has developed the first virtual reality head-mounted display using eye-tracking technology. FOVE has embedded two infrared cameras in the helmet's eye position. The camera is placed under the eyeglass lens and will not be in sight. The range has an impact and it also tracks the player's pupil activity.
The company that conducts further research on eye-tracking hybrid technology is Tobii, who has teamed up with Starbreeze to incorporate eye-tracking technology into the StarVR head-up with 5K image quality and a 210-degree field of view.
German eye tracking technology company SensoMotoric Instruments (hereinafter referred to as SMI) at last year's developer conference demonstrated a remote eye tracking system in cooperation with the Sony Magic Lab. In collaboration with Samsung, the company launched the SMI Mobile Eye Tracking HMD Observation, which can qualitatively observe and record visual behavior in real time in a virtual environment, and the SMI Mobile Eye Tracking HMD Analysis Pro, which provides a simple analysis and analysis package for gaze behavior.
HTC Vive, PlayStation VR has not yet used eye tracking technology. Oculus founder Palmer Rudge said in an interview that eye tracking technology is the "most critical part of the future VR technology," but then said that the eye tracking technology The solutions are still at a lower level.
How to solve the picture delayAs Oculus founder Palmer Rudge said, although many companies already have this technology, but the speed of eye tracking technology often can not keep up with the speed of human eye movement, rendering speed is not perfect, which leads to the total display screen There is a delay that makes the user feel uncomfortable.
In May last year, Nvidia launched the MRS (multi-resolution shading) to speed up the rendering process. This technology enables VR rendering to no longer render the entire image at the same resolution. Instead, it is divided into several different areas. The focus area will be rendered with a full high resolution; the edges of the image will be rendered at a lower quality, and the pixels after the edges will be subjected to deformation loss, which can save 25%-50% of the pixels and can theoretically be improved. Double the rendering speed.
Recently, Nvidia and SMI launched the latest eye-tracking virtual reality display in cooperation with SMI to achieve accurate, low-latency eye tracking at a frame rate of 250Hz. Luke said this is the speed at which their eye tracking devices can keep up with their eyes for the first time.
Out-of-focus resolution coordinationAlthough the delay problem has been solved, the picture effect is still lacking. The Nvidia team still needs to spend a lot of time on accurate calculations to reduce the resolution of the out-of-focus screen to the best level before it is noticed by the audience. In the process of reducing the resolution, any flicker will cause interference, and the external-focus vision will easily perceive flicker. If the out-of-focus picture is too blurry, it will have a tunnel visual effect, and it will feel like looking through the telescope.
In order to solve this problem, Nvidia's researchers recently discovered that by increasing the contrast of the out-of-focus screen while reducing the resolution, the human eye can be “deceivedâ€.
In the future, Nvidia hopes that this discovery will enable mainstream virtual reality equipment manufacturers to incorporate eye tracking technology into their products.