• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Eye tracking for VR: A game changer

Why we need it:

* Foveated Rendering
To achieve presence, high resolutions and field of view are necessary.
But the resolution of our eyes is only high for a tiny portion of the view (2°).
Foveated rendering dramatically reduces the rendering need, making 16k VR easily feasible.
(e.g 100x rendering cost reduction at 70° fov)
speedupvciw3.png

Image: Microsoft Research

* Eye-relative motion blur
While our eyes have a limited framerate, temporal aliasing artifacts are visible even at 10,000 Hz. (e.g strobing, wagon wheel, chronostasis, ..).
Eye-relative motion blur removes temporal aliasing without decreasing real detail, making 120Hz indistinguishable from the real world.
Motion blur vs wagon wheel effect: https://www.youtube.com/watch?v=iXg_7Ckv_io

* Depth of field modulation (with virtual retinal display)
Disparity between scene depth and focus depth causes discomfort and can reduce presence (for very close objects).
Eye tracking with virtual retina displays removes this disparity.
Virtual retina displays also eliminates the need for glasses and remove the screen door effect.
(light field rendering isn't necessary)

* New interactivity
The game can now react to your gaze, opening up new possibilities
Examples: NPC reactions (smile/ annoyance etc.), new types of user interface and control.
Infamous Second Son demo: https://www.youtube.com/watch?v=kKYr9MaZw3I&hd=1

What we need:
The eye has 2 principal modes of motion: Smooth pursuit with up to 30°/sec and saccades with up to 800°/sec .
The goal is to have a blurry screen during big saccades (saccadic masking) and a sharp screen during smooth pursuit.
With foveated rendering, any mismatch between real gaze point and rendering gaze point provides enough blur. So it only needs to be fast enough for smooth pursuit.

Latency and inaccuracy decreases the effectiveness for input, but everything else still works reasonably well (e.g. the 2° region increases by 4° with each frame latency @ 120Hz)

The SensoMotoric eye tracking system used in the Infamous Second son demo is more than suffienct (even enables saccade tracking), with:
0.03 °resolution
0.4 °acuracy
500 Hz sampling rate
4ms latency (1/2 frame @120Hz)
Link: http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/red-red250-red-500.html

Now to fit it inside a head-mounted display...

Update 1: A working solution
 

Man

Member
Surely to be part of the PS5 VR set. Main reason for Sony coming out early with that GDC eye-tracking demo.
 

Somnid

Member
You need a game to change first. At this rate VR is going to be in beta for the next decade, they don't need more hardware distractions, they need software and real cultural feedback. This is 5 years off at least.
 

vpance

Member
Surely to be part of the PS5 VR set. Main reason for Sony coming out early with that GDC eye-tracking demo.

No, there's no point to demo that if it wont show up till PS5. They're clearly evaluating it for Morpheus. A Sony engineer already said they could miniaturize the tech.
 

Man

Member
No, there's no point to demo that if it wont show up till PS5. They're clearly evaluating it for Morpheus. A Sony engineer already said they could miniaturize the tech.
1) It will make the Morpheus too expensive.
2) The PS4 can handle the 1080p resolution of it comfortably.
3) Figuring out foveated rendering with current-game engines will take R&D and time.
4) Screen resolution will outpace silikon hardware performance upgrades in the coming years.

It's all about picking your battles and this is one steeeeep hill to climb.
 

Nafai1123

Banned
If they manage to fit this into Morpheus it would be a true game changer. I personally think it will probably be included in PS5, which will have VR by default.
 
You need a game to change first. At this rate VR is going to be in beta for the next decade, they don't need more hardware distractions, they need software and real cultural feedback. This is 5 years off at least.

Sure. You're an expert of launching new consumer electronic products.

There are several real hurdles to overcome for VR to deliver on the promise. At least for Oculus Rift, the fear is about releasing a sub-optimal product and then being dismissed as another gimmick and losing the opportunity to iterate before people move on.
 
I know at least 3 people with heterotropia .... how does that work for them ?
Well, eye tracking works for each eye seperately. I think adjusting camera position and rotation to simulate the effect of correction glasses shouldn't be too hard, but I'm not sure I completely understand the problems with heterotropia.
 

Man

Member
This will be key in releasing a standalone Virtual Reality headset as the hardware inside really don't have to be all that.

The PlayStation 5 will be a Virtual Reality headset. No external box.
 

vpance

Member
1) It will make the Morpheus too expensive.
2) The PS4 can handle the 1080p resolution of it comfortably.
3) Figuring out foveated rendering with current-game engines will take R&D and time.
4) Screen resolution will outpace silikon hardware performance upgrades in the coming years.

It's all about picking your battles and this is one steeeeep hill to climb.

How expensive is it, really? I'm not sure how that makes Magic Labs look if they're wasting their time playing with unfeasible technologies for PS4. People are stringing together IR, webcams and open source software to make writing/drawing with your eyes possible. As far as I can see, the HMD integration is the main challenge.
 

Platy

Member
Well, eye tracking works for each eye seperately. I think adjusting camera position and rotation to simulate the effect of correction glasses shouldn't be too hard, but I'm not sure I completely understand the problems with heterotropia.

I am thinking of options that turn of tracking for one eye but even this don't sound like a perfect answer
 

quetz67

Banned
This will be key in releasing a standalone Virtual Reality headset as the hardware inside really don't have to be all that.

The PlayStation 5 will be a Virtual Reality headset. No external box.

Why does eyetracking matter for where the images are rendered?
 

Orayn

Member
Can't wait for someone to skim the OP and loudly complain about motion blur, not realizing that the lack of it is a big component of why we need high framerates in the first place.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Yes. I've often thought about it but I struggle to think how a rendering engine could be constructed to take advantage of this by only rendering a sharp image at the center and then progressively reducing complexity and rendering cost towards the edges in a seamless and smooth way.

A very tough nut to crack.
 

Mindlog

Member
And then right after that GVS :]
It's all coming together nicely. Pretty amazing to see where the field has come in just a few years and the promising stuff right around the corner.
 
Yes. I've often thought about it but I struggle to think how a rendering engine could be constructed to take advantage of this by only rendering a sharp image at the center and then progressively reducing complexity and rendering cost towards the edges in a seamless and smooth way.

A very tough nut to crack.

That's what I've always thought about this tech, seems like it'd be a nightmare from a QA perspective. I don't know jack about programming or engine development but I'd think you'd need some highly specialized algorithms to pull off something like this, as you said, seamlessly. We're probably a ways off from seeing something like this ready for prime time.
 
Can't wait for someone to skim the OP and loudly complain about motion blur, not realizing that the lack of it is a big component of why we need high framerates in the first place.
The problem is false motion blur. For example, during smooth pursuit the tracked moving object should be sharp and the background blurred, not the moving object.
Now our eyes have troubles focusing on the blurred moving object, causing eye-strain. (Hence we need "eye-relative" motion blur)

For objects rotating on screen (e.g. wagon wheel) you don't need eye tracking because the amount of eye rotation around the visual axis is generally negligible.
 

Nafai1123

Banned
Would it theoretically be possible to apply the same kind of LOD that anisotropic filtering uses? Instead of LOD being determined by the distance from the player, it would be determined by the focal point of the player. I realize that AF doesn't give a huge performance gain at this point, but maybe the same concept could be applied to other visual effects.
 

Man

Member
I wouldn't be surprised if Sony has set aside resources to become first to market with this. The GDC demo + Morpheus demos nearby should alert most devs to what path they are on.

From SensoMotoric Instruments website: http://www.smivision.com/en/gaze-an...s]=653&cHash=6e3dab75caf4be1818fac2788a969492
Christian Villwock, Director of SMI´s OEM Solutions Business: „It’s exciting to see SMI eye tracking used in the Infamous: Second Son PS4 game and how natural it feels for the gamer. We are proud that global players like Sony rely on our long-term experience in eye tracking technology. We have listened to their requirements over the past couple of years and our next generation platform, the RED-n, outperforms earlier eye tracking solutions in all relevant dimensions while meeting mass market cost and form factor need.”
 
Will eye tracking help the 3D effect more depth?
For depth perception we need depth clues.

Motion parallax was added with DK2/morpheus (positional tracking). Depth from motion will be in CV1/PS4VR if it isn't already.
Accomodation is still missing. You could do it with eye-tracking and virtual retinal displays. But you could also do light-field rendering.

Here is a list of depth clues:

Works with classic 3D:
Stereopsis; Convergence
Perspective, Relative size, Familiar size, Elevation, Occlusion

Works, depends on engine:
Shadow Stereopsis
Retinal images with no parallax disparity but with different shadows are fused stereoscopically, imparting depth perception to the imaged scene
Kinetic depth effect
If a stationary rigid figure (for example, a wire cube) is placed in front of a point source of light so that its shadow falls on a translucent screen, an observer on the other side of the screen will see a two-dimensional pattern of lines. But if the cube rotates, the visual system will extract the necessary information for perception of the third dimension from the movements of the lines, and a cube is seen.
Aerial perspective
Due to light scattering by the atmosphere, objects that are a great distance away have lower luminance contrast and lower color saturation.
Texture gradient
Fine details on nearby objects can be seen clearly, whereas such details are not visible on faraway objects.
Lighting and shading
The way that light falls on an object and reflects off its surfaces, and the shadows that are cast by objects provide an effective cue for the brain to determine the shape of objects and their position in space.
Curvilinear perspective ( distortion shader)
At the outer extremes of the visual field, parallel lines become curved, as in a photo taken through a fisheye lens.

Needs high framerate, low persistence:
Depth from motion
When an object moves toward the observer, the retinal projection of an object expands over a period of time.

Needs positional tracking:
Motion parallax
When an observer moves, the apparent relative motion of several stationary objects against a background gives hints about their relative distance.
Needs depth modulation:
Accommodation
 
Depth of field control is the single biggest thing I find lacking in current 3D/VR implementations, and if someone manages to get this right (for movies OR games) it'll be huge for me. As it is I find my eyes unnaturally lock to a specific focal plane in order to mimic the movie's focal plane and it feels awkward at best.
 

Desty

Banned
You could put $100 into adding some kind of eye tracking, but why not just put that $100 into a faster GPU and render everything at higher fidelity.
 

Nikodemos

Member
You could put $100 into adding some kind of eye tracking, but why not just put that $100 into a faster GPU and render everything at higher fidelity.
Because the $100 worth of eye tracking have a better cost-benefit ratio than the $100 of video card, evidently.
 

vpance

Member
Eye tracking will be going in Morpheus, no doubt. If they have to delay it to end of 2015 or 2016 due to cost or implementation, it will be worth it. Unless they're willing to split their VR user base in two with a Morpheus v2 down the line, which would be monstrously stupid and make software support of the feature even less likely.
 

Man

Member
Wow, I didn't know this. I though this was farther away. I could see this being ready in 2015.
Yoshida saying 2014 is out of the question for Morpheus was initially frustrating to me but if they manage to fit this in...

The Oculus & Morpheus are already considered a revolution. But if they manage to include FOVeated rendering in addition it will be a Revolution x Revolution... *mindmelt*

It's funny because in theory... The PS4 could release the talked about VR glasses this year and it will be amazing but... they could just release eye-gazing VR glasses for PS3 or Wii-U and it will *destroy* the PS4 graphics in every way.
FOUvated graphics could really deliver 'Avatar like' graphics today if combined with VR.
 

Binabik15

Member
I followed a link to this thread and I have to say its pretty amazing. Touches on a lot of things I half thought about when thinking about VR displays and then a ton more. Its a bit frustrating that I get the anatomical parts (those boring opthalmology lessons we were tortured with had a payoff, I can't believe it, and on GAF of all places) but not the software side techniques and the mathematical models you need to work out the solutions, though.

Now any VR unit I buy will likely disappoint, haha, but you have to start somewhere.

And I hope they upgrade the surgery machines/microscopes with this tech before I get to use them, I hate eye strain but am very susceptible.
 

Durante

Member
Eye tracking is a game changer, the only question is when. I think we'll see it in a consumer HMD product at the very earliest in 2016.

And even more so than the switch to regular VR from screens, you'll need to get devkits out very early -- changing your engine to effectively perform FOVeated rendering can potentially be a lot more work than "just" integrating rendering for "normal" VR.
 

Coolwhip

Banned
How does eye movement work in VR anyway? When I look around, a big part of the movement of my field of vision comes from eye movement, not head movement. How does that work when you have 2 tiny screens in front of your eyes?
 

gofreak

GAF's Bob Woodward
How does eye movement work in VR anyway? When I look around, a big part of the movement of my field of vision comes from eye movement, not head movement. How does that work when you have 2 tiny screens in front of your eyes?

The ideal is that the field of view of the headset is large enough to allow you to move your eyes left<->right, up<->down, and the extremes of your own field of view are accommodated by the headset's. The optics manipulate the view to make these small little screens appear to envelope your field of view - or as much as possible. In reality at the extremes of one's eye movement you may see a little black at the periphery. The current fovs quoted for OR and Morpheus may not be enough to avoid that completely.
 

Durante

Member
The ideal is that the field of view of the headset is large enough to allow you to move your eyes left<->right, up<->down, and the extremes of your own field of view are accommodated by the headset's. The optics manipulate the view to make these small little screens appear to envelope your field of view - or as much as possible. In reality at the extremes of one's eye movement you may see a little black at the periphery. The current fovs quoted for OR and Morpheus may not be enough to avoid that completely.
They are not enough, not even close. However, you do pretty quickly adjust to moving mostly your head instead of your eyes. Actually, I wouldn't even call it "adjusting", it's more subconscious than that.
 

quetz67

Banned
The ideal is that the field of view of the headset is large enough to allow you to move your eyes left<->right, up<->down, and the extremes of your own field of view are accommodated by the headset's. The optics manipulate the view to make these small little screens appear to envelope your field of view - or as much as possible. In reality at the extremes of one's eye movement you may see a little black at the periphery. The current fovs quoted for OR and Morpheus may not be enough to avoid that completely.

But that problem can't be fixed by eye tracking, unless you move the screen (and the image at the same time) with the eye movement.

Actually it is just for performance reasons. I don't really understand why that would be dependant on where the image is rendered, but I assume it is about being able to render the images in the device because of the reduced performance needed.
 
Variations in the temporal upscaling tech from Shadowfall multiplayer could be useful here for cases where your eyes move faster than can be accounted for.

But seeing as this tech is reliant on enormously high resolution screens, I think it's a while before it will happen.
 

gofreak

GAF's Bob Woodward
But that problem can't be fixed by eye tracking, unless you move the screen (and the image at the same time) with the eye movement.
Correct.

Actually it is just for performance reasons. I don't really understand why that would be dependant on where the image is rendered, but I assume it is about being able to render the images in the device because of the reduced performance needed.

I think the idea of foveated rendering and the win for performance is that you can reduce per pixel quality in the parts of the screen the idea is not focussed on, and thus either bank the framerate gain or increase per-pixel complexity where the eye is focussed. Or selectively anti-alias only parts of the image, or decrease resolution away from the eye's target and increase it where it is targeted. Etc. etc.

However I don't know how close we are to that being viable in consumer devices. The latency on the tracking would need to be very low indeed for that application.

It may not need to be so low for control input, though - and that might be worthy enough of an application to pursue first, on its own.
 
I'm excited about the new interactive features this could allow as well as the performance increase.

Imagine reading a book (lets say a horror book), and the edges of your view darken as you read a tense scene, or the word blood get subtly highlighted a brownish red as you read it (like house being highlighted blue in House of Leaves). The sound of a knife slashing something plays as you read about someone being murdered.

Hnnnng
 

Tuxmascot

Neo Member
I think this is a great idea, however, I feel that the VR industry should focus on making an excellent VR device first. Maybe make a few games for it too.

Once we get the device and a majority of its' flaws out, I would welcome such a development.

I think the community should have as little distractions as possible right now. Especially at such a crucial time as now.
 
Imagine reading a book (lets say a horror book), and the edges of your view darken as you read a tense scene, or the word blood get subtly highlighted a brownish red as you read it (like house being highlighted blue in House of Leaves).

Sounds terrible and distracting. Messing with text that people are reading is rarely a good idea. It's actually a really good way to break immersion and take someone out of the moment. You see something like that, you stop reading and pay more attention to the color of the words than the prose.

Besides, are people going to read a book with a HMD on for some colored text and sound gimmicks?
 

Marc

Member
I'm excited about the new interactive features this could allow as well as the performance increase.

Imagine reading a book (lets say a horror book), and the edges of your view darken as you read a tense scene, or the word blood get subtly highlighted a brownish red as you read it (like house being highlighted blue in House of Leaves). The sound of a knife slashing something plays as you read about someone being murdered.

Hnnnng

Great idea's! This changes every media going, going to be impossible to get people away from their headsets. They will have to put dual cameras in and form AR, imagine making everything look cooler. You get an avatar of whomever you want to be like, as do others and experience is shared.

Jesus, crazy times ahead.
 
Top Bottom