• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony GDC eyetracking tech demo [using eyes to aim inside Infamous Second Son]

Wow that sounds really good, I had a Canon 50E SLR with eye-controlled AF point selection through the optical viewfinder. That was back in ~1995, so almost 20 years. It actually worked quite well for me, but it was only for 3 focus points, so of course not high res or anything.

The later model EOS 3 had 45 selectable AF points:

"The EOS-3 inherited a refined version of the Eye-Control system of the EOS-5. This system, when calibrated to a given user, allowed for picking one of the 45 points of the autofocus system simply by looking at it though the viewfinder. An infrared transmitter and receiver mounted around the eyepiece monitored the position of the iris, thus "knowing" where the photographer was looking and focusing on that point. The system has its limitations however, notably eyeglasses and occasionally contact lenses would confuse the system. This feature was never rolled forward to the later 1V body."

http://en.wikipedia.org/wiki/Canon_EOS-3
 

twobear

sputum-flecked apoplexy
mess at sony providing more compelling reasons to own a pseye than MS ever have with kinect

just pack it up and go home MS, you're not needed any more
 

stryke

Member
I'm not sure how this would work for consoles (apart from being integrated into VR). That camera was really securely taped down, and the guy had to sit very still. That's fine sat at a desk, but consoles are much more randomly placed in a bedroom or living space. I very much doubt the PS4 Camera has the accuracy to do this, so it would require yet another peripheral to work with a TV.

I can't see it going into the VR headset (at least not in the PS4 version) if they want to make the thing remotely affordable.

I don't think it's intended for couch use. Looks like it requires a special sensor too.
 
It would be interesting to see how the eye-tracking could work for something like a competitive FPS game for example.

Imagine playing Counterstrike using this to aim.
 
How many enjoyed the Infamous footage more

StudentRaisingHand.gif
 

DemonNite

Member
But how? Not sarcasm, I'm really trying to wrap my head around how this could do anything inside a VR headset.

I am superman

I turn my head to see you above me

I burn you up with my laser eyes without having to keep turning my head (to keep you in the center of the display)
 

DieH@rd

Banned
But how? Not sarcasm, I'm really trying to wrap my head around how this could do anything inside a VR headset.

If could not only change the rendering of the game [which would be invisible to the user], but could change how gameplay elements are made:

- pretty girl smiling when you look her in the eyes
- NPC gets annoyed at you if he notices that you are looking at his face scar too much
- crazy Nazi torturer seeing which of your NPC friends you wish NOT to die
- out of 3 manequines in the ruined asylum only those that you are not looking at are opening their eyes
- using your Superman eye lazers to burn enemies
- marking things with your eyes [HUD]
- solving puzzles with your eye gaze


As DemonNite said, with eyetracking "center of the screen" can finally be decoupled as the main way to select or interact with things.
 

StuBurns

Banned
Eventually we'll have eye tracking with laser retina projection, and a VR headset makes perfect sense for that, but it's a very long way off.
 

KOHIPEET

Member
If could not only change the rendering of the game [which would be invisible to the user], but could change how gameplay elements are made:

- pretty girl smiling when you look her in the eyes
- NPC gets annoyed at you if he notices that you are looking at his face scar too much
- crazy Nazi torturer seeing which of your NPC friends you wish NOT to die
- out of 3 manequines in the ruined asylum only those that you are not looking at are opening their eyes
- using your Superman eye lazers to burn enemies
- marking things with your eyes [HUD]
- solving puzzles with your eye gaze


As DemonNite said, with eyetracking "center of the screen" can finally be decoupled as the main way to select or interact with things.

Homer-simpson-thinking.gif


Thanks. I understand now.

Too much Homer.
 

syko de4d

Member
But how? Not sarcasm, I'm really trying to wrap my head around how this could do anything inside a VR headset.

Control the UI with Eye tracking. Want to change your headgear? Open Inventory, look at your new helmet, press and hold X, look at your old helmet and release X and the helmets get swapped. Or writing on a virtual Keyboard, watch at a letter, press x, watch at the next letter and press X etc. With some training it could be as fast as touchscreen writing. It would be your own mouse pointer with 100% accuracy even for people without aiming skills :D

Interaction with NPCs or other players Online would be even more like real life. Or in an Skyrim VR like game you could throw a fireball at enemies just with looking at them, would feel like a real mage. And many more stuff like some neat graphic stuff you can do when you know where people looking at.

But it will take some years till we get eye tracking so small that we can put it inside a HMD.
 

DemonNite

Member
Mmmm..

homer-thinking-o.gif


I'm beginning to comprehend.

I am Gabriel Logan

I am in stealth mode avoiding enemies

I do not see them in real life (via gaze tracking) so they do not appear on my map

This is true line of sight rather than assuming the player saw that enemy because the main camera did
 

Thorgal

Member
This can also have uses outside of gaming .

Camera's for example .

You know how many times you saw something and wish you made a picture of it but it either happened so fast there was no time to aim the camera ?
Imagine this tech minimized to fit on your Glasses which will continuously take pictures of everything your eyes are drawn to .
No matter how fast it happened , if your eyes saw it , there will be a picture of it .
 
Having very fast eyetracking integrated into Morpheus could allow not only new ways of controlling the games or giving games info about state of user [horror game that knows where you are looking could be so scary], but could also allow devs to create brand new engines.

I can imagine a new Fatal Frame game taking advantage of this technology. The game will track your eyeballs and use it to aim the Camera Obscura at the ghosts coming for you. Sounds one heck of a scary experience to me!
 
I am Gabriel Logan

I am in stealth mode avoiding enemies

I do not see them in real life (via gaze tracking) so they do not appear on my map

This is true line of sight rather than assuming the player saw that enemy because the main camera did

Do not do this to me man.
 
I think depth of field and focus control is probably going to benefit immersion and presence more than being able to aim and shoot, that's almost like needing to turn your head to aim back when you just map mouse control to headset tracking for early vr stuff.
 
I didn't believe it would work before I watched the video.

I watched the video.

I am a believer. It will only get better. What times we live in, it's madness!
 
If could not only change the rendering of the game [which would be invisible to the user], but could change how gameplay elements are made:

- pretty girl smiling when you look her in the eyes
- NPC gets annoyed at you if he notices that you are looking at his face scar too much
- crazy Nazi torturer seeing which of your NPC friends you wish NOT to die
- out of 3 manequines in the ruined asylum only those that you are not looking at are opening their eyes
- using your Superman eye lazers to burn enemies
- marking things with your eyes [HUD]
- solving puzzles with your eye gaze


As DemonNite said, with eyetracking "center of the screen" can finally be decoupled as the main way to select or interact with things.

Developers ....DEVELOPERS!!!!!!!!! listen to this man :)
 

dose

Member
As impressive as the video is, when I'm playing I don't always focus on what I'm shooting at, sometimes I'm glancing at scenery, other characters/objects etc. I'm sure I'm not the only one ;) Isn't the Infamous example limiting and make you have to constantly look where you want to shoot?
 

StuBurns

Banned
As impressive as the video is, when I'm playing I don't always focus on what I'm shooting at, sometimes I'm glancing at scenery, other characters/objects etc. I'm sure I'm not the only one ;) Isn't the Infamous example limiting and make you have to constantly look where you want to shoot?
The eye tracking is just the reticule, you can look at whatever you want. When in combat, your reticule would ideally be on enemies, it's not the time to be sightseeing.
 
If they can get eyetracking in morpheus and it works regardless of game as a right analog replacement, that is a gamechanger, bigger in my view than VR itself.
 

stryke

Member
As impressive as the video is, when I'm playing I don't always focus on what I'm shooting at, sometimes I'm glancing at scenery, other characters/objects etc. I'm sure I'm not the only one ;) Isn't the Infamous example limiting and make you have to constantly look where you want to shoot?

.....They just demoed him looking at the Space Needle?
 

stiktis

Member
The eye tracking is just the reticule, you can look at whatever you want. When in combat, your reticule would ideally be on enemies, it's not the time to be sightseeing.

Hm... But what happens when I check my health bar/mana/ammo? Or while shooting at one guy I quickly scan the area for the next possible threat? I am sure there are workarounds I just can't think of right now...
 

vpance

Member
If could not only change the rendering of the game [which would be invisible to the user], but could change how gameplay elements are made:

- pretty girl smiling when you look her in the eyes
- NPC gets annoyed at you if he notices that you are looking at his face scar too much
- crazy Nazi torturer seeing which of your NPC friends you wish NOT to die
- out of 3 manequines in the ruined asylum only those that you are not looking at are opening their eyes
- using your Superman eye lazers to burn enemies
- marking things with your eyes [HUD]
- solving puzzles with your eye gaze


As DemonNite said, with eyetracking "center of the screen" can finally be decoupled as the main way to select or interact with things.

lol I love those examples.
 

PJV3

Member
Hm... But what happens when I check my health bar/mana/ammo? Or while shooting at one guy I quickly scan the area for the next possible threat? I am sure there are workarounds I just can't think of right now...

Maybe it can detect if you are looking at HUD elements etc, I'm not sure how precise it is, but if you're able to pick out targets it must be pretty good.
 

StuBurns

Banned
Hm... But what happens when I check my health bar/mana/ammo? Or while shooting at one guy I quickly scan the area for the next possible threat? I am sure there are workarounds I just can't think of right now...
The game should be able to tell you're trying to view the HUD, not the geometry currently under it and leave the reticule where you left it for your eye to pick it back up when you look at the simulation again.
 

Zaptruder

Banned
This would be fantastic in VR. Among other reasons:

- multiplayer emotivity; combining gaze tracking with full body motion provides players with a great deal of emotive quality. If they also do facial expression tracking, then you have a system that is able to track the body language of a person, conveying much of the importance of face to face communication virtually.

- reactive UI; hide UI elements until a player looks for them. Simple, but upgrades the immersiveness of the overall presentation substantially - puts them into a space that isn't cluttered by elements that force a person into the modality of thinking that they're in a game. Example - you see the PM icon up in the top right of your screen? Imagine looking at it - and having it expand to show you the headline contents in there. Combine that with context driven voice command - 'open' and it'll open the title you're reading.

So... this shit is next level future stuff - shortening the gap between desire, intention and action dramatically.

OTOH, gaze tracking is some bullshit for aiming and camera control - unless of course they give the player a button that they can hold down to activate that, rather than just have it active all the time. Makes more sense for eye lasers than hand fireballs though.
 
Having very fast eyetracking integrated into Morpheus could allow not only new ways of controlling the games or giving games info about state of user [horror game that knows where you are looking could be so scary], but could also allow devs to create brand new engines.

Out eyes read high detail data at very small surface. Half of screen can be rendered with half details, and rest with even way less.
Basically LOD to the extreme. Could be a roundabout way of increasing performance for VR games without actually increasing performance of the hardware. But it would be trippy being an outside observer watching Morpheus gameplay on an external display. It would be like watching those movie effects where the audience is seeing through the vision of another character/animal. Peripheral vision is extremely out of focus.
 
This would be fantastic in VR.
I think the best use for it in VR would be to simulate focus - at the moment everything is in focus in VR, and you get the depth perception from your eyes converging or diverging, but they don't ever change their focus. In VR, if you look at an object up close, you can feel your gaze converging (going cross-eyed), and if you look past the object to something in the distance, your eyes diverge again. But if you close one eye, you'll see that both objects were always in focus, only that the closer one is more displaced relative to the other view.

(You can feel this convergence if you hold up a finger close your face and stare at it with both eyes, and then look past it to something in the distance)

But in reality, you can close one eye and look at your finger, and the stuff in the distance goes out of focus, and then look at the stuff in the distance and your finger goes out of focus. This currently doesn't happen in VR, only the two-eye convergence stuff does. But if it could track your eyes, it could simulate this with subtle depth of field effects.

This might feel really weird however... because your eyes would still not be doing any physical changing of focusing - they are still locked to the horizon, where the screen remains constantly in focus. It would also require extremely precise and responsive tracking.
 

vpance

Member
Basically LOD to the extreme. Could be a roundabout way of increasing performance for VR games without actually increasing performance of the hardware.

It's a logical and efficient way. Waiting for 16k panels and 10TF GPUs to go mainstream will be fun, but foveal rendering can get us much more impressive results in the meantime.
 

Raoh

Member
I posted this video in another forum a while back but sadly no one seemed to care, glad to see the technology moving forward.

http://www.youtube.com/watch?v=AztH_YVQN-k&feature=share&list=PL13420013BB0F0538&index=21

This was used using components from the PSEYE from the PS3.


The EyeWriter is low-cost, eye-tracking glasses & free, open-source software that allows writers and artists with paralysis to draw and communicate using only their eyes.

In April 2009, The Ebeling Group, GRL, openFrameworks, and F.A.T Lab collaborated with Tony "Tempt" Quan legendary urban artist and ALS sufferer, to make the first prototype of the EyeWriter.

Check out www.eyewriter.org or www.notimpossiblefoundation.org for more information.
 

rjinaz

Member
Fire, fire... FIRE!!!!!!!!

Devices like these always fell short. They seem so cool, and the idea behind them always seemed so cool, but then you get them home and you quickly realize it isn't what you expected nor wanted and then it gets put on a shelf somewhere never to be touched again.

But now we're actually living in the age where the concepts behind what these devices were trying to mimic, are actually possible. I'm excited about video games again, much like I was when I was a kid in the late 80s and 90s. I can't wait to shoot down fighter jets with my eyes!
 

Zaptruder

Banned
I think the best use for it in VR would be to simulate focus - at the moment everything is in focus in VR, and you get the depth perception from your eyes converging or diverging, but they don't ever change their focus. In VR, if you look at an object up close, you can feel your gaze converging (going cross-eyed), and if you look past the object to something in the distance, your eyes diverge again. But if you close one eye, you'll see that both objects were always in focus, only that the closer one is more displaced relative to the other view.

(You can feel this convergence if you hold up a finger close your face and stare at it with both eyes, and then look past it to something in the distance)

But in reality, you can close one eye and look at your finger, and the stuff in the distance goes out of focus, and then look at the stuff in the distance and your finger goes out of focus. This currently doesn't happen in VR, only the two-eye convergence stuff does. But if it could track your eyes, it could simulate this with subtle depth of field effects.

This might feel really weird however... because your eyes would still not be doing any physical changing of focusing - they are still locked to the horizon, where the screen remains constantly in focus. It would also require extremely precise and responsive tracking.

That'd be pretty neat - but you're right, the actual muscles that contract the lens would shift at all.

Also, it wouldn't work with only a single eye either (where as it does in reality).

But this idea kinda falls in the realm of foveated rendering - is closely related and likely to be done just as a matter of course in doing foveated rendering right (i.e. use a Z depth mask to determine what should be shown at higher resolution in the foveated area).
 
Top Bottom