• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CES Oculus Rift Reactions

N

NinjaFridge

Unconfirmed Member
I'd love to try this out. The reactions sounds incredible but it's something you would need to try yourself.

Sony or MS will attempt their own, flawed copycat versions. Like they did with Kinect and Move.

Sony already has that HMZ headset. Adding Rift-like tech to it would seem like a natural progression.
 

Mr.Green

Member
This is priceless.

Originally, this was supposed to be a straight review of Sony Corp.’s updated Personal 3D Viewer. Then things changed.

I went hands-on with the orignal model for the National Post last year, describing it as “a head-mounted visor that makes the user look like the futuristic lovechild of a Battelstar Galactica Centurion and Geordie LaForge.”

You can connect it to your PC or PlayStation 3, and use it to play games and watch movies in 3D.

The new model HMZ-T2 is lighter, and you can now use your own pair of headphones too (before you were stuck with the ones Sony built in). But the general head-mounted visor idea remains the same.

What differs is the competition. And its name is Oculus Rift.
 

3phemeral

Member
I imagine that people will get addicted to this, become acclimated to the world controls, then become incredibly depressed when they can't press forward to travel in reality.

Rates of videogame death in increase exponentially. Just release an Endless Ocean type of game centered around James Cameron's Pandroa.

KuGsj.gif
 
Nope. Stereoscopy is just one of many things that have to be done for Rift integration, and even for that NV's proprietary stuff isn't really usable for anyone else.

Yeah the biggest hurdle it seems will be decoupling the camera from movement/aiming, like tank controls with a swivel view, which is hard to do since view scope and control lik that is usually hardcoded into most engines.
 
For anyone who owns sonys tv headset. Are there any side effects in staring at a screen 3 inches away from your face hours at a time?

Nah, I was fine even after a few hours. However, the unit weight was the real problem. My face was destroyed after like 15 minutes. Really painful.
 
I have plans to buy the consumer version, but I have one question:

Can you output non-stereoscopic 3D imagery on a monitor while retaining the 3D image in the head mounted display?

I ask because I plan to livestream or record gameplay with this thing, and I want to make sure people can perceive the image without 3D head mounts or glasses.
 

beast786

Member
Sony already has that HMZ headset. Adding Rift-like tech to it would seem like a natural progression.

I own HMZ and it is very different from rift. They are honestly non comparable at all. One just is regular screen in front of you giving impact as if you are watching a bigger tv, while other is VR with motion for games only.

Apple and oranges
 
So this is just VR but for home?


As someone who lived through and wasn't impressed with VR in the 90s, whats the big deal? Honest question.


ed


Racing games could benefit but I'm not seeing much else.
 
Did I post in that thread? I don't recall.



Lemme do that if I didn't.



ed



eh. Megathread isn't active and this one is talking about implementation/hands on.
 

Sinatar

Official GAF Bottom Feeder
I'm just picturing playing something truly horrifying like Amnesia with this and soiling my computer chair...


Day 0.
 

jarosh

Member
So this is just VR but for home?


As someone who lived through and wasn't impressed with VR in the 90s, whats the big deal? Honest question.


ed


Racing games could benefit but I'm not seeing much else.

You're on the internet, you're in *this* thread. There's a video in the op and additional links to interviews on the first page. Why not actually *watch* and *read* first before hitting the reply button?
 

syko de4d

Member
I'm just picturing playing something truly horrifying like Amnesia with this and soiling my computer chair...


Day 0.

imagine you play sth. like Amnesia with Headset on and someone in real life touch you at your shoulders while playing.

I would jump out of the chair screaming like a girl :D
 
Theoretically with games on a 2d plane they could just disable the motion tracking and this thing would work like a high FOV Sony HMZ, right?
 

Piggus

Member
Man racing sims with a wheel and the rift. It'd be pants shittingly real.

I already have a cockpit and a nice wheel, so just waiting on the rift for dat complete experience. All you'd be missing is the g-forces.

Sony absolutely needs to support this thing, if just for GT6. I mean the various PC sims are cool but driving nothing but race cars gets a little boring.
 

pestul

Member
So this is just VR but for home?


As someone who lived through and wasn't impressed with VR in the 90s, whats the big deal? Honest question.


ed


Racing games could benefit but I'm not seeing much else.
Just about any 3d gameworld you can immerse yourself into will benefit from this..
 
I own HMZ and it is very different from rift. They are honestly non comparable at all. One just is regular screen in front of you giving impact as if you are watching a bigger tv, while other is VR with motion for games only.

Apple and oranges
If they adjust the optics, add the relevant motion tracking, and optimise for low latency, they could start working with SCE to bring a VR headset very similar to the Rift to the PS4. The HMZ isn't designed with all this in mind, so the experience is very different. But they have a starting point. The Rift prototype was made with all off-the-shelf components and bags of enthusiasm from one dude. I'm sure Sony R&D could knock something up pretty quickly if they see the potential.
 

msv

Member
I own HMZ and it is very different from rift. They are honestly non comparable at all. One just is regular screen in front of you giving impact as if you are watching a bigger tv, while other is VR with motion for games only.

Apple and oranges
I wouldn't say it's 'VR with motion for games only'. It can do everything the HMZ does and then some (except for the resolution and OLED tech of course). Headtracking purely adds to the features, it doesn't change anything about the fact that it's still an HMD.

Headtracking and latency is where I'm guessing the biggest difference is going to be, I haven't tried it, but it seems very uncomfortable to have an 'immovable' screen in front of your eyes. Watching movies with the Oculus on the other hand, on a virtual theater screen that stays fixed in the virtual position (as in, you can look away from it) I'd think that would make a huge difference.
 

TheExodu5

Banned
As someone who lived through and wasn't impressed with VR in the 90s, whats the big deal? Honest question.

Watch Carmack's QuakeCon panel. This is not like VR from the 90s. VR from the 90s was terrible. This is a whole other beast.

I wouldn't say it's 'VR with motion for games only'. It can do everything the HMZ does and then some (except for the resolution and OLED tech of course). Headtracking purely adds to the features, it doesn't change anything about the fact that it's still an HMD.

Well it's an HMD with warped optics, so while it may work as an HMD, the image will be distorted in one way or another, even with a filter in place to reverse the lens fisheye effect. Also, such a large FOV would not work too well without head tracking, since you wouldn't be able to comfortable see the corners of the image. In the end, you have a product that is really designed with VR in mind, not movie watching.
 
I wouldn't say it's 'VR with motion for games only'. It can do everything the HMZ does and then some (except for the resolution and OLED tech of course). Headtracking purely adds to the features, it doesn't change anything about the fact that it's still an HMD.

Headtracking and latency is where I'm guessing the biggest difference is going to be, I haven't tried it, but it seems very uncomfortable to have an 'immovable' screen in front of your eyes. Watching movies with the Oculus on the other hand, on a virtual theater screen that stays fixed in the virtual position (as in, you can look away from it) I'd think that would make a huge difference.

Imagine if their was a program that took your video files and rendered them in a vitual movie theater. You could simulate having a home theater.
 
Definitely seems like it's doing what it's supposed to. We're closer to full virtual reality, which is where we'll end up in 50 years (at the consumer level). Gotta see it myself to see if there's any critical drawbacks like autostereoscopic 3D or most motion controller tech today, though, before I get on board. People tend to really oversell these gimmicks in articles.

The one problem no one seems to have really tackled yet is focus, or depth of focus to be more precise. While this tech simulates eye position and movement coupled with visual queues to make the 3D immersion better than it likely has ever been before, there's still the issue that you're looking at a screen in which all objects are constantly in focus. Real eyesight allows one to selectively focus on objects based on physical distance, ocular convergence, and ocular lens/light ray convergence. With this tech, you're still looking at a flat (though slightly curved) surface and thus all objects on screen are still at the same physical distance relative to your eyes (though again, mostly diminished by the brain working out the difference in viewpoints and ocular distance to extrapolate perceived distance). You can see some of the consequences of focus manipulation in tilt shift photography. So while the rift will be generally acceptable and even spectacular with just movement through environments, the effect will likely greatly diminish when faced with more complex up close scenes/objects with varying distances layered on top of one another. Such a problem though is far more important in cinema which deals mostly with closeups.

Half the equation is already solved on the digital side since a virtual engine can easily be controlled to refocus on any particular object, and on the analog side there's plenoptic imaging/light field photography which is slowly gaining traction and allows for refocusing independent of the time and state of imaging. But in both arenas, there's no way for the user to control the focus naturally, relying entirely on manual adjustment. I expect we'll need to invent some kind of ocular/retinal tracking in order to simulate this in a more immersive way, but that seems immensely impractical. We'll likely need an entirely new display paradigm, either volumetric projection or, much much further down the line, direct neural stimulation, both of which will likely come with their own unique issues to solve.

Unless I'm interpreting something wrong in all of this *shrug* In the end though, with this device they've got, I would say, 85% of the problems solved with reproducing natural human vision.
 
Occulus Rift is like the geek version of the Sybian. I remember seeing women riding the Sybian for the first time and being transported to another world. Occulus Rift is the same thing for the tech/geek set.

The porn/adult toy industry needs another revelation like the Sybian.
 

Quasar

Member
I have to admit I wonder about this given the low resolution display being up so close to the eyes.

Is this a matter of it being a first step kickstarter project, or an issue of making better displays at a slightly affordable price?
 

Sinatar

Official GAF Bottom Feeder
Occulus Rift is like the geek version of the Sybian. I remember seeing women riding the Sybian for the first time and being transported to another world. Occulus Rift is the same thing for the tech/geek set.

The porn/adult toy industry needs another revelation like the Sybian.

So how long until Oculus rift POV porn starts getting made?
 
Someone should experiment with combining this and the next version of Kinect (if that's a thing). Oculus Rift for looking around, a pedal or something at your feet for walking, Kinect 2.0 tracking your hands. Maybe throw in some voice controls too!
 

Kinitari

Black Canada Mafia
The one problem no one seems to have really tackled yet is focus, or depth of focus to be more precise. While this tech simulates eye position and movement coupled with visual queues to make the 3D immersion better than it likely has ever been before, there's still the issue that you're looking at a screen in which all objects are constantly in focus. Real eyesight allows one to selectively focus on objects based on physical distance, ocular convergence, and ocular lens/light ray convergence. With this tech, you're still looking at a flat (though slightly curved) surface and thus all objects on screen are still at the same physical distance relative to your eyes (though again, mostly diminished by the brain working out the difference in viewpoints and ocular distance to extrapolate perceived distance). You can see some of the consequences of focus manipulation in tilt shift photography. So while the rift will be generally acceptable and even spectacular with just movement through environments, the effect will likely greatly diminish when faced with more complex up close scenes/objects with varying distances layered on top of one another. Such a problem though is far more important in cinema which deals mostly with closeups.

Half the equation is already solved on the digital side since a virtual engine can easily be controlled to refocus on any particular object, and on the analog side there's plenoptic imaging/light field photography which is slowly gaining traction and allows for refocusing independent of the time and state of imaging. But in both arenas, there's no way for the user to control the focus naturally, relying entirely on manual adjustment. I expect we'll need to invent some kind of ocular/retinal tracking in order to simulate this in a more immersive way, but that seems immensely impractical. We'll likely need an entirely new display paradigm, either volumetric projection or, much much further down the line, direct neural stimulation, both of which will likely come with their own unique issues to solve.

Unless I'm interpreting something wrong in all of this *shrug*

Retinal tracking is pretty interesting, a few attempts have been made and some soon to be commercialized, ala positional gaze tracking Microsoft has been working on. I've heard a few things about using retinal biometrics for security as well, on a consumer level.

I imagine these fields will see some convergence.
 

Kurdel

Banned
I have to admit I wonder about this given the low resolution display being up so close to the eyes.

Is this a matter of it being a first step kickstarter project, or an issue of making better displays at a slightly affordable price?


From the Verge:

But the immersion trumps all, even despite the Rift's relatively low resolution.

I am sure in a few years we will see a pricier, higher ppi displays.
 
Did the guy presenting say that the dev kit is going to be different from what is seen in the video? I thought I heard him say the display is faster in dev kit.
 

DieH@rd

Banned
Did the guy presenting say that the dev kit is going to be different from what is seen in the video? I thought I heard him say the display is faster in dev kit.

Yes, they are using old display in all presentations. March devkit will have nice plastic body and better 7" display.
oculus_rift_proto_640_large_verge_medium_landscape.jpg
 
Top Bottom