• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GDC Expo: hands on impressions/media of Project Morpheus (Sony VR)

Speaking of the Kotaku piece, I think this quote from Shu probably explains why they didn't have a Driveclub demo available yet

We had a demo for a driving game with the tech, and that was really great when I drove slowly," Yoshida said. "I could view the scenery. It's no longer racing. It's driving, driving through scenery. But when I put on the gas and started driving [fast], it was too much." He said that "many people" who tried it felt ill.

Need to get that latency down before showing it off. Don't wanna make your conference attendees puke.
 

gofreak

GAF's Bob Woodward
CVG article - with an interesting little bit of info about the current prototype:

http://www.computerandvideogames.com/455003/features/project-morpheus-ten-key-questions-answered/

A Sony R&D tech told CVG on the show floor that the Morpheus - at least presently - can receive an external video source via HDMI. But, we were informed, any video source not specifically designed for Morpheus VR will simply be displayed on the Morpheus as a 2D plane in front of the user, and you move your head to look around the flat view. There'll also be no stereoscopy.

It's unclear at this point if universal HDMI-in support is a feature only present in the prototype for testing and development, or if it's planned for the final retail version, although we suspect the latter.

Hopefully that will feature in the consumer unit too.
 
More resolution is always going to be preferable theoretically, but I take it to mean you're saying why not use that additional performance just for super sampling or whatever, and yeah, that's certainly a fair point.
Sure, and I'm not disputing that, but at the same time, "moar graffx" is always going to be preferable, theoretically. That's why we get all of the lol@ps2gfx posts whenever PSVR comes up. There's lots of stuff you can do to make the picture prettier apart from increasing the resolution; better AA, more polys, better lighting, etc. All of that stuff will improve the look of your world, and by extension, increase believability. They also all
AFAIK
effectively come at the cost of resolution, by which I mean, if you double your pixel count, you now may not have time to compute the improved AA and stuff.

I think the GTA thing hits at the very center of the debate really. GTA might seem like the perfect VR game to some people, it presents a huge world, lovingly crafted. But in reality, it would be the very worst choice for VR.
I wasn't really cheerleading for GTA:VR in particular; it was just recognizable and easy to type. lol It does bring up an interesting point though.

They're right about games needing to be designed from the outset for VR. Now, everyone seems to be assuming that means saying goodbye to all the franchises that we know and love, but that's not necessarily the case. In GTA you assume the role of a street punk who hits it big in the underworld
I assume
. While you're likely correct about GTA5 making for a terrible VR game, that doesn't mean there can't be a great GTA VR game. I'd argue that if R* makes a game where you become a punk with dreams of taking over the mob, then that's GTA, whether it's a GTA5-style game, or a VR experience that makes you literally feel like you're the punk. In fact, the series has already undergone a similar re-imagining, when it transitioned from top-down to what we have now.

Speaking of best practices though, I think MMOs may become really huge in VR.
How's that for a segue?
So we're supposed to rethink the way we design our games. Let's try that for a while.

The name of the game is presence. We want the player to literally feel like they've been transported in to the world we've created for them. We've already established some rules for establishing and maintaining that effect. Head motion is law. Latency is king. Sound matters. These are mostly just technical issues though. What about the game design itself? What can we do there to strengthen presence and minimize immersion breaks?

I think one thing most gamers will agree can kill the mood in an instant is shitty NPC interaction. A best they're crappy robots. At worst they're little more than buttons. Since they can't think like a real person, they can't respond like a real person, which is a huge limiter to immersion. In GTA, when you're captured, beaten, and dragged up before the Don who explains how he's going to kill you, as a player, you laugh and say, "Yeah, yeah. Blow me, bitch."

Clearly, that player doesn't feel present at that moment. So how do we avoid it? The obvious solution would be to remove all of the NPCs, but we can't have every game be about the player wandering around alone in the world either. So why not populate the world with other players? Imagine you're playing GTA:VR and you get dragged before the Don, but this Don is a real person who built himself up EVE-style to king-like status in the game. When he gets up in your face, are you still gonna laugh at him, or are you gonna be the one whose knees are about to buckle?

At his speech on designing good VR experiences, Palmer was saying simply hanging out with other avatars — even if you're not really doing anything — can be a very compelling experience in VR. Hanging out with other people and not really doing anything? I don't know about you, but that sounds like Home to me. :p

If managed correctly, I think MMOs and stuff like Home could become pretty huge, because shared experiences are more powerful, since it gives you external reinforcement. Being able to glance over at your buddy, Bob, even if he's a big, purple ogre, causes you to subconsciously accept what's going on around you. Yup, must be real. Bob the Ogre is here, and he clearly sees it too. Imagine you're walking home late one night and you see a flying saucer flit by overhead. If you were alone, you'd be inclined to disbelieve it, but if Bob saw it too, then shit just got real.

Media Molecule told us they wanted to let us record our dreams. Magic Lab introduces us to Morpheus, god of dreams. Imagine a set of tools from MM that allowed you to create your own little pocket universe, all built with a pair of wands like Anton in those multi-touch demos. Okay, I need a block of wood here… but it needs to be this big… *puts hands out to stretch block to size* Still not much of a workbench though… *pulls out light saber and carves the block down a bit* You know, I think we should put some mountains here… When you've created your domain, invite your friends over to explore and rule over it with you. Or explore domains others have created. Or jump over in to the Star Wars MMO.

I think stuff like this is what Sony may have been hinting at when they said that "social" was going to be an important aspect of VR.
I am not Agent Phil.

Sony are in a great position to limit what is supported in VR, and provide a set of expectations for what things should and shouldn't be in VR. Oculus are trying hard to do that too, Palmer has openly talked about needing VR exclusive games, and not retrofitting, but Rift is on PC, and people are going to do it anyway, no matter what Oculus suggest.
/shrug Both sides have their merits. Yes, being inside a curated ecosystem means you get more consistent quality, but being out in the wilds means you get a little more variety and sometimes see some really interesting shit.


No worries about the misunderstanding.
:)

Okay, here's where I'm coming from. Let's say that DK2 is good enough to evoke presence. Abrash says it should be — based on his experiments with his prototype — and people are reporting it with both DK2 and Morpheous. So Sony baselines VR on PS4 at 1MP and 75 Hz, which is a nice balance between blur reduction and performance demands. Oculus are looking to baseline at 90 Hz — a 20% increase in performance demand — and 2MP, doubling the already increased demands.

So let's say Ubi release some big VR game for PS4 and PC. They tune the game to look nice on PS4 and run great. Let's say on the PC side of things, the PS4 would be equivalent to Medium settings, with most effects off. But on the PC, thanks to the high specs for the Rift, you'd need a GPU 2.4x as powerful as the PS4 to run those same Medium settings. I took a look, and the Titan is almost exactly 2.4x as powerful as the PS4. It's also more than $1000 on Amazon. Which do you think will impress potential customers more when explaining why they should buy a $1000 GPU, running the game on Medium and minimal effects, just like the $350 PS4 except at 2MP, or running the game on Ultra with all effects at 1MP?

And what about the poor schmuck who actually has a PS4-level PC rather than a Titan? He'll need to turn his settings down to Asstastic, if he's even able to run the game at all. All just because they doubled the resolution of the display. That's what I mean by leaving customers out in the cold. Is it not better to have the same resolution with equal or better graphics and reaching a broader range of customers than have only a tiny percentage of your market be able to manage even equal graphics, much less better?

The Steam numbers indicate powerful GPUs aren't terribly common. You dismiss them and assure me the average PC is really quite powerful and getting faster all the time, but that's not much of a rebuttal. :p You cite flawed methodology and imply that for some reason Valve may want to hide just how powerful the average PC is, but you present me with no alternate source of data beyond, "Trust me; there's lots."

If Valve do know how secretly powerful the average PC is, maybe we should look to them for recommendations for a minute. If you read Abrash's blog, while he does talk about 8MP at 1000 Hz and stuff like that, he also says none of that will happen in his lifetime, so it makes more sense to discuss what can reasonably be achieved over the next year or two, taking in to account not only the ability to source displays, but also the ability to drive them with reasonable performance and IQ. The reference design he presented was exactly that; a design potent enough to evoke presence in most users, yet with performance demands in reach of the average gaming rig, all for a device releasing in 2015. Would Abrash agree that 2MP panels are superior to 1MP? Without question, but he doesn't think it's a reasonable design goal for a device releasing next year.

If Oculus set 1MP as their baseline performance, they can offer PS4-level VR to users with PS4-level boxen, and they can offer PS5-level VR to users with PS5-level boxen. And they can maybe even offer PS3-level VR to users with PS3-level boxen. If they go to 2MP, they can only offer PS4-level VR to guys with Titans, and the guys with 7850s will get a gimped experience if they get anything at all.

See what I'm saying? I'm not saying they're not allowed to be better than PS4. I'm saying that visuals will already be taking a noticeable hit to support VR, and at least today, bumping those settings back up a few notches may be more noticeable than bumping resolution, and it's a preferable solution since it's scalable, while a resolution bump effectively raises the performance floor substantially.

Why not release a 1MP display in 2015 as Abrash recommends, build up a sizable and vocal community of supporters, and then release a 2MP model in 2017 for the Titan++ guys who were already running Ultra on Rift 1? It just seems like they're jumping the gun here. I know Oculus have "the best minds in VR" now, but is one of them Tim Taylor or something? Why the rush to double the minimum system requirements? PC will be able to show its superiority at any given resolution, so why set the bar for performance so high?


Don't forget that PC/OR isn't locked at max spec, you can tune it to your liking within the ceiling you have available, that's the infamous strength of the platform.. You are not forced to render at 1440p/90Hz (or 4K/120Hz whatever comes in the near future). If you have less powerful hardware then turn down the details and run it @ 1080p/60Hz then upgrade later for proper presence.
I don't think upscaling is a particularly good idea for VR, especially if you're expecting everyone with less than a Titan to do it.


I disagree with this. This is different than other mediums. The experience is absolutely crucial. If there are other features in a competitor's headset, they can market it all they want as an advantage over the other. However, if that experience is flawed there is a serious risk of disorientation and sickness. On top of that, the sensation of presence is lost and the unique experience of VR is lost.
Why do you assume Sony's experience would be flawed? By all accounts it's quite nice, and it ticks all of Abrash's checkboxes for presence.

This isn't an either/or proposition. Sony are perfectly capable of delivering a solid experience visually, and then strengthening that experience with things that pull you still further in to the simulation, like realistic audio, solid motion-based interaction, and strong community ties.
 
This isn't an either/or proposition. Sony are perfectly capable of delivering a solid experience visually, and then strengthening that experience with things that pull you still further in to the simulation, like realistic audio, solid motion-based interaction, and strong community ties.

If you think of all the experience Sony has with TV, Audio, Gaming and what not, they might be the best candidate to bring it all together for succesful VR.
 

FleetFeet

Member
It's an alluring project name.

Honestly, it would be best if they kept that name... just replace Project with Playstation and we're good to go. I was truly stunned that they actually had a great name, I was not expecting that whatsoever, they gotta ride that wave.

If you think of all the experience Sony has with TV, Audio, Gaming and what not, they might be the best candidate to bring it all together for succesful VR.

Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40
 

Mindlog

Member
Would Abrash agree that 2MP panels are superior to 1MP? Without question, but he doesn't think it's a reasonable design goal for a device releasing next year.
Which interview can this be sourced to?
I'd love to read/watch it. I'm really interested if those estimates are based on the consistent attempts to tie PS4/PC power curves or to actual screen availability. For what it's worth I'd prefer even Morpheus to go with simpler graphics at the best reasonably priced screen available. That's a hard sell to their 'core' audience that drooled over NBA 2K14, but complexity isn't as key to the greater number. The sense of scale and lack of illness will be more important than Harden's sweat.
Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...
I need to rehost some of the stuff I have on Youtube. It's pretty great. TrueAudio from AMD should also be part of what the PS4 is using. Here's a teaser from Lichdom.

It's nice to finally get back to pushing tech we had at the start of the century before Creative came along and killed it.
 

Man

Member
Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40
Okay that is really impressive (using iPhone earpods).
 

Salex_

Member
Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40

Holy shit! I was smiling throughout the video. Having that kind of audio with VR would be amazing.

EDIT: Found this video using more examples, this is crazy. https://www.youtube.com/watch?v=uzFswCpJPqg
 

Kalren

Member
Honestly, it would be best if they kept that name... just replace Project with Playstation and we're good to go. I was truly stunned that they actually had a great name, I was not expecting that whatsoever, they gotta ride that wave.



Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40

AMD introduced last year their True Audio hardware solution. The PS4 has this.

GAF Thread

There are some worthwhile demos linked in the thread.
 

FleetFeet

Member
I need to rehost some of the stuff I have on Youtube. It's pretty great. TrueAudio from AMD should also be part of what the PS4 is using. Here's a teaser from Lichdom.

It's nice to finally get back to pushing tech we had at the start of the century before Creative came along and killed it.

The great thing is that it benefits all games no matter the medium, but it seems like it will really push the envelope in conjunction with VR. I'm honestly disheartened that this technology has been around for so long and has been so neglected... imagine where we'd be if it was the industry standard today. And holy shit at how dirty Creative was... my god. That is some dastardly shit...

Okay that is really impressive (using iPhone earpods).

Holy shit! I was smiling throughout the video. Having that kind of audio with VR would be amazing.

EDIT: Found this video using more examples, this is crazy. https://www.youtube.com/watch?v=uzFswCpJPqg

Yeah I was not expecting that type of feeling in the least bit. It was beyond surreal! It is going to be something else when all the components to proper VR are met, it is going to be massive, and not just for gaming.

Edit:
AMD introduced last year their True Audio hardware solution. The PS4 has this.

GAF Thread

There are some worthwhile demos linked in the thread.

I was hoping that this was what AMD and Sony had collaborated on... that's good to know!
 

chubigans

y'all should be ashamed
did they release the video for the Conference?

since i saw some clips used in the FOX10 News video, maybe they did release the video? but i couldn't find it anywhere , and i guess the project name did half the marketing for sony hehe

vFHrvXp.jpg


FOX10 News - Sony's Project Morpheus
https://www.youtube.com/watch?v=4jtcvp-JfYY

Ah nice find! Huh, maybe Sony just released media to the AP and news sources? hmmm
 
Audio is something that completely escaped my mind until Sony brought it up at GDC. Imagine playing a multiplayer game where the voice of a teammate is actually coming from their in-game character. Really cool possibilities there.
 

vdo

Member
Looking at the device it seems that the screen is not that far away from your eyes. How close is it? A couple of inches?

Does that mean if you have to, for example, be able to read a magazine at the same distance from your eyes as this screen is for this not to be blurry? Maybe more people will need to where glasses while wearing this then I first thought if they have any farsightedness, unless I am misunderstanding how this works.
 
Looking at the device it seems that the screen is not that far away from your eyes. How close is it? A couple of inches?

Does that mean if you have to, for example, be able to read a magazine at the same distance from your eyes as this screen is for this not to be blurry? Maybe more people will need to where glasses while wearing this then I first thought if they have any farsightedness, unless I am misunderstanding how this works.

The lenses help with that, plus the fact that the stereoscopic view causes your eyes to naturally focus at the distance projected by the difference in left/right eye picture. So basically it won't feel like you're focusing at a screen directly in front of you.
 

vdo

Member
The lenses help with that, plus the fact that the stereoscopic view causes your eyes to naturally focus at the distance projected by the difference in left/right eye picture. So basically it won't feel like you're focusing at a screen directly in front of you.

Ahh..ok. Thanks for the explanation.
 

Mr.Green

Member
Looking at the device it seems that the screen is not that far away from your eyes. How close is it? A couple of inches?

Does that mean if you have to, for example, be able to read a magazine at the same distance from your eyes as this screen is for this not to be blurry? Maybe more people will need to where glasses while wearing this then I first thought if they have any farsightedness, unless I am misunderstanding how this works.

The optics (lenses) make your eyes focus at infinity. It's actually more comfortable for your eyes than to look at a monitor.
 

vdo

Member
The optics (lenses) make your eyes focus at infinity. It's actually more comfortable for your eyes than to look at a monitor.

Hmm...maybe I could use this technology for work, too :) - instead of side by side physical monitors like a lot of people do, have a few virtual monitors that size themselves into a larger view as you look towards them.
 

fasTRapid

Banned
So, please excuse my maybe dumb question as I don't really know how far the Augmented Reality Glasses technology has come in the last years but would it be thereotically possible to kind of outsource the HUD of a game to something like Google Glasses as you can wear glasses inside the Morpheus?
Is this theoretically possible?
 

Hoo-doo

Banned
Honestly, it would be best if they kept that name... just replace Project with Playstation and we're good to go. I was truly stunned that they actually had a great name, I was not expecting that whatsoever, they gotta ride that wave.



Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40

Holy shitballs.

Was using my sennheiser headphones with this and was kind of taken aback how realistic this was. Closing my eyes I actually felt someone walk circles around me.
 
Kind of redundant. If you want to simulate a hud on a "google glass" just render it in the VR.

There's already a few HUDs designed for VR that provide a seamless experience. The best one I've seen so far is from Time Rifters. The holographic style just looks so cool in 3D and works perfectly for things like weapons selection, ammo counters etc. Gotta give Dead Space props for one of the first ones like this despite not being designed for VR. They did it right.

https://www.youtube.com/watch?v=esIkG7tkLkU
 
That binaural audio stuff is pretty impressive. I imagine it will be all the more so when I can turn my head to actually see what's making the noise, and have the sound respond appropriately.


Which interview can this be sourced to?
I'd love to read/watch it. I'm really interested if those estimates are based on the consistent attempts to tie PS4/PC power curves or to actual screen availability. For what it's worth I'd prefer even Morpheus to go with simpler graphics at the best reasonably priced screen available. That's a hard sell to their 'core' audience that drooled over NBA 2K14, but complexity isn't as key to the greater number. The sense of scale and lack of illness will be more important than Harden's sweat.
He talked about it at Dev Days this year. When he presents his summary slide with what I've been referring to as the reference design, it's titled, "Feasible 2015 consumer HMD." In the section on resolution, he says that a 1080p is sufficient to invoke presence, but obviously more resolution is better. He talks about how motion blur is distracting, and strobing helps to eliminate it, but that introduces flicker. To combat the flicker you had to increase the frequency. He eliminated the flickering by going to 95 Hz, but concedes it could be eliminated at lower frequencies as well. Oculus seem to have done a good job of combatting motion blur and flickering at 72-75 Hz, FWIW.

Since Abrash himself says more resolution is better — until you reach the limit of human vision — I don't know why he wouldn't have said 2MP was "feasible" for 2015, if he thought it was. Must be he thought it was infeasible.
 
D

Deleted member 22576

Unconfirmed Member
This is all so exciting. I can't wait to try VR out.
I feel like a kid. I want it so badly. I have to have it.
 

Nafai1123

Banned
New info http://www.dualshockers.com/2014/03/21/sonys-morpheus-vr-headset-for-ps4-has-three-different-processors-could-still-get-an-oled-screen/ Credit goes to Winternet for finding it, but I figured this would be the more appropriate thread since it's tech focused.

3 different processors (DSP, FRC, image correcting) are built into the headset. Undetermined if they will end up in the consumer version as they can use the DSP or GPU on the PS4 to achieve the same results

120hz motion interpolation is being used currently to decrease image persistence of LCD

Also some interesting tidbits about the LCD used

Yoshida-san also mentioned that the LCD used with Project Morpheus is an especially made panel that isn’t the common vertical scan type, but a horizontal scan type that is much faster at updating horizontally adjacent pixels with a side scrolling image, and can dramatically reduce the afterimage effect of the picture.
 

DieH@rd

Banned
New info http://www.dualshockers.com/2014/03/21/sonys-morpheus-vr-headset-for-ps4-has-three-different-processors-could-still-get-an-oled-screen/ Credit goes to Winternet for finding it, but I figured this would be the more appropriate thread since it's tech focused.

3 different processors (DSP, FRC, image correcting) are built into the headset. Undetermined if they will end up in the consumer version as they can use the DSP or GPU on the PS4 to achieve the same results

120hz motion interpolation is being used currently to decrease image persistence of LCD

Also some interesting tidbits about the LCD used


Wow, I did not expected that! So the screen is refreshing in 120fps. That interpolation chip has to be very fast and artifact-free to make this viable.

So what is causing reported bluring? Interpolation or Sony not even bothering to use low-persistence mode in this first prototype [8.3ms per frame for Morpheus vs 2/3ms for DK2]?
 

vivftp

Member
New info http://www.dualshockers.com/2014/03/21/sonys-morpheus-vr-headset-for-ps4-has-three-different-processors-could-still-get-an-oled-screen/ Credit goes to Winternet for finding it, but I figured this would be the more appropriate thread since it's tech focused.

3 different processors (DSP, FRC, image correcting) are built into the headset. Undetermined if they will end up in the consumer version as they can use the DSP or GPU on the PS4 to achieve the same results

120hz motion interpolation is being used currently to decrease image persistence of LCD

Also some interesting tidbits about the LCD used

Ahh, I was wondering if they'd use motion interpolation on the headset. I wonder if it's identical to their motionflow tech on the TVs, or something else.

IIRC, older motionflow tech would take the original signal and create brand new complete frames between the existing ones to produce a smoother image. Newer motionflow techniques break up the image into portions so that the overall end result was smoother - like this:

picture_motionflow_200.jpg
 

luffeN

Member

The last paragraphs are somehow worrying?

Kotaku said:
The Morpheus is a real headset. Sony really is releasing devkits to developers so they can make VR projects for PS4. It's still not guaranteed, though, that Morpheus will ever actually come out.

Yoshida did clarify that there's no chance Morpheus will go on sale for gamers in 2014. But beyond that? Will gamers be able to use them on PS4s of their own? Yoshida chuckled. Tough question, he said.

I wasn't asking to trick him, wasn't trying to get him to spill a release date he wasn't prepared to offer. I just wanted to know how real this was.

He sounded determined and said, "we really, really, really, really want this to be a possible project."
 

Leb

Member
It's still not guaranteed, though, that Morpheus will ever actually come out.

He sounded determined and said, "we really, really, really, really want this to be a possible project."

I can really only think of a few things that would prevent PM from coming out with this generation:

1.) Console users generally have very little tolerance for/patience with experiences which aren't straightforward and polished. Modern VR is still a very new, very experimental technology and Sony may be concerned that they won't be able to effectively manage expectations for what will, inevitably, remain a decidedly imperfect technology for some time to come.

2.) Sony feels they can't deliver a device of sufficiently high quality at a price point that will guarantee the degree of adoption necessary for the VR ecosystem to flourish.

In this respect, Oculus may have it easier as PC users are generally willing to put up with a lot more bullshit and are willing to spend a lot more money on things which don't always work perfectly.
 
Honestly, it would be best if they kept that name... just replace Project with Playstation and we're good to go. I was truly stunned that they actually had a great name, I was not expecting that whatsoever, they gotta ride that wave.



Someone posted this link to a binaural audio demo yesterday... I was truly blown away by this, especially considering this can be done with ANY stereo headphones. I would shut my eyes, and I would have the weirdest feeling having this matchbox shake around my head like it is really there... Sony isn't kidding when they mean audio will seal the deal.

You have to use headphones...

https://www.youtube.com/watch?v=HprNPCRyP40

Holy shit! I was smiling throughout the video. Having that kind of audio with VR would be amazing.

EDIT: Found this video using more examples, this is crazy. https://www.youtube.com/watch?v=uzFswCpJPqg

Sounds amazing but I never got any sense of any sound actually in front of me but always at the back and around the back/sides - not the front or in the front distance.

Anyone else?
 

gofreak

GAF's Bob Woodward
New info http://www.dualshockers.com/2014/03/21/sonys-morpheus-vr-headset-for-ps4-has-three-different-processors-could-still-get-an-oled-screen/ Credit goes to Winternet for finding it, but I figured this would be the more appropriate thread since it's tech focused.

The original 4Gamers interview asks a lot of good questions. I'm using Google Translate but you can get the gist of it.

http://www.4gamer.net/games/251/G025118/20140321014/

A basic summary:

Why use RGB leds instead of IR?

They did use IR in an earlier version of the prototype. But they found it was easier to make tracking more accurate with visible LEDs, because they can change the color of the LEDs independently to improve accuracy. They can get by with 6 visible LEDs, with IR you need many and it's not quite as easy.

Why only 2 LEDs on the back when there are 4 on the front?

Marks says that they found 2 on the back to be 'good enough'. They use the internal sensors to help here, and even when someone turns around, parts of some of the front LEDs appear and disappear in and out of view and that can be used to help the tracking.

What's in the processor unit? It's relatively large.

(See Dualshocker's report - but I'd add that they acknowledge that the FRC/motion interpolation adds a little latency, but that the team thought the trade off was worth it to reduce afterimage)

OLED is not being used?

The possibility of using OLED in the final product is being studied. The LCD panel is a custom one with horizontal scanning. They found this dramatically reduces after-image effects with lateral motion of the HMD, which apparently is a significant issue (vs vertical motion, I guess). The choice between LCD and OLED hasn't been made yet.

Is Project Morpheus exclusive to PS4?

With the development unit it is not exclusive to PS4, but you need a PS4 unit. Think Move.me. For example, EvE was running off a PC. The Morpheus in this case was connected to a PS4, and exchanging data via a LAN connection to the PC. This won't necessarily be the case with the final unit, though, and so basically the article's author thinks of it as only for PS4 at the moment.

They talk about some other things, like cooperation with Oculus on development environment - nothing in motion, but Yoshida suggests that the differences between the two on a developmental level are already at a level of just tuning. They're also asked if Morpheus will be at TGS 2014 - the answer seems to be yes.
 
Top Bottom