• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VR Rendering summary for AMD/NVIDIA/Valve tech talks

Road to VR live blogs:

AMD
http://www.roadtovr.com/amd-on-low-...vr-and-graphics-applications-live-blog-330pm/

LiquidVR SDK
img_54f798c29c7a691ouw.jpg

-Latest Data Latch (latency reduction, better parallelism and performance)
-Affinity Multi-GPU (latency reduction, better CPU performance)
-Direct to Display
-"Racing the beam" during rolling shutter for 1ms motion-to-photon latency
-Working with Oculus/Valve


NVIDIA
http://www.roadtovr.com/vr-direct-h...mproving-the-vr-experience-live-blog-2pm-pst/

VR Direct
img_54f78b7366929aepic.jpg

-Late Latching constants
-Driver-level asynchronous time warp, priority over "normal" rendering
-VR SLI (lower latency)
-"can't expect to see perfect 2x scaling here because it depends on the overlap of rendering between frames. If you see 40-50% increase in performance with SLI, you're doing well"
-lower latency GPU distortion shader, future compatibility with Oculus SDK
-"hot out of the oven", "will be in flux for awhile"


Valve
http://www.roadtovr.com/valve-talks-advanced-vr-rendering-live-blog-5pm-pst/

Advanced VR rendering
img_54f7b9b39727ff8u46.jpg


-HTC Vive specs (90Hz, 110 FoV, 1080x1200 per eye, room-scale tracking)
-instancing to double geometry setup (half API calls, improved cache coherency)
-DX11 Multi-GPU extensions: nearly doubled framerate on AMD, have yet to test NVIDIA implementation but will soon.
-Low persistence global display, panel only lit for 2ms after raster scan finished
-"running start" for lower latency and better parallelism
-"Aliasing is your enemy". 4xMSAA minimum, Valve use 8xMSAA. Jittered SSAA is the best
-New method for filtering normal maps.
-Roughness mip maps
-Tangent-space axis aligned anisotropic lightning, "It's so critical that you have good specular in VR", "We think AA specular in VR looks so good that we want to make sure everyone knows."
-Geometric specular anti-aliasing "hacky math"
-More efficient mip map encoding using a hemi-octahedron
-"Noise is your friend." Gradients are awful in VR and they pop, banding is horrible. Every pixel should have noise. Absolutely do this.
-Stencil out areas that you can't see through the lens. 15% performance improvement
-Warp mesh, only warp rendered area (15% performance cost reduction)

 

mhayze

Member
Road to VR live blogs:
AMD
-"Racing the beam" during rolling shutter for 1ms motion-to-photon latency
NVIDIA
-Driver-level asynchronous time warp, priority over "normal" rendering
Valve
-HTC Vive specs (90Hz, 110 FoV, 1080x1200 per eye, room-scale tracking)
-instancing to double geometry setup (half API calls, improved cache coherency)
-Low persistence global display, panel only lit for 2ms after raster scan finished
-"running start" for lower latency and better parallelism
-Stencil out areas that you can't see through the lens. 15% performance improvement
-Warp mesh, only warp rendered area (15% performance cost reduction)

Interesting stuff across the board.

The resolution is one of the things that caught my attention, since that is not an even divisor of any common display resolution. But the last two items in bold make sense - due to the desired shape of the projected display and probably the optics, parts of the display are masked out - 15% as indicated. Chances are this may be a 2560x1440 display with some masking (i.e. a split 1280x1440 masked down to 1080x1200 - so about 100px all around per eye) or it could just be some new display resolution?

Sounds like AMD has a solid head start in this stuff, but I'm also very interested to see what Nvidia does here (at a driver/software level).
 

CTLance

Member
Thanks for the thread.
Sounds like AMD has a solid head start in this stuff, but I'm also very interested to see what Nvidia does here (at a driver/software level).
So it wasn't just my bias. Phew. Seems like team green is either keeping their cards incredibly close to their chests... or they got caught on the wrong foot with this VR thing. How odd.
 

The Llama

Member
Thanks for the thread.

So it wasn't just my bias. Phew. Seems like team green is either keeping their cards incredibly close to their chests... or they got caught on the wrong foot with this VR thing. How odd.

There have been rumors AMD has been ahead on VR for a while now. IIRC someone on here (who I forget) posted that they were working closely with Valve on this stuff and nVidia wasn't, or something like that.
 

Gumbie

Member
Nice OP, thanks for all the info! It's interesting that AMD mentions they are already working with Oculus/Valve and Valve already has the Multi-GPU extensions working with AMD. Wonder why Nvidia seems to be behind in the VR game?
 

Sciz

Member
-DX11 Multi-GPU extensions: nearly doubled framerate on AMD, have yet to test NVIDIA implementation but will soon.
-"can't expect to see perfect 2x scaling here because it depends on the overlap of rendering between frames. If you see 40-50% increase in performance with SLI, you're doing well"
You're doing it wrong, Nvidia.
 

Seiru

Banned
I'm going to be incredibly nervous if I end up having to buy an AMD card for my VR rig. I've been burned by them so many times in the past.
 

Durante

Member
Road to VR live blogs:

AMD
[...]
NVIDIA
Awesome. Never expected things to progress so quickly on the vendor side when I backed the Oculus KS back in 2012.
Also, I wouldn't really interpret too much into any perceived performance difference at this point. What's important is that both are actively working on improving VR rendering.

-"Aliasing is your enemy". 4xMSAA minimum, Valve use 8xMSAA. Jittered SSAA is the best
Couldn't agree more, on DK1 and 2 HL2 without AA and with 8xSGSSAA is really a massively different experience.
 
Nice OP, thanks for all the info! It's interesting that AMD mentions they are already working with Oculus/Valve and Valve already has the Multi-GPU extensions working with AMD. Wonder why Nvidia seems to be behind in the VR game?

A little while ago I saw a post from an Oculus dev saying they hadn't heard a peep from Nvidia after their VR Direct unveiling last year, but that they had been working with AMD and were very pleased with the results.

Nvidia gonna Nvidia. Probably just playing their cards close to their chest as usual.

Though Crossfire clowning SLI so soundly seems off to me, I never thought of the two technologies as so different that a gap this wide would be possible. But it's early still, so.
 

Dr. Kaos

Banned
Good news. Ati seems to be ahead on this, but we'll see.


I do think that VR deserves its own hardware-level optimizations as well. If Ati/nVidia/intel have anything in the planning stages, they cannot announce it or some customers will hold off on buying a new GPU, which costs them sales.
 

syko de4d

Member
Sounds like my future PC will run two GPUs

If VR really gets a tons of buyers AMD and Nvidia will sell so much GPUs. They are the real winners between the battle of Oculus and Valve :D
 

Durante

Member
Finished reading the NV slides. I like that they already have high priority context support not just for DirectX but also for Android, I bt some people at Oculus wish Samsung used Tegra :p

Also, they seem to think that upsampling frame rate with async timewarp is a bad idea, though it's one of the Morpheus operating modes.

Interesting stuff about the dedicated copy engine for SLI. Also VR SLI drivers for DX11 are already available under NDA.
 
Finished reading the NV slides. I like that they already have high priority context support not just for DirectX but also for Android, I bt some people at Oculus wish Samsung used Tegra :p

Why use an entire context just for timewarp though. I don't understand.

Nvidia GPUs don't support asynchronous compute?
 
So I'm starting to think planning my build around a mITX board with room for only one GPU is a bad idea, since the world of VR is going to revolve around two?
 

viveks86

Member
Glad to see AMD/Nvidia stepping in to optimize for VR already! I guess I'm doing SLI after all. Thinking 980 SLI or 980 Ti SLI (depending on when that gets released and price) .
 

kami_sama

Member
Honestly, all VR development should be focused towards DX12 and Vulcan. DX9 and DX10 class hardware is not performant enough nor is the API.

Also, I think they will drop DX11 soon enough. Lots of cards support already most if not all of DX12 features, I think.
 
Glad to see AMD/Nvidia stepping in to optimize for VR already! I guess I'm doing SLI after all. Thinking 980 SLI or 980 Ti SLI (depending on when that gets released and price) .

There is no 980ti. Titan X is a month away, the 1000 series or whatever they rename it as is probably 5-8 months away
 
Also, I think they will drop DX11 soon enough. Lots of cards support already most if not all of DX12 features, I think.

Sounds like a weird thing to do. DX11 is still in active development. DX12 is for when programmers want to do more low-level programming and DX11 for the way we do things now which is "easier".
 

viveks86

Member
There is no 980ti. Titan X is a month away, the 1000 series or whatever they rename it as is probably 5-8 months away

I meant the one that is rumored. It has been talked about since last year. Chances are it will get announced at GTC (unless their rumored production issues push it out to 2016). So I'm gonna wait till GTC to see if it is a real thing. If not, then 980 SLI it is. Pretty sure Titan X is gonna be ridiculously priced like all Titans, so that's not on the cards for me, unless it's price/performance ratio is in line why the 980s.
 

mrklaw

MrArseFace
I want to see both AMD and nvidia start selling more dual GPU cards. Maybe market them as 'VR edition' or something like that. SLI is clearly of benefit for VR, but plenty of people will not want to buy two separate GPUs, and some simoly won't be able to use two GPUs if they have SFF builds with only single GPU motherboards.
 

Xyber

Member
I meant the one that is rumored. It has been talked about since last year. Chances are it will get announced at GTC (unless their rumored production issues push it out to 2016). So I'm gonna wait till GTC to see if it is a real thing. If not, then 980 SLI it is. Pretty sure Titan X is gonna be ridiculously priced like all Titans, so that's not on the cards for me, unless it's price/performance ratio is in line why the 980s.

The card people have been talking about will be on the full Maxwell chip, so it won't be a 980Ti since 980 is already using the full 204 chip and there's no extra stuff to enable. The card you are thinking about (unless you are actually thinking about the Pascal cards, but those are not coming for quite some time) will be the 1080 or whatever it will be called.
 
This is hella exciting as well as interesting stuff. VR is coming baby, and its great seeing so much focus on it with dev's happening at GDC. Hopefully this gets the creative juices flowing and we are going to see a big ramp up in terms of games in development for VR.

Also, one thing I noticed, its mentions that these are Vive dev kit specs. Nowhere does it say final consumer model.

Is it possible Valve is going to aim slightly higher in terms of FPS / resolution in the consumer model at the end of the year? I can't help but wonder if they will at least improve FPS since stuff like Morpheus now is 120fps.
 

Gumbie

Member
I mostly buy nvidia cards but seeing how amd has been working closely with oculus and valve is making me consider the Radeon 300 series for my next card.
 

Tetranet

Member
-"Noise is your friend." Gradients are awful in VR and they pop, banding is horrible. Every pixel should have noise. Absolutely do this.


Huh, fascinating. This can prove to be quite useful in masking such shortcomings.
 
Sounds like a weird thing to do. DX11 is still in active development. DX12 is for when programmers want to do more low-level programming and DX11 for the way we do things now which is "easier".

DX11 has no place in VR longterm. If you want to create a good VR experience then you need to be obsessive about latency, frame times and performance. DX12 and Vulcan are critical to the success of VR on the PC.
 
DX11 has no place in VR longterm. If you want to create a good VR experience then you need to be obsessive about latency, frame times and performance. DX12 and Vulcan are critical to the success of VR on the PC.

Ah, I understood it wrong, I thought he meant the development and support for DX11 altogether, not only for VR. Makes sense then.
 

viveks86

Member
The card people have been talking about will be on the full Maxwell chip, so it won't be a 980Ti since 980 is already using the full 204 chip and there's no extra stuff to enable. The card you are thinking about (unless you are actually thinking about the Pascal cards, but those are not coming for quite some time) will be the 1080 or whatever it will be called.

I see… well, I'm basically waiting for some flagship card that's not a Titan to be announced at GTC. If there are none, I'm settling for a 980 SLI. Unless the Titan X costs $1000 AND is faster than 980 SLI (unlikely).
 
It's a safe bet that no other card will be announced at GTC, Jen-Hsun will just expand on the Titan X, and maybe talk again about Pascal.
 

Durante

Member
More on pixel quality:
aliasing16gklg.png

aliasing2okknc.png


I commented on this earlier, but yeah, I found the smae thing on DK1 and 2.

So basically, what you need is more pixels, more often, with lower latency, and higher IQ. Fun!
 
Top Bottom