• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Game Graphics Technology | 64bit, procedural, high-fidelity debating

Properly? I am not sure what they may have meant by that.

It does have some oddness to it, like cutting off on screen edges, fading in slowly to become more detailed and leaving trails (it uses samples from previous frames to increase its quality / cut down on aliasing), as well as not representing some surface properties as well as other SSR's (namely the one in Frostbite).

But I think it is pretty great.

Thank you.
 
So I dug through some of the old Crytek presentations and found some of the material on them. The linked slide is actually leading into more discussion on variable shadow blur based on distance (true soft shadows, similar to other percentage-closer effects where the farther the shadow caster is from the surface, the softer the shadow), but the bokeh effect is used even with that aspect is disabled and it's using a single penumbra blur for all shadows regardless of distance.

I've also attached a few photos I found quickly (jpgs, sorry!) that illustrate the real life effect they're faking.

Note the circular bokeh dots of light on the far walls:
nonLdCA.png


The slide is taken from their 2011 Siggraph paper available here.
There was a great blog post I read once explaining the physical reasons why this occurs, due to natural lensing between leaves as light bends. If only I could find it.
Thank you.

If someone has UE4 installed, I bet they could point out some of its finer points and quirks (that I mentioned) with some screenshots or short webm's.
 

pottuvoi

Banned
I know sebbbi mentioned he really wanted to see programmable sample patterns(for his msaa trick to work on pc) along with control over coverage samples(you know, CSAA/EQAA). He also mentioned a few other fancy sounding shader things. Nothing critical I guess.
GCN can use interpolators within pixel shader, this can be used for all kinds of things, including very fast GBAA.
I would imagine other fun little things would be possible as well.
I'd love to read that if you find it. I had figured that the tiny openings basically acted like pinhole camera 'lenses'. Doing a quick search, I stumbled across this which seems to confirm that's generally what's happening (the pinhole camera effect as seen during a solar eclipse). I've also noticed the effect happen with venetian blinds when the light is at an angle where it's able to peek through.

In any event, despite publishing some info on it, I haven't seen any other game or developer use the effect, which is a shame.
Yes.
It's just an image of an area lightsource. (Which appears in shadowmaps if you sample the shadow as a cirlcle and not just simple PCF kernel, just like in DoF)

It's also good to note that as they use normal shadow maps they cannot get the strange effects that start to happen in such cases. (as each of those small circles are basically a pinhole camera, if something becomes between the hole and lightsource it is reversed in the lightcircle in the wall. (Which is moon in those images you linked.))
 

Kronik

Banned
Generally, when you need distant shadows, it's in a landscape. For such cases, UE4's Ray Traced Distance Field Soft Shadows seem like a great alternative.

Sounds interesting, I can't really imagine how bokeh and shadows relate just from the name.

Unless you have a very particular definition of "critical": none.

Async compute? Going to be huge for GCN since it's stateless and supports resource limits on all its command processors.

Early tests suggest that compute shaders on GCN can run 60% faster, they almost become "free" with async shaders, and even the overall game performance can increase by 10-30%.

Maxwell on the other hand...... but by time it matters Pascal arrives I guess.
 

dogen

Member
Async compute? Going to be huge for GCN since it's stateless and supports resource limits on all its command processors.

Early tests suggest that compute shaders on GCN can run 60% faster, they almost become "free" with async shaders, and even the overall game performance can increase by 10-30%.

Maxwell on the other hand...... but by time it matters Pascal arrives I guess.

I think the question was if DX12 left any critical features un-exposed.
 
Async compute? Going to be huge for GCN since it's stateless and supports resource limits on all its command processors.

Early tests suggest that compute shaders on GCN can run 60% faster, they almost become "free" with async shaders, and even the overall game performance can increase by 10-30%.

Maxwell on the other hand...... but by time it matters Pascal arrives I guess.

pascal isnt expected to be any better at async than maxwell. its the same architecture
 

Kezen

Banned
Async compute? Going to be huge for GCN since it's stateless and supports resource limits on all its command processors.

Early tests suggest that compute shaders on GCN can run 60% faster, they almost become "free" with async shaders, and even the overall game performance can increase by 10-30%.

Maxwell on the other hand...... but by time it matters Pascal arrives I guess.

Directx 12 exposes async compute on supporting hardware.
 

Kezen

Banned
I am currently a bit busy with RL at the moment but will have more than enough stuff to post in here when I am done with that.


The best way to tell IMO would be to have an object that takes up a lot of screen space close to the camera FOV move. Then you could see the obmb in more detail to compare better to others. It does not seem really artifacty at that distance in these screen though.

Does this shot fit the bill ?
yBGOXP2.png


It does not seem as stellar as Ryse' motion blur to me but I'm no maven.
KtFG6u3.jpg
 
Question: Is GTA 5 designed with PBR in any form? So many of the materials in the game look insanely realistic and everything seems to be very well lit. It has amazing ground graphics like this:

gta52015-12-2600-01-1xas0b.jpg


And realistic brushed metal/gold paints for cars and so on.

But it's hard to tell if it's just a very nice texture + normal map or actual material kind of stuff, and the fact that it's essentailly a last-gen title has me unsure.

EDIT: Also, the way light behaves when you see motion blur smearing a reflection:
20441371309_676100cfe3_o.png


Looks exactly like what I've seen in Unreal 4.
 

Someone broke down how GTA does all of its rendering by using RenderDoc

Looking in there, there are some PBS and PBL things (things we have come to associate with that very very broad term, which this thread will eventually define in a more concrete way) going on (fresnel, variable glossiness, IBL and some cool ways to model irradiance). But at the same time, I do not see a number of other things like coloured specular or any information how light is attenuated.

So it is hard to say exactly what it is doing. The game's car models in general do look nice though.

snip 2. something about motion blur and stuff

This to me just looks like a highly specular surface, bloom, and what happens when you apply motion blur on top. It does not point to something being modeled on real world values necessarily.

You could easily see the same effect in Crysis 2 screenshots (high specular, bloom, mb)
but we all know that game does not use a physically based or normalised shading model. It does have aspects physically based lighting though and some more realistic shading than most games from that period. So it could be a similar situation in GTA V, where it has some aspects of what we now consider "physically based".
 
Nice post, Crysis 2 looks exactly like what I mentioned so I guess that that effect isn't a giveaway of something using PBR.

What would you say is more accurate, that GTA was designed with the mindset of partially PB stuff (sort of like Fallout 4), or it's just an "old style" game with some incredible lighting and artwork?
 

Kezen

Banned
Is it possible to estimate the cost of VXAO compared to HBAO+ ?
The next Tomb Raider will have VXAO on PC.

Here's what I can find on the subject :
qUeMFow.png

69GOTUs.png
 

dogen

Member
Is it possible to estimate the cost of VXAO compared to HBAO+ ?
The next Tomb Raider will have VXAO on PC.

Here's what I can find on the subject :

[/QUOTE]

Looks nice, but I thought it was using their own "Broad Temporal Ambient Obscurance"? Is this a replacement for the pc version, or an additional nvidia effect?

Nvm, I'm sure it'll just be an optional extra.
 

Kezen

Banned
Looks nice, but I thought it was using their own "Broad Temporal Ambient Obscurance"? Is this a replacement for the pc version, or an additional nvidia effect?

Nvm, I'm sure it'll just be an optional extra.

Yes both can be implemented on PC.

But VXAO is not mentioned any longer, maybe it won't make it.
 

KKRT00

Member
So I dug through some of the old Crytek presentations and found some of the material on them. The linked slide is actually leading into more discussion on variable shadow blur based on distance (true soft shadows, similar to other percentage-closer effects where the farther the shadow caster is from the surface, the softer the shadow), but the bokeh effect is used even with that aspect is disabled and it's using a single penumbra blur for all shadows regardless of distance.

I've also attached a few photos I found quickly (jpgs, sorry!) that illustrate the real life effect they're faking.

Note the circular bokeh dots of light on the far walls:
nonLdCA.png


The slide is taken from their 2011 Siggraph paper available here.
Usage of this tech in Star Citizen :)

Screenshot by Dictator93 :)
starcitizen_2016_01_059kj1.png
 

RoboPlato

I'd be in the dick
Does anyone know if Battlefront on consoles is using Frostbite's own internal resolution scaling as opposed to the hardware scalers? Given that the IQ seems better than other games running at the same res and using the a similar FXAA technique I'm leaning toward that being the case. Maybe someone on PC can take some screens at native 900p and 1080p scaled down to compare?

Here's a bunch of .pngs from PS4 for reference
https://goo.gl/photos/MVpM2cgdgsyC8EsGA
 

onQ123

Member
Can someone explain the Alien in PlayRoom? Is it using volume rendering? Also it freaked me out when I noticed that it was casting shadows in my room a little too realistically.
 
Does anyone know if Battlefront on consoles is using Frostbite's own internal resolution scaling as opposed to the hardware scalers? Given that the IQ seems better than other games running at the same res and using the a similar FXAA technique I'm leaning toward that being the case. Maybe someone on PC can take some screens at native 900p and 1080p scaled down to compare?

Here's a bunch of .pngs from PS4 for reference
https://goo.gl/photos/MVpM2cgdgsyC8EsGA
Given how frostbite on PC uses its own internal scaler for the resoltion scaling option, I would imagine they could use it as well. I sadly do not own the game on PC to take some 900p shots scaled up, some interal scaler shots, and then compare them to the PS4 ones you have. :(

Anyone?
Can someone explain the Alien in PlayRoom? Is it using volume rendering? Also it freaked me out when I noticed that it was casting shadows in my room a little too realistically.
Could you toss some screenshots or good videos out that show it off well?
Does this shot fit the bill ?
yBGOXP2.png


It does not seem as stellar as Ryse' motion blur to me but I'm no maven.
KtFG6u3.jpg
I will need to produce some screens later for comparison. It looks pretty good here though (even though the second shot could use a much higher qualtiy image). You can though realitively easily see the sample slices just in the camera/screen motion blur which I am not sure is so obvious in something like Ryse.
----
BTW, I will be updating the thread soon enough I just have to find the time to take some more screens for the post processing mega-post. Sorry for taking my sweet time with it!
 

Javin98

Banned
I wish this thread were more active. There are some pretty interesting discussions in here. Anyway, I have a couple of questions:
1. So now that many people had played the Uncharted 4 beta, can we confirm if the god rays are indeed volumetric or a simple screen space implementation? I read that it is also using some form of color bleeding in the jungle map on Beyond3D, is this true?
2. I played MGSV for over 105 hours now, but I think I caught a glimpse of color bleeding only once. Anyway, I just wanna confirm, is this game using some form of global illumination? If so, it has to be real time, right? Since the game has a day and night cycle.
 
Some commentary!
I wish this thread were more active.
I will post here more often soon enough: I was just away from my tower for so long... so there is that.
1. So now that many people had played the Uncharted 4 beta, can we confirm if the god rays are indeed volumetric or a simple screen space implementation? I read that it is also using some form of color bleeding in the jungle map on Beyond3D, is this true?
From what I saw in the footage and screens of the beta, it looks like it is not global (i.e. throughout the whole map) and coming from the sun light source. But it does look like in that one area for the jungle map they placed a volumetric spotlight kinda like how they did it in earlier uncharted games (in that desert section there are hand placed ones poking through the broken roof top in UC3 for example). There is good evidence of bounce diffuse lighting in the beta, how it is done though is a complete mystery to me. I doubt it would be real time though for a 60fps multiplayer game with a static time of day.
2. I played MGSV for over 105 hours now, but I think I caught a glimpse of color bleeding only once. Anyway, I just wanna confirm, is this game using some form of global illumination? If so, it has to be real time, right? Since the game has a day and night cycle.
Indeed MGS does have bounce lighting in different areas, but important to note is that, at least according to prerelease documentation they had at GDC 2013, that it is artist placed. Basically, they baked out postive cubemap probes for different times of day and the they update periodically as the sun changes position in the sky. So it is not really global, but it is indirect lighting.
The coolest use of it I saw in the game is at night time from those big spot lights in Afghanistan. They seem to place a cube map or some sort of reflective probe at the contact point with the ground of a number of the huge flood lights. This then causes yellowed light to bounce beyond the cone of direct light. It looks awesome. I will snap a screen of it and other GI effects in the game when I get home.
 

Javin98

Banned
Some commentary!

I will post here more often soon enough: I was just away from my tower for so long... so there is that.

From what I saw in the footage and screens of the beta, it looks like it is not global (i.e. throughout the whole map) and coming from the sun light source. But it does look like in that one area for the jungle map they placed a volumetric spotlight kinda like how they did it in earlier uncharted games (in that desert section there are hand placed ones poking through the broken roof top in UC3 for example). There is good evidence of bounce diffuse lighting in the beta, how it is done though is a complete mystery to me. I doubt it would be real time though for a 60fps multiplayer game with a static time of day.

Indeed MGS does have bounce lighting in different areas, but important to note is that, at least according to prerelease documentation they had at GDC 2013, that it is artist placed. Basically, they baked out postive cubemap probes for different times of day and the they update periodically as the sun changes position in the sky. So it is not really global, but it is indirect lighting.
The coolest use of it I saw in the game is at night time from those big spot lights in Afghanistan. They seem to place a cube map or some sort of reflective probe at the contact point with the ground of a number of the huge flood lights. This then causes yellowed light to bounce beyond the cone of direct light. It looks awesome. I will snap a screen of it and other GI effects in the game when I get home.
Thanks for the answers, I was really wondering about them, especially the one about MGSV. So, from your post, the color bleeding is supposedly baked depending on the time of day? That's reasonable, I guess. Real time GI with a 60FPS target would have been too much for the consoles, especially in an open world area. I was pretty shocked when I first saw it and was wondering how in the world they managed GI at 60FPS. Great to have the answer now. It still looks great, regardless. As for Uncharted 4, I think you're right. The impression I got from pics and videos is that the god rays from the sun are volumetric. Any idea about the others? Also, from what I read on Beyond3D, an image taken shows that the color bleeding was also added in into the PSX 2014 map, or at least, for that screenshot.
 
Thanks for the answers, I was really wondering about them, especially the one about MGSV. So, from your post, the color bleeding is supposedly baked depending on the time of day? That's reasonable, I guess. Real time GI with a 60FPS target would have been too much for the consoles, especially in an open world area.
Here are those images I promisedin MGS V of how they fake GI with spot lights (which is awesome btw).
You can see where the direct light of the spot light terminates, yet an orange glow is affixed for all geometry facing the area where to spot light is shining. (look at the rocks and building)
mgsvtpp_2016_01_14_18p2qqy.png

mgsvtpp_2016_01_14_182npko.png

Here is a day time of example of it from sun light:
mgsvtpp_2016_01_14_19zhlcx.png

And yeah, they bake out the probes for various times of day and then blend between 'em. At least, that is what they stated they do.

IAs for Uncharted 4, I think you're right. The impression I got from pics and videos is that the god rays from the sun are volumetric. Any idea about the others? Also, from what I read on Beyond3D, an image taken shows that the color bleeding was also added in into the PSX 2014 map, or at least, for that screenshot.
Which image are you talking about from Beyond3d?
 

Javin98

Banned
Here are those images I promisedin MGS V of how they fake GI with spot lights (which is awesome btw).
You can see where the direct light of the spot light terminates, yet an orange glow is affixed for all geometry facing the area where to spot light is shining. (look at the rocks and building)
mgsvtpp_2016_01_14_18p2qqy.png

mgsvtpp_2016_01_14_182npko.png

Here is a day time of example of it from sun light:
mgsvtpp_2016_01_14_19zhlcx.png

And yeah, they bake out the probes for various times of day and then blend between 'em. At least, that is what they stated they do.


Which image are you talking about from Beyond3d?
Damn, that's pretty impressive. And with a day and night cycle, using baked GI would have been the only option. Now I feel like booting up MGSV on my PS4 just to go look around for these color bleeding.

As for Uncharted 4, here is the image:
uncharted-4--a-thiefsbps4x.gif

Credits to Clukos from Beyond3D. He has done a lot of such comparisons. Keep in mind that the BTS 2015 image is very compressed, but other improvements can also be observed. Hair, for example, looks much better IMO.
 
As for Uncharted 4, here is the image:
uncharted-4--a-thiefsbps4x.gif

Credits to Clukos from Beyond3D. He has done a lot of such comparisons. Keep in mind that the BTS 2015 image is very compressed, but other improvements can also be observed. Hair, for example, looks much better IMO.

IMO, it is a bit hard to see what improved between the two builds (although things undoubtably have knowing ND). Only if there was better media.

I think the shirt looks much better even in the compression, but that could chalked up to a scene lighting change or (more likely) a better asset that has folds. A higher quality image would be awesome.
It uses UE4, so it uses PBR.
Here you can find old paper on their PBR implementation.
http://blog.selfshadow.com/publications/s2013-shading-course/#course_content

Which has seen even improvements since then! I wonder, beyond paragon, what the first games will be that take advantage of the clothing, skin, hair, and eye shading they just developed.

Looks like normal rasterised graphics to me. I think the "volume" you are seeing is just the PS4 detecting the scene depth from controller and separating the background image from that. The shadow and alien look like a normal shadow map and a polygon model to me.
 

tuxfool

Banned
Which has seen even improvements since then! I wonder, beyond paragon, what the first games will be that take advantage of the clothing, skin, hair, and eye shading they just developed.

A lot of the licensees will use this. They're constantly integrating their Paragon additions into point releases. I'd really recommend watching all their livestreams related to Paragon.
 

Javin98

Banned
IMO, it is a bit hard to see what improved between the two builds (although things undoubtably have knowing ND). Only if there was better media.

I think the shirt looks much better even in the compression, but that could chalked up to a scene lighting change or (more likely) a better asset that has folds. A higher quality image would be awesome.
Well, like I said earlier, according to the guy who made this comparison, green light bleed is added in, but I don't really see evidence of it, do you? Also, the shaders on materials, especially the clothing and flashlight has been improved significantly. The guys at Beyond3D also claim that light scattering was added on hair.
 
Well, like I said earlier, according to the guy who made this comparison, green light bleed is added in, but I don't really see evidence of it, do you? Also, the shaders on materials, especially the clothing and flashlight has been improved significantly. The guys at Beyond3D also claim that light scattering was added on hair.

I mean, those things could all be true (although the green "light bleeding" is something I do not see at all to be honest and if it is referencing the frame colour temperature, it easily could just be colour grading) but I think the image is way too low quality to make almost any certain statements. The clothing does look better though, but how it looks better is an extrapoliation. It could just be a different and newer asset althogether (which it appears to be IMO), which would not necessarily point toward them having improved their shading model, but rather just being a better asset in general. Flashlight looking improved signifcantly? I do not see it, if anything the second more recent über-compressed image makes it look less detailed or make its look like it has less polygons due to the image quality of the screen (the ridges on the shaft
lol
for example appear to be missing or there does not appear to be a divet in behind the lamp: all probably due to the image quality).
A lot of the licensees will use this. They're constantly integrating their Paragon additions into point releases. I'd really recommend watching all their livestreams related to Paragon.

Yeah I watched all the stuff they have streamed so far about rendering (assuming we are talking about the stream where Brian Karis came out and talked about the shading model with the lead art guy). That was awesome.
 

pottuvoi

Banned
Yeah I watched all the stuff they have streamed so far about rendering (assuming we are talking about the stream where Brian Karis came out and talked about the shading model with the lead art guy). That was awesome.
Yup, their Paragon livestreams have been great.
Characters & Hair etc.

Foliage & Pom.
Foliage part is the fun/gorgeous one.
Pom is old starting to really show it's age, hoping to see some alternatives someday. (tracing to distance/brickmaps?)
 

onQ123

Member
Looks like normal rasterised graphics to me. I think the "volume" you are seeing is just the PS4 detecting the scene depth from controller and separating the background image from that. The shadow and alien look like a normal shadow map and a polygon model to me.

I'm talking about the Alien you can see through him & see when he ingest objects into his body.& so on.
 

AmyS

Member
Did you guys see these new ray tracing demos running on PowerVR Wizard GPUs (GR6500) from CES ?

https://www.youtube.com/watch?v=psDRzQw3v0o

'Spanish Steps' / car interior demo running on a desktop PCI system with four PowerVR Wizard GPUs. Another demo of objects with a lot of dynamic geometry running on one PowerVR Wizard GPU.

Pretty impressive.

f0HIQZc.jpg


http://blog.imgtec.com/powervr-developers/real-time-ray-tracing-on-powervr-gr6500-ces-2016

http://www.theinquirer.net/inquirer...-creator-ci40-and-powervr-ray-tracing-eyes-on

The firm showed us real-time ray tracing, a form of 3D rendering where the rays of light in an image are followed along their path as they interact with and reflect from materials to create realistic images. This was done on a PowerVR GR6500 mobile GPU, comparing it with the same technology that requires a huge desktop-class Nvidia GPU to obtain the same, impressive results.
Voica told us that the technology is likely to show up first in gaming consoles, along with desktop machines, laptops, tablets and smartphones, and we suspect that it could end up in future Apple devices, given that the firm’s PowerVR GPU drives the processors used in them. Exact availability details remain unknown.
 

onQ123

Member
Did you guys see these new ray tracing demos running on PowerVR Wizard GPUs (GR6500) from CES ?

https://www.youtube.com/watch?v=psDRzQw3v0o

'Spanish Steps' / car interior demo running on a desktop PCI system with four PowerVR Wizard GPUs. Another demo of objects with a lot of dynamic geometry running on one PowerVR Wizard GPU.

Pretty impressive.

f0HIQZc.jpg


http://blog.imgtec.com/powervr-developers/real-time-ray-tracing-on-powervr-gr6500-ces-2016

http://www.theinquirer.net/inquirer...-creator-ci40-and-powervr-ray-tracing-eyes-on

Voica told us that the technology is likely to show up first in gaming consoles,

NX?
 
Did you guys see these new ray tracing demos running on PowerVR Wizard GPUs (GR6500) from CES ?

https://www.youtube.com/watch?v=psDRzQw3v0o

'Spanish Steps' / car interior demo running on a desktop PCI system with four PowerVR Wizard GPUs. Another demo of objects with a lot of dynamic geometry running on one PowerVR Wizard GPU.

Pretty impressive.

f0HIQZc.jpg


http://blog.imgtec.com/powervr-developers/real-time-ray-tracing-on-powervr-gr6500-ces-2016

http://www.theinquirer.net/inquirer...-creator-ci40-and-powervr-ray-tracing-eyes-on

I'd love to be corrected by someone more knowledgeable than me, but I think we're still pretty far from seeing full-scene ray tracing. The first two demos were completely static which is very telling. Basically, having a static scene/model means being able to precalc a whole lot of stuff and optimize things so the ray tracing algorithm can take a bunch of shortcuts.

Then the one dynamic demo they showed off had a paltry number meshes in it, none of them with any bones or deformations to speak of, and it was still chugging. You can see it in the recording struggling to hit an even framerate for what would be an incredibly simple scene in a normal renderer.
 
Full scene ray tracing is still a ways off, yes, but I think with another large GPU performance bump or two we could see ray tracing being used for specific effects.
Still praying every day that those Nvidia ray-traced shadows make it into games. I'd quad SLI for that.
 

Kezen

Banned
Full scene ray tracing is still a ways off, yes, but I think with another large GPU performance bump or two we could see ray tracing being used for specific effects.
Still praying every day that those Nvidia ray-traced shadows make it into games. I'd quad SLI for that.

Unreal Engine 4 has support for ray traced shadows. We might see them on high-end games sometimes in the future.
https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/RayTracedDistanceFieldShadowing/index.html
 

Jux

Member
Unreal Engine 4 has support for ray traced shadows. We might see them on high-end games sometimes in the future.
https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/RayTracedDistanceFieldShadowing/index.html

There is a difference between raytracing "something" and raytracing the whole scene. There has been raytracing in 3D engine for a very long time. Just not full raytraced scene. For example, Screen space reflections are a kind of raytracing. Collision detection too.
Moreover, what unreal 4 has is Shadows raytraced through a distance field. You can't actually render materials with that. It's a really small subset of what you can do with raytracing (even though it's pretty clever and works very well, but still at a cost that is prohibitive on consoles).

Full scene ray tracing is still far off.
 

onQ123

Member
I'd love to be corrected by someone more knowledgeable than me, but I think we're still pretty far from seeing full-scene ray tracing. The first two demos were completely static which is very telling. Basically, having a static scene/model means being able to precalc a whole lot of stuff and optimize things so the ray tracing algorithm can take a bunch of shortcuts.

Then the one dynamic demo they showed off had a paltry number meshes in it, none of them with any bones or deformations to speak of, and it was still chugging. You can see it in the recording struggling to hit an even framerate for what would be an incredibly simple scene in a normal renderer.

I think the point is to add the RTU as a co-processor & have it work with the normal GPU for things that the ray tracing unit is better at than the GPU.
 
Fantastic thread with a lot of great posts already. Subbed.

Thx. It should get even more structured in time hopefully! I just wish more people would post :D
---
GDC 2016 talks are starting to pop up and they are looking already pretty interesting:
10:05-11:00: Practical DirectX 12 - Programming Model and Hardware Capabilities

This session will do a deep dive into all the latest performance advice on how to best drive DirectX 12, including work submission, render state management, resource bindings, memory management, synchronization, multi-GPU, swap chains and the new hardware capabilities.

Speakers: Gareth Thomas - AMD & Alex Dunn - NVIDIA

11:20-11:50: Rendering Hitman with DirectX 12

This talk will give a brief overview of how the Hitman Renderer works, followed by a deep dive into how we manage everything with DirectX 12, including Pipeline State Objects, Root Signatures, Resources, Command Queues and Multithreading.

Speaker: Jonas Meyer - IO Interactive

12:00-12:30: Developing The Northlight Engine: Lessons Learned

Northlight is Remedy Entertainment's in-house game engine which powers Quantum Break. In this presentation we discuss how various rendering performance and efficiency issues were solved with DirectX, and suggest design guidelines for modern graphics API usage.

Speaker: Ville Timonen - Remedy Games

12:30-13:20: Lunch

13:20-14:20: Culling at the Speed of Light in Tiled Shading and How to Manage Explicit Multi-GPU

This session will cover a new technique for binning and culling tiled lights, that makes use of the rasterizer for free coarse culling and early depth testing for fast work rejection. In addition we'll cover techniques that leverage the power of explicit multi-GPU programming.

Speakers: Dmitry Zhdan - NVIDIA & Juha Sjoholm - NVIDIA

14:40-15:40: Object Space Rendering in DirectX 12

While forward and deferred rendering have made huge advancements over the last decade, there are still key rendering issues that are difficult to address. Among them, are arbitrary material layering, decoupling shading rate from rasterization, and shader anti-aliasing. Object space lighting is a technique inspired by film rendering techniques like REYES. By reversing the process and shading as early as possible and not in rasterization space, we can achieve arbitrary material layering, shader anti-aliasing, decoupled shading rates, and many more effects, all in real-time.

Speaker: Dan Baker - Oxide Games

16:00-17:00: Advanced Techniques and Optimization of HDR Color Pipelines

The session explores advanced techniques for HDR color pipelines both in the context of the new age of wide gamut HDR displays and with application to existing display technology. Presenting a detailed look at options for optimization and quality at various stages of the pipeline from eye-adaption, color-grading, and tone-mapping, through film grain and final quantization.

Speaker: Timothy Lottes - AMD
 
Top Bottom