• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Dev Demonstrates NVIDIA GeForce RTX 2080 Ti & Xbox Series X Mesh Shaders Performance With DirectX 12 Ultimate API

mesh_shader_slide.jpg



Earlier this week, Microsoft announced its latest DirectX 12 Ultimate API which aims to provide a unified platform to developers for next-generation graphics on PC and consoles. One of the key features of the announcement was the addition of Mesh Shaders to the DX12 framework & the Principal Engineer at Microsoft/Xbox ATG (Advanced Technologies Group), Martin Fuller, has showcased how the new technique would help devs in delivering higher graphics throughput in next-gen games.

DirectX 12 Ultimate API's Mesh Shaders Tested With NVIDIA GeForce RTX 2080 Ti & Xbox Series X - Huge Performance Gains On PCs & Consoles
Martin explains that there are only two platforms that currently support DirectX 12 Ultimate Mesh Shaders, and those include the NVIDIA Turing GPU lineup and the Xbox Series X with AMD RDNA2. So just a recap of what Mesh Shaders are and what they do. NVIDIA announced Mesh Shaders back with its Turing GPU architecture in 2018 as a means to dramatically improve performance and image quality when rendering a scene with a substantial number of very complex objects.

Take for instance a very complex & triangle heavy mesh and what Mesh Shaders would essentially do is segment it into smaller meshlets. Each meshlet ideally optimizes the vertex re-use within it. Using the new hardware stages and this segmentation scheme, devs can render more geometry in parallel while fetching less overall data. An in-depth look at Mesh Shaders can be read at NVIDIA's and Microsoft's Dev blogs.
meshlets_comparison.png


The same Mesh Shaders also bring the full power of generalized GPU Compute to the geometry pipe-line, allowing developers to build more dynamic worlds than before without compromising over performance. It allows for advanced culling techniques, LOD (Level of Detail) and an infinitely more procedural topology generation in a scene. An impressive demo by NVIDIA, known as Asteroids, was published a while back and can be seen below.



The DirectX 12 Mesh Shader demo shown by Martin includes the NVIDIA GeForce RTX 2080 Ti running on Windows 10 at a resolution of 1440p while the Xbox Series X devkit is running at 4K. The demo includes five rooms with various techniques. A normal pass through renders the 4K scene on Xbox Series X at around 100 microseconds which are reduced to just 55 microseconds with meshlet sphere culling that is a more advanced culling technique. The same holds true for the RTX 2080 Ti which reports significant render time drops with advanced meshlet culling techniques. You can see the demo of the RTX 2080 Ti and Xbox Series X in the video below:



There's a performance breakdown of using the older vertex shaders versus mesh shaders and the difference is that the render time in both cases is cut to almost half on both platforms. Following is the render time breakdown using various mesh shaders + advanced culling techniques.

Microsoft-DirectX-12-Mesh-Shaders-Culling_Performance_NVIDIA-GeForce-RTX-2080-Ti-Xbox-Series-X_2-740x416.png

Microsoft-DirectX-12-Mesh-Shaders-Culling_Performance_NVIDIA-GeForce-RTX-2080-Ti-Xbox-Series-X_1-740x416.png


It is also noteworthy that the RTX 2080 Ti renders the scene in about 40 microseconds using the regular pass-through method at 1440p whereas the Xbox Series X renders in around 100 micro seconds at 4K. The Xbox Series X, however, delivers much faster render times even at 4K than the (standard pass-through) NVIDIA GeForce RTX 2080 Ti which goes off to show the benefits of the new Mesh Shaders in Direct X 12 Ultimate API being embedded in Turing and RDNA 2 GPUs.

 

Armorian

Banned
Turing architecture was ahead of its time with FT 12_2 support, but so was Maxwell with 12_1 in 2014 and I don't know if there are any DX12 games using its features :messenger_grinning_squinting:
 

ZywyPL

Banned
Is there a test with SSD and an even faster SSD on PC+nvidia? What’s the difference there? That would be interesting to see.

The difference is virtually none, because at some point the games reach diminishing returns, where even 2, 3, 5 times faster SSD doesn't allow for any more details, any bigger draw distance, no less pop-in, and so on. The developers asked for a fast SSD, and that's exactly what MS gave them, whereas Sony have set themselves a benchmark of 5GB/s where probably nobody asked them for. I won't be surprise if the CPU/GPU will hit the wall before any dev can actually utilize that bandwidth.
 
The difference is virtually none, because at some point the games reach diminishing returns, where even 2, 3, 5 times faster SSD doesn't allow for any more details, any bigger draw distance, no less pop-in, and so on. The developers asked for a fast SSD, and that's exactly what MS gave them, whereas Sony have set themselves a benchmark of 5GB/s where probably nobody asked them for. I won't be surprise if the CPU/GPU will hit the wall before any dev can actually utilize that bandwidth.

I find it really odd that people seem to be painting the XsX storage as slow just because it’s slower than than the PS5s. It’s still ultra fast storage. It’s like saying “The Flash is faster than Superman so that means Superman is slow.”

The XsX storage bandwidth isn’t going to hold anything back
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
mesh_shader_slide.jpg



Earlier this week, Microsoft announced its latest DirectX 12 Ultimate API which aims to provide a unified platform to developers for next-generation graphics on PC and consoles. One of the key features of the announcement was the addition of Mesh Shaders to the DX12 framework & the Principal Engineer at Microsoft/Xbox ATG (Advanced Technologies Group), Martin Fuller, has showcased how the new technique would help devs in delivering higher graphics throughput in next-gen games.

DirectX 12 Ultimate API's Mesh Shaders Tested With NVIDIA GeForce RTX 2080 Ti & Xbox Series X - Huge Performance Gains On PCs & Consoles
Martin explains that there are only two platforms that currently support DirectX 12 Ultimate Mesh Shaders, and those include the NVIDIA Turing GPU lineup and the Xbox Series X with AMD RDNA2. So just a recap of what Mesh Shaders are and what they do. NVIDIA announced Mesh Shaders back with its Turing GPU architecture in 2018 as a means to dramatically improve performance and image quality when rendering a scene with a substantial number of very complex objects.

Take for instance a very complex & triangle heavy mesh and what Mesh Shaders would essentially do is segment it into smaller meshlets. Each meshlet ideally optimizes the vertex re-use within it. Using the new hardware stages and this segmentation scheme, devs can render more geometry in parallel while fetching less overall data. An in-depth look at Mesh Shaders can be read at NVIDIA's and Microsoft's Dev blogs.
meshlets_comparison.png


The same Mesh Shaders also bring the full power of generalized GPU Compute to the geometry pipe-line, allowing developers to build more dynamic worlds than before without compromising over performance. It allows for advanced culling techniques, LOD (Level of Detail) and an infinitely more procedural topology generation in a scene. An impressive demo by NVIDIA, known as Asteroids, was published a while back and can be seen below.



The DirectX 12 Mesh Shader demo shown by Martin includes the NVIDIA GeForce RTX 2080 Ti running on Windows 10 at a resolution of 1440p while the Xbox Series X devkit is running at 4K. The demo includes five rooms with various techniques. A normal pass through renders the 4K scene on Xbox Series X at around 100 microseconds which are reduced to just 55 microseconds with meshlet sphere culling that is a more advanced culling technique. The same holds true for the RTX 2080 Ti which reports significant render time drops with advanced meshlet culling techniques. You can see the demo of the RTX 2080 Ti and Xbox Series X in the video below:



There's a performance breakdown of using the older vertex shaders versus mesh shaders and the difference is that the render time in both cases is cut to almost half on both platforms. Following is the render time breakdown using various mesh shaders + advanced culling techniques.

Microsoft-DirectX-12-Mesh-Shaders-Culling_Performance_NVIDIA-GeForce-RTX-2080-Ti-Xbox-Series-X_2-740x416.png

Microsoft-DirectX-12-Mesh-Shaders-Culling_Performance_NVIDIA-GeForce-RTX-2080-Ti-Xbox-Series-X_1-740x416.png


It is also noteworthy that the RTX 2080 Ti renders the scene in about 40 microseconds using the regular pass-through method at 1440p whereas the Xbox Series X renders in around 100 micro seconds at 4K. The Xbox Series X, however, delivers much faster render times even at 4K than the (standard pass-through) NVIDIA GeForce RTX 2080 Ti which goes off to show the benefits of the new Mesh Shaders in Direct X 12 Ultimate API being embedded in Turing and RDNA 2 GPUs.



Mesh Shaders and the Geometry Engine are one of the most interesting developments in GPU tech for me... crazy programmability at the vertex/mesh level not seen since the PS2’s VU’s days. Nice 👍.
 

Riven326

Banned
I find it really odd that people seem to be painting the XsX as slow just because it’s slower than than the PS5s. It’s still ultra fast storage. It’s like saying “The Flash is faster than Superman so that means Superman is slow.”

The XsX storage bandwidth isn’t going to hold anything back
Yes, I feel the same way. Its as if SSDs have been slow this whole time and PS5 will finally introduce the first fast SSD. That's how people are framing the discussion and I think it's dishonest at best.
 

pawel86ck

Banned
I find it really odd that people seem to be painting the XsX storage as slow just because it’s slower than than the PS5s. It’s still ultra fast storage. It’s like saying “The Flash is faster than Superman so that means Superman is slow.”

The XsX storage bandwidth isn’t going to hold anything back
Agree. Not so long time ago DealerGaming has tested his evo 970 pro (3.5GB/s) against XSX SDD, and his fast SDD was still over 3x times slower, so it's not like XSX SDD (or should I say velocity architecture) is slow.
 

Bogroll

Likes moldy games
Is there a test with SSD and an even faster SSD on PC+nvidia? What’s the difference there? That would be interesting to see.
And also as a test, 2 PC's add in a 5700 in the faster ssd and a 5700xt in the slower ssd. It won't make the 5700 perform better.
Maybe take it further and make the faster ssd machine 16gb and the other 8gb and maybe still the same result. Make faster ssd have a 3900x and slower ssd 3700x and still the maybe the slower ssd machine would come out on top.
The results would be interesting.
And I know this isn't the specs of the consoles.
 
Last edited:

Connxtion

Member
Mesh Shaders and the Geometry Engine are one of the most interesting developments in GPU tech for me... crazy programmability at the vertex/mesh level not seen since the PS2’s VU’s days. Nice 👍.
I was thinking the same. Guess Sony’s Geometry Engine is the vulkan version of mesh shaders.
 

longdi

Banned
Mesh shaders and VRS are wow, did not see them coming.
Present games always seems to lack a bit of smoothness still. I guess too much resources wasted on unseen portion.

With these 2, and RT, we may reach near CG quality at last.
Cant wait for RTX3000 and Series X.
 
Last edited:

Goliathy

Banned
The difference is virtually none, because at some point the games reach diminishing returns, where even 2, 3, 5 times faster SSD doesn't allow for any more details, any bigger draw distance, no less pop-in, and so on. The developers asked for a fast SSD, and that's exactly what MS gave them, whereas Sony have set themselves a benchmark of 5GB/s where probably nobody asked them for. I won't be surprise if the CPU/GPU will hit the wall before any dev can actually utilize that bandwidth.

Thank you, sir. Very insightful.
 
Last edited:

llien

Member
Why does the text read as if it was written by a drunk guy from NV PR department?
"No access to connectivity", seriously? I guess it's bad.

No apples to apples comparison of XSeX AMD chip vs 2080Ti? Oh, ok. I guess because the former somehow beats it.
 

M1chl

Currently Gif and Meme Champion
Why does the text read as if it was written by a drunk guy from NV PR department?
"No access to connectivity", seriously? I guess it's bad.

No apples to apples comparison of XSeX AMD chip vs 2080Ti? Oh, ok. I guess because the former somehow beats it.
Just watch the video, I think it demostrates it way better.
 
Why does the text read as if it was written by a drunk guy from NV PR department?
"No access to connectivity", seriously? I guess it's bad.

No apples to apples comparison of XSeX AMD chip vs 2080Ti? Oh, ok. I guess because the former somehow beats it.
You guess wrong... as usual.
 

Panajev2001a

GAF's Pleasant Genius
I was thinking the same. Guess Sony’s Geometry Engine is the vulkan version of mesh shaders.
I hope for something quirkier, just to see what they do with it. It may be one of the features Cerny mentioned that they co-developed with Sony and AMD will bring in Big Navi or later and yes initially proprietary to AMD with a Vulkan extension... then later DX12.2?!
 

Bernkastel

Ask me about my fanboy energy!
I hope for something quirkier, just to see what they do with it. It may be one of the features Cerny mentioned that they co-developed with Sony and AMD will bring in Big Navi or later and yes initially proprietary to AMD with a Vulkan extension... then later DX12.2?!
Sony does not use Vulkan. They use their own propipriatory GNM/GNMx.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Sony does not use Vulkan. They use their own propitiatory GNM/GNMx.

Listen, re-read that sentence and see I was referring to the feature being part of Big Navi potentially... their desktop chips and AMD exposing it to devs using custom extensions.

Cerny said it clearly “some of these features will appear in AMD’s desktop GPU’s and APU’s one day, but they are features we co developed with them” so I expect something beyond DX12 Mesh Shaders.
 
Last edited:

ethomaz

Banned
Others: PS5 doesn’t that like that.
DF bias: No. Xbox does everything better.
Others: Cerny said it works like that and that on PS5.
DF bias: No. Xbox does everything better.

Why even try? :messenger_tears_of_joy:

It will be fun if their article started to shows PS5 version of the games having the edge.... how he will spin.
 
Last edited:

Caio

Member
The technologies that XSX will be incorporating will in theory make the XSX closer to 4X the XB1X in actual performance.
A true Generational leap from XB1X.

Yep, and this is insanely impressive, considering XBox One X was released in late 2017. MS really have developed a monster console. It's Day One for me. I still can't believe all the specs and optimizations they put in the box. I'm seriously impressed., and XSX should be compared to the base Model if we speak about the generational leap, and God, it is an Hell of a generational leap, one of the biggest ever.
 
Sure. Tell me more about his bias...
Sure. His bias is towards clearing up misconceptions and preconceived notions of forum fanboys by way of rational thought backed up by knowledge driven by a love of technology...not fanboy console warring. You're clearly held to a different standard,. as one of the aforementioned... because you're nobody important. He's in the position he's in... because of his knowledge. You know... the one that developers constantly praise throughout twitter... the ones that developers have tipped their hats to in GDC talks..

His opinion is backed up and informed by his knowledge of said subjects... often DIRECTLY informed by sources within the industry.

Yours is basically.... "bu but Cerny is really hyping it for a reason!!!"
 

ethomaz

Banned
Sure. His bias is towards clearing up misconceptions and preconceived notions of forum fanboys by way of rational thought backed up by knowledge driven by a love of technology...not fanboy console warring. You're clearly held to a different standard,. as one of the aforementioned... because you're nobody important. He's in the position he's in... because of his knowledge. You know... the one that developers constantly praise throughout twitter... the ones that developers have tipped their hats to in GDC talks..

His opinion is backed up and informed by his knowledge of said subjects... often DIRECTLY informed by sources within the industry.

Yours is basically.... "bu but Cerny is really hyping it for a reason!!!"
So his bias is more influential than mine.
Sadly and unprofessional.
 
Last edited:
Because he has a long history of being biased? What a shame since Digital Foundry are the least biased mainstream journalistic source when it comes to topics like these.
Or maybe he is reporting the facts, and those facts don't make for good reading for a PS5 peep?
Of you want unbiased info, you could always read what Jason Schreier has to say...............
 
Top Bottom