• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anandtech: The DirectX 12 Performance Preview

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm

CPU utilization:

Speaking of batch submission, if we look at Star Swarm’s statistics we can find out just what’s going on with batch submission. The results are nothing short of incredible, particularly in the case of AMD. Batch submission time is down from dozens of milliseconds or more to just 3-5ms for our fastest cards, an improvement just short of a whole order of magnitude. For all practical purposes the need to spend CPU time to submit batches has been eliminated entirely, with upwards of 120K draw calls being submitted in a handful of milliseconds. It is this optimization that is at the core of Star Swarm’s DirectX 12 performance improvements, and going forward it could potentially benefit many other games as well.

Another metric we can look at is actual CPU usage as reported by the OS, as shown above. In this case CPU usage more or less perfectly matches our earlier expectations: with DirectX 11 both the GTX 980 and R9 290X show very uneven usage with 1-2 cores doing the bulk of the work, whereas with DirectX 12 CPU usage is spread out evenly over all 4 CPU cores.

71450xeut1.png


980_ft_575pxhtxck.png


71451tjx3d.png


Plenty more in the article. Looking good so far. RIP Mantle?
 

Kezen

Banned
It's going to be tough for AMD to keep Mantle relevant when DX12 ships.

Impressive results compared to DirectX 11 for both lower level APIs but I'm really curious to know how more expensive is taking advantage of those compared to the thick DX11.
 

Kezen

Banned
Meanwhile this setup also highlights the fact that under DirectX 11, there is a massive difference in performance between AMD and NVIDIA. In both cases we are completely CPU bound, with AMD’s drivers only able to deliver 1/3rd the performance of NVIDIA’s

Well this is interesting.
 

Nzyme32

Member
Cool. Looking forward to seeing if the new OpenGL / glNext API can keep up or surpass DX12, particularly when the same demo is likely to be shown.

But - it would be better to see actual game performance or a new game built around DX12 vs benchmarks of things like Star Swarm be misleading.

Off topic, but I miss Anand Lal Shimpi's discussions on such stuff.
 

Krejlooc

Banned
Day after day, I love my gtx 980 even more.

Edit: being more versed in open gl and having worked on it most of my professional life, I too am more excited to see what glNext has to offer. But I like those results for dx12.
 
Coming from a place of ignorance here: these improvements will be an across the board thing or just for games design with DX12 in mind? and does the card need to have dx12 support for these performance boosts to even work?
 

Nzyme32

Member
It's just one game though. We shouldn't jump to conclusions yet.
(But yeah, it seems likely it could go that way)

It's hardly a game, more of a benchmark. Comparisons to other benchmarks often give really different results. However Star Swarm is good at demonstrating bottle necking differences - specifically made to demo Mantle, so a good comparison of differences to it.
 

epmode

Member
Coming from a place of ignorance here: these improvements will be an across the board thing or just for games design with DX12 in mind? and does the card need to have dx12 support for these performance boosts to even work?

Yes, the game needs explicit DX12 support. The neat thing is that DX12 works on all modern video cards, not just the de facto AMD support for Mantle.
 

Kezen

Banned
I wonder what PC will be needed to match consoles with DX12 considering it narrows the efficiency gap significantly between PC and consoles.
 
I think there should rightly be a discussion about the implications of this and the future of the console gaming industry. Long has the reality been that consoles perform better because of less overhead and writing "to da metal". Well, it's looking like PCs no longer need more hosepower to get console-equivalent performance.

PC gaming -- which was already affordable -- becomes maybe 30%-50% less expensive if these numbers translate to real games. While offering eternal backwards compatibility with games and peripherals, fee online play, less expensive games and emulation. In addition to doubling as a computer that you can actually get shit done with.

DX12 might mean you can build a desktop rig with $350 worth of parts that gives you 1080p/60 in everything you might want to play. Hell, 2015/2016 Intel Integrated GPUs might be able to produce the same results, which means most laptops $500+ would be able to outperform consoles handily. I would think that everyone except Nintendo and 1st party studios would sit up and take note. That's a lot of built-in market.
 

Elsolar

Member
What's with all these GPU comparisons? GPU performance doesn't mean shit, DX12 is all about CPU optimization.
 

LordOfChaos

Member
Holy moly, looks promising! I'll give it a more thorough read in a bit.

Seems like it would benefit APU type systems most though (as we've known for a while), what with weak CPU cores paired with decent graphics. But it will be interesting to see what this can do across the whole range of hardware.

That star swarm demo, I keep wishing a real game was something like that.
 
I wonder what PC will be needed to match consoles with DX12 considering it narrows the efficiency gap significantly between PC and consoles.
None. DX12 only allows PC CPUs to catch up with console CPU efficiency. There is still no PC GPU with 8GB RAM like the PS4.
 
Good news for PC gamers with newer hardware. I am holding off on building my x99 platform until HBM, Broadwell-e or Skylake, and a DDR4 price drop.
 

Kezen

Banned
None. DX12 only allows PC CPUs to catch up with console CPU efficiency. There is still no PC GPU with 8GB RAM like the PS4.

It's getting old you know. And it won't be quite as efficient as console APIs who need no abstraction but the gap should narrow enough for the difference not to matter.

I mean, a Core I3 already gives 6/7 Jaguar cores benefiting from a much more efficient API a run for their money, imagine with DX12.

By the way you might want to look at the 8gb R9 290X. ;)
http://www.sapphiretech.com/presentation/product/product_index.aspx?cid=1&gid=3&sgid=1227&pid=2394&psn=000101&lid=1
 

Kinthalis

Banned
I wonder what PC will be needed to match consoles with DX12 considering it narrows the efficiency gap significantly between PC and consoles.

It should lower the CPU overhead remendously, but GPU performance will be similar to what it is today, excpet when it comes to compute, and maybe a few other hardware accelerated things DX12 is doing.

So the big take away is that even very low end PC CPU's should now be abel to keep up with a PS4.... although that's already true now with an i3 and 750ti providing comparable performance...
 

mkenyon

Banned
I think there should rightly be a discussion about the implications of this and the future of the console gaming industry. Long has the reality been that consoles perform better because of less overhead and writing "to da metal". Well, it's looking like PCs no longer need more hosepower to get console-equivalent performance.

PC gaming -- which was already affordable -- becomes maybe 30%-50% less expensive if these numbers translate to real games. While offering eternal backwards compatibility with games and peripherals, fee online play, less expensive games and emulation. In addition to doubling as a computer that you can actually get shit done with.

DX12 might mean you can build a desktop rig with $350 worth of parts that gives you 1080p/60 in everything you might want to play. Hell, 2015/2016 Intel Integrated GPUs might be able to produce the same results, which means most laptops $500+ would be able to outperform consoles handily. I would think that everyone except Nintendo and 1st party studios would sit up and take note. That's a lot of built-in market.
This isn't accurate, because GPUs are still largely the main bottlenecks in gaming.

What this means is that: Crap CPUs won't be as limiting in games as they are now, moving the bottleneck almost exclusively to the GPU. Not much more than that.

$350 in parts is still a crap system that can't run stuff at 1080p/60.

*edit*

And people, Horse Armour is trolling.
 

LordOfChaos

Member
On DX11, Nvidia does way better than AMD on the star swarm test, less driver overhead. Interesting.

Mantle does edge out DX12, but by a negligible amount, so it does seem that will be doomed. DOOOMED.

I'm keenly interested in seeing GLNext and how it compares.
 

Nzyme32

Member
I think there should rightly be a discussion about the implications of this and the future of the console gaming industry. Long has the reality been that consoles perform better because of less overhead and writing "to da metal". Well, it's looking like PCs no longer need more hosepower to get console-equivalent performance.

PC gaming -- which was already affordable -- becomes maybe 30%-50% less expensive if these numbers translate to real games. While offering eternal backwards compatibility with games and peripherals, fee online play, less expensive games and emulation. In addition to doubling as a computer that you can actually get shit done with.

DX12 might mean you can build a desktop rig with $350 worth of parts that gives you 1080p/60 in everything you might want to play. Hell, 2015/2016 Intel Integrated GPUs might be able to produce the same results, which means most laptops $500+ would be able to outperform consoles handily. I would think that everyone except Nintendo and 1st party studios would sit up and take note. That's a lot of built-in market.

This is only a demo that pushes large amounts of draw calls and demonstrates a very particular circumstance where low level APIs such as DX12 can make the biggest impact vs DX11. It is promising, sure.

"That said, any time we’re looking at an early preview it’s important to keep our expectations in check, and that is especially the case with DirectX 12. Star Swarm is a best case scenario and designed to be a best case scenario; it isn’t so much a measure of real world performance as it is technological potential."

Seeing more demonstrations, particularly of DX12 / glNext developed games / utilised engines will be a lot more interesting to see.
 
You guys seem to not understand what an absolute huge deal this is.

CPU bound games such as Arma 3, World of Warcraft and many other titles would benefit greatly from this. RTS games, anything with a ton of units on screen etc.

Take for instance Diablo 3, even on my 2500k @ 4.6 I get some major frame dips into the 20s when there are 100's of units on screen blowing up. I imagine DX12 would be a huge improvement in situations like that because I can see one or two cores spiking up on my CPU but my GPU usage staying in the low 30's (290x)
 

Kayant

Member
That's great news now show me this actual game benchmarks and we will see how worthwhile this will be. Outside of Fable Legends as then been any games formally announced to be using DX12(PC)?

Edit -

What's with all these GPU comparisons? GPU performance doesn't mean shit, DX12 is all about CPU optimization.

Click the article maybe ;)

Looks like going with 4 cores will be perfectly fine :)

71448.png

71449.png
 
I think there should rightly be a discussion about the implications of this and the future of the console gaming industry. Long has the reality been that consoles perform better because of less overhead and writing "to da metal". Well, it's looking like PCs no longer need more hosepower to get console-equivalent performance.

PC gaming -- which was already affordable -- becomes maybe 30%-50% less expensive if these numbers translate to real games. While offering eternal backwards compatibility with games and peripherals, fee online play, less expensive games and emulation. In addition to doubling as a computer that you can actually get shit done with.

DX12 might mean you can build a desktop rig with $350 worth of parts that gives you 1080p/60 in everything you might want to play. Hell, 2015/2016 Intel Integrated GPUs might be able to produce the same results, which means most laptops $500+ would be able to outperform consoles handily. I would think that everyone except Nintendo and 1st party studios would sit up and take note. That's a lot of built-in market.

Except that the gains aren't going to be even remotely near those 30%-50% in the real world, and you still can get more out of a single, known platform than a generic one, DX or not.

Take this "consoles are deaaad" stuff back to reddit, please.
 
Top Bottom