• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anandtech: The DirectX 12 Performance Preview

SapientWolf

Trucker Sexologist
Despite popular belief on GAF, the shitty AMD drivers statement has always been reality and not just an internet meme. This isn't the first time this had been proven to be the case.

DirectX 12 is shaping up to be everything it promised to be. A low end quad core CPU could end up being a good choice for a budget gaming rig.
Nvidia overhauled their drivers awhile back as an answer to Mantle and we're seeing the benefits in the benchmarks.

http://www.tomshardware.com/news/nvidia-driver-update-direct3d-optimization,26381.html

Doesn't necessarily mean AMD was rocking shitty drivers. I guess they relied too heavily on Mantle when they should have been focusing on incorporating the improvements into their DX11 drivers.
 

Kezen

Banned
i hope for lot of physics stuff everywhere.
It could allow better use of GPU compute as well, because as I understand it it's a better fit for GPU nowadays.

I'd have to agree with some of the other posts. DX12 is hardly "multiplatform" in a conventional sense since it is Microsoft only. It's either DX12 on Xbox running Windows 10, or DX12 on Windows 10 PC or DX12 on Window 10 phones.
Meanwhile something like OpenGL family would be a better example of "multiplatform", functioning across mobile, web and desktop platforms, despite its lack of use on consoles and decline for majority of PC gaming. If the combination of complete merging and rewrite of OpenGL & OpenGL ES in glNext is as pictured, your idea of DX12 being "multiplatform" is almost comical.
https://www.khronos.org/assets/uplo...4-siggraph-bof/OpenGL-Ecosystem-BOF_Aug14.pdf
glnext_unitytauc4.png
I don't know if DX12 works on Windows phone but surely that would actually give credence to DX12 as a multi platform solution. OpenGL will remain the API of choice for Linux and Mac but as far as PC gaming is concerned those platforms are not as popular as Windows.

Despite popular belief on GAF, the shitty AMD drivers statement has always been reality and not just an internet meme. This isn't the first time this had been proven to be the case.
DirectX 12 is shaping up to be everything it promised to be. A low end quad core CPU could end up being a good choice for a budget gaming rig.
Definitely, it will lower the price of entry for good PC performance. It's going to be even more interesting to compare PC and consoles now, see where do they stack up against various CPUs, GPUs running DX12.

You may all thank AMD and Mantel for forcing Microsoft to make Direct X more of a "to the metal" API.
Nvidia and MS wanted that back in 2010.
Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead - See more at: http://blogs.nvidia.com/blog/2014/03/20/directx-12/#sthash.YKRGlgf1.dpuf
http://blogs.nvidia.com/blog/2014/03/20/directx-12/

So it was in developpment way before Mantle was announced. We don't owe DX12 being low level to AMD.
 
OpenGL will probably benefit emulators more than PC games, given how a lot of emulators are trying to support a wide variety of devices. For PC games, I think that most developers would stick with Mantle and/or DX12 unless they were going for broad device support.
 

flux1

Member
Real happy to see all the talk might actually lead to results and not just a bunch of smoke n mirrors.

Going to Nvidia from AMD improved performance for me just from less cpu use from the drivers. Hopefully DX12 will improve even further on that once games start implementing it.
 

thelastword

Banned
Because of their inferior DX11 drivers compared to Nvidia.
So you're saying that the drivers were the main reason in the performance gap between AMD and NVIDIA. That bodes well for AMD since a 290X is pretty much on par with a 970GTX in terms of performance sometimes even eclipsing it in many games that I've seen benchmarked. It's cheaper and benches better at higher resolutions too, I'm thinking they have the full 4GB's too. Win win for AMD it seems.

ArmA was already mentioned and DX12 should make those kind of games easier to make/expand their scope even more. But let me guess, that doesn't count because it's not a cinematic 30fps corridor shooter with black bars.
So you're saying that ARMA is not possible on consoles? What exactly about ARMA is not possible? What is so revolutionary about it?
 

roytheone

Member
I currently have a very good GPU (gtx 970) but my CPU is getting old (i7 870). As I understand it, directX 12 will decrease my CPU bottleneck?
 

Seanspeed

Banned
I thought DX12 wasn't suppose to be a huge difference in performance. Or is just XB1?
Right. Its a huge deal for PC's as it essentially gives PC's the same sort of low level efficiency advantage that consoles have always had. For the XB1, it will be minor API improvements at best.

I currently have a very good GPU (gtx 970) but my CPU is getting old (i7 870). As I understand it, directX 12 will decrease my CPU bottleneck?
Basically, yea. If a game is developed with DX12, it should enable them to get fuller use of your CPU, hopefully increasing your CPU performance ceiling.
 

Nzyme32

Member
need shame graph for relevance doing videogames. Impossible numbers stay the same.

The context of the presentation isn't meant to illustrate numbers in any way. Even "relevance" is in quotations. It simply shows the ability of the API across platforms and use cases as "relevance"

Opengl being multiplatform doesn't seem to be very relevant in what's happening right now. 2/3 of platforms for high end games are supporting DX12. Mobile gaming? on iOS that's the primarily platform for the mobile game developers, there is Metal and it's actually attracting good support among upcoming games and that's the direction it's going there. Opengl is nice and all, being multiplatform doesn't seem to be doing any magic for it at the moment for whatever reason.

Pretty much agree, but that isn't the discussion. It's simply the ability for the API to be multiplatform / platform agnostic.

I don't know if DX12 works on Windows phone but surely that would actually give credence to DX12 as a multi platform solution. OpenGL will remain the API of choice for Linux and Mac but as far as PC gaming is concerned those platforms are not as popular as Windows.

I simply mean I don't see phone / console / PC of a single vendor's platform a being "multiplatform". That only becomes relevant if a developer wants to be singularly on Microsoft platforms, where generally this is not the case and they have to implement something another solution for other platforms. I'm talking about the potential uses of an API and calling it multiplatform based on that, rather than "well no one uses OpenGL/glNext to make games on PC", which isn't really changing anything.
 

Kezen

Banned
So you're saying that the drivers were the main reason in the performance gap between AMD and NVIDIA.
In CPU bound scenarios. Hence the gap between AMD DX11 and Mantle.
That bodes well for AMD since a 290X is pretty much on par with a 970GTX in terms of performance sometimes even eclipsing it in many games that I've seen benchmarked. It's cheaper and benches better at higher resolutions too, I'm thinking they have the full 4GB's too. Win win for AMD it seems.
I think it's a solid card and a valid alternative to the 970. But power consumption is up.
If I had to switch to the red team this would be my choice.

That bodes well for AMD since a 290X is pretty much on par with a 970GTX in terms of performance sometimes even eclipsing it in many games that I've seen benchmarked. It's cheaper and benches better at higher resolutions too, I'm thinking they have the full 4GB's too. Win win for AMD it seems.
I think it's a solid card and a valid alternative to the 970. But power consumption is up.
If I had to switch to the red team this would be my choice.
 

martino

Member
The context of the presentation isn't meant to illustrate numbers in any way. Even "relevance" is in quotations. It simply shows the ability of the API across platforms and use cases as "relevance"

Ok still curious to see the relevance analysed in a more video game focus vision.
But i can understand why they choose to only transform into graph the open platform/standard possibility instead.
 

derExperte

Member
So you're saying that ARMA is not possible on consoles? What exactly about ARMA is not possible? What is so revolutionary about it?

The scope and size. Console CPUs would melt and this time for real. A severely limited version could work but then we're not talking about the same experience anymore (which I would argue also goes for higher framerates and better image quality but let's not have that discussion here).

I don't know if DX12 works on Windows phone but surely that would actually give credence to DX12 as a multi platform solution. OpenGL will remain the API of choice for Linux and Mac but as far as PC gaming is concerned those platforms are not as popular as Windows.

http://blogs.unity3d.com/2015/01/22/staying-ahead-with-directx-12/

Direct3D 12 is expected to run on all Microsoft devices: mobiles, laptops, desktops and Xbox One, all of which Unity already supports.
 

Nzyme32

Member
Ok still curious to see the relevance analysed in a more video game focus vision.
But i can understand why they choose to only transform into graph the open platform/standard possibility instead.

Well again that presentation was by Unity, explaining what the intentions of glNext project were (since that presentation was the first brief of what Khronos intend), and why that is different to what OpenGL / ES is (ie actually breaking compatibility with the old, merging the two and building from scratch, a modernised API for the first time in years), thus a graph of potential relevance to developers.

Who knows if it is actually going to be as fully featured and powerful as DX12. Obviously seeing games on either of them should be interesting. OGL desperately needs to start over should it actually become important to PC gaming, but it would likely be years for glNext to make much of an impact vs DX12 in any case. Regardless, GDC should be very interesting since we should see examples
 

Kezen

Banned
God damn. Can we blame AMD's drivers now? LoL.

As I've said AMD drivers are not "bad" when you are GPU bound but it seems even the "Omega" drivers (which are more efficient in CPU limited scenarios) fail to catch up to Nvidia's.

With DX12 this will probably not matter considering the drivers will be much, much thinner and less complex. The hardware itself will make the difference along with developper support.
We will see which of the red or green team pulls ahead in DX12 games.
 

LordOfChaos

Member
Yeah, I'm aware of Thunderbolt's potential, but the % loss in performance varies greatly from game to game (according to an earlier test from another site using Thunderbolt 1), so I wouldn't be really comfortable with doing that.

Alienware, on the other hand, created a port that's (apparently) the full 16 lanes of PCI-e. Alas, it's proprietary, and having to buy their notebooks to get it is a big no-no.

Perhaps by the time this rig of mine starts to let me down, the current Thunderbolt will be enough.



I'd still take it, for virtue of the fact that it exists in more products than Alienwares proprietary one, and likely will continue to as Intel integrates the controller into its chipset.

And as you said the loss was only in a few games, and that was over TB1 - TB2 has double the bandwidth in each direction, and products like my Macbook have two full speed TB2 ports. If it could use both at once, there would be little but compute that would bandwidth constrain you.
 

martino

Member
As I've said AMD drivers are not "bad" when you are GPU bound but it seems even the "Omega" drivers (which are more efficient in CPU limited scenarios) fail to catch up to Nvidia's.

With DX12 this will probably not matter considering the drivers will be much, much thinner and less complex. The hardware itself will make the difference along with developper support.
We will see which of the red or green team pulls ahead in DX12 games.

I'm currently in red team (changing a lot) and i'm not satisfied at all by driver performance in game (especially release performance).
It takes age for amd to release new driver / nvidia that gives specific ones for near all big release.
 

Renekton

Member
It's going to be tough for AMD to keep Mantle relevant when DX12 ships
At least AMD and DICE accomplished part of what they set out to do, draw attention to CPU utilization and low-level access. Before Mantle IIRC, nobody was talking about those seriously.
 

wachie

Member
It's going to be tough for AMD to keep Mantle relevant when DX12 ships.

Impressive results compared to DirectX 11 for both lower level APIs but I'm really curious to know how more expensive is taking advantage of those compared to the thick DX11.
The motivation for AMD is to entice devs/studios to use Mantle to cross port and still reap the benefits of low(er) layers. The whole idea behind is to leverage their position in the console landscape.
 

Kezen

Banned
I'm currently in red team (changing a lot) and i'm not satisfied at all by driver performance in game (especially release performance).
It takes age for amd to release new driver / nvidia that gives specific ones for near all big release.

Well I have been looking at various benchmarks in the latest games and AMD and Nvidia are more or less evenly matched. AMD win in some games, lose in others, same goes for Nvidia. I can't say one is definitely better than the others, in pure hardware terms there is little to separate the two. But of course this only accounts for performance and not overall stability and bugs, I don't have anything to complain about regarding Nvidia drivers but before believing the grass is greener (ahem...sorry) on the other side take a lookt at this :
https://forums.geforce.com/default/topic/805564/geforce-drivers/official-nvidia-347-25-whql-driver-feedback-thread/

Red or green team drivers are always potentially "posing problems" (literal translation from my native language, no idea if it's grammatically correct in English).

The motivation for AMD is to entice devs/studios to use Mantle to cross port and still reap the benefits of low(er) layers. The whole idea behind is to leverage their position in the console landscape.
And they would be foolish not to try that. It makes a great deal of sense. Technically Mantle could be considered a "standard" in the sense that DX12 is comparable to it.
Hum..Maybe that's stretching the meaning of the word but you get my point.

However, it still begs the question : how are they going to entice devs to support Mantle when DX12 is around ? Why would devs need to cross-port when they can already bring low-level efficiency to the PC landscape at large with MS's API ? Only for the sake of exploiting some niche optimization tricks ultra specific to GCN ? Being relatively close to the APIs used on consoles will not be enough to secure Mantle's future, and the fact that it's available on W7 / 8.1 / W10 is not a very strong argument when W10 is free for W7 users onwards for a whole year. Think about that for a second.

I wish Mantle to remain among us just not to let MS fall asleep but I'm not optimistic.
 
I'd still take it, for virtue of the fact that it exists in more products than Alienwares proprietary one, and likely will continue to as Intel integrates the controller into its chipset.

And as you said the loss was only in a few games, and that was over TB1 - TB2 has double the bandwidth in each direction, and products like my Macbook have two full speed TB2 ports. If it could use both at once, there would be little but compute that would bandwidth constrain you.

Well, if there's a box that can use dual Thunderbolt, and each Thunderbolt port on a MBP has its own controller.... well...
 

Nzyme32

Member
Well I have been looking at various benchmarks in the latest games and AMD and Nvidia are more or less evenly matched. AMD win in some games, lose in others, same goes for Nvidia. I can't say one is definitely better than the others, in pure hardware terms there is little to separate the two. But of course this only accounts for performance and not overall stability and bugs, I don't have anything to complain about regarding Nvidia drivers but before believing the grass is greener (ahem...sorry) on the other side take a lookt at this :
https://forums.geforce.com/default/topic/805564/geforce-drivers/official-nvidia-347-25-whql-driver-feedback-thread/

Red or green team drivers are always potentially "posing problems" (literal translation from my native language, no idea if it's grammatically correct in English).


And they would be foolish not to try that. It makes a great deal of sense. Technically Mantle could be considered a "standard" in the sense that DX12 is comparable to it.
Hum..Maybe that's stretching the meaning of the word but you get my point.

However, it still begs the question : how are they going to entice devs to support Mantle when DX12 is around ? Why would devs need to cross-port when they can already bring low-level efficiency to the PC landscape at large with MS's API ? Only for the sake of exploiting some niche optimization tricks ultra specific to GCN ?

I wish Mantle to remain among us just not to let MS fall asleep but I'm not optimistic.

Does Mantle need to stay around at all or even be much of a concern to anyone? Since unfettered access to Mantle was given to Khronos by AMD for development of glNext, I'd assume they would be more engaged with their solution that incorporates that work, and less insistent on Mantle

Here in Brazil? I envy your wallet.

I'm curious, is there any way of circumventing the crazy pricing over there?
 

Nokterian

Member
You guys seem to not understand what an absolute huge deal this is.

CPU bound games such as Arma 3, World of Warcraft and many other titles would benefit greatly from this. RTS games, anything with a ton of units on screen etc.

Take for instance Diablo 3, even on my 2500k @ 4.6 I get some major frame dips into the 20s when there are 100's of units on screen blowing up. I imagine DX12 would be a huge improvement in situations like that because I can see one or two cores spiking up on my CPU but my GPU usage staying in the low 30's (290x)

Oh yes i am so glad with that having a I7-2600k all my cores are unparked and frame rate drops are horrendous this will be a huge boost for Diablo 3 and WoW also.
 

Kezen

Banned
Does Mantle need to stay around at all or even be much of a concern to anyone? Since unfettered access to Mantle was given to Khronos by AMD for development of glNext, I'd assume they would be more engaged with their solution that incorporates that work, and less insistent on Mantle
It would be interesting to see Mantle alive and well just not to let Microsoft sit on their ass. Just ponder over the DX11 situation : it's been 5 years since it has been released and developpers and IHVs have been pressuring MS to release a more lightweight API. Yet, DX12 will only release this fall. What makes Mantle attractive from a consumer standpoint is that it competes directly with MS's DX so they could feel the need to update DX12 more regularly.
OpenGL does not seem to be an option for many devs/pubs as far as Windows gaming is concerned.
 

Nzyme32

Member
It would be interesting to see Mantle alive and well just not to let Microsoft sit on their ass. Just ponder over the DX11 situation : it's been 5 years since it has been released and developpers and IHVs have been pressuring MS to release a more lightweight API. Yet, DX12 will only release this fall. What makes Mantle attractive from a consumer standpoint is that it competes directly with MS's DX so they could feel the need to update DX12 more regularly.

I don't see the reasoning for mantle being attractive to a consumer right now. Mantle games are few and far between, and developers are not moved towards it with DX12 around the corner with Mantle still being tied to AMD. I don't even know what games are going to support it this year. Sure, it would be nice if you are an AMD user, but the supported games are not there

OpenGL does not seem to be an option for many devs/pubs as far as Windows gaming is concerned.

Because OpenGL has been consistently beaten by DX and has become more incomprehensible with the cruft of the past. Actually getting a new implementation that is competitive with DX, agnostic, with better language and features may eventually change things in the long term where current OpenGL or even Mantle simply can't as it stands

Would we be seeing this in DX12 if not for Mantle?

Probably not. Mantle was what lit a fire under their ass.
 

Kezen

Banned
I don't see the reasoning for mantle being attractive to a consumer right now. Mantle games are few and far between, and developers are not moved towards it with DX12 around the corner with Mantle still being tied to AMD. I don't even know what games are going to support it this year. Sure, it would be nice if you are an AMD user, but the supported games are not there
It's attractive because it's competition, even if only compatible with GCN hardware. I agree with you that I don't see it gaining any more traction after DX12 is released but I would certainly like it to. With regards to supported games Mantle had more games in one year than Directx 11. Not bad.

Because OpenGL has been consistently beaten by DX and has become more incomprehensible with the cruft of the past. Actually getting a new implementation that is competitive with DX, agnostic, with better language and features may eventually change things in the long term where current OpenGL or even Mantle simply can't as it stands
Well I won't complain if it leads to a more competitive place. Otherwise chances are we will be stuck with DX12 until 2020.
 

AmyS

Member
Haven't skimmed through the entire thread yet but from the Anandtech preview, this stands out

Also absent for the moment is a definition for DirectX 12’s Feature Level 12_0 and DirectX 11’s 11_3. Separate from the low-level API itself, DirectX 12 and its high-level counterpart DirectX 11.3 will introduce new rendering features such as volume tiled resources and conservative rasterization. While all of the above listed video cards will support the DirectX 12 low-level API, only the very newest video cards will support FL 12_0, and consequently be fully DX12 compliant on both a feature and API basis. Like so many other aspects of DirectX 12, Microsoft is saving any discussion of feature levels for GDC, at which time we should find out what the final feature requirements will be and which (if any) current cards will fully support FL 12_0.

We don't know if Maxwell 2 cards (980, 970) fully support FL 12_0, and if it is eventually discovered that they do not, will GM200 have it?

The same question about AMD's forthcoming R9-300 series.

Or will we not see full FL 12_0 support until cards come out well after DX12 is released, such as Nvidia's Pascal ?

GDC should be very interesting.
 
Does Mantle need to stay around at all or even be much of a concern to anyone? Since unfettered access to Mantle was given to Khronos by AMD for development of glNext, I'd assume they would be more engaged with their solution that incorporates that work, and less insistent on Mantle

I'm curious, is there any way of circumventing the crazy pricing over there?

Mostly no - though computer parts aren't quite as overpriced as stuff like consoles and iDevices. Most people travel abroad and cross their fingers if they bring more than $500 in goods (above that you might have to pay taxes over the stuff you brought)

Just so you have an idea - here's the rig I built early last year.

Intel i5-4670K CPU
MSI Z87M-G43 MB
Asus GTX 780 DCUII GPU
2x Kingston HypeX Blue 4GB 1333MHz memory
Seagate Barracuda ST1000DM003‎ HDD
Corsair GS600 PSU
Corsair Carbide 300R case

I paid roughly R$5000 for the above, which today stands at U$1800.

Right now a GTX 980 would cost you something like R$2700 - that's $970
 

jmga

Member
So you're saying that the drivers were the main reason in the performance gap between AMD and NVIDIA. That bodes well for AMD since a 290X is pretty much on par with a 970GTX in terms of performance sometimes even eclipsing it in many games that I've seen benchmarked. It's cheaper and benches better at higher resolutions too, I'm thinking they have the full 4GB's too. Win win for AMD it seems.

Look at the specs, not at the price. A 290X should perform over a 980. AMD drivers are shit.
 
I wonder how much gains I would see on my potato phenom II going from my 6870 to say a gtx 750 or some similar performing card...
Anyone have a guess?

I'm almost always cpu bottlenecked

I always thought it would be retarded to benchmark gpus in any cpu limited scenario but it seems we actually need a page in gpu reviews dedicated to that if amd is this shitbad in cpu limited scenarios.
 

Cels

Member
You guys seem to not understand what an absolute huge deal this is.

CPU bound games such as Arma 3, World of Warcraft and many other titles would benefit greatly from this. RTS games, anything with a ton of units on screen etc.

Take for instance Diablo 3, even on my 2500k @ 4.6 I get some major frame dips into the 20s when there are 100's of units on screen blowing up. I imagine DX12 would be a huge improvement in situations like that because I can see one or two cores spiking up on my CPU but my GPU usage staying in the low 30's (290x)

diablo 3 is not a good example at all, the game is horribly unoptimized on pc. i get significant fps lag in 4 player games and my rig is just as good as yours, 4690k@4.5, 280x.

it gets especially bad in rakkis crossing and cesspools. a lot of other people have the same problems with these two areas as well.
 
Top Bottom