• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DirectX 12 GPU exclusive features to be shown/announced at GDC

hipbabboom

Huh? What did I say? Did I screw up again? :(
That's a lot more arrows. Which I guess it means better / easier to relay larger amounts of commands between the CPU and GPU? Would help explain the supposed big performance increases DX12 will provide.
FTFY.

DX12 lets you control the command queue of the GPU
 

RayMaker

Banned
So mid 2016 is when XBO's potential will be unleashed and blow away PCs and PS4?

Well, I've waited this long so what's another year.

To be honest, the most Im expecting, are framerate improvements, in open world games like assansins creed and GTA5, if AC:unity was developed on DX12 it might of been a locked 30fps.
 

FordGTGuy

Banned
Explain why that is so, please.

Much like the fabled D-GPU, Move engines, double read write of ESRAM, people hoping for hardware features being hidden until a proper unveiling by MS are being delusional.

Why in the hell would microsoft be coy about somethihng like this? Why would they hide hardware features? Seriously, why?

We are honestly bringing up D-GPU? Why are you trying to turn this into a strawman argument?

Move engine and double read/write ESRAM is not fabled it is actually part of the real hardware.

They aren't being coy, Phil Spencer said it does support DX12 and I don't consider that being coy.
 
FTFY.

DX12 lets you control the command queue of the GPU

Yeah I'm not sure exactly how all that works. I'm a PC gamer and know a decent amount about the tech itself but really nothing at all in terms of how this helps devs.

To anyone privy on this stuff do you think MS claims of big performance increases has any weight with DX12? Not just on Xbone but in the PC market?
 

jmga

Member
Microsoft are the ones building DX12 and unlike Nvidia they knew what the final product would be all along.

Because it isn't 2013 hardware it is only based on 2013 hardware.

So why hid this future hardware features from developers even in your own SDK documentation?
 

FordGTGuy

Banned
Yeah I'm not sure exactly how all that works. I'm a PC gamer and know a decent amount about the tech itself but really nothing at all in terms of how this helps devs.

To anyone privy on this stuff do you think MS claims of big performance increases has any weight with DX12? Not just on Xbone but in the PC market?

On the PC market... yes.

On Xbox One Phil Spencer specifically said it will not show a "massive" improvement.

So why hid this future hardware features from developers even in your own SDK documentation?

When I say knew all along I mean when they initially sat down to figure out their plans for DX12, Nvidia without a doubt knows now what is needed as they are currently designing hardware for it.

DX12 isn't finished yet and isn't even in beta yet.
 
W
Move engine and double read/write ESRAM is not fabled it is actually part of the real hardware.

They aren't being coy, Phil Spencer said it does support DX12 and I don't consider that being coy.

1. Microsoft talked about it having something like 200GB/s right? Actual real world answers and logic turned that in to like 120 or so... and only for stuff going through ESRAM.

2. Support DX12 does not mean having DX12 hardware features. If no AMD cards developed and designed, before, during, or after the xb1 have dx12 hardware support. Why on earth would the xb1? You are completely overestimating the designing of the GPU internals that occurs at a place like MS. They "shop around" for parts and then try and put tiny inputs into the things development (like the small stuff cerny did). They do not design these GPUs, you know how massive a task it is to design a modern GPU? Especially entirely new feature sets for a GPU?

Everythign we know about the xb1 design is that it was not about power, not about features, not about performance, but about cost cutting and manufacture control.
becuase that SDK didnt support DX12 yet?

They already have a hardware specific version of DX on the xb1.
 

UnrealEck

Member
I'm wondering if Xbox One has full DirectX 12 feature support and AMD's 200 series didn't, why didn't it? If Xbox One was in development with DX12 support being included, why would AMD's 200 cards not have had it? Or nVidia's Maxwell. I guess maybe a deal could have been done between AMD and Microsoft, leaving nVidia out for their Kepler and Maxwell cards but that'd be a bit of a stretch I think.
 
Are you actually implying that DX12 features somehow equal more powerful hardware?

Give me a break, why don't you actually do some research on the understanding of what this does before implying something so obviously wrong.

I didn't see a single person in here implying that the XOne would somehow become more powerful than PC/PS4 because of full DX12 compatibility.

Is that honestly what is making people skeptical about this because they want to defend the hardware power advantage of the PS4/PC? I can assure you that this isn't going to magically unlock some hidden power.



Microsoft are the ones building DX12 and unlike Nvidia they knew what the final product would be all along.

Because it isn't 2013 hardware it is only based on 2013 hardware.

No, even in this case a bottom end desktop GPU with DX12 support will outbest the Xbox One with remarquable ease.

DX12 will absolutely not turn MS's console into something it is not, it's a low tier hardware which won't achieve miracles when games will stress it even more.

I was being sarcastic. I expect nothing to change in 2016 except MisterX's story.
 

FordGTGuy

Banned
1. Microsoft talked about it having something like 200GB/s right? Actual real world answers and logic turned that in to like 120 or so... and only for stuff going through ESRAM.

2. Support DX12 does not mean having DX12 hardware features. If no AMD cards developed and designed, before, during, or after the xb1 have dx12 hardware support. Why on earth would the xb1? You are completely overestimating the designing of the GPU internals that occurs at a place like MS. They "shop around" for parts and then try and put tiny inputs into the things development (like the small stuff cerny did). They do not design these GPUs, you know how massive a task it is to design a modern GPU? Especially entirely new feature sets for a GPU?

Everythign we know about the xb1 design is that it was not about power, not about features, not about performance, but about cost cutting and manufacture control.

They already have a hardware specific version of DX on the xb1.

In everything there are theoretical maximums.

A cars engine dyno is the theoretical maximum for car performance but real world performance completely varies. Same goes with this technology, just because theoretical data doesn't match real performance doesn't make it mythical.

You completely missed every article and breakdown on the Xbox One SoC design. They put heavy design into the Xbox One, they also started building their own hardware chips at the same time.(All the chips and hardware inside the Kinect are designed in house by Microsoft.)

Xbox One already uses DX12 features.

If it was built for the idea of cost cutting and manufacturing control it wouldn't be the largest SoC ever mass-produced.

I'm wondering if Xbox One has full DirectX 12 feature support and AMD's 200 series didn't, why didn't it? If Xbox One was in development with DX12 support being included, why would AMD's 200 cards not have had it? Or nVidia's Maxwell. I guess maybe a deal could have been done between AMD and Microsoft, leaving nVidia out for their Kepler and Maxwell cards but that'd be a bit of a stretch I think.

Microsoft designed the Xbox One SoC based on AMD parts.
 
If it was built for the idea of cost cutting and manufacturing control it wouldn't be the largest SoC ever mass-produced.

It is the largest SOC ever built because ESRAM takes up ridiculous amounts of diespace. Its size is directly as a result of them being cheap / wanting to control its production. Its die space/ performance dividend is hilariously low. It (the ESRAM taking up massive die space) is only there because they cheaped out (DDR3) and wanted to control shrink production in the future (no EDRAM daughter die!).

A cars engine dyno is the theoretical maximum for car performance but real world performance completely varies. Same goes with this technology, just because theoretical data doesn't match real performance doesn't make it mythical
The original figure given was mythical and basically impossible for all intents and purposes in game development. It was just thrown out there to obfuscate the power differential between PS4 and xb1. Same with the whole "5 billion transistors number."

EDIT: look at the god damn space all that "oh shit we fucked up" ESRAM takes up:
diecomparison.jpg
 

pottuvoi

Banned
It works in similar fashion as DX11, which offers compability modes for DX9c and DX10 class cards. (ShaderModel 4, SM4 and SM5)

DX11.3 will allow some of the features, DX12 should allow more. (similar to Mantle and GNM)
 

roytheone

Member
So if I understand this correctly, my GTX 970 will still get the performance improvements from DX 12, but it is possible I will miss out on some other new features in DX 12? The performance improvements is the thing that interests me the most, so I am ok if that is the case (plus, developers will not utilize those specific DX 12 features a lot if only a handful of pc gamers have capable cards).
 

FordGTGuy

Banned
It is the largest SOC ever built because ESRAM takes up ridiculous amounts of diespace. Its size is directly as a result of them being cheap / wanting to control its production. Its die space/ performance dividend is hilariously low. It (the ESRAM taking up massive die space) is only there because they cheaped out (DDR3) and wanted to control shrink production in the future (no EDRAM daughter die!).

The original figure given was mythical and basically impossible for all intents and purposes in game development. It was just thrown out there to obfuscate the power differential between PS4 and xb1. Same with the whole "5 billion transistors number."

You mean like every mythical theoretical maximum talked about in every field of technology?

I don't understand why we got here from talking about DX12 support but I'm think it has something to do with trying to derail the thread by saying Microsoft is too cheap to even support their own software on the hardware they put heavy design into......

Or trying to imply that some DX12 features somehow equal a mythical performance boost.

So if I understand this correctly, my GTX 970 will still get the performance improvements from DX 12, but it is possible I will miss out on some other new features in DX 12? The performance improvements is the thing that interests me the most, so I am ok if that is the case (plus, developers will not utilize those specific DX 12 features a lot if only a handful of pc gamers have capable cards).

You will get performance improvement but it is up in the air if you will get every single feature that DX12 will include.
 
It is not apparent yet if it supports ALL DX12 hardware GPU features. In fact, this question was super hardcore dodged on the PcPperspective stream yesterday.

So as not to lower sales of existing cards.

I've experienced this type of marketing double speak by Nvidia in the past and been burned by it, so until they say flat out "We support the new hardware features in DX12" I wouldn't believe them, or other hardware maker for that matter.

I was looking into building a new PC in 2007 and Nvidia told enthusiast websites that the 680 series would support the new upcoming Intel manufacturing process, 45nm I believe. I ended up buying a 680 series mobo and dual core 65nm chip, expecting to upgrade to a 45nm quad core a few years down the road. Low and behold, when the 45nm chips come out only Core 2 Duo dual core processors are supported, which they unsurprisingly failed to mention to enthusiast websites. That chipset also had a BIOS update that would stealth down-clock one of the buses as a result of Nvidia losing a patent lawsuit.

Wasn't sad to see them get out of the chipset business.

dontbelievetheirlies.jpg
 

tuxfool

Banned
Explain why that is so, please.

Much like the fabled D-GPU, Move engines, double read write of ESRAM, people hoping for hardware features being hidden until a proper unveiling by MS are being delusional.

Why in the hell would microsoft be coy about somethihng like this? Why would they hide hardware features? Seriously, why?

Heck, even interviews with devs about dx12 have only mentioned its CPU savings and nothing about unlocking hidden hardware stuff.

I thought it was well known that the gpu in xb1 was GCN 1.1 or maybe 1.0. The ps4 definitely seems 1.1. I've seen no mention of the XB1 supporting things like the volatile bit.
 
You mean like every mythical theoretical maximum talked about in every field of technology?

I don't understand why we got here from talking about DX12 support but I'm think it has something to do with trying to derail the thread by saying Microsoft is too cheap to even support their own software on the hardware they put heavy design into......

Or trying to imply that some DX12 features somehow equal a mythical performance boost.
.
We got here because I mentioned the similarity between the idea that "the xb1 has hidden hardware GPU features unlocked by dx12" reminds me of "dgpu in the powerbrick" and other crazy stuff that I have heard regarding the xb1.

You argue that they were deving dx12 whilst deving the xb1 and therefore xb1 probably has GPU hardware features in dx12. They will then surprise devs, the public, and everyone by unveiling and unlocking this hardware through a dx12 update.

To me, this sounds like a DGPU being unlocked. It really does.
I thought it was well known that the gpu in xb1 was GCN 1.1 or maybe 1.0. The ps4 definitely seems 1.1. I've seen no mention of the XB1 supporting things like the volatile bit.

I am pretty sure Bonaire is GCN 1.1.
 

Cse

Banned
buyers of 970/980 are going to be pissed.

I would imagine the new Maxwell cards would allow one to get the full benefits of DX12. I mean, NVIDIA and Microsoft talk to one another, no? I have a hard time believing that the engineers at Microsoft that have been working on DX12 for the past ~2 years never once communicated their intentions or shared their software with NVIDIA.

If not...then would we have to wait for Maxwell's successor to see cards that take full advantage of DX12, or could the eventual 14nm FinFET Maxwell cards accomplish this?
 

tuxfool

Banned
I would imagine the new Maxwell cards would allow one to get the full benefits of DX12. I mean, NVIDIA and Microsoft talk to one another, no? I have a hard time believing that the engineers at Microsoft that have been working on DX12 for the past ~2 years never once communicated their intentions or shared their software with NVIDIA.

If not...then would we have to wait for Maxwell's successor to see cards that take full advantage of DX12, or could the eventual 14nm FinFET Maxwell cards accomplish this?

DX12 is not finalized and requires input from all hardware vendors. However Hardware requires far more dev time and validation than software. This means things have to be finalized far earlier than software, so it is highly possible it will not support all dx12 feature levels (which is why they exist).
 

pottuvoi

Banned
Unlikely as 970/980 doesn't support even DX11.2 and that doesn't stop anyone from buying them.
.. really?
It supports full dx 11.3 and so on. (It is Nvidias Tock in Intels roadmap.)

Only question is how low level DX12 will go, will it support everything each new GPU will allow or is there similar feature levels to DX11 except for each vendors.
Certainly would love to see things like interpolator access in pixel shader to be available in most/all cards.
 

tuxfool

Banned
I am pretty sure Bonaire is GCN 1.1.

GCN 1.1 (Ish?), The problem is that AMD never defines what they consider the various GCN versions. The number are attributed by the press in order to sort of describe the hardware iteration.

For example: Bonaire is 1.1, but the 'volatile bit' feature was put in the ps4 at Sony's request (it is implied that they even drew up the specification). The R290X, another gpu considered to be GCN 1.1 is known to contain this feature.

So the question remains, what features are necessary for GCN versions to be defined. The XB1 does not contain Trueaudio which is another feature often considered to be part of 1.1.

On the other hand Bonaire was released in March 2013 and it could fit so presumably was developed in parallel to the ps4 gpu (despite having seperate teams).
 

pottuvoi

Banned
Hopefully this:

HZY7ZCn.jpg


Fullly supports all DX12 / Direct3D12 features.

As well as AMD's equivalent (R9-3XX).
It should, like all current Maxwell2 GPUs.

Nvidia hasn't really added features between really big upgrades and the cycle is years.. (It seems like Fermi was their previous big upgrade in terms of features.)
 

jmga

Member
We can deduce there are at least 3 levels of DX 12 support.

- Level 1: GCN 1.X, Fermi, Kepler and 1st gen Maxwell. (feature level 11.0-1-2?)

- Level 2: 2nd gen Maxwell (and GCN 2.0)? (feature level 11.3?)

- Level 3: Pascal and GCN 3.0? (feature level 12.0?)

I suppose level 3 will be full support and level 2 is Maxwell partial support.
 
We can deduce there are at least 3 levels of DX 12 support.

- Level 1: GCN 1.X, Fermi, Kepler and 1st gen Maxwell. (feature level 11.0-1-2?)

First gen maxwell has at least 4 DX12 features (the OIT stuff, the conservative rasterization, etc...)
 

dr_rus

Member
.. really?
It supports full dx 11.3 and so on. (It is Nvidias Tock in Intels roadmap.)
Really.

8cdDKjH.png


11.3 is DX12 features on DX11 "thick" API. Until now there is no indication that support for 11.3 will mean support for 11.2. Some features in 11.2 are optional.
Also - if it supports "full dx 11.3" then it supports full DX12 as well since these are same things feature wise.

Only question is how low level DX12 will go, will it support everything each new GPU will allow or is there similar feature levels to DX11 except for each vendors.
Certainly would love to see things like interpolator access in pixel shader to be available in most/all cards.

Of course there will be feature levels, how else would all NV's DX11+ chips support D3D12?
 

tuxfool

Banned
First gen maxwell has at least 4 DX12 features (the OIT stuff, the conservative rasterization, etc...)

Is MS going to define a Feature level for a single card. Correct me if I'm wrong but the first version of Maxwell only appeared in the 750ti. Unless of course DX12 starts doing it the OpenGL way.
 

pottuvoi

Banned
Really.

8cdDKjH.png


11.3 is DX12 features on DX11 "thick" API. Until now there is no indication that support for 11.3 will mean support for 11.2. Some features in 11.2 are optional.
Also - if it supports "full dx 11.3" then it supports full DX12 as well since these are same things feature wise.
DX11.3 and DX11.2 thing would be interesting indeed.
Also DX12 will most likely be expanded in similar fashion in future.
Of course there will be feature levels, how else would all NV's DX11+ chips support D3D12?
Of course there are feature levels, but will there be one for each vendor?
IE. Nvidia, AMD and Intel each having their own path.
 

Chobel

Member
Put me in team "xb1 GPU doesn't have these exclusive features".

EDIT: I have 970 but I'm not worried, because I doubt these new features are anything groundbreaking, and they can probably be easily emulated with Maxwell cards.
 

dr_rus

Member
Of course there are feature levels, but will there be one for each vendor?
IE. Nvidia, AMD and Intel each having their own path.

That will completely defeat the purpose of a common API and force developers to code for each GPU architecture separately.

A feature level is a level of support which isn't tied to any vendor and is actually open to be supported for any vendor at any time.

If I have to guess I'd say that we're looking at four feature levels in DX12: d3d11, d3d11_1, d3d11_2 and d3d11_3.

Fermi, Kepler and Maxwell1 are d3d11.
GCN1 is d3d11_1.
GCN11 and GCN12 are d3d11_2.
Maxwell2 is d3d11_3.
 

AmyS

Member
When I buy a graphics card, I'll just make damn sure beforehand that it supports DirectX feature level 12 / Direct3D12, and not have to wonder if it's only got 11.2 or 11.3 in hardware.
 

Conduit

Banned
https://twitter.com/xboxp3/status/446873135594225664

"DX12 will have impact on XBOX One games written for DX12. Some DX12 features are already in XBOX One but full DX12 coming."

Xbox one is getting full DX12, Phil isn't going to lie about that when people will see for themselves later this year. That would just be stupid on his part. Plus, he hasn't lied or even spinned words yet so why would he about that? He's the one that said TR exclusivity had a duration, flat out shot down misterxmedia etc. He's not in the game to talk crap even it does make the xbox look better.

Hmmmm, very interesting. I wonder if it will introduce any real world performance boosts, as they said in the brief DX12 demo they showed that it supposedly can offer substantial performance increases.




Well, 3 months later :

G9YxBVP.png


FTyMN1I.png
 

Naminator

Banned
Are you actually implying that DX12 features somehow equal more powerful hardware?

Give me a break, why don't you actually do some research on the understanding of what this does before implying something so obviously wrong.

I didn't see a single person in here implying that the XOne would somehow become more powerful than PC/PS4 because of full DX12 compatibility.

Is that honestly what is making people skeptical about this because they want to defend the hardware power advantage of the PS4/PC? I can assure you that this isn't going to magically unlock some hidden power.

I think misterX said somewhere that 2016 will be the year when the hidden magic juices of the Xbox One that are steeping right now will finally be unleashed.

It's a not-so-clever little gotcha question, trying to make it seem like you're spreading misterX's BS around.
 

UnrealEck

Member
Have fun waiting if you are trying to be on the cutting edge of pc tech. A gimped 970 runs pretty much every game better then current home consoles.

I can't think of any game that doesn't.

Why doesn't Xbox One have full support for these DX12 features now? Spencer says they're coming, but I'm wondering why they weren't already in. If they're coming because they weren't finished until recently, post Xbox One hardware finalisation, wouldn't that mean these additions are software based? If so, why wouldn't GPUs like Maxwell have support for the same features?
 

pottuvoi

Banned
That will completely defeat the purpose of a common API and force developers to code for each GPU architecture separately.
It would make it harder, yes.
A feature level is a level of support which isn't tied to any vendor and is actually open to be supported for any vendor at any time.
DX11 already was similar as DX11.2 was only supported by AMD and Intel.
Developers had to write code for each vendor if they wanted to take advantage of fetures and/or avoid pitfalls.

DX12 is a 'low level' API and when going low enough there will be more differences between GPUs.
If I have to guess I'd say that we're looking at four feature levels in DX12: d3d11, d3d11_1, d3d11_2 and d3d11_3.

Fermi, Kepler and Maxwell1 are d3d11.
GCN1 is d3d11_1.
GCN11 and GCN12 are d3d11_2.
Maxwell2 is d3d11_3.
Agreed, this is likely.

Cannot wait to see what next GPUs from AMD and Intel have for us. :D
 
oh well, i am kind disapointed by all this, i was reading this article(http://blogs.nvidia.com/blog/2015/01/21/windows-10-nvidia-dx12/) last night, and came across this "We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12." only to find out now that its only partial support, i am sad now.

It is market talk as I said before. "fully support" vs "support full"

It is still BS though obviously. I would expect GM200 or GM210 (whenever that exists) to be fullllllllll dx12 though.
 
Top Bottom