• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What the fuck is the issue with DirectX 12?

64bitmodels

Reverse groomer.
DX11 performs much better in many of the games and applications that I use. Elden Ring as great as it is uses DX12 and runs like total shite despite not being that much more advanced visually than Sekiro which runs far better on my machine and just so happens to be using DX11. Same for Spiderman Remastered and Miles Morales which are just stutter central on my machine. Infact the only Sony port i can think of without these major stutters is God of War... and that's the one using DX11.

What is it with this fucking API? On AMD hardware at least it seems like it's just inferior to Vulkan and DX11. It feels like they took a step back when it came to efficiency and optimization in exchange for Raytracing and other features which they have announced like SRS but have also never used in any game using DX12. It feels like MS really shat the bed here with DX12, worse performance and stutters across the board in exchange for a feature which most people aren't going to be able to properly use.

I feel like the stutter struggle with PC ports is caused entirely by this laggy, unoptimized piece of shit API. I hope MS gets their shit together and work to make it as optimized as possible because the thought of seeing Returnal with stutters thanks to using DX12 sickens me to my stomach.
 

mhirano

Member
DX12 took the optimizations away from the drivers and put them on the engine.
So the GPU vendors can do little about bad performance: it’s on the game devs. And code optimization (with multiple hardware configurations) is HARD
 
I don't have any proof. BUT I think there is a massive conspiracy against PC gaming going on in the industry. Crypto was part of that conspiracy but that con could only sustain until every developer was using the same engine (Unreal).
Microsoft no longer has any interest in keeping PC gaming going with DX. Soon every developer is going to have to work their asses off optimizing for every card ever made and that will KILL PC gaming off for good.
The only ones that will survive are the PC 'Platformers' (Steamdeck) that do the optimizing for the developers. OR ya know a console like Xbox/PS.
 

LordOfChaos

Member
wait, really? where did you get that info from?

Can confirm. Regular DX11 stacks significantly manipulate the drawcalls coming out of the games. They reorder commands, add/remove commands, replace shader kernels, even just directly hotpatch the game renderers. In essence they can pretty much rewrite the command queue or even the game at will, and they do. In the words of an ex-NVIDIA employee, pretty much every AAA game ships broken. Nvidia also put perhaps the most optimization into their DX11 codepaths, so AMD gained more on DX12 titles that do it right. Intel in contrast was so late to making GPUs they have almost none of the DX11 optimizations and their cards are pretty much now and future only.

In contrast, the DX12/Vulkan drivers are thin shims over the hardware capabilities of the cards. They mostly just take whatever the game sends it and runs it.

This at least makes it clear where the blame lies. The older DX versions and OpenGL implementations were a clusterfuck of hacks to work around game bugs and performance issues, newer low level APIs mostly give single party responsibility for optimization.
 
Last edited:

64bitmodels

Reverse groomer.
In contrast, the DX12/Vulkan drivers are thin shims over the hardware capabilities of the cards. They mostly just take whatever the game sends it and runs it.
how do Vulkan games manage to run so much better then? In my experience at least vulkan games typically have less stuttering and are more consistent. I have not tried these Vulkan games with DirectX12 however
 

mhirano

Member

BeardGawd

Banned
Can confirm. Regular DX11 stacks significantly manipulate the drawcalls coming out of the games. They reorder commands, add/remove commands, replace shader kernels, even just directly hotpatch the game renderers. In essence they can pretty much rewrite the command queue or even the game at will, and they do. In the words of an ex-NVIDIA employee, pretty much every AAA game ships broken. Nvidia also put perhaps the most optimization into their DX11 codepaths, so AMD gained more on DX12 titles that do it right. Intel in contrast was so late to making GPUs they have almost none of the DX11 optimizations and their cards are pretty much now and future only.

In contrast, the DX12/Vulkan drivers are thin shims over the hardware capabilities of the cards. They mostly just take whatever the game sends it and runs it.

This at least makes it clear where the blame lies. The older DX versions and OpenGL implementations were a clusterfuck of hacks to work around game bugs and performance issues, newer low level APIs mostly give single party responsibility for optimization.
Oh wow so even seasoned PC developers were never coding correctly and will need additional training to get things right.
 

Crayon

Member
how do Vulkan games manage to run so much better then? In my experience at least vulkan games typically have less stuttering and are more consistent. I have not tried these Vulkan games with DirectX12 however

Either that's just you (idk, I use linux so it's different because everything gets pushed through vulkan) or vulkan is just better or the vulkan implementations are better. Leaning towards the latter since dx12 gets pushed by ms meanwhile you look at a studio like ID electing to use vulkan and you have to take into account that's a studio known for top tier technical ability.
 

PhoenixTank

Member
Shaders need to be precompiled (async or otherwise) with DX12 to avoid the #stutterstruggle. Still a thing but less of a shit show with DX11 - I'd be interested in knowing why.
 

LordOfChaos

Member
Oh wow so even seasoned PC developers were never coding correctly and will need additional training to get things right.

Basically, the whole industry relied on Nvidia and AMD fixing a bunch of shit in post, and if they didn't the GPU makers would be the ones blamed for shitty drivers. Intel never had those decades of fixing shit in drivers for older DX at least in the context of higher end dedicated cards, and so anything before 12 performs badly on their dGPU architecture.

DX12 reduces CPU overhead but it's more directly doing what the game says, it's hypothetically more performant but it needs the developers to put in the work to get there

I can't really answer the question on why Vulkan may be going better
 
Last edited:

Guilty_AI

Member
From what i understand, DX12 was made with asynchronous computing in mind, leaving a lot of the task management in the hands of the developers, whereas DX11 did this automatically but without using the machine's full potential.

The core of the issue here is that many devs ship games broken and i honestly don't think this has much to do with dx12 itself. The broken and unoptimized games sometimes release today would still be broken and unoptimized, with the lack of effort put into dx12 particularities just being another factor.

Try seeing videos comparing dx11 vs dx12 in the same game, you'll notice performance difference is minimal most of the time, in a lot of them dx12 even has more stable framepacing. The ones where there is noticeable difference is usually in already perfectly broken games like Saints Row 22.

EDIT: Another example. Seems in The Witcher 3 next-gen, despite dx12 having worse performance, actually offers some 'hidden' graphical advantages over dx11

 
Last edited:

RoboFu

One of the green rats
It’s a lower level api but it’s also much harder to set up and optimize. Where as before it was much simpler to set up and the render functions had a generic pre optimized hardcoded setup but you had very little control over it. Those lower level optimizations had to be done in the driver. So your driver wasn’t just one thing it have multiple variations that could be game dependent.

Dx 12 you can insert custom routines at almost any point of the render pipeline. Dynamically change that pipeline or allocate processing to the hardware at a much lower level that before.

Basically for the devs with time, budget, and skill you can do some great things and greatly optimize your render engine.

But on the flip side if you don’t have the budget or skill you can greatly screw up your rendering engine into a something that’s much harder to fix later.
 

Fredrik

Member
Shaders need to be precompiled (async or otherwise) with DX12 to avoid the #stutterstruggle. Still a thing but less of a shit show with DX11 - I'd be interested in knowing why.
I want to see this. Can you choose to run/install a game in DX11 over DX12 on the same hardware and with the same Windows version?
 

Buggy Loop

Member
It’s someone’s genius idea of taking a console fixed hardware - metal API and thus they optimized the caching of the shaders to be done on the fly, but since consoles are known quantities, it’s very easy and without stutter to compile on the fly.

I don’t know who suggested this on PC, if it’s Microsoft that saw benefits for consoles and brought it or if AMD thought they would gain an upper hand with their architecture and thus made Metal API which then spawned DX12 and Vulkan.

But in the end, on PC it doesn’t make any fucking sense. Every cards, every drivers, will have different shader caches to compile on the fly if devs are stupid and don’t do this at the start of the game or in the loadings. It’s a stupid fucking idea to do this on the fly.

Here’s a super interesting discussion on the same subject for the dolphin emulator which encountered the same problems because of the GameCube ATI’s TEV, which was really cool tech at the time. It details how complicated the situation is to emulate even such an old console’s shaders on hardware that is much more powerful.

https://fr.dolphin-emu.org/blog/2017/07/30/ubershaders/?cr=fr

It’s a super fun read
 

nkarafo

Member
I have a pretty mediocre, nearly obsolete PC (i5 4670/1060 6GB) and i remember the two Resident Evil Remasters having some stutters in certain places that i could reproduce 100% of the time, even on my VRR display. But after the recent DX12 patches, both games seem to run better for me, the stutters i used to have have been cleared. I played the whole RE3 remaster from start to finish at 60fps and i hardly remember any stutters at all.

Though maybe those games are the exception? I don't play that many newer games lately due to my aging hardware.
 

nkarafo

Member
Here’s a super interesting discussion on the same subject for the dolphin emulator which encountered the same problems because of the GameCube ATI’s TEV, which was really cool tech at the time. It details how complicated the situation is to emulate even such an old console’s shaders on hardware that is much more powerful.

https://fr.dolphin-emu.org/blog/2017/07/30/ubershaders/?cr=fr

It’s a super fun read

The shader compilation issue in console emulating is a really weird one.

I was told shader compilation is mandatory if the console you are emulating demands it. Yet that's not true in the case of N64 and Dreamcast.

The two graphics plugins that matter for N64 emulators are GlideN64 and Parallel RDP. The first is HLE and faster, the later is LLE and more accurate. But apparently, GlideN64 requires shader compilation while Parallel doesn't. It's far less noticeable with N64 games because the shaders are fewer and much smaller. So the stutters are more like a few microstutters here and there (though they become really bad if you use HD texture mods). But still, Parallel RDP doesn't do shader compiling at all.

I thought maybe it's because of the accuracy difference but that theory goes out of the window in the Dreamcast situation. DEMUL, which has the more accurate graphics plugin that doesn't even allow for upscaling, has very severe shader compilation stutters. Seriously, it's so bad that i gave up on DC emulation altogether, despite the emulator being nearly 100% compatible. But in Flycast there is no shader compilation taking place. And that emulator used to be behind DEMUL in accuracy/compatibility, though now it should be on par.

So i don't understand how some emulators can completely bypass the "mandatory" shader compilation function while others (like Dolphin, Cemu, etc) can't and have to use methods like Ubershaders to make them tolerable or having to download precompiled shaders from other users.
 
Last edited:

twilo99

Member
Why would Microsoft trust game developers with so much on the performance optimization side? That’s just a bad idea.

Is it possible they got some pushback from AMD and Nvidia so that they don’t have to worry about it?

Also, I’m still confused as to what version of DX does the series x and s run?
 

Buggy Loop

Member
The shader compilation issue in console emulating is a really weird one.

I was told shader compilation is mandatory if the console you are emulating demands it. Yet that's not true in the case of N64 and Dreamcast.

The two graphics plugins that matter for N64 emulators are GlideN64 and Parallel RDP. The first is HLE and faster, the later is LLE and more accurate. But apparently, GlideN64 requires shader compilation while Parallel doesn't. It's far less noticeable with N64 games because the shaders are fewer and much smaller. So the stutters are more like a few microstutters here and there (though they become really bad if you use HD texture mods). But still, Parallel RDP doesn't do shader compiling at all.

I thought maybe it's because of the accuracy difference but that theory goes out of the window in the Dreamcast situation. DEMUL, which has the more accurate graphics plugin that doesn't even allow for upscaling, has very severe shader compilation stutters. Seriously, it's so bad that i gave up on DC emulation altogether, despite the emulator being nearly 100% compatible. But in Flycast there is no shader compilation taking place. And that emulator used to be behind DEMUL in accuracy/compatibility, though now it should be on par.

So i don't understand how some emulators can completely bypass the "mandatory" shader compilation function while others (like Dolphin, Cemu, etc) can't and have to use methods like Ubershaders to make them tolerable or having to download precompiled shaders from other users.

Found on Reddit :

  • If it has shaders => you need to recompile them. (RPCS3, Xenia, Yuzu, Ryujinx, Cemu)
  • If it's simple enough to do software rasterization (either on the CPU or on the GPU as a compute shader), you don't need need to compile a lot of shaders. (ParallelRDP does that)
  • if it has a lot of complicated fixed function state, you either need to generate shaders which do the same thing or write one giant uber shader that can do it all. (Dolphin can do both)
The N64 had a fixed-function pipeline with two texture combiner stages and a ton of different bits of state that would affect drawing. Not a shader like we would have on PC. What some graphic plugins are doing is using shaders to emulate what that end result on N64 would look like. It’s not a replica of the texture combiner.
 

Crayon

Member
The big promise of mantle was better cpu utilization. At least that's what I remember. The shader compilation is sidestepped with fossilize or whatever it's called provided with steam so I have a good time. Vulkan is also a boon for linux since opengl is a wreck. I dx12 is a similar concept. I think it's great overall but the shader comp teething issue has gone on for two long.
 

clampzyn

Member
Also, I’m still confused as to what version of DX does the series x and s run?
Series Consoles is moving to DX12U as far as I remember, so it's gonna be easier to develop on both PC and Series Consoles (plus retro emulation). But a lot of games still run on Dx11 and ported to DX12 (for RT purposes) that's why those type of games runs poorly on series s/x.
 
Last edited:

nkarafo

Member
Found on Reddit :

  • If it has shaders => you need to recompile them. (RPCS3, Xenia, Yuzu, Ryujinx, Cemu)
  • If it's simple enough to do software rasterization (either on the CPU or on the GPU as a compute shader), you don't need need to compile a lot of shaders. (ParallelRDP does that)
  • if it has a lot of complicated fixed function state, you either need to generate shaders which do the same thing or write one giant uber shader that can do it all. (Dolphin can do both)
The N64 had a fixed-function pipeline with two texture combiner stages and a ton of different bits of state that would affect drawing. Not a shader like we would have on PC. What some graphic plugins are doing is using shaders to emulate what that end result on N64 would look like. It’s not a replica of the texture combiner.

And where does the Dreamcast fall in this?

Also, how come the PS2 emulation doesn't have that issue but the Dreamcast and Gamecube does? PS2 def seems more complex than DC at least.
 

PaintTinJr

Member
It’s someone’s genius idea of taking a console fixed hardware - metal API and thus they optimized the caching of the shaders to be done on the fly, but since consoles are known quantities, it’s very easy and without stutter to compile on the fly.

I don’t know who suggested this on PC, if it’s Microsoft that saw benefits for consoles and brought it or if AMD thought they would gain an upper hand with their architecture and thus made Metal API which then spawned DX12 and Vulkan.

But in the end, on PC it doesn’t make any fucking sense. Every cards, every drivers, will have different shader caches to compile on the fly if devs are stupid and don’t do this at the start of the game or in the loadings. It’s a stupid fucking idea to do this on the fly.

Here’s a super interesting discussion on the same subject for the dolphin emulator which encountered the same problems because of the GameCube ATI’s TEV, which was really cool tech at the time. It details how complicated the situation is to emulate even such an old console’s shaders on hardware that is much more powerful.

https://fr.dolphin-emu.org/blog/2017/07/30/ubershaders/?cr=fr

It’s a super fun read
That's a great read, and this quote from the article I feel more than anecdotal supports the point I made in the 7900Xt/XTX review thread about the close relationship of Nvidia to DX, and how AMD hardware and Opengl/Vulkan are 2nd class citizens rendering games through MS Windows' HAL - that favours DX - with linux+WINE/proton exposing the HAL throttling of non-Nvidia/DX solutions.

"NVIDIA's Compiled Shaders on OpenGL and Vulkan are Much Slower than D3D
This one is particularly frustrating as there is no great way for us to debug this. We're feeding the same shaders to the host GPU on OpenGL, Vulkan and D3D, yet, D3D ends up with shaders that are much faster than the other two backends. This means that on a GTX 760, you may only get 1x internal resolution in a particular game on OpenGL or Vulkan, but on D3D, be able to comfortably get double or even triple before seeing slowdown.

Since NVIDIA does not allow us to disassemble shaders despite every other desktop GPU vendor having open shader disassembly, we have no way to debug this or figure out why the compiled code is so much more efficient on D3D. Think about how ridiculous this is: we want to make Dolphin run better on NVIDIA and they don't provide the tools to let us even attempt it. It's a baffling decision that we hope is rectified in the future. Without the shader disassembly tools provided by other vendors, fixing various bugs would have been much more difficult.

The sad thing is, the tools we need do exist - - if you're a big enough game studio. Edit: NVIDIA informed us that they only provide shader disassembly tools for Direct3D 12 (under NDA), and they are not available for other APIs regardless of NDA. Hopefully tools for other APIs will be available in the future."
 
the only two games i can think of playing with DX12 were Flight Sim and Fortnite. any time i enabled DX12 it would result in worse performance and in the case of Fortnite the game would crash every 5-10 minutes.

at least with those two games i think things have improved. I'm getting much better performance in Flight Sim but i didn't play it for about a year. the Chapter 4 update for Fortnite seems to have significantly improved DX12 as i have had it enabled since the start of the season with 0 crashes. both games still seem to have major frame drops/hitches at time Flight Sim can drop to about <10fps for a second or two. Fortnite averages 90-110fps but will drop to 40-50fps at times. at higher frame rates the pacing still feels off.

Witcher 3 is also running on DX12 now but apparently it runs like shit. i only played an hour of it on my old GPU and i had it cranked up full so i can't tell the difference between a 4 year old gpu trying to max it out and engine issues. i haven't gave it a shot on my 4080 yet.

Upgrade your RAM and GPU OP.
while that could help it's not the issue. you want me to upgrade to 64GB RAM and an RTX 4090? because that's the only way to improve my system
 
Last edited:

64bitmodels

Reverse groomer.
This is actually pretty eye-opening to me

So in reality directx12 is actually very powerful and good due to being low level but they left the optimizations to the developers.
I still feel like DX11 is better for having made those optimizations and its actually surprising they did stuff like that just to keep the game smoothly running... but I guess the blame can't entirely be shouldered on Microsoft.
 

Zathalus

Member
how do Vulkan games manage to run so much better then? In my experience at least vulkan games typically have less stuttering and are more consistent. I have not tried these Vulkan games with DirectX12 however
It really depends on the developer, both DX12 and Vulkan are closely related to each other and are derived from the Mantle API created by AMD. Games that offer both a DX12 and Vulkan option perform very similar to each other.

You also have plenty of games that perform just fine on DX12. Control, Cyberpunk, Forza, Gears 5, to name some examples.
 
Last edited:

winjer

Gold Member
From what i understand, DX12 was made with asynchronous computing in mind, leaving a lot of the task management in the hands of the developers, whereas DX11 did this automatically but without using the machine's full potential.

The core of the issue here is that many devs ship games broken and i honestly don't think this has much to do with dx12 itself. The broken and unoptimized games sometimes release today would still be broken and unoptimized, with the lack of effort put into dx12 particularities just being another factor.

Try seeing videos comparing dx11 vs dx12 in the same game, you'll notice performance difference is minimal most of the time, in a lot of them dx12 even has more stable framepacing. The ones where there is noticeable difference is usually in already perfectly broken games like Saints Row 22.

EDIT: Another example. Seems in The Witcher 3 next-gen, despite dx12 having worse performance, actually offers some 'hidden' graphical advantages over dx11



DX12 in the Witcher remaster is a terrible example, for anything.
It's just a wrapper on top of DX11. Just a lazy hack job, that causes too many issues with performance.
 

rofif

Can’t Git Gud
Dx12 is better than dx11 in AMD at least. Less CPU overhead.

And from software can't code even if their lives depended on it. Love their games but their engine is held with duct tape.
I prefer duct tape engine rather than another unreal engine bs.
I think from engine is pretty great actually
 

REDRZA MWS

Member
the only two games i can think of playing with DX12 were Flight Sim and Fortnite. any time i enabled DX12 it would result in worse performance and in the case of Fortnite the game would crash every 5-10 minutes.

at least with those two games i think things have improved. I'm getting much better performance in Flight Sim but i didn't play it for about a year. the Chapter 4 update for Fortnite seems to have significantly improved DX12 as i have had it enabled since the start of the season with 0 crashes. both games still seem to have major frame drops/hitches at time Flight Sim can drop to about <10fps for a second or two. Fortnite averages 90-110fps but will drop to 40-50fps at times. at higher frame rates the pacing still feels off.

Witcher 3 is also running on DX12 now but apparently it runs like shit. i only played an hour of it on my old GPU and i had it cranked up full so i can't tell the difference between a 4 year old gpu trying to max it out and engine issues. i haven't gave it a shot on my 4080 yet.


while that could help it's not the issue. you want me to upgrade to 64GB RAM and an RTX 4090? because that's the only way to improve my system
So do it
 

winjer

Gold Member
DX11 performs much better in many of the games and applications that I use. Elden Ring as great as it is uses DX12 and runs like total shite despite not being that much more advanced visually than Sekiro which runs far better on my machine and just so happens to be using DX11. Same for Spiderman Remastered and Miles Morales which are just stutter central on my machine. Infact the only Sony port i can think of without these major stutters is God of War... and that's the one using DX11.

What is it with this fucking API? On AMD hardware at least it seems like it's just inferior to Vulkan and DX11. It feels like they took a step back when it came to efficiency and optimization in exchange for Raytracing and other features which they have announced like SRS but have also never used in any game using DX12. It feels like MS really shat the bed here with DX12, worse performance and stutters across the board in exchange for a feature which most people aren't going to be able to properly use.

I feel like the stutter struggle with PC ports is caused entirely by this laggy, unoptimized piece of shit API. I hope MS gets their shit together and work to make it as optimized as possible because the thought of seeing Returnal with stutters thanks to using DX12 sickens me to my stomach.

DX12 is a low level API, that puts more control on the developers hands. It's also more complicated to use.
To this we have also to consider that gaming has grown a lot and there is not enough talent to go around.
Also, game artists are now producing a lot more individual shaders and all of these have to be compiled for the respective hardware.
On consoles, it's easier because a dev targets one platform and one version of the firmware. So they can precompile shaders and ship them with the game.
On PC, there are many GPUs and driver versions. So shaders have to be compiled for each machine. Good devs will force a shader compilation when the game starts.
But a lot of devs just half-ass it and compile shaders during gameplay, leading to stutters. This is not a problem with DX12, it's a problem with incompetent devs.

I have not seen many complains about stutter in Spider-man. But the game is very heavy on the memory subsystem, it could be be you have low performance ram, or not enough ram/vram. Or even not enough PCI-e bandwidth.
God of War does have stutters, but it's from loading assets, when entering a new area. Even on high end systems.
 

Buggy Loop

Member
DX12 in the Witcher remaster is a terrible example, for anything.
It's just a wrapper on top of DX11. Just a lazy hack job, that causes too many issues with performance.

Actually that was debunked. Patch was sent with test files that included this solution, but it was not kept. Newer patch simply removed these test files.
 

Dr.D00p

Gold Member
There are so few PC centric developers these days who have the time, budget & manpower to do the API justice....all the other fuckers have taken the console coin.

Its why my days of buying top end GPUs are over because they will never, ever be pushed to the limits of what they're actually capable of besides just turning up the resolution on crap console ports.
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
DX11 performs much better in many of the games and applications that I use. Elden Ring as great as it is uses DX12 and runs like total shite despite not being that much more advanced visually than Sekiro which runs far better on my machine and just so happens to be using DX11. Same for Spiderman Remastered and Miles Morales which are just stutter central on my machine. Infact the only Sony port i can think of without these major stutters is God of War... and that's the one using DX11.

What is it with this fucking API? On AMD hardware at least it seems like it's just inferior to Vulkan and DX11. It feels like they took a step back when it came to efficiency and optimization in exchange for Raytracing and other features which they have announced like SRS but have also never used in any game using DX12. It feels like MS really shat the bed here with DX12, worse performance and stutters across the board in exchange for a feature which most people aren't going to be able to properly use.

I feel like the stutter struggle with PC ports is caused entirely by this laggy, unoptimized piece of shit API. I hope MS gets their shit together and work to make it as optimized as possible because the thought of seeing Returnal with stutters thanks to using DX12 sickens me to my stomach.
DX12 demands more of the programmers. It's lower level, so getting optimal performance requires more work.
 

Panajev2001a

GAF's Pleasant Genius
DX12 took the optimizations away from the drivers and put them on the engine.
So the GPU vendors can do little about bad performance: it’s on the game devs. And code optimization (with multiple hardware configurations) is HARD
I fundamentally agree with you, but… Why does Vulkan perform better?

About RT/DXR, actually the situation is the reverse where console devs have far more granular access even on XSX|S than they do on Windows PC’s (which still lack Direct storage in earnest).
 

M1chl

Currently Gif and Meme Champion
From what i understand, DX12 was made with asynchronous computing in mind, leaving a lot of the task management in the hands of the developers, whereas DX11 did this automatically but without using the machine's full potential.

The core of the issue here is that many devs ship games broken and i honestly don't think this has much to do with dx12 itself. The broken and unoptimized games sometimes release today would still be broken and unoptimized, with the lack of effort put into dx12 particularities just being another factor.

Try seeing videos comparing dx11 vs dx12 in the same game, you'll notice performance difference is minimal most of the time, in a lot of them dx12 even has more stable framepacing. The ones where there is noticeable difference is usually in already perfectly broken games like Saints Row 22.

EDIT: Another example. Seems in The Witcher 3 next-gen, despite dx12 having worse performance, actually offers some 'hidden' graphical advantages over dx11


In Witcher is mostly due to DX11 to DX12 translation layer, game is basically emulated translated to newer DX, so they could add more effects, while not rewriting the old code.

Actually that was debunked. Patch was sent with test files that included this solution, but it was not kept. Newer patch simply removed these test files.
Was it? Just because teste files were removed, does not mean that it is not using this solution.
 
Last edited:

64bitmodels

Reverse groomer.
the problem is with DX12 not the hardware. unless someone is on 8GB RAM or a GTX 970/1060.
i am using 16gb ram but it's very slow, like around 2133mhz
never got around to upgrading it either. I think that's the main bottleneck of my system but games perform decently so i never really cared
MS for DX13 should just go back to letting the driver optimize the games, it's clear developers can't do it themselves if the releases of 2022 are any indication
 

winjer

Gold Member
i am using 16gb ram but it's very slow, like around 2133mhz
never got around to upgrading it either. I think that's the main bottleneck of my system but games perform decently so i never really cared
MS for DX13 should just go back to letting the driver optimize the games, it's clear developers can't do it themselves if the releases of 2022 are any indication

Holy shit dude. 2133 is really slow, no wonder you are getting stutters.
DX12 implementations in some games are problematic. But expecting your PC with 2133MT/s to work well, is too much to ask.
Are you even using it in dual channel? Do you have XMP enabled? How old is your CPU.

And in DX12, it's the driver that compiles and optimizes the shaders.
The issue with stutters is game engine related, but this will be improved with things like automatic gathering of PSOs in UE5.
 
Top Bottom