Has the image quality been improved??
AMD keeps winning with DX12, AMD's DX12 performance is pretty much on par with Nvidia's DX11 performance. It's disappointing that AMD didn't get a boost with DX11 though.
I think this is a case where you can lay the blame on both the developer and Nvidia.
AMD keeps winning with DX12, AMD's DX12 performance is pretty much on par with Nvidia's DX11 performance. It's disappointing that AMD didn't get a boost with DX11 though.
I think this is a case where you can lay the blame on both the developer and Nvidia.
Unless you can prove the difference is due it being uwp instead of dx11 vs dx12 I don't see how being uwp is relevant.I'm waiting to see how those clowns in the Performance Thread are going to try to continue to justify UWP performance now. smh
Unless you can prove the difference is due it being uwp instead of dx11 vs dx12 I don't see how being uwp is relevant.
Have you actually had a refund request denied, or are you just guessing?
I've not heard of anyone having a request denied.
Or UWP
Or UWP
My deal with DX12 is that the games don't look much better. Actually in all my tests, they never look different. But they perform worse. I don't see why anyone would want to be apart of it.
My rig is an i7, 1080, 16GB, Windows 10. Tried Rise of the Tomb Raider, Hitman and something on UWP.
It's still the early days of DX12. If performance is still lackluster a year from now, when games are starting to come out that were built from the ground up on DX12, then we can declare DX12 to be a bust. It's premature to make any judgments until then.
Aren't all UWP games like this? They got made for XB1 and then got ported to UWP. Can't get more "from the ground up" than that.
Aren't all UWP games like this? They got made for XB1 and then got ported to UWP. Can't get more "from the ground up" than that.
My deal with DX12 is that the games don't look much better. Actually in all my tests, they never look different. But they perform worse. I don't see why anyone would want to be apart of it.
My rig is an i7, 1080, 16GB, Windows 10. Tried Rise of the Tomb Raider, Hitman and something on UWP.
Rise of the Tomb Raider is noticably better for me in DX12, in DX11 I have constant stutter and FPS dips to the high 20s whereas DX12 doesn't hitch or stutter at all and dips to the 40s. Then again that's Nixxes and not every studio is Nixxes. Similarly, Dolphin does run a bit faster in DX12 as well. But yeah, no DX12 title has been as dramatic of a leap as Vulkan was for DOOM, going from High to Ultra preset while giving me a higher framerate was a welcome addition.
Umm, no.
Anyone with the windows 10 store version should be given a Steam key as well.
I can't believe I am stuck with the gimped version and if I knew it was coming to steam I would never have purchased on the win 10 store as the whole thing was a nightmare.
Starting to wonder what the hell the point of DX12 is. Aside from forcing people to upgrade to windows 10.
It's still the early days of DX12. If performance is still lackluster a year from now, when games are starting to come out that were built from the ground up on DX12, then we can declare DX12 to be a bust. It's premature to make any judgments until then.
Absolutely. I argued the same thing when DX10 came out. Until more games are built around it, there's no point.
Great post. That clarifies your point very well.
How far in DX10's life was RE5? Because that's the only game I actually remember using it, (apart from the first two Bioshocks which used it for sexier water) made a pretty nice difference performance wise too.
I was specifically talking about the "from the ground up" statement. I mean there's no DX11 involved in UWP games so my argument was that UWP games are basically built up from the gound in DX12.
Recore and Killer Instinct are UWP and DX11.
I can go into visual studio right now and create a DX11 UWP project. Game engines are free to utilize DX11 in UWP as well. There are even a few games that utilize a DX11 renderer in UWP.I was specifically talking about the "from the ground up" statement. I mean there's no DX11 involved in UWP games so my argument was that UWP games are basically built up from the gound in DX12. XB1 practically uses a custom version of DX12.
I'm happy to report that I was able to get a refund just now from Microsoft support. It was like pulling teeth though. I started in chat, explained in detail the situation, they stated that they couldn't do anything to help me. I had to request a supervisor on three different occasions before they finally agreed.
They requested a number to call, and the supervisor called me about 5 minutes later. I brought her up to speed on the phone, she put me on hold for another 5 minutes or so, and told me that that she could do a one time refund as a courtesy to me.
So yeah, $63.29 refunded to my card, for my Quantum Break Win 10 copy that I purchased back in April.
Or UWP
You should be able to, but at high settings, not ultra.Just tell me one thing:
I have a Geforce 1070.
Can I get 1080p (NO SCALING) and 60fps?
Yes, I am willing to turn down effects.
Should be possible on medium settings with some of them being on high even maybe.Just tell me one thing:
I have a Geforce 1070.
Can I get 1080p (NO SCALING) and 60fps?
Yes, I am willing to turn down effects.
Starting to wonder what the hell the point of DX12 is. Aside from forcing people to upgrade to windows 10.
I'm happy to report that I was able to get a refund just now from Microsoft support. It was like pulling teeth though. I started in chat, explained in detail the situation, they stated that they couldn't do anything to help me. I had to request a supervisor on three different occasions before they finally agreed.
They requested a number to call, and the supervisor called me about 5 minutes later. I brought her up to speed on the phone, she put me on hold for another 5 minutes or so, and told me that that she could do a one time refund as a courtesy to me.
So yeah, $63.29 refunded to my card, for my Quantum Break Win 10 copy that I purchased back in April.
This was my first purchase on the windows store.
Needless to say I feel scammed and will never use it ever again.
In theory, it allows for better optimizations on the PC by giving developers direct access to the hardware. But you're not going to see the benefits in a quick DX11->DX12 port.Starting to wonder what the hell the point of DX12 is. Aside from forcing people to upgrade to windows 10.
The point is to remove a large portion of the driver which basically checks after the developers to make sure that they didn't write something which would result in a crash / error / low performance / etc. This part of the driver also manages all the resources and memory partitions, making sure that the proper data is in a proper place all the time.
This part of the driver is why DX11 driver consume so much CPU - because it constantly assess the code coming from the application for correctness, parse it to prepare resources, trying to "guess" what the application may need in the near future. It's basically a very complex runtime code analyzer with predictions and deep knowledge of the h/w it's running on. Removing this part means that a lot of CPU can be free to do other stuff, either in graphics or in something else like AI.
Unfortunately this also means that developers must be the ones who will implement the same resource management in their renderers (validation checks are removed completely in a hope that developers won't ship a broken game which can't even launch).
And this isn't a simple thing when you consider that such implementation should not only be tweaked for all possible PC configurations (RAMs from 4GBs to 32GBs, VRAMs from 2GBs to 16GBs, different PCIE speeds, different storage speeds, different background apps, etc) but also be quite a bit different for different GPUs, not only from different vendors but between different GPU families of one vendor even.
So if a developer manages to implement all of this properly he will get a lot more CPU ceiling to play with which may result in either more draw calls submitted (meaning more objects on the screan at each time) or CPU used for advanced physics or AI simulation.
If a developer won't be able to implement this properly DX12 renderer will run worse than DX11 one where this is already implemented by the IHVs in their drivers, and while it will still provide additional CPU resources, this will happen at the cost of worse GPU usage which is hardly a good trade to make as most games are GPU limited on PC.
With QB it's pretty apparent that Remedy used the console codebase for the DX12 renderer ignoring the fact that it's a bad fit for all h/w beyond AMD's GCN GPUs. Moving the game to DX11 thus resulted in BIG gains on NV h/w (and Intel's probably, if anyone's willing to check) since the inefficient part of the DX12 renderer for NV h/w was substituted with the highly efficient DX11 driver implementation from NV.
Should be possible on medium settings with some of them being on high even maybe.
The point is to remove a large portion of the driver which basically checks after the developers to make sure that they didn't write something which would result in a crash / error / low performance / etc. This part of the driver also manages all the resources and memory partitions, making sure that the proper data is in a proper place all the time.
This part of the driver is why DX11 driver consume so much CPU - because it constantly assess the code coming from the application for correctness, parse it to prepare resources, trying to "guess" what the application may need in the near future. It's basically a very complex runtime code analyzer with predictions and deep knowledge of the h/w it's running on. Removing this part means that a lot of CPU can be free to do other stuff, either in graphics or in something else like AI.
Unfortunately this also means that developers must be the ones who will implement the same resource management in their renderers (validation checks are removed completely in a hope that developers won't ship a broken game which can't even launch).
And this isn't a simple thing when you consider that such implementation should not only be tweaked for all possible PC configurations (RAMs from 4GBs to 32GBs, VRAMs from 2GBs to 16GBs, different PCIE speeds, different storage speeds, different background apps, etc) but also be quite a bit different for different GPUs, not only from different vendors but between different GPU families of one vendor even.
So if a developer manages to implement all of this properly he will get a lot more CPU ceiling to play with which may result in either more draw calls submitted (meaning more objects on the screan at each time) or CPU used for advanced physics or AI simulation.
If a developer won't be able to implement this properly DX12 renderer will run worse than DX11 one where this is already implemented by the IHVs in their drivers, and while it will still provide additional CPU resources, this will happen at the cost of worse GPU usage which is hardly a good trade to make as most games are GPU limited on PC.
With QB it's pretty apparent that Remedy used the console codebase for the DX12 renderer ignoring the fact that it's a bad fit for all h/w beyond AMD's GCN GPUs. Moving the game to DX11 thus resulted in BIG gains on NV h/w (and Intel's probably, if anyone's willing to check) since the inefficient part of the DX12 renderer for NV h/w was substituted with the highly efficient DX11 driver implementation from NV.
Interesting read... I'm not an expert on this stuff, but to me, this complete removal of validations checks, as you put it, doesn't sound good, and honestly, leaves me more than a bit skeptical on the proper utilization of DX12 if we're left at the "mercy" of developers who have to manually account for so many variables. You make DX12 sound like a "one step forward, two steps backwards" solution, there's potential to free up the CPU, but making use of it sounds like a lot of work.The point is to remove a large portion of the driver which basically checks after the developers to make sure that they didn't write something which would result in a crash / error / low performance / etc. This part of the driver also manages all the resources and memory partitions, making sure that the proper data is in a proper place all the time.
This part of the driver is why DX11 driver consume so much CPU - because it constantly assess the code coming from the application for correctness, parse it to prepare resources, trying to "guess" what the application may need in the near future. It's basically a very complex runtime code analyzer with predictions and deep knowledge of the h/w it's running on. Removing this part means that a lot of CPU can be free to do other stuff, either in graphics or in something else like AI.
Unfortunately this also means that developers must be the ones who will implement the same resource management in their renderers (validation checks are removed completely in a hope that developers won't ship a broken game which can't even launch).
And this isn't a simple thing when you consider that such implementation should not only be tweaked for all possible PC configurations (RAMs from 4GBs to 32GBs, VRAMs from 2GBs to 16GBs, different PCIE speeds, different storage speeds, different background apps, etc) but also be quite a bit different for different GPUs, not only from different vendors but between different GPU families of one vendor even.
So if a developer manages to implement all of this properly he will get a lot more CPU ceiling to play with which may result in either more draw calls submitted (meaning more objects on the screan at each time) or CPU used for advanced physics or AI simulation.
If a developer won't be able to implement this properly DX12 renderer will run worse than DX11 one where this is already implemented by the IHVs in their drivers, and while it will still provide additional CPU resources, this will happen at the cost of worse GPU usage which is hardly a good trade to make as most games are GPU limited on PC.
With QB it's pretty apparent that Remedy used the console codebase for the DX12 renderer ignoring the fact that it's a bad fit for all h/w beyond AMD's GCN GPUs. Moving the game to DX11 thus resulted in BIG gains on NV h/w (and Intel's probably, if anyone's willing to check) since the inefficient part of the DX12 renderer for NV h/w was substituted with the highly efficient DX11 driver implementation from NV.