• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[HUB] NVIDIA Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

Md Ray

Member



What kind of driver overhead problem, you ask?
5600 XT + 1600X combo pummels RTX 3090 + 1600X combo by being 18% ahead:
hwEStwe.png


More details/benchmarks in the video.

EDIT:
Also, this isn't a Ryzen-only issue. The same issue exhibits even with an Intel CPU.

Also, note that 3600X + 3070 is slower than 3600X + 5700 XT which, IMO, should not be happening.

EDIT 2:
Hardware Unboxed said:
Some additional information: If you want to learn more about this topic I highly recommended watching this video by NerdTech:


You can also follow him on Twitter for some great insights into how PC hardware works: https://twitter.com/nerdtechgasm As for the video, please note I also tried testing with the Windows 10 Hardware-Accelerated GPU Scheduling feature enabled and it didn't change the results beyond the margin of error. This GeForce overhead issue wasn’t just seen in Watch Dogs Legion and Horizon Zero Dawn, as far as we can tell this issue will be seen in all DX12 and Vulkan games when CPU limited, likely all DX11 games as well. We’ve tested many more titles such as Rainbow Six Siege, Assassin’s Creed Valhalla, Cyberpunk 2077, Shadow of the Tomb Raider, and more.


A follow-up, discussion type video:



Part 2:
 
Last edited:

vanguardian1

poor, homeless and tasteless
Even as a long time ATI/AMD user, I have to say that this is kinda ridiculous...

I guess the saying is true, try to find a problem long enough, you'll eventually find one. *shrugs*

.... I wonder if it's the newer GPU designs and their newer drivers, 'cause it used to be AMD GPU's needed beefy cpu's to work competitively and nvidia's didn't..... :-/
 

Spukc

always chasing the next thrill
Why 3090?

Just look at:
Ryzen 5 3600 + RX 5700 XT
vs
Ryzen 5 3600 + RTX 3070

5700 XT is 17% faster than 3070. Does that look normal to you?
again.
the fuck is the point what idiot pairs a 3090 with a 1600x
 

Md Ray

Member
again.
the fuck is the point what idiot pairs a 3090 with a 1600x
This test reveals problems with the NVIDIA driver. The point of this test. Let alone 1600X even with a mid-range 3600X with mid-range 3070, the issue is reproducible and is slower than 5700 XT.

Why is this happening?

Shouldn't NVIDIA be questioned?
 
Last edited:

martino

Member
This test reveals problems with the NVIDIA driver. The point of this test. Let alone 1600X even with a mid-range 3600X with mid-range 3070, the issue is reproducible and is slower than 5700 XT.

Why is this happening?

Shouldn't NVIDIA be questioned?
Can we really says it's a global problem with 2 games and one cpu to support the theory ?
And why not question amd when the only intel cpu in mainly one test is "only" a i3-10100 and it perform best than 2600x ? older intel cpu are really lacking in this test.
so i know what fanboy will decide here with that small bad sample:
  • nvidia one will say amd is cripling nvidia gpu performance
  • amd one will say nvidia is cripling amd cpu performance
all this because of another lacking click bait hardware unboxed video...

edit: ho god i did again...i overreacted.
 
Last edited:
They do great monitor reviews. I must of missed something because I didn't know these guys were disliked as well haha.

Needs to be tested with Intel CPUs as well to confirm.
 

martino

Member
They do great monitor reviews. I must of missed something because I didn't know these guys were disliked as well haha.

Needs to be tested with Intel CPUs as well to confirm.
And the actual pc only cpu intensive games already there showing this (if it's not contextual to those games)....
 
Last edited:

KungFucius

King Snowflake
Drivers are optimized for individual games. If games become more CPU intensive and that impacts performance at the high end, why wouldn't Nvidia change their optimization strategy?

If people have RTX cards with older CPUs they should make a big fuss over this to see what Nvidia does. I am not too concerned about some mythical future where CPU intensive game are not considered an optimization priority by Nvidia because my CPU should be OK for a while, but for those with 3600s, that should also be OK, they should raise a stink.
 
Good job I game at 4K/60 then I guess on a1600X for now, the upgrade from a 5700XT to a 2080ti would've been humorously backwards if I was using 1080p medium settings. No need to get upset 🤷‍♂️
 

llien

Member
god knows... dude
Fair enough.

Original thought was "future proofing". The idea was, that artificially getting into CPU limited scenarios would predict how CPUs perform in future games.
And then came Ryzen and all that appeared to be a rather naive take.

That being said, he was testing a range of CPU + GPU combinations, it would look weird, if he'd skip certain combinations.

Aren't his findings... rather curious ?
From generic "you need faster CPUs with NV's GPUs, due to heavier driver" to concrete examples. And that difference is in 2 digit % area.

Doesn't it surprise that 5700XT + 2600x is faster (e.g. WD:L) than 3070 + 2600x?
 
again.
the fuck is the point what idiot pairs a 3090 with a 1600x
Look at all the other results, it affects the 1% lows quite a bit on all Nvidia GPUs.

Now, if I was to buy a 6 core CPU I'd probably not pair it with such an expensive GPU (assuming MSRP, which doesn't happen). But this is also true for the 3070 vs the 5600x, where both the top and low fps. Neither the ryzen 2600x nor 3600x are slouches.

How do you make the argument for a 3600 or 3600ti? Other than I need CUDA it would not make any sense.
 
I knew the owner of the video/article before open the thread.

That tells you a lot.

The fact he didn’t use any Intel CPU to drawn better conclusions is out of my mind like he is trying to find the result he wants lol

There's some data towards the end for the Core i3 10100 that shows 10-20% higher utilisation when using an Nvidia GPU with similar performance in the same scenario on Shadow of the Tomb Raider.

It's interesting data, but kinda moot in real world scenarios given most people aren't gaming at those settings on a 3090 or 3070
 

martino

Member
Fair enough.

Original thought was "future proofing". The idea was, that artificially getting into CPU limited scenarios would predict how CPUs perform in future games.
And then came Ryzen and all that appeared to be a rather naive take.

That being said, he was testing a range of CPU + GPU combinations, it would look weird, if he'd skip certain combinations.

Aren't his findings... rather curious ?
From generic "you need faster CPUs with NV's GPUs, due to heavier driver" to concrete examples. And that difference is in 2 digit % area.

Doesn't it surprise that 5700XT + 2600x is faster (e.g. WD:L) than 3070 + 2600x?
i will use your classic dlss argument here.
how many games and games to come with decima and wd engine ?
if there were tests with also old intel cpu showing that on games using unreal engine or unity..this video could actually weight in my gpu purshase decision (if stock are there one day).
As it is it's not.
 
Last edited:
Wow, some of you really aren't all that knowledgeable when it comes to methodology. Jesus Christ....

Next to nobody will pair a 3090 with a 1600x, but doing this test to expose a driver overhead issue is a valid methodology to expose this issue. The fact that a 3070 coupled with a 3600x (an actually very possible and sensible combo that you can purchase right now in a new pre-made pc on Dell, cyberpowerpc or ibuypower) is slower than a 5700xt with the same CPU says it all. You have to question this as a consumer. More effort could probably be done from team green.

As said in the end of the video, it's a known issue. No revelation here.

What I am also to going to question is their methodology here though. Why have they included a single intel CPU (i3 10100) instead of opting to do the same thing as they did with AMD CPUs? They could have made a separate chart. 10600k, 9600k, 8600k, 7600k and on the other hand keep the 3600x in the loop for the rest of the video for the AMD chart. This would have made much more sense. Condensing all this data like this only, mixing it up, only expose the issue with too much scamble, bringing some people to question the methodology even though it's good. They disservice themselves by doing so. Also, while this problem exists, the picture in the OP is perhaps the absolute worst case possible. It's not as bad as this in other games, but also, the issue faints away as you start cranking the IQ, may it be game details or resolution. As always, the more you crank up that resolution, the less the cpu is bottlenecking the GPU. People playing at 1440p or 4k will use their GPU horsepower and feel the bottleneck of their CPUs much less so than if they are playing at 1080p medium settings. If you have a 3070, you might have a 1080p monitor still, but you sure as hell won't play at medium settings lol

All that's left here is people playing at high refresh rate. Those will most assuredly have a beefier CPU since it's pretty much a requirement unless you're only playing light ressource competitive games like MOBA, CS:GO, OVERWATCH, etc.
 
Last edited:

llien

Member
If you are doing a cpu test it takes any gpu bottle necks out of the equation.
And tests what? Note how you can barely get anything practical from benchmarking tasks that was light on CPU anyway (hence un-optimized)

Probably because there was nothing wrong with the DF review itself.
There is nothing wrong with misleading customers and kissing Huang's butt, it's just a business, I remember that.
 

FireFly

Member
There is nothing wrong with misleading customers and kissing Huang's butt, it's just a business, I remember that.
Well, I think there is a pretty clear line between a preview, which yes, can be paid marketing (benchmark these games. don't talk about X, Y or Z) and a review which should be objective and with no restrictions. You can think DF are "selling out" to Nvidia, while still being capable of providing objective analysis in the review. That line is why previews can be that way in the first place. Since if your reviews are inaccurate, your site's reputation will be affected, but previews are always taken with a large dose of salt.

It's the same with game "previews". It's almost expected that the previewer will try to remain positive even if the game is a piece of shit.
 

VFXVeteran

Banned



What kind of driver overhead problem, you ask?
5600 XT + 1600X combo pummels RTX 3090 + 1600X combo by being 18% ahead:
hwEStwe.png


More details/benchmarks in the video.

You can't count one game and declare that the entire driver set for the Nvidia graphics boards are at fault. You would have to give multiple examples of games exhibiting the same result. Perhaps it's the game.
 
Last edited:
You can't count one game and declare that the entire driver set for the Nvidia graphics boards are at fault. You would have to give multiple examples of games exhibiting the same result. Perhaps it's the game.

"This GeForce overhead issue wasn’t just seen in Watch Dogs Legion and Horizon Zero Dawn, as far as we can tell this issue will be seen in all DX12 and Vulkan games when CPU limited, likely all DX11 games as well. We’ve tested many more titles such as Rainbow Six Siege, Assassin’s Creed Valhalla, Cyberpunk 2077, Shadow of the Tomb Raider, and more."

Recommended video in the comments:

 

sn0man

Member
Seems stupid on its face to point this performance oddity out.

Also seems relevant because the 3600 is a popular CPU (as were previous 6-core ryzen).

For the haters; the worst that can happen is NVIDIA looks into it and identifies / fixes an overhead or other driver quirk and gamers benefit from a better product in the future.
 
You can't count one game and declare that the entire driver set for the Nvidia graphics boards are at fault. You would have to give multiple examples of games exhibiting the same result. Perhaps it's the game.
If you watch the video you'll see he uses more than one game. Also demonstrates the increased NVidia driver overhead when using an Intel cpu towards the end of the video.
 
Last edited:
again.
the fuck is the point what idiot pairs a 3090 with a 1600x
Look at all the other results, it affects the 1% lows quite a bit on all Nvidia GPUs.

Now, if I was to buy a 6 core CPU I'd probably not pair it with such an expensive GPU (assuming MSRP, which doesn't happen). But this is also true for the 3070 vs the 5600x, where both the top and low fps. Neither the ryzen 2600x nor 3600x are slouches.

How do you make the argument for a 3600 or 3600ti? Other than I need CUDA it would not make much sense.
If people have RTX cards with older CPUs they should make a big fuss over this to see what Nvidia does.
It's why GPUs should also be tested with low / mid range CPUs more often.
 
Last edited:
Seems stupid on its face to point this performance oddity out.

Also seems relevant because the 3600 is a popular CPU (as were previous 6-core ryzen).

For the haters; the worst that can happen is NVIDIA looks into it and identifies / fixes an overhead or other driver quirk and gamers benefit from a better product in the future.

its not a oddity, we've seen this behavoir in the lots of game in previous videos



watch the ryzen 1600 results...

nvidia is doing it's GPU scheduling driver-side and not in hardware like AMD. that's taking up CPU cycles. in the most common scenarios that's not really a problem.
 

sn0man

Member
its not a oddity, we've seen this behavoir in the lots of game in previous videos



watch the ryzen 1600 results...

nvidia is doing it's GPU scheduling driver-side and not in hardware like AMD. that's taking up CPU cycles. in the most common scenarios that's not really a problem.

So my “worst case NVIDIA fixes it” is off by quite a bit. NVIDIA goes to hardware scheduler in future gen is the fix?
 

Kuranghi

Member
*Me over here trying to play most games at 4K-8K*

Me: Haha, I'll upgrade my 3770K in ANOTHER 9 years...
*Then remembering how much better emulation/Milkdrop could be with a recent CPU*:

Cat Zoom In GIF by Paul Trillo


Infini-detail!
 
Top Bottom