I agree with that assessment. Core counts have ballooned as IPC and frequencies have stuck. As long as there are excess cores and/or you’re playing at GPU-bound resolution I see no real issue. It is a difference to be sure but doesn’t affect many.as i see it the downsides are somewhat niché without any big impact for most use-cases
It’s strange. People are like that though. I admit I get into a brand and Stan for them sometimes e.g., Noctua.Not sure I understand the NV cope in here. Pretty embarrassing, they just make GPUs, they don't need, and ought not be, worshipped.
The current gen consoles appear to have reasonable 8-core 16-thread CPUs so I would imagine the transition will occur quite soon. I’m waiting to upgrade my ryzen 3600 right now. Was considering a 5600x but we are mostly GPU constrained. Figured I’ll wait and focus on obtaining a 3080FE (I know fat chance) and wait for AM5 to see what happens on the CPU side.When will this be? When the next gen games starting to use 8 cores?
As soon as cross-gen period ends. Which I suspect is happening by the end of this year.When will this be? When the next gen games starting to use 8 cores?
It's in the OP for many hours now. Should probably watch the vid before making assumptions like that. It's shared/linked for a reason.Should probably mention that in the OP then.
Weird? How?That is a issue with his channel... he lose the trust to be watched.
Indeee he looks like a
BTW the test is weird no matter how you try to defend it.
And what a stupid post.What a stupid test. 3090 + 1600X is a terrible, terrible benchmark for anything.
Why does 5700 XT outperform 3070 by nearly 20% with a decent CPU like 3600X? If CPU bound, at worst, both configs should produce near identical results, no?Why would someone who can afford a 3090 put it on a low end system? The whole basis for this thread is ridiculous.
To all the people saying "this makes no sense as a test". The point of the analysis is not that gamers in the wild will be realistically pairing a 3090 with a 1600X. The point is to show that when Vulkan and DirectX 12 games are CPU-bound, they seem to take a bigger performance hit in configurations with a high-end Nvidia GPU compared to ones with a high-end AMD GPU.
As of now there is no game on the market that can single-handedly put a new, high-end CPU to its knees; so if you need to sample CPU-bound scenarios, you have to go with a mediocre CPU, or the conditions for the test itself are simply impossible to create. The takeaway is "CPU-bound game", not "crap CPU". Games will naturally become more taxing once PS4 and Xbox One are dropped, and this phenomenon could start to affect more titles and better CPUs in the years to come.
Why would someone who can afford a 3090 put it on a low end system? The whole basis for this thread is ridiculous.
Drivers overheadWhy does 5700 XT outperform 3070 by nearly 20% with a decent CPU like 3600X? If CPU bound, at worst, both configs should produce near identical results, no?
I've asked this many times now but nobody wants to answer this question.
C'mon now. What do you have to say about the data they've gathered? It's all fake and HUB is conspiring against NVIDIA? Please.
You're asking the people that would buy a 1050 Ti over a RX570 because: "a 1080 Ti is better than a Vega!!"Why does 5700 XT outperform 3070 by nearly 20% with a decent CPU like 3600X? If CPU bound, at worst, both configs should produce near identical results, no?
I've asked this many times now but nobody wants to answer this question.
It doesn’t your wrong.Why does 5700 XT outperform 3070 by nearly 20% with a decent CPU like 3600X? If CPU bound, at worst, both configs should produce near identical results, no?
I've asked this many times now but nobody wants to answer this question.
The VR thing is a good point. I have a higher than 1080p display so while I appreciate the article and the thread discussion it felt pretty irrelevant. I assume though that you can super sample before VR output to try and help things.there are plenty of people who pair, say a 3600 with a high end card. looking at their results it could matter if you do high frame rate gaming or vr. i typically game in 4k which makes it irrelevant but I also use vr (matters less when you turn up supersampling but still)
The last gen consoles had already 8 cores. Why would the developers to start using it now if they didn't use it last gen?As soon as cross-gen period ends. Which I suspect is happening by the end of this year.
Lol, what a weird post. The data is right there, in the OP. Click the spoiler tag to reveal one of the bar graphs. Look at 3600X + 3070 fps vs 3600X + 5700 XT.It doesn’t your wrong.
Another weird post. Last-gen consoles had 8 cores/8 threads. They did use all of it, while not all of the 8 cores/threads for gaming since the OS and background apps needed some of the CPU resources - the games had access to up to 7 to 7.5 cores. On PC, games already use more than 8 cores/threads like WD 2, WD Legion, Cyberpunk 2077...The last gen consoles had already 8 cores. Why would the developers to start using it now if they didn't use it last gen?
The last gen consoles had already 8 cores. Why would the developers to start using it now if they didn't use it last gen?
And again the question why would they? Why spend a lot of time and money to do complicated optimization of the code for multithreading if you can simply cut the corner and use the now available more raw power of the cores to achieve the same result by using only two or three cores?Another weird post. Last-gen consoles had 8 cores/8 threads. They did use all of it, while not all of the 8 cores/threads for gaming since the OS and background apps needed some of the CPU resources - the games had access to up to 7 to 7.5 cores. On PC, games already use more than 8 cores/threads like WD 2, WD Legion, Cyberpunk 2077...
PS5/XSX/S have 8 cores w/ SMT (16 threads) double the thread count of last-gen consoles. And according MS, games can have access to 14 threads leaving 1 core/2 threads for the OS and non-gaming apps.
So yeah, they doubled the thread count for a reason. Because games WILL actually use more than 8 threads going forward as it is already the case on PC these days.
Because as the game complexity increases, they have to? And because it's already being done? Don't games like WD Legion not use more than 8 threads on PC? I don't understand your argument. Are you suggesting that next-gen open-world from Ubisoft, Rockstar Games will use just 2-3 cores on next-gen consoles?And again the question why would they? Why spend a lot of time and money to do complicated optimization of the code for multithreading if you can simply cut the corner and use the now available more raw power of the cores to achieve the same result by using only two or three cores?
Consoles have undervolted 3700x no? 8c16t5600X is a 6core/12thread CPU and a lot faster than the CPU on these consoles. In fact even a quad core with 8thread is more than enough as long it has good single thread performance. This thread and video is pretty useless IMO. May be for a newbie who is a into esports and thinks high end gfx cards will not get bottlenced by CPU at1080P resolution
Not 3700X per se, that's a desktop Zen 2 part. Console's Zen 2 closely match mobile Zen 2 i.e. 4800H. Same 8C/16T but with a significant reduction to L3$ (a cutdown from 32MB to 8MB).Consoles have undervolted 3700x no? 8c16t
Consoles have undervolted 3700x no? 8c16t
You are trying to blow up this whole Nvidia thing. It was a very bad cpu + gpu combination in the first place. It's like you put tractor tires on your Porsche and then trying to show that Porsche has a problem because it get outperformed by a Golf.Because as the game complexity increases, they have to? And because it's already being done? Don't games like WD Legion not use more than 8 threads on PC? I don't understand your argument. Are you suggesting that next-gen open-world from Ubisoft, Rockstar Games will use just 2-3 cores on next-gen consoles?
Do you know even RT can eat up a lot of CPU cycles?
3600X + 3070 isn't a bad combination though. Both mid-range CPU + GPU. Yet, 5700 XT, a weaker/cheaper GPU than 3070, is nearly 20% faster than 3070 when CPU-bound. That's significant and should not be happening. Both combinations should be producing identical results at worst. Why is it so hard to comprehend for you people?You are trying to blow up this whole Nvidia thing. It was a very bad cpu + gpu combination in the first place. It's like you put tractor tires on your Porsche and then trying to show that Porsche has a problem because it get outperformed by a Golf.
Because you are eding an artificial bottleneck to the system and expecting from the gpu somehow magically to bypass this bottleneck.Why is it so hard to comprehend for you people?
Good job ignoring the rest of the comment and sidestepping. Almost every NVIDIA defender in this thread has been doing the same.Because you are eding an artificial bottleneck to the system and expecting from the gpu somehow magically to bypass this bottleneck.
See the problem here?reasonably
See the problem here?
at the same time, You really have to stretch your imagination to esports levels to be bound by cpu.3600X + 3070 isn't a bad combination though. Both mid-range CPU + GPU. Yet, 5700 XT, a weaker/cheaper GPU than 3070, is nearly 20% faster than 3070 when CPU-bound. That's significant and should not be happening. Both combinations should be producing identical results at worst. Why is it so hard to comprehend for you people?
Tell me why shouldn't we ask better from NVIDIA as a consumer?
Again sidestepping and ignoring the fact that 5700 XT is producing better results than a more expensive GPU, when they should be identical in CPU bound scenario. What do you have to say about that?See the problem here?
Today CPU bound = Medium, 1080p.at the same time, You really have to stretch your imagination to esports levels to be bound by cpu.
Low/medium graphics, 720p/1080p etc
I highly doubt it. We will be gpu bound always aside from esportsToday CPU bound = Medium, 1080p.
Tomorrow it will be 1440p (to some extent it already is with bigger GPUs) that will become CPU bound as the game scene complexity increases with next-gen, open-world games and 16 thread CPUs on PS5/XSX are pushed harder.
I highly doubt it. We will be gpu bound always aside from esports
at the same time, You really have to stretch your imagination to esports levels to be bound by cpu.
Low/medium graphics, 720p/1080p etc
There is no problem, except for bad CPU/GPU pairing.
What a lame excuse...Nvidia has a greater feature set
Let's pretend 2600x is a bad CPU, shall we.It was a very bad cpu + gpu combination in the first place.
The claimed number of cores is two times higher, than actual number.monstrous amounts of cudacores they have in Ampere
It depends completely on the CPU performance level developers are targeting. Stick in a netbook class CPU and suddenly you are CPU bound on everything. Well, since developers are going to be targeting Zen 2 class performance for 60 FPS, if you have a Zen 2 processor and you want to do 120 FPS, you are going to want some low overhead GPU drivers!I highly doubt it. We will be gpu bound always aside from esports
Why should a gpu produce better results than another if the limiting factor the cpu is?Again sidestepping and ignoring the fact that 5700 XT is producing better results than a more expensive GPU, when they should be identical in CPU bound scenario. What do you have to say about that?
Why would a sane person expect 5700XT to beat 3070 when using something as recent as 2600x?Why should a gpu produce better results than another if the limiting factor the cpu is?
It depends how you use it. With a 5700 it's a good cpu. With a 3090 it's a bad cpu.Let's pretend 2600x is a bad CPU, shall we.
Even with a 3600X + 3070, the issue is seen. Wanting high refresh rate gaming on a 3070 doesn't seem super unreasonable.It depends how you use it. With a 5700 it's a good cpu. With a 3090 it's a bad cpu.
it's pretty obvious what path OP is walkingShould probably mention that in the OP then.