• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Forspoken PC Requirements

Gaiff

SBI’s Resident Gaslighter
Once again I would like to say with chest loud and proud so the dunces in the back can hear.

PS5 hardware is NOT one to one “equal“ to any PC.
The APIs are different.
The hardware is custom.

To match certain low level efficiencies on PS5 you need to vastly over spec your gaming PC which of course can achieve better results but you can not say “oh it’s like a 3060” etc as a hardware configuration of the ps5 does not exist in desktop nor are the systems and APIs driving it.
Is that why a 3700X/2070S/16GB combo performs like a PS5 90% of the time and beats it in RT 99% of the time?
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
tl;dr: Are you wrong? Not necessarily, 6-core CPUs tend to perform very well for their majority of their lives for high-end gaming but since people tend to keep their CPUs for quite a bit longer than their GPUs, it isn't exactly the best solution to get a mid-tier CPU if you plan on sticking to top-tier GPUs for a while. They will often be a bottleneck.
Then I think you missed what ive been saying.
Me and GHG have been talking about this since the XSX/PS5 specs were leaked and the Ryzen 3000s were a thing.
He stated way back when then you will "NEED" 8 cores, you'd be a fool to buy a 3600X or 10600K because XSX/PS5 games will crush those CPUs

Ive been effectively saying:
Once you are at 6 cores the number of cores needed for gaming this generation wont matter as long as IPC and cache keeps going up.

The 3600 was more than enough when it launched.
The 5600 was more than enough when it launched.
The 7600 was more than enough when it launched.

At no point since Ryzen 3000/10th generation have more than 6 cores proven to be a thing thats needed for high end gaming.
And Im still standing by those statements that this generation there wont be many games that actually load up more cores and have any sort of major benefit assuming you are comparing similar IPC.
Obviously new generation parts are better than old generation parts no one is denying that and it isnt something that needs to be stated.
But "in generation" 6 cores have been within 5% of their higher core brothers pretty much every time......X3D notwithstanding.

If you have the IPC, games use a master thread and worker threads, the master thread does most of the heavy lifting, so if your IPC is high enough 6 cores will suffice even for high level gaming.
If we are targeting 60 then even older generations of 6 cores CPUs will be able to hold their own (10600K is still an absolutely stellar chip).
Once we reach 4K and really start getting GPU bound then those extra 10 cores matter even less.

When it comes to the 12th gen CPUs.
The 12400 is a worse bin than the 126/7/9K which is where the advantage is really being made, not because they also have e-cores or more p-cores.
So if you are building a new system you can save money by buying the current creme of the crop 6 core and enjoy 99% of the performance of the most expensive chip.
You are in generation so if inexplicably that ~10% differential in 1% lows start to bother you in however many years, then there is an upgrade path.
Or get whatever the current generations 6 core CPU is.
 

Gaiff

SBI’s Resident Gaslighter
Then I think you missed what ive been saying.
Me and GHG have been talking about this since the XSX/PS5 specs were leaked and the Ryzen 3000s were a thing.
He stated way back when then you will "NEED" 8 cores, you'd be a fool to buy a 3600X or 10600K because XSX/PS5 games will crush those CPUs

Ive been effectively saying:
Once you are at 6 cores the number of cores needed for gaming this generation wont matter as long as IPC and cache keeps going up.

The 3600 was more than enough when it launched.
The 5600 was more than enough when it launched.
The 7600 was more than enough when it launched.

At no point since Ryzen 3000/10th generation have more than 6 cores proven to be a thing thats needed for high end gaming.
And Im still standing by those statements that this generation there wont be many games that actually load up more cores and have any sort of major benefit assuming you are comparing similar IPC.
Obviously new generation parts are better than old generation parts no one is denying that and it isnt something that needs to be stated.
But "in generation" 6 cores have been within 5% of their higher core brothers pretty much every time......X3D notwithstanding.

If you have the IPC, games use a master thread and worker threads, the master thread does most of the heavy lifting, so if your IPC is high enough 6 cores will suffice even for high level gaming.
If we are targeting 60 then even older generations of 6 cores CPUs will be able to hold their own (10600K is still an absolutely stellar chip).
Once we reach 4K and really start getting GPU bound then those extra 10 cores matter even less.

When it comes to the 12th gen CPUs.
The 12400 is a worse bin than the 126/7/9K which is where the advantage is really being made, not because they also have e-cores or more p-cores.
So if you are building a new system you can save money by buying the current creme of the crop 6 core and enjoy 99% of the performance of the most expensive chip.
You are in generation so if inexplicably that ~10% differential in 1% lows start to bother you in however many years, then there is an upgrade path.
Or get whatever the current generations 6 core CPU is.
Then I misunderstood. I understood it as him saying that 6-core aged worse based on the anecdote of him buying a cheaper CPU and regretting it later on. Since people keep CPUs for 4,5, 6, or even 7 years at times, I used older 6-cores vs older higher end CPUs to demonstrate that the older 6-core did in fact suffer more over time than their brethren.
 
  • Like
Reactions: GHG

rodrigolfp

Haptic Gamepads 4 Life
People are surprised?

This looks about what Id expect for actual next gen games.
This looks current gen to you?

forspoken.jpg


That PS2 textures on the ground around "Forspoken".
 

ACESHIGH

Banned
People are surprised?

This looks about what Id expect for actual next gen games.

I think most folks, myself included, are not seeing enough visual bang for the buck on this game to warrant those specs.
Specs themselves are more or less in line with what's expected for next gen games. Save for some outrageous RAM and VRAM recommended ones.

It's like asking for an I5 2500K and a GTX 760 to run Tetris. Most folks are over those requirements but they still look silly.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I always understood that PS5 = RTX 2070 and Xbox Series X = RTX 2080 Super.
They both perform closer to a 2070S but there are some scenarios where they can approach and even beat a 2080S.

In the case of Playstation exclusives, the PS5 can actually start getting close to a 2080 Ti/3070 in exceptional circumstances.
 

Gaiff

SBI’s Resident Gaslighter
This looks current gen to you?

forspoken.jpg


That PS2 textures on the ground around "Forspoken".
To be fair is there anything that looks "next-gen"? The best looking games are HFW, Plague Tale Requiem, Forza Horizon 5, Flight Simulator 2020, TLOU Part I, Cyberpunk 2077 and a few I'm missing. Would you say any of those games look "next-gen"? They all look like they could run on a PS4 with some cutback settings (well, except for FS2020 but that's mainly due to the CPU).
 

yamaci17

Member
High IPC 6 cores and 16 gigs will be fine until the end of the generation with CONSOLE equivalent settings. So much fearmongering, all will be dispersed once the games are released and they run just fine with said specs.

Its like claiming 4 core 4670k is going to be a shitty CPU since PS4 has 8 cores (slow but 8!). Reality is, aside from Odyssey/Origins, that CPU STILL manages to run PS4 ports way above 45 FPS (such as GoW, Spiderman) with playable %1 lows. Even the case of AC Odyssey and Origins, it was still pushing above 40 FPS, its just that people did not expect 4 cores to run below 60 FPS at that time. But that was a time when consoles targeted 30 FPS to begin with. If console games target 60 FPS, similar specced higher IPC PC CPUs will have no trouble hitting 60 FPS as well. If console games start to not care for 60 FPS and directly target 30 FPS, naturally a 5600x might hit a wall at 45 FPS in some arbitrary future gen game, and it does not make it a bad CPU. It depends on what the performance level target for consoles.





Yeah, a 2013 CPU running a PS4 port in 2022 with acceptable frametimes and framerates. Let's ignore that. Its only 4 cores. Surely it should've sucked in Spiderman. Both 5600x and 7600x has ample amount of extra IPC/singlethread firepower to last an entire gen above console standard framerates. That's it. Just like 4670k had (only this time, at least 5600x is more close to consoles in terms of core count instead of being at half the amount).

Maybe 3600 can struggle and fly really close or maybe fall just a bit below PS5. That's about it.

5600x will be fine running hogwarts legacy, forspoken or any other game that supposedly "requires" a 8 core behemoth.

What PS5 game is going to supposedly "crush" these high power 6/12 CPUs? And what is the point of bringing a 6/6 counterpart? Games have yet to properly scale across 12 threads. Yet some people somehow think games will scale so good that suddenly 12+ core CPUs will fly high. C'mon people. Be realistic. Be reasonable. Just a bit.
 
Last edited:

GHG

Member
Then I misunderstood. I understood it as him saying that 6-core aged worse based on the anecdote of him buying a cheaper CPU and regretting it later on. Since people keep CPUs for 4,5, 6, or even 7 years at times, I used older 6-cores vs older higher end CPUs to demonstrate that the older 6-core did in fact suffer more over time than their brethren.

No you have not misunderstood. You are bang on in terms of what I'm saying.

This discussion originated when the Ryzen 3XXX series was on the market and he was advising people to get 6 core CPU's (and 16GB RAM) even when their budget allowed for an 8 core CPU along with 32GB of RAM through a bit of price optimisation (going for a different case, motherboard, fans, etc).

My stance is, and will always be get the best you can for your budget at the time of building, it is more likely to serve you well in the long term if you do that. Settling for 4 or 6 cores just because "it's ok for now" is nonsense as far as I'm concerned. When you do that you're positioning yourself in a way that you're more likely to get hit with a game that wont run well on your system regardless of whether it's well optimised or not. CPU's of the same generation with more cores have always offered more longivity and that's a fact, I've been on the other side of it (was poorly advised at the time and got a 3570k over a 3770k or 3930k which were my other two options), never again and I wouldn't want other people to go through that when they needn't.

If you get the platform right (motherboard, CPU, RAM) then there is absolutely no reason why it can't serve you well for beyond the length of a typical console generation - all you will need to do is upgrade the GPU once or twice over that period of time and you're good to go. Instead because of poor advice you have people running into bottlenecks, whether that be system RAM or CPU bottlenecks which typically manifest as frame-time issues (which is something Black_Stride Black_Stride always seems to ignore when bombarding people with graphs and fake youtube benchmarks). Smooth and hassle free experiences are what we should seek should we not?

It's always better to go "overkill" than it is to go under. If you undercook things then you're leaving yourself exposed the moment a demanding game comes out (or if there's a wholesale shift in console generation).

For reference here is the post and thread that started it all:

https://www.neogaf.com/threads/contemplating-going-pc-next-gen-but-need-help.1560283/#post-259641475

Lilly Singh Liar GIF by A Little Late With Lilly Singh
 

Kenpachii

Member
U buy hardware on what your demands are.

I buy CPU's to last me 5+ years, because i hate swapping motherboards / memory and reinstalling windows etc, it need to last me. this means when a generation shift happens its absolute fucking useless to buy 6 core cpu's when consoles are featuring 8 core cpu's. This is why i spend more for a 9900k over a 9700/9600 and it pays off now. The same as i bought a 4/8 solution back in the PS3 days when everybody adviced 4/4 solutions but saw years later massive gains because of the more threads which made the CPU last for almost a decade.

If u buy a CPU and plan on upgrading anyway within 2 years, there is no point in really going for a more future proof solution like 8 cores a 9600k would have done fine u just lose money for nothing.

Same goes for 16GB memory has already been a standard for almost a decade, consoles ship with 16gb, its safe to assume with PC that its going over console settings more memory in general will be needed for higher settings. U don't care about dropping settings along the way, 16GB will be fine, u want to be done with it, get 32GB and 64GB is overkill so waste of money. U are going to upgrade to DDR5 in 2 years no point in buying more memory because u will be replacing it anyway with faster memory.

GPU wise, it depends on what u want really and what u expect out of it. U play at 4k and want RT, its going to be expensive, u want 1440p don't really care for RT, u can cheap out a lot more on it.

But in general u also need to look at what games u really like to play. U play mmo's and RTS games and builders, u probably want the best CPU solution u an get, GPU can take a backseat.

U don't got a lot of money but want somethign u can upgrade later on? best to buy good parts and keep one part bad like i did, bought a 9900k, but kept the 970 gtx. Year later bought a 1080ti and a year later bought a 3080. Because i spend a bit more on the 9900k, i didn't had to replace the CPU solution if i did bought a 9600k which would have been fine with a 970 gtx at the time, it would be a massive headic with a 3080 and i would want to upgrade then.

At the end this saves me money.

Buying a PC isn't just "buy this stock solution" u have different demands for people and different solutions for other people.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
No you have not misunderstood. You are bang on in terms of what I'm saying.

This discussion originated when the Ryzen 3XXX series was on the market and he was advising people to get 6 core CPU's (and 16GB RAM) even when their budget allowed for an 8 core CPU along with 32GB of RAM through a bit of price optimisation (going for a different case, motherboard, fans, etc).

My stance is, and will always be get the best you can for your budget at the time of building, it is more likely to serve you well in the long term if you do that. Settling for 4 or 6 cores just because "it's ok for now" is nonsense as far as I'm concerned. When you do that you're positioning yourself in a way that you're more likely to get hit with a game that wont run well on your system regardless of whether it's well optimised or not. CPU's of the same generation with more cores have always offered more longivity and that's a fact, I've been on the other side of it (was poorly advised at the time and got a 3570k over a 3770k or 3930k which were my other two options), never again and I wouldn't want other people to go through that when they needn't.

If you get the platform right (motherboard, CPU, RAM) then there is absolutely no reason why it can't serve you well for beyond the length of a typical console generation - all you will need to do is upgrade the GPU once or twice over that period of time and you're good to go. Instead because of poor advice you have people running into bottlenecks, whether that be system RAM or CPU bottlenecks which typically manifest as frame-time issues (which is something Black_Stride Black_Stride always seems to ignore when bombarding people with graphs and fake youtube benchmarks). Smooth and hassle free experiences are what we should seek should we not?

It's always better to go "overkill" than it is to go under. If you undercook things then you're leaving yourself exposed the moment a demanding game comes out (or if there's a wholesale shift in console generation).

For reference here is the post and thread that started it all:

https://www.neogaf.com/threads/contemplating-going-pc-next-gen-but-need-help.1560283/#post-259641475

Lilly Singh Liar GIF by A Little Late With Lilly Singh
And the 3600X and 3700X during their lifetime didnt prove your stance.
The 5600X and 5700X same shit.
The 7600X and 7700X will be the same.
The price differential would better benefit you being spent on another component noting the 3600X and 3700X had 100 dollar price differential.
The 76 and 77 also have a 100 dollar price differential.
Dont even get me started on the 10 and 16 core components.

to-this-day-deontay-wilder-meme.gif
 

yamaci17

Member
the damn same 9600k and 9900k argument. you're comparing a shadted 6/6 CHIP with a freaking 8/16 chip. that's a total of 2.66x THREAD COUNT.

what we're discussing is 5600x and 5800x. one has 12 threads, and other has 16. 1.23x THREAD COUNT.

3700x experiences exact same bottleneck as 3600x. maybe you get %10 better %1 lows. its not even going to matter if you combine the both CPU with something like 3060ti/2070 super. you will be super GPU bound in most cases to a point CPU will stop mattering in terms of %1 lows. you share super CPU bound super high end GPU benchmarks to prove some points. it makes no sense.

upgrade your GPU, with 3700x? so, you think upgrading from 2070 to something like 4070ti will be "beneficial" for a 3700x chip? NO. hell no. 3700x limits the likes of 2070 as is.

same for 5600x/5800x. both are suitabable for the likes of 3080/2080ti/3070. BOTH will not be suitable for the likes of 4080-4090. you will never be satisfied.

you must built a balanced rig that lasts as much as POSSIBLE together. enduring with a single cpu lastgen was maybe possible. most people did not do that either. if you had a 7700k 1080ti system, you simply stayed there OR you simply went for a 9900k 2080ti rig.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
the damn same 9600k and 9900k argument. you're comparing a shadted 6/6 CHIP with a freaking 8/16 chip. that's a total of 2.66x THREAD COUNT.

what we're discussing is 5600x and 5800x. one has 12 threads, and other has 16. 1.23x THREAD COUNT.
The point was comparing available 6-core to higher end CPUs, not imaginary ones. Besides that, the 3600 these days fares quite a bit worse than the 3700X.

Also

4087427-9343625450-B9283.jpg


What the fuck is this shit?
 

Fabieter

Member
Why is there an outage with this game. System requirements are most of the time higher than its actually needed. They just tested it with the systems and that shouldnt cause any problems.

And are the same time the ps5 Version is 1440p 60fps DSR it's on par with alot of ps5 games even if it drops the resolution heavily in some scenes. I donr see a problem here either.
 

yamaci17

Member
The point was comparing available 6-core to higher end CPUs, not imaginary ones. Besides that, the 3600 these days fares quite a bit worse than the 3700X.

Also

4087427-9343625450-B9283.jpg


What the fuck is this shit?
quite a bit worse? you're exeggerating %10-20 differences. EVEN THEN, those actually "little bit" (not quite) differences only pan out if you pair the 3700x with an unbalanced GPU such as 3080+. it is imbalanced, simple as that. no one in their sane mind should combo a 3700x with a 3080. practical limit for that CPU is practically a 3060ti and that's pushing your luck. both 3600 and 3700x will not "fulfill" the potential of a 3080.

and GPUs below? both CPU will be sleeping in most cases with GPUs such as 2080. in the cases where 3700x matters, both CPU is at a obsolete location, most likely, for the given circumstance. you want to be GPU BOUND as much as possible.

do you think the difference between the to here cements the 3700x as a capable cpu that deserves a GPU upgrade? HELL no. both BOTTLENECKS the gpu hell out of it.

if you had a more balanced system where gpu is maxed out near 60-65 frames, %1 lows would be quite similar. what is the point of having a bottlenecked system to begin with? you must balance your build OR your settings so that you're GPU bound as much as possible. the below situation is not desired nor wanted and is no indicator that 3700x is worth it over a 3600x. if anything, it proves that all those cores only provide a small uplift in performance despite all that jazz. one would expect better, not this. you make it sound as it is a "quite bit" but its not even quite bit. its not even %50. its gotta be at least %50 LIKE it happened with 9600k and 9900k. it will never HAPPEN the way it happened between 4670k and 4790k OR 9600k and 9900k which people keep repeating. those were extreme cases where huge amount of thread differences.

IF anything, the below table proves that it is quite pointless to have the 3700x over 3600 in a CPU bound scenario. both CPUs are almost used to their maximum, yet this is the best 3700x can achieve over a 3600. it is not even RELEVANT for the gpu bound cases.

RThWeXh.png



Can you please provide the links for these videos? I really want to see what kind of CPUs and GPUs are being pitted against here.
 
Last edited:

GHG

Member
Where the fuck was the 8core crowd when i was asking advices about the cpu for the new build?

GHG GHG K Kenpachi yeah i'm talking to you 2:lollipop_squinting:

I was only in the GPU threads unfortunately.

If you're going for the 13600K then you needn't worry too much due to the architectural changes.

However if you're going for one of the older AMD 6 core chips then I'm sure Black_Stride Black_Stride will pay for your upgrade when your games start stuttering 😉
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I was only in the GPU threads unfortunately.

If you're going for the 13600K then you needn't worry too much due to the architectural changes.

However if you're going for one of the older AMD 6 core chips then I'm sure Black_Stride Black_Stride will pay for your upgrade when your games start stuttering 😉
I advised him on the 136K.
 

Gaiff

SBI’s Resident Gaslighter
quite a bit worse? you're exeggerating %10-20 differences. EVEN THEN, those actually "little bit" (not quite) differences only pan out if you pair the 3700x with an unbalanced GPU such as 3080+. it is imbalanced, simple as that. no one in their sane mind should combo a 3700x with a 3080. practical limit for that CPU is practically a 3060ti and that's pushing your luck. both 3600 and 3700x will not "fulfill" the potential of a 3080.
But a lot of people did pair a 3700X with a 3080. The 3700X was the best AMD CPU available when the 3080 was released. The 3060 Ti came out a while later. And those 10-20% difference are quite big when they used to be 1-5%. The 3700X came out in July 2019, over a year before Ampere. I have a friend who even built a rig centered around it and had a 1060 and upgraded to a 3080 when it came out.
and GPUs below? both CPU will be sleeping in most cases with GPUs such as 2080. in the cases where 3700x matters, both CPU is at a obsolete location, most likely, for the given circumstance. you want to be GPU BOUND as much as possible.
Except real life scenarios don't demonstrate that. People don't keep their CPUs for a measly 1-2 years. They keep them for 4+ years easily. That's at least 2 GPU cycles and degradation does matter. It's often the difference between lasting a GPU cycle and being forced to upgrade because those 10-20% drops affect your experience negatively.
 
Last edited:

yamaci17

Member
3700x is a stuttering mess CPU, FYI.

Using 3700x for a case example is funny when the CPU is riddled with inter ccx stutters. 5600x will forever provide a smoother experience than the pos 3700x will ever be able to achieve.

for the case between 5600x and 5800x, both will have similar stutters. if a game is properly coded, IT WONT even stutter on a 4/8 moder i3, such as Spiderman.

now YOU CAN find stutter videos all over the internet, since PC games are ponies and will stutter in the slightest inconvenience a background windows bloat throw at it.

dude, I can find 4/8 moder i3 videos pitted against 3700x where 3700x has worse %1 lows :d
 

yamaci17

Member
But a lot of people did pair a 3700X with a 3080. The 3700X was the best AMD CPU available when the 3080 was released. The 3060 Ti came out a while later. And those 10-20% difference are quite big when they used to be 1-5%. The 3700X came out in July 2019, over a year before Ampere. I have a friend who even built a rig centered around it and had a 1060 and upgraded to a 3080 when it came out.

Except real life scenarios don't demonstrate that. People don't keep their CPUs for a measly 1-2 years. They keep them for 4+ years easily. That's at least 2 GPU cycles and degradation does matter. It's often the difference between lasting a GPU cycle and being forced to upgrade because those 10-20% drops affect your experience negatively.
they used to be %1-5 for GPU bound scenarios, which they still are. games did not change all of a sudden, you just got more powerful GPUs that pushed more CPU bound situations.

also, their mistake. no sane people should've combined a 3700x with a 3080. 3080 AT least deserved a highly clocked intel chip, OR you gotta push some serious 4k on to it. in both cases, you should be GPU bound. also you're wrong. 3080 came out in late 2020 (september?) 5600x / 5800x released in nov 2020. i'm sure you could've had a patience of one month and get the optimal performance out of your GPU. zen 3 was always meant for Ampere. zen 2 was never meant for high end Ampere. Zen 2 is a flawed architecture to begin with.

3700x/1060 is also imbalanced build, that works the other way around.

whatever, you believe whatever you want to believe.

most 3700x adnd 3080 users were disgruntled and ended up upgrading to a 5600x or 5800x which gave them QUITE a bit of %35-50 more CPU bound performance unless they played at heavy 4k settings, that is. 3700x simply lacks the firepower to satisfy the possibilities a 3080 brings to the table. further GPUs would add FURTHer salt to the injury (such as 4070ti+)

as i said, send me the videos. %1 lows are random and erratic. i'm sure i can find cases in your videos where the situation is somehow reversed. games are not simply coded to make FULL use of 12 threads, let alone 16 threads. you people are really exaggerating this. if you open TONS OF stuff background, THAT'S another story. then those people really should provide their use case, and get the 8 core counterpart as they wish. but if a person purely plays only games with minimal bloat in the background, 6 core is fine for them. a discord here, and a spotify there won't break the back of a CPU. most of these Electron APPs are running on GPUs anyways :D
 
Last edited:

Kataploom

Gold Member
the damn same 9600k and 9900k argument. you're comparing a shadted 6/6 CHIP with a freaking 8/16 chip. that's a total of 2.66x THREAD COUNT.

what we're discussing is 5600x and 5800x. one has 12 threads, and other has 16. 1.23x THREAD COUNT.

3700x experiences exact same bottleneck as 3600x. maybe you get %10 better %1 lows. its not even going to matter if you combine the both CPU with something like 3060ti/2070 super. you will be super GPU bound in most cases to a point CPU will stop mattering in terms of %1 lows. you share super CPU bound super high end GPU benchmarks to prove some points. it makes no sense.

upgrade your GPU, with 3700x? so, you think upgrading from 2070 to something like 4070ti will be "beneficial" for a 3700x chip? NO. hell no. 3700x limits the likes of 2070 as is.

same for 5600x/5800x. both are suitabable for the likes of 3080/2080ti/3070. BOTH will not be suitable for the likes of 4080-4090. you will never be satisfied.

you must built a balanced rig that lasts as much as POSSIBLE together. enduring with a single cpu lastgen was maybe possible. most people did not do that either. if you had a 7700k 1080ti system, you simply stayed there OR you simply went for a 9900k 2080ti rig.
I guess you meant that for people that want to push their hardware with HFR and not stay at 60 fps, right?

I'm a little anxious for this game, DS Remake and HL to come out so I can see how it's looking like for my 5600x and 6700xt on current gen, I'll be ok with 60 fps at 1440p and even some 4K games using FSR.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Well hopefully it works out (in that case it actually should).

16GB of RAM is already fast proving to be inadequate, we are only in the first half.
Ill be upgrading to 32GB DDR5 soonish.
But from the Ryzen 3000 launch till now 16GB has worked out just fine with no issues. (As I said way back when).
The highest usage I've seen for any game was like 12GB.
 

yamaci17

Member
I guess you meant that for people that want to push their hardware with HFR and not stay at 60 fps, right?

I'm a little anxious for this game, DS Remake and HL to come out so I can see how it's looking like for my 5600x and 6700xt on current gen, I'll be ok with 60 fps at 1440p and even some 4K games using FSR.
I'm sure your system will be fine for these games, we will have to see.

My problem however some people here somehow thinks everyone must get a 8 core CPU somehow. Ignoring the market for 6/12 CPUs completely. 5600x in my opinion will be super fine for both HL and DS remake and this Forspoken too. If these games end up broken in terms of CPU performance, it will be because of SINGLE THREAD bottlenecks, which will put the 5600x and 5900x in the same basket. I really don't think games are in a state where they can bog down a 12 freaking thread CPU. 6700xt and 5600x is a super balanced build in my opinion.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
they used to be %1-5 for GPU bound scenarios, which they still are. games did not change all of a sudden, you just got more powerful GPUs that pushed more CPU bound situations.
Games got more demanding and the gap between them widened quite a bit. People used to say to just get the 3600 because it performs the same. Not so true 3 years later.
also, their mistake. no sane people should've combined a 3700x with a 3080. 3080 AT least deserved a highly clocked intel chip, OR you gotta push some serious 4k on to it. in both cases, you should be GPU bound. also you're wrong. 3080 came out in late 2020 (october?) 5600x / 5800x released in nov 2020. i'm sure you could've had a patience of one month and get the optimal performance out of your GPU.
No, I'm not wrong. The 3080 came out on September 17th 2020. The first Zen 3 CPUs came out on November 5th 2020, 48 days later. Furthermore, this ignores those who already had Zen 2 CPUs in their rigs. If you built a rig in 2019 with a 3700X/2070S, a 3080 would be an absolutely worthwhile upgrade without needing a new CPU. In what kind of imaginary world do you live to think that no one paired their 3080 with a 3700X? It was a popular combination.
3700x/1060 is also imbalanced build, that works the other way around.
Because he was waiting for Ampere. He got the 1060 for peanuts.
whatever, you believe whatever you want to believe.
I believe in reality. We're talking about real life situations, not imaginary CPUs.
most 3700x adnd 3080 users were disgruntled and ended up upgrading to a 5600x or 5800x which gave them QUITE a bit of %35-50 more CPU bound performance unless they played at heavy 4k settings, that is.

as i said, send me the videos. %1 lows are random and erratic. i'm sure i can find cases in your videos where the situation is somehow reversed.
 

yamaci17

Member
Games got more demanding and the gap between them widened quite a bit. People used to say to just get the 3600 because it performs the same. Not so true 3 years later.

No, I'm not wrong. The 3080 came out on September 17th 2020. The first Zen 3 CPUs came out on November 5th 2020, 48 days later. Furthermore, this ignores those who already had Zen 2 CPUs in their rigs. If you built a rig in 2019 with a 3700X/2070S, a 3080 would be an absolutely worthwhile upgrade without needing a new CPU. In what kind of imaginary world do you live to think that no one paired their 3080 with a 3700X? It was a popular combination.

Because he was waiting for Ampere. He got the 1060 for peanuts.

I believe in reality. We're talking about real life situations, not imaginary CPUs.


let's see then

is this okay here? 3700x have many stutters, not SO different than the 3600 here. are those stutters look okay to you? and odyssey was a 2018 game (so technically, its a old game huh? somehow it reeveald the weakness of venerable 6 cores the year it is released?) but both have stutters. the stutters that would startle both gamers in CPU bound situations.



moey6zd.jpg


again, does not look like the venerable 3700x is impervious to frame spikes that you would attribute to 6 cores being "insufficient"
i4ilYay.jpg



-----


oh Borderlands 3 one. IT IS CLEAR that one has SHADERS Compiled. it is a MISTAKE on reviewers part. they most likely benchmarked the 3600 first, HAD BIG shader compilation stutters, and on the second run when he swapped CPUs, the shaders were compiled, so 3700x had an easier time.

STUTTERS would usually happen

1) when the cpu is super stressed and maxed out
2) shader compilation
3) traversal stutters

do you see 3600 being stressed out or maxed out here? it literally chills at %29 usage. how is this a proof of "3600 being bogged down bcoz of 6 cores :(" if anything, this scene below proves that the video owner did not do their homework. they should've tested each CPU twice to make sure shader compilation stutters did not get in their way of testing. sadly, it did, which somehow ended up as your argument on how the 6 core sucks, despite chilling at a measly %29 usage. it is clear that only 2-3 of its cores are used here, so what? how is this going to prove me that 6 cores suck?

E9rWSmv.jpg


and warzone? this is one of the most brutal CPU bound games out there that can truly max out CPUs at High framerates.

sure framerates a bit lower but it has almost the same stability. THE BELOW game is more demanding in terms of CPU boundness than BL3. %71 usage, which means that almost 8-9 threads of the 3600 is saturated, yes it is fine. yet somehow you expect me to believe that BL3 pushes 3600 to its limits with only at %30 usage? okay?



ozazdMZ.jpg



and this section proves the erraticness of PC games.

6YWQ0Mw.jpg



once again at 5:35, during horizon zero dawn benchmark, IT IS APPARENT that the game is compiling shaders. these are shader compilation hitches. this is a huge mistake ON reviewer's behalf. once again, CPU idles at %34 CPU usage ; yet it stutters heavily. those are compilation stutters. i've seen tons of them. you're free to believe otherwise. this is not normal, nor should be taken as a source or proof that 6 cores are dying.

sTUBy62.jpg


same for Hitcher 3 part. the reviewer most likely had a new driver installed for their GPU, first did the tests on 3600 full of hitches and stutters from various shader compilations, and once he swapped CPUs (shaders are compatible between different CPUs. you only need new shaders if you change your driver or your GPU), his gameplay became smoother.

sorry but this video is flawed at its core. I did that exact same BL3 benchmark, where I got tons of stutters 1st time, and once I rerun the benchmark , it was way more smoother.
 
Last edited:

T4keD0wN

Member
Yes but you're missing the point. The point is that on PC, it's very rare that a developer can write code that will run optimally on a given GPU and not be bottlenecked somewhere else. People love to attribute that to just "lazy developers" and "bad PC ports" and they have no idea just how difficult it is to make a game run optimally and smoothly on the PC platform due to just how much variance there is in the supported systems. The reality is that you can write the same code that will run "100%" on a console, but that same code will run at a wide range of sub-100% values on PCs depending on the particular PC they test on.
Weird, i cant think of any games where i had trouble of getting at least 80-100% gpu usage besides gotham knights which often dropped to ~50% and i only have 12600k. I still get 90-110fps when riding a bike and 150+fps indoors in that game which is way more than console version although i didnt use ray tracing which which would make it heavier on the CPU. Most PC devs dont have trouble with making their games scale well on PC and get 90%+ gpu usage provided your cpu is strong enough. I wont count games with FPS limits for obvious reasons.
Name 1 AAA game released this or last decade where you cant get 90%+ gpu usage with a pascal, turing or ampere card @8k resolution outside of loading screens or menus.

That's not at all a fair comparison.
Wait up..... Thats exactly what ive said.......

The point again is that a developer will never write code that will run at only 30% on a console because there is a lot more in their control in terms of how their code will run. On a PC, there are always so many unknowns and no developer in the world can test every HW and SW permutation to ensure that the code runs optimally in all cases. The goal here isn't to compare the theory of their performance. Limiting a console to 30% utilization when that will never be the case in an actual game is just silly and only serves to say hey, this console is only X% of the performance of this PC card in theory. But in reality with real app code, the functionality in how that code is run and executed is going to vary greatly between console and PC. That is the point. Of course the dev would want that code to run 100% on every platform it will run on, but they generally can only guarantee that on a console by the fact that is is fixed known config. They can never say that on "the PC" as a platform, only certain specific configs that they may have access to for testing.
Yeah, youre quite right here in saying thats its much more work and harder. The better hardware you have the better they will run.

That's why the majority of gamers at the start of 2023 are still gaming on 1080p 60hz monitors, with less than 8 CPU cores, and GPU performance that's well below a 2070 level. I can spend hours telling you all the crazy configurations folks have in their PCs and again the point is as a PC developer you HAVE to account for this as unfortunate and silly as it is. You have to account for the fact that people who are less knowledge in the PC space may upgrade their GPUs without updating the rest of the system, that they may still be using standard HDDs, that they may not have an RT capable GPU (despite the fact that Nvidia's 2xxx series came over 4 years ago with RT and we're on our 3rd generation of RT HW) etc.
They didnt seem to account for it at all going by the requirements. I bet it runs at the same framerate when cpu bound @360p with 6 cores enabled as it does with 10 cores enabled.

It's the reality of the world and while NeoGaf members can choose to ignore it and live in their bubble of PC dominance and multi thousand $$ rigs, the average developer cannot afford to do the same (if they want to make any money that is).
Having high requirements limits the amount of potential buyers (look at DOTA, counter strike and league of legends which run on anything and top the popularity charts). You make no sense here, if they wanted to make more money and sell a lot of copies they would ensure its well optimized and scales well. The lower the requirements the more potential buyers there are.
 
Last edited:

GHG

Member
I love PCGaf but I will never argue with any of you guys. These boys are willing to write a 40 page manifesto about why your opinion is shit at the drop of a hat.

I'm laughing because it's true.

It's funny to me how we all sonehow have different interpretations of the same data. I guess that's what drives everyone's component choices, if not everyone would have identical builds and that would be boring.
 

Gaiff

SBI’s Resident Gaslighter
let's see then

is this okay here? 3700x have many stutters, not SO different than the 3600 here. are those stutters look okay to you? and odyssey was a 2018 game (so technically, its a old game huh? somehow it reeveald the weakness of venerable 6 cores the year it is released?) but both have stutters. the stutters that would startle both gamers in CPU bound situations.
The 3700X still performs substantially better which is the point.
again, does not look like the venerable 3700x is impervious to frame spikes that you would attribute to 6 cores being "insufficient"
No one said that the 3700 is impervious to frame spikes. It's just more consistent across the board with significantly higher lows and better averages. That matters a lot for a smooth gaming experience.
-----


oh Borderlands 3 one. IT IS CLEAR that one has SHADERS Compiled. it is a MISTAKE on reviewers part. they most likely benchmarked the 3600 first, HAD BIG shader compilation stutters, and on the second run when he swapped CPUs, the shaders were compiled, so 3700x had an easier time.

STUTTERS would usually happen

1) when the cpu is super stressed and maxed out
2) shader compilation
3) traversal stutters

do you see 3600 being stressed out or maxed out here? it literally chills at %29 usage. how is this a proof of "3600 being bogged down bcoz of 6 cores :(" if anything, this scene below proves that the video owner did not do their homework. they should've tested each CPU twice to make sure shader compilation stutters did not get in their way of testing. sadly, it did, which somehow ended up as your argument on how the 6 core sucks, despite chilling at a measly %29 usage. it is clear that only 2-3 of its cores are used here, so what? how is this going to prove me that 6 cores suck?

E9rWSmv.jpg
We can ignore Borderlands 3.
and warzone? this is one of the most brutal CPU bound games out there that can truly max out CPUs at High framerates.

sure framerates a bit lower but it has almost the same stability. THE BELOW game is more demanding in terms of CPU boundness than BL3. %71 usage, which means that almost 8-9 threads of the 3600 is saturated, yes it is fine. yet somehow you expect me to believe that BL3 pushes 3600 to its limits with only at %30 usage? okay?



ozazdMZ.jpg
The 3700X still performs better when there used to be virtually no differences in games between them.
and this section proves the erraticness of PC games.

6YWQ0Mw.jpg



once again at 5:35, during horizon zero dawn benchmark, IT IS APPARENT that the game is compiling shaders. these are shader compilation hitches. this is a huge mistake ON reviewer's behalf. once again, CPU idles at %34 CPU usage ; yet it stutters heavily. those are compilation stutters. i've seen tons of them. you're free to believe otherwise. this is not normal, nor should be taken as a source or proof that 6 cores are dying.
I never said 6-core is dying. Why are you exaggerating? Horizon also compiles shaders every time it updates or new drivers are detected. You cannot start the game without the shaders compiling beforehand. Those frame spikes are more likely due to streaming assets or something else.
same for Hitcher 3 part. the reviewer most likely had a new driver installed for their GPU, first did the tests on 3600 full of hitches and stutters from various shader compilations, and once he swapped CPUs (shaders are compatible between different CPUs. you only need new shaders if you change your driver or your GPU), his gameplay became smoother.
Never heard of compilation stutters in The Witcher 3.
sorry but this video is flawed at its core. I did that exact same BL3 benchmark, where I got tons of stutters 1st time, and once I rerun the benchmark , it was way more smoother.
The video has one game with UE4 shader compilation stutters. The rest is fine. I never made the claim that 6-core CPUs are dead or belong in the trash bin. I simply agree with GHG that they age worse than their more powerful family members. If someone were to build a rig with a a 4090 and intent on keeping it for at least two cycles of high end GPUs, I wouldn't recommend a 6-core CPU like the 7600X.
 
Last edited:

yamaci17

Member
The 3700X still performs substantially better which is the point.

No one said that the 3700 is impervious to frame spikes. It's just more consistent across the board with significantly higher lows and better averages. That matters a lot for a smooth gaming experience.

We can ignore Borderlands 3.

The 3700X still performs better when there used to be virtually no differences in games between them.

I never said 6-core is dying. Why are you exaggerating? Horizon also compiles shaders every time it updates or new drivers are detected. You cannot start the game without the shaders compiling beforehand. Those frame spikes are more likely due to streaming assets or something else.

Never heard of compilation stutters in The Witcher 3.

The video has one game with UE4 shader compilation stutters. The rest is fine. I never made the claim that 6-core CPUs are dead or belong in the trash bin. I simply agree with GHG that they age worse than their more powerful family members. If someone were to build a rig with a a 4090 and intent on keeping it for at least two cycles of high end GPUs, I wouldn't recommend a 6-core CPU like the 7600X.

horizon zero dawn changed its shader compilation model before the date of that video. it now compiles shaders on the fly, or while in the menu

witcher 3 always had compilation stutters lol. you can do a quick research "novigrad witcher 3 stutter". same for ac odyssey. some people think dx11 is impervious to shader compilation stutters but nah, these two games had plenty of it

you say 4090 and 7600x. when did i say i would suggest 7600x and a 4090? 7600x should be combined with 4070ti-3090ti
 

Puscifer

Member
So it's 20% at Greenmangaming lol

Remember when entire publisher's made stinks over this in the past? Not sure how to take if Square is okay with that launch discount
 

GHG

Member
you say 4090 and 7600x. when did i say i would suggest 7600x and a 4090? 7600x should be combined with 4070ti-3090ti

And this here is the crux of the issue.

What you're basically saying is that if the person who builds the PC you recommend today then it likely won't be a platform that will respond well to GPU upgrades in the future.

Today's 4090 is tomorrow's 5070.

So my advice is consistent - if you're on a budget cut the RGB crap, really think about what motherboard features you do and don't need, think about if the pretty glass panel case is a necessity and reroute that money to ensure you get at least 32GB of system ram along with a 8+ core CPU. That way you will have a platform that will be in a much better position to carry you into the future compared to one that has just 16GB of system ram and a 6 core CPU.

Nobody every built a PC and said they wish they purchased less powerful and less capable components.
 
Last edited:
Doesn't really make sense, a 6700 xt for 1440p30, but a 6800xt for UHD60...the 6800xt isn't that much faster that it can push that much more pixels in half the frametime...

The 6700xt will probably do 1440p60 just fine.
 

feynoob

Member
out of topic
I have rx570 with ryzon3 3100 which I want to do an upgrade slowly due to financial reasons.
Which option should I start first? Graphic card or the processor?
What are the cheapest option that can last me for 2-4 years from now?
 

GHG

Member
out of topic
I have rx570 with ryzon3 3100 which I want to do an upgrade slowly due to financial reasons.
Which option should I start first? Graphic card or the processor?
What are the cheapest option that can last me for 2-4 years from now?

What's your motherboard?

That's the most important thing for determining whether or not you have a viable upgrade path or if you'll need to do a new build from scratch.
 
Top Bottom