• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CPU gaming performance has come a long ways in the past 3 years.

Leonidas

Member
As I was researching CPU gaming performance, I came across this graph from Techspot/HardwareUnboxed.

HB0IYgc.png


The surprising thing to me is that right now we're actually seeing 10-20% gains per release in gaming performance, vs. the previous years best gaming CPUs, with the data from this Techspot/HUB review. Sample size of 12 games is small though, and some games may skew the results.

For the past 3 years

2020: 5950x beat 10900K by 13-14% at 1080p and 1440p
2021: 12900K (D5) beat the 5950X (2020s best gaming CPU) by 18-19% at 1080p and 1440p
2022: 13900K (D5) beat the 12900K (D5) (2021s best gaming CPU) by 14% at 1080p and 9% at 1440p

It's also good to see that going DDR4 in 12th/13th Gen doesn't seem to hurt performance too much.

At the bottom of the graphs you see Zen2 and Intel 10th Gen. Ever since I got into PC gaming I don't recall ever seeing such gains every year in terms of gaming CPU performance vs the previous years best gaming CPU. I hope we keep seeing these kinds of yearly increases so I can have an excuse to upgrade my rig again in a few years...
 
To be honest, alot of game engines have been slow to adapt to multicore coding on CPUs.
You see it still today where certain engines like Idtech, Forzatech and IW7 have excellent CPU efficiencies and maintain rock solid framerates across multiple skus while other engines struggle to have a game run at a locked 30fps when it is graphically inferior to games like Forza, Doom and COD.
I think this is also why Sony's inhouse engines and studios are very consistent at performance and graphics quality. That PS3 made them very good at parallel coding.
 

Freeza93

Banned
Ue is easily accessible. Steam is proof of that 80% of garbage there is because of ue 4.

You dont need much dev knowledge to make a game there. A shitty one yes but it made plenty of people rich.

Look at Ark Survival Evolved. Its ond big pile of software garbage that runs on years old ue4 version. it runs terrible, looks terrible, has game breaking bugs, crashes, stutter, it has pretty much everything a playable version of anything should not have.

No one should trash engines. You will never know how it is to be a programmer or even remotely understand how a game engine operates. The real magicians of gaming of which you will never hear about.

Making maps and animations is not hard lol. time consuming but not hard. ue does a lot for you A LOT. Devs would piss themselves if you would set them 20 years back. 🫣
 

Rubicaant

Member
Its because new GFX cards are becoming more 4k focused, so when you benchmark 1080p and 1440p it reveals any CPU bottlenecks. Check 4k benchmarks and it's more like how it was in previous generations.
 

Freeza93

Banned
Its because new GFX cards are becoming more 4k focused, so when you benchmark 1080p and 1440p it reveals any CPU bottlenecks. Check 4k benchmarks and it's more like how it was in previous generations.
I also dont like this talking. A Gpu was never made for any resolution beside of marketing. We now are simply there that 4k is playable nothing more. Theres a reason Nvidia cane up with „RTX“. Because we now have more than enough power for rasterization. So how do you get people to buy new stuff even if its undercooked af and not really real ray tracing. So now so you can cripple performance for the next decades.

A fully real Raytrayced Game with lets say Red Dead Redemption 2 Graphics. Will take decades to run in 4k.

Look at quake 2 we can barely run that. My 3080 shit itsself in a 20 year old game. Now look at Portal. Good Luck with „hybrid fake Raytracing “.
And yes i get lost while typing on the phone😆.
 

amc

Member
Flumps. Sugary and chewy, Edited by fuck knows who. LOL. Someone has edited posts in my conversation so I joined in. Shitbird.
 
Last edited:

RoboFu

One of the green rats
That just shows there isn’t much difference between A $250 cpu and a $575 cpu 😵‍💫
 
Last edited:

Leonidas

Member
Talking the cutting edge I don't give a fuck about 1080p. Yeah, CSGO you the man, but GAF majority want image and performance metrics. Not 2014 codswallop resolutions.

The jump in 4K is the metric pls. Sack 1080p charts. Yeah, 13900k is insane, got one.
The OP includes 1440p charts too. There are still ~10-20% gains at 1440p according to their review.

1440p is the sweet spot in terms of IQ and framerate.
 

Leonidas

Member
Exactly. 1080P is a bullshit metric that means shit in 2023 for the majority.
The fact that high refresh 1080p displays are being made at 360 Hz and beyond makes it still relevant for them to include 1080p results.

The fact that the 4090 at 1440p only loses a few % vs. 1080p is pretty amazing.

If you want 4K results check out 4090 GPU reviews.
 

Marchizmo

Member
I'm still running a 4790k from 2014. I should really look to upgrade, but I can still play just about everything at 1080p 60fps which is all I care about.
I finally ditched mine last summer, but boy did that thing really hold up for all those years.
 

Leonidas

Member
I'm still running a 4790k from 2014. I should really look to upgrade, but I can still play just about everything at 1080p 60fps which is all I care about.
No interest in high refresh? Upgrading to a 1440p high refresh panel was the best PC gaming purchase I've made in a while.

Everything is silky smooth :goog_cool:

Leonidas Leonidas I have a 4090. 1080p is a nonsense with the GPU. It's not a 1080p card.

That's great, but this post is about CPU gaming performance, not the GPU.
 
Last edited:

LiquidMetal14

hide your water-based mammals
I went from not even thinking about upgrading this 5900x build to maybe potential doing an x3D upgrade this year pending on the finances line up.
 

dave_d

Member
Ever since I got into PC gaming I don't recall ever seeing such gains every year in terms of gaming CPU performance vs the previous years best gaming CPU. I hope we keep seeing these kinds of yearly increases so I can have an excuse to upgrade my rig again in a few years...
Oh we saw gains like this in the 90s. One year you'd have a 386 and then the next you'd have a 486 running 2-3x as fast. Of course if you tried to run some of Origins' stuff and you didn't have the latest and greatest it was a slide show so you expected to upgrade constantly. (My brother literally went from 8086->386->486->pentium from about 89->96.) Huge gains back then.(But your systems went totally obsolete in a couple of years too.)
 

BlackTron

Member
I'm still running a 4790k from 2014. I should really look to upgrade, but I can still play just about everything at 1080p 60fps which is all I care about.

This chip doesn't leave much to be desired at 1080. It's still in my desktop too paired with the ole' 1060. Even though I have a much better laptop, I still use it to play some FPS games at 1080/144hz for conveniences' sake.
 

diffusionx

Gold Member
Too bad these benchmarks are completely artificial. Nobody is buying a 4090 to play these games at 1080p. They do that because if you actually play the game the way you would, the GPU is the actual bottleneck and all those deltas completely get wiped out. The benchmarks are constructed to isolate the CPU.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
As I was researching CPU gaming performance, I came across this graph from Techspot/HardwareUnboxed.

HB0IYgc.png


The surprising thing to me is that right now we're actually seeing 10-20% gains per release in gaming performance, vs. the previous years best gaming CPUs, with the data from this Techspot/HUB review. Sample size of 12 games is small though, and some games may skew the results.

For the past 3 years

2020: 5950x beat 10900K by 13-14% at 1080p and 1440p
2021: 12900K (D5) beat the 5950X (2020s best gaming CPU) by 18-19% at 1080p and 1440p
2022: 13900K (D5) beat the 12900K (D5) (2021s best gaming CPU) by 14% at 1080p and 9% at 1440p

It's also good to see that going DDR4 in 12th/13th Gen doesn't seem to hurt performance too much.

At the bottom of the graphs you see Zen2 and Intel 10th Gen. Ever since I got into PC gaming I don't recall ever seeing such gains every year in terms of gaming CPU performance vs the previous years best gaming CPU. I hope we keep seeing these kinds of yearly increases so I can have an excuse to upgrade my rig again in a few years...

Jeez im starting think I dont need to upgrade for a long while.
My monitor is 3440x1440@165.
The bottom of that graph is still 140fps average.
Man all these CPUs are well overpowered.

Imma be on LGA1700 till the PlayStation 6 easy work.
 

skneogaf

Member
I play games with graphics on absolutely maximum at 4k 120hz so I just need a fairly recent cpu like a i7 9700k that I use as I hit gpu bottleneck before cpu bottleneck.
 

winjer

Gold Member
This is only happening because we have competition, so AMD and Intel have to work hard to deserve our money.

We have to remember when Intel was alone and dominant in the CPU market, so for a decade we saw little to no IPC improvements, i7s only had 4C8T and the only thing that would increase, were prices.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I play games with graphics on absolutely maximum at 4k 120hz so I just need a fairly recent cpu like a i7 9700k that I use as I hit gpu bottleneck before cpu bottleneck.
maxresdefault.jpg



is it worth upgrading from 3900x if i dont care about over 60fps ?

It averages ~140fps.....so even when/if you upgrade to a 120fps panel itll still be a good get on average.
If you are only hunting 60fps.....then hodl for alot longer.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
i plan on keeping it for at least another 5 years , hope it will be good enough
Oooff..........in a good way!
I think by the time the PS6 is dropping its IPC might be really really low at the time.
So hope devs start really parallelizing games so those cores actually get to work and who know how long you can keep maintaining a stable 60fps.
 

Redneckerz

Those long posts don't cover that red neck boy

Assaulty

Member
Recently put in a 13600k and it's an absolute BEAST of a cpu. I expect it to last for 5-6+ years. For single player, I'm fine with 60fps, and most comp shooters don't need a whole lot.
 

samoilaaa

Member
Oooff..........in a good way!
I think by the time the PS6 is dropping its IPC might be really really low at the time.
So hope devs start really parallelizing games so those cores actually get to work and who know how long you can keep maintaining a stable 60fps.
there are alot of people using 10 year old cpu and still getting 60 fps as long as the gpu is strong enough

unless games will start to have big worlds that are actually filled with alot of stuff happening , npcs without zombie AI , i dont think there will be a problem

look at forspoken , its a 2023 game filled with nothing , a few monsters here and there
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
there are alot of people using 10 year old cpu and still getting 60 fps as long as the gpu is strong enough

unless games will start to have big worlds that are actually filled with alot of stuff happening , npcs without zombie AI , i dont think there will be a problem

look at forspoken , its a 2023 game filled with nothing , a few monsters here and there
My media center has a 2500K (4.2GHz(its old and was a 5.0GHz part for years)) x GTX 1070 combo.
That little thing can amazingly still play modern titles at 1080p60. (with some settings tinkering obviously)
Every now and then I load up a new title just to see if it can actually still hang on.
While its not a perfectly stable 60 its a totally totally playable 60.
I recently tested Warzone 2 which has pretty high requirements depending on the situation.
And this little fucker is still pulling through.
Cyberpunk?......oooffff that was rough, would need to have a 30 or 40fps limit....60 is too unstable with pretty much any settings.

So realistically any modern CPU will power through most titles easy work.
All these new parts are super powerful.
Maxing out 144Hz panels will almost always be down to the GPU more so than the CPU.
Me thinks imma be on LGA1700 for a long long while.
 
Last edited:

rofif

Can’t Git Gud
They also consume more power and getvharder too cool. It’s still better efficiency than before but it ideal
 
Top Bottom