• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

7950x3D is now at the top of the food chain for gaming!

smbu2000

Member
I'm in a qundry.
Back in November I bought a prebuilt OEM lenovo legion with a 13700F. I find out it's a dead end of life motherboard.
All I have to do is stick an rtx 5000 series in it to last me several more years. By then the AM5 will be dead end of life motherboards.
I guess I'm out of shopping around till whatever is next on the horizon.
Would be nice to have more umph in my PC right now though.
13700F should still be a good cpu that lasts you for awhile. If you aren’t planning on upgrading it anytime, then you should be okay with the whole system. (Can probably take 14th gen in the mobo, but that is basically the same as 13th gen.)
 

Bojji

Member
Great news for the 3 guys playing at 1080p on a 4090.

0.01% over a 13700k at 4k (but 60% more expensive) or equal to a 13900k with PBO max and undervolt (and 6% more expensive).

iEFaXbM.png

You don't need to buy anything more expensive than 7700x or 13600k to have great performance in games in high resolution based on this chart.

But of course when more CPU demanding games comes and with more powerful GPU (5090?) differences seen on that hated 1080p chart will show up again, that's why it's done like that to show actual performance of CPUs not limited by GPUs.
 

Rentahamster

Rodent Whores
I hear you but (respectfully) disagree with pretty much all points.

It doesn’t tell you which CPU is best for games because 1080p is unlikely to ever be a resolution gamers will use with a state-of-the-art, latest-gen CPU. I’d actually challenge anyone to find a gamer playing on this CPU at 1080p. I fully understand how resolution at the lower end is CPU dependent and at the higher end is GPU dependent. But here I am with a 13700K and a 4090 wondering if this CPU is going to do much for me at 4K in terms of CPU bottlenecking and… well, I don’t know. It’s a review of high-end equipment at preposterous gaming resolutions that doesn’t tell you pretty much anything.

It seems like these reviewers are trying to sell their product (reviews) to gamers when really it should be productivity software they ought to be focused on. But that won’t get gamer clicks.

Yes it does tell you which CPU is best for games because you are able to take the 1080p data and extrapolate it to other situations. You can't do that with 4K data since there are less significant differences, which makes 4K data much less useful. You're utilizing benchmark data wrong. You are expecting to look at a review and see your exact CPU/GPU configuration and see the data for that. Sometimes you can do that because the reviewer benchmarked the exact same configuration that you have. Lucky you! Many other times, you cannot do that because there are a shit ton of possible configurations out there in the wild that it would be unreasonable for a reviewer to benchmark every single possible configuration in existence.

How do we compromise while still retaining meaningful data? We have the reviewer do tests that actually allow people to do meaningful comparisons for the specific product category they're looking for. 1080p tests allow consumers to see meaningful differences between CPUs. 4K tests don't do that because they're mostly the same and they don't show you what differences there are.

Check out this line at 11:44



"Then we have the 4K data, which in my opinion is completely worthless for CPU testing, but it's often quite heavily requested, so here we are. When benchmarking the GPU, the CPU doesn't really matter in almost all games because you're GPU limited."

If you need a more in depth explanation, here are two more videos where they covered this topic.



 
love the 7800x3d.
AMDs best work since the athlon64.

was going to get a 7800x3d until my local microcenter had an amazing deal on 12700k+mobo+ram.
am5 mobos were so expensive at that time, the microcenter bundle was ~1/4 the total cost.
will maybe throw in a used 13900k one day.
 

samoilaaa

Member
If you have a newer GPU then your frametimes will be taking a hard beating in modern games even if you're only targeting 60fps.

Just grab a 5700x3d if you can.
i reach high fps in CP77 even in dogtown market with a 4080 , 4k/max settings

i dont see games this gen becoming more cpu demanding this gen
 
Last edited:

Xellos

Member
I'm in a qundry.
Back in November I bought a prebuilt OEM lenovo legion with a 13700F. I find out it's a dead end of life motherboard.
All I have to do is stick an rtx 5000 series in it to last me several more years. By then the AM5 will be dead end of life motherboards.
I guess I'm out of shopping around till whatever is next on the horizon.
Would be nice to have more umph in my PC right now though.

Even if 7800X3D and 7950X3D are faster and more efficient, 13700F is still a very good CPU. Pretty much any 6+ core, 12th-14th gen Intel or Zen4 AMD CPU is going to do a great job with modern games.
 

kiphalfton

Member
Yes it does tell you which CPU is best for games because you are able to take the 1080p data and extrapolate it to other situations. You can't do that with 4K data since there are less significant differences, which makes 4K data much less useful. You're utilizing benchmark data wrong. You are expecting to look at a review and see your exact CPU/GPU configuration and see the data for that. Sometimes you can do that because the reviewer benchmarked the exact same configuration that you have. Lucky you! Many other times, you cannot do that because there are a shit ton of possible configurations out there in the wild that it would be unreasonable for a reviewer to benchmark every single possible configuration in existence.

How do we compromise while still retaining meaningful data? We have the reviewer do tests that actually allow people to do meaningful comparisons for the specific product category they're looking for. 1080p tests allow consumers to see meaningful differences between CPUs. 4K tests don't do that because they're mostly the same and they don't show you what differences there are.

Check out this line at 11:44



"Then we have the 4K data, which in my opinion is completely worthless for CPU testing, but it's often quite heavily requested, so here we are. When benchmarking the GPU, the CPU doesn't really matter in almost all games because you're GPU limited."

If you need a more in depth explanation, here are two more videos where they covered this topic.





What a long winded response. Stopped reading after the straw man "there's lots of different configurations". No shit.

It is two additional charts (one at 1440p and one at 2160p) per game. Not asking much.

They gave benchmarks at 1080p, and only at 1080p, because at 1080p the difference between CPUs is most pronounced.

Typical advertising.
 
Last edited:

ssringo

Member
Jesus I didn't think I was  that subtle. I'm not actually bothered that my top tier cpu is slightly less top tier. I just thought it was funny, and a bit cliche, that the day after I bought pc parts they were outdated (according to a new test).
 

Rentahamster

Rodent Whores
What a long winded response. Stopped reading after the straw man "there's lots of different configurations". No shit.

That's not a straw man. That's a literal fact.

It is two additional charts (one at 1440p and one at 2160p) per game. Not asking much.

There are better things they can do with their time, like adding additional games. Benchmarking CPUs at 4K is a waste of time because the information isn't meaningful.

They gave benchmarks at 1080p, and only at 1080p, because at 1080p the difference between CPUs is most pronounced.

So you agree with me.
 
I'm so torn on what to do in my situation. I know my 7950x3D and Asus B650E-F are semi busted. I'm happy with the performance of this rig and bought it with the intention of being future proof for when a new CPU drops, I can simply slot it into the board and carry on from there. But I don't trust this board with a new CPU, and I'm tired of dealing with it failing on me. Just don't know what to do.
 

GHG

Member
i reach high fps in CP77 even in dogtown market with a 4080 , 4k/max settings

i dont see games this gen becoming more cpu demanding this gen

I'm not doubting you reach "high" fps, however across most games you're leaving a lot of frames and smoothness on the table:

minimum-fps-3840-2160.png


relative-performance-games-38410-2160.png


Ignore the highlighted 7800x3d, compare the 5800x3d to your 3900x.
 

SoloCamo

Member
Even if 7800X3D and 7950X3D are faster and more efficient, 13700F is still a very good CPU. Pretty much any 6+ core, 12th-14th gen Intel or Zen4 AMD CPU is going to do a great job with modern games.

I'd honestly say even 9th gen Intel (9900k) / 3rd gen Ryzen (3700X) still holds up quite well, even at 1080p for *most* games. It's already expensive enough keeping up with 4k's gpu demands, I never bother with high refresh because it's even more expensive to keep chasing that. With a 60hz target I've yet to run into any games where my 11900k (stock clocks, just paired with 32gb of dual rank cl14-14-14 3733mhz DDR4) has let me down. My 10900 was no different.
 
Last edited:

Poppyseed

Member
Yes it does tell you which CPU is best for games because you are able to take the 1080p data and extrapolate it to other situations. You can't do that with 4K data since there are less significant differences, which makes 4K data much less useful. You're utilizing benchmark data wrong. You are expecting to look at a review and see your exact CPU/GPU configuration and see the data for that. Sometimes you can do that because the reviewer benchmarked the exact same configuration that you have. Lucky you! Many other times, you cannot do that because there are a shit ton of possible configurations out there in the wild that it would be unreasonable for a reviewer to benchmark every single possible configuration in existence.

How do we compromise while still retaining meaningful data? We have the reviewer do tests that actually allow people to do meaningful comparisons for the specific product category they're looking for. 1080p tests allow consumers to see meaningful differences between CPUs. 4K tests don't do that because they're mostly the same and they don't show you what differences there are.

Check out this line at 11:44



"Then we have the 4K data, which in my opinion is completely worthless for CPU testing, but it's often quite heavily requested, so here we are. When benchmarking the GPU, the CPU doesn't really matter in almost all games because you're GPU limited."

If you need a more in depth explanation, here are two more videos where they covered this topic.




I absolutely understand that this shows CPU performance, in a meaningless way. It's completely irrelevant. These game reviews at 1080p with a top-of-the-line CPU and top-of-the-line GPU are next to worthless. I mean, let's say for example - this thread. How many people in this thread are shopping for a top CPU and top GPU to play games at 1080p? Raise your hands. Nobody? Color me surprised. In however many years time, as I stated earlier, we'll be using different benchmarks from different games, and there will be pretty much nothing to gain from going back and looking at older reviews. Those CPUs will be so old at that point nobody will care. You'll just care about the next generation of CPUs, and any real data comes from productivity benchmarks, - meaningful benchmarks that show video encoding, exporting etc. That stuff really works out a CPU, at any resolution. 1080p is laughable for games benchmarking rich-people hardware.

All these reviews need to say, as far as games go with high-end CPU/GPUs is = they're all about the same, save your money, nobody should be buying these. It's just marketing to run these at 1080p, and I've been a PC enthusiast for a good 35 years now.
 

SoloCamo

Member
All these reviews need to say, as far as games go with high-end CPU/GPUs is = they're all about the same, save your money, nobody should be buying these. It's just marketing to run these at 1080p, and I've been a PC enthusiast for a good 35 years now.

The future is now gramps ;) People aren't stuck to just 60hz anymore and there are plenty of people, including myself, that want the longest life possible out of their system. If I listened to bad advice such as you are implying, I would have ended up swapping platforms multiple times in the same window that I was able to keep my 4790k (2014-2020). If you've really been PC gaming for 35 years you should understand this... I guess I'm a young gun having PC gamed for only 25+ years but this testing was pretty obvious even in the early 2000's.
 
Last edited:

Poppyseed

Member
The future is now gramps ;) People aren't stuck to just 60hz anymore and there are plenty of people, including myself, that want the longest life possible out of their system. If I listened to bad advice such as you are implying, I would have ended up swapping platforms multiple times in the same window that I was able to keep my 4790k (2014-2020). If you've really been PC gaming for 35 years you should understand this... I guess I'm a young gun having PC gamed for only 25+ years but this testing was pretty obvious even in the early 2000's.
Hah, gramps. That's funny. :messenger_grinning_sweat:

Damnit. Trying not to get too old, and now you made me feel old. I like high-refresh stuff, too. I mean, that's why I have a 4090, too! But 1080p ain't never, ever happenin' again.
 
But…but… who on earth is buying a CPU like this and a 4090 GPU just to play 1080p? At 4K I’d guess all these CPUs perform similarly.
I imagine that's less true than it once was because of DLSS since you're technically rendering the game at a lower internal resolution and then upscaling it to 4K.

I believe ray tracing is also quite heavy on the CPU, which is probably worth considering if you have a 4090.

If you have a newer GPU then your frametimes will be taking a hard beating in modern games even if you're only targeting 60fps.

Just grab a 5700x3d if you can.

Depending on the region you're in and your budget, I think the 5800X3D might be a better bet? In my region the price difference between the 5700X3D and the 5800X3D is virtually nothing so it seems worth it to spend a tiny bit extra. I have a 3900X myself and I'm finding it is bottlenecking my 3090 somewhat even in 4K and I'm getting a few more stutters than I'd like.

Planning to upgrade next month.
 

Poppyseed

Member
I imagine that's less true than it once was because of DLSS since you're technically rendering the game at a lower internal resolution and then upscaling it to 4K.

I believe ray tracing is also quite heavy on the CPU, which is probably worth considering if you have a 4090.



Depending on the region you're in and your budget, I think the 5800X3D might be a better bet? In my region the price difference between the 5700X3D and the 5800X3D is virtually nothing so it seems worth it to spend a tiny bit extra. I have a 3900X myself and I'm finding it is bottlenecking my 3090 somewhat even in 4K and I'm getting a few more stutters than I'd like.

Planning to upgrade next month.
To be fair, PC gaming is a bit of a quagmire these days. Shader compilation stutters etc are so commonplace. I miss the days of smooth frame rates, whenever those days were...
 

SoloCamo

Member
Hah, gramps. That's funny. :messenger_grinning_sweat:

Damnit. Trying not to get too old, and now you made me feel old. I like high-refresh stuff, too. I mean, that's why I have a 4090, too! But 1080p ain't never, ever happenin' again.

All good, I'm in the same old boat. I'm not dropping from 4k, going to lower res, regardless of high refresh is a major downgrade for me. When I see a 1080p screen even with my not amazing vision, the large, obvious pixels bother me to no end.
 

poodaddy

Member
How many times is it necessary to have videos showing gaming on high-end hardware at low resolutions as a determiner of a CPU’s worth when just about nobody games like that? What is the point?
Nearly 60% of users still use 1080p according to the most recent Steam Hardware survey, making it by far the most common resolution and nothing else really comes close, so I don't think we can claim "nobody games like that."
 
  • Thoughtful
Reactions: amc

Poppyseed

Member
Nearly 60% of users still use 1080p according to the most recent Steam Hardware survey, making it by far the most common resolution and nothing else really comes close, so I don't think we can claim "nobody games like that."
I think you didn't read my post: I said "high-end hardware at low resolutions." I also said, "just about nobody games like that."
 
Last edited:
  • Like
Reactions: amc

Rentahamster

Rodent Whores
. I mean, let's say for example - this thread. How many people in this thread are shopping for a top CPU and top GPU to play games at 1080p? Raise your hands. Nobody? Color me surprised

Most of us with high end gear are not gaming at 1080. However, the 1080p data is a lot more useful to us than the 4K data. That's why reviewers tend to favor 1080p benchmarks in this specific circumstance.
 

poodaddy

Member
I think you didn't read my post: I said "high-end hardware at low resolutions." I also said, "just about nobody games like that."
I read your post, I simply disagree. As silly as you and I might think it is, and believe me I think it's silly, there are plenty of people out there with high end GPU's and CPU's and 64 unnecessary ass gb of DDR5 on a 1080p non HDR VA or IPS monitor. I was in the Army a while back and my wife is still in the Air Force and in College, and I can tell you for sure that quite a lot of young people just buy "the best CPU's and GPU's", (read: whatever the Best Buy guy recommends), without any attention paid to monitor at all. I don't really get it, but it's definitely a thing. Also, "just about nobody".....my apologies, definitely cannot be applied to 60% of a demographic. I could go with it if you were at least talking about a minority, but applying "just about nobody" to the majority just seems silly. I get what you're saying though, just my two cents.
 
  • Thoughtful
Reactions: amc

Fafalada

Fafracer forever
But here I am with a 13700K and a 4090 wondering if this CPU is going to do much for me at 4K in terms of CPU bottlenecking and… well, I don’t know.
It'll do - relatively speaking - more than CPUs that perform worse at 1080p, and less, than CPUs that perform better.
If your position is that 'all modern CPUs are good enough for 60fps so who cares' - well - Dragon Dogma says hold my beer.

And these outlets do test productivity benchmarks (HWUnboxed in particular). if you haven't noticed they do - then you may be among the same group of people who you just complained about for not paying attention.
 

Poppyseed

Member
I read your post, I simply disagree. As silly as you and I might think it is, and believe me I think it's silly, there are plenty of people out there with high end GPU's and CPU's and 64 unnecessary ass gb of DDR5 on a 1080p non HDR VA or IPS monitor. I was in the Army a while back and my wife is still in the Air Force and in College, and I can tell you for sure that quite a lot of young people just buy "the best CPU's and GPU's", (read: whatever the Best Buy guy recommends), without any attention paid to monitor at all. I don't really get it, but it's definitely a thing. Also, "just about nobody".....my apologies, definitely cannot be applied to 60% of a demographic. I could go with it if you were at least talking about a minority, but applying "just about nobody" to the majority just seems silly. I get what you're saying though, just my two cents.
If you look at the 60% running 1080p, they're statistically not using high-end rigs of any sort. Look at the GPU charts. Most people are using dogsh*t for a GPU.

 
  • Like
Reactions: amc

sachos

Member
But…but… who on earth is buying a CPU like this and a 4090 GPU just to play 1080p? At 4K I’d guess all these CPUs perform similarly.
They do tests like this to test CPU max theoretical framerate output capacity trying to diminish the GPU influencing the result as much as posible (i would even prefer if they did 720P Lowest). If it can't do 120/240fps in a specific game at the lowest graphics settings then chances are it won't ever be able to do it even with a next gen GPU.
This is what TechPowerUp has to say about their 720p CPU Benchmarks:
"This low resolution serves to highlight theoretical CPU performance, because games are extremely CPU-limited at this resolution. Of course, nobody buys a PC with an RTX 4090 to game at 720p, but the results are of academic value because a CPU that can't do 144 frames per second at 720p will never reach that mark at higher resolutions. So, these numbers could interest high-refresh-rate gaming PC builders with fast monitors. Our 720p tests hence serve as synthetic tests in that they are not real world (720p isn't a real-world PC-gaming resolution anymore) even though the game tests themselves are not synthetic (they're real games, not 3D benchmarks)."

Real world example of the usefulness of this test: Say you have a 4090 Gaming at 4K and you get 60 FPS in game X, the 5090 comes out and is 50% faster! Huge Gains! You buy it and you realize you only get 2 FPS increase in your game. Why is that? You are CPU limited. How would you have known that previous to buying the 5090? You would have tested the game at 720p and you would have seen that you were only getting 2 FPS extra with your 4090.
 
Last edited:

brenobnfm

Member
It's really not, negligible difference compared to 7800x3D, which indicates that a difference selection of games could've display a different result and considering it consumes more power, the 7800x3D is still clearly the top gaming CPU.
 
Top Bottom