• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel 14th Gen rumored to clock to at least 6.2 GHz. Double digital multi-core performance vs. 13th Gen.

IFireflyl

Gold Member
Where I'm wrong?

You were wrong here:

They are already super efficient.

I don't know what data you are using to come to the conclusion that they are "super efficient". I mean, you're not even saying, "relatively efficient", but rather you're making it out as if they have managed to give us amazingly high performance with an impressively low power draw, and that isn't the case.


You were also wrong here:

It's only you that wanted moar power so they ran at stupid voltages.

I don't know why you think I was demanding these changes. That was a stupid thing to say.

Literally all CPU's and GPU last 7+ years ran at very high voltages. I've undervolted every CPU/GPU I had because I'm on SFF PC. My current rig is a a 5-litre SFF PC with 12600k and 3070 and both Intel and Nvidia (especially) are super power-efficient if you don't want to squeeze last 5-10% of performance.

The fact that you're saying you have undervolted every CPU/GPU is proof that I am correct. You're doing this because the CPUs/GPUs are not efficient by default. There is no other reason to undervolt your CPUs/GPUs.
 
The fact that you're saying you have undervolted every CPU/GPU is proof that I am correct. You're doing this because the CPUs/GPUs are not efficient by default. There is no other reason to undervolt your CPUs/GPUs.
The fact that I can do undervolting and have 50% less power draw at 5-10% less performance proves exactly my point. It's not on paper or in my thoughts - it's real. Hardware became more energy efficient with every generation - because efficiency = performance. It's just matter how you look at it. You can have same performance next-gen but it will consume 50% less power or double the performance at same power draw. We see it every generation.

NVIDIA/INTEL/AMD all don't care about power draw that much because of competition and squeezing last bit of performance at ridiculous power draw. And sell you a cooling system too - it's beneficial for everyone in the business.
 
thats why i bought a 12700k on sale... as a stop-gap till i get a 14700k.

and once the 14700k is out, ill hear about the 16700k and not buy the 14700k.

and ill end up having the 12700k for like 10 years.

images
 
thats why i bought a 12700k on sale... as a stop-gap till i get a 14700k.

and once the 14700k is out, ill hear about the 16700k and not buy the 14700k.

and ill end up having the 12700k for like 10 years.

images

That's basically my upgrade loop. Always waiting for the perfect moment... that is always jus a gen away.
 

V2Tommy

Member
thats why i bought a 12700k on sale... as a stop-gap till i get a 14700k.

and once the 14700k is out, ill hear about the 16700k and not buy the 14700k.

and ill end up having the 12700k for like 10 years.

images
Well, moving past a 14700K on your Z690/Z790 current board means getting a new board, so waiting on a 14700K itself as a "final option" is a good one on your current platform. This is not quite the same as eternally delaying a purchase because you already have two-thirds of the system in place, so "maxing it out" is a good feeling. In fact, the idea of a brand new 14900K on a cheap, years-old Z690 DDR4 platform is a huge Intel win.
 

IFireflyl

Gold Member
The fact that I can do undervolting and have 50% less power draw at 5-10% less performance proves exactly my point.

You're not listening to me. The mere fact that you can undervolt your CPU to get that much of a difference in power draw with that little performance loss is exactly the point I am making. That means that, by default, Intel CPUs are wildly inefficient. A 10% hit to performance to save 50% of its power draw is huge. Maybe your issue is that you don't understand what the word "inefficient" means.

not achieving maximum productivity; wasting or failing to make the best use of time or resources.

Telling me how that much excess power for that little performance difference is efficient.

It's not on paper or in my thoughts - it's real. Hardware became more energy efficient with every generation - because efficiency = performance. It's just matter how you look at it. You can have same performance next-gen but it will consume 50% less power or double the performance at same power draw. We see it every generation.

Improvements don't make the CPUs efficient. The default setting for Intel CPUs is, by your own admission, inefficient. Intel hasn't once dropped the power draw. At most, the power draw stays the same but might have more efficient performance than the previous generation. More efficient performance doesn't mean that it is efficient though. If I take a test and get an 50%, and then I retake the test and get a 60%, that still means I suck.

NVIDIA/INTEL/AMD all don't care about power draw that much because of competition and squeezing last bit of performance at ridiculous power draw. And sell you a cooling system too - it's beneficial for everyone in the business.

Again, all you're doing is making my arguments for me, but somehow saying the words like it backs up your claim. This is my point. Just because something0 is beneficial to [company name] doesn't mean it is beneficial to the consumer.
 
Top Bottom