• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel responds to i7-7700K high temp. issue — tells owners they shouldn't overclock

knitoe

Member
I wouldn't buy a 4-core CPU right now as I expect any CPU to last 3-4 years. But continue trying to justify your 7700k by all means...

I have a 5930K@4.5GHz. In 2-3 years, I am looking at a possible Intel 10-12 core and not a weak 8 core today or will be in the future, but whatever...
 
I have a 5930K@4.5GHz. In 2-3 years, I am looking at a possible Intel 10-12 core and not a weak 8 core today or will be in the future, but whatever...

I've got a 5820K @ 4.5Ghz? The Ryzen 8-cores aren't 'weak' that's your poor judgement if you think they are.
 
Why are people buying 7700k's when the Ryzen 1700 exists and can be had for $300 and has had a vast array of BIOS updates, fixes and a new microcode that have incrementally improved performance even more since launch?

In fact, why are people buying 4-core CPUs in 2017 full stop? :/ At least get 6-cores people.

That's exactly the dilemma I'm in. I currently have a 3570k which lasted me for around 4 years that I want to upgrade by the end of the year. But how should my upgrade go? I would like my next CPU to be a future-proofing upgrade that lasts me another 4 years so I want something beyond a quad-core which seems quite hard to achieve. My primary usage is gaming which means if I go Ryzen I would need to gamble 100% on games scaling better with more cores in the future because the single-core IPC performance seems quite lackluster. And while Intel still is king here the issue with the 7700k makes me worry while at the same time the cheapest Intel six-core I can buy here right now is 400€ and no way in hell I'm paying 400€ for a CPU alone when I still have to replace my board and RAM along with it :/

Nevermind that I would need to factor in another 50-60€ for a new heatsink as, while my current Coolermaster Hyper TX3 Evo still does the job, I want a bit more headroom.
 

knitoe

Member
I've got a 5820K @ 4.5Ghz? The Ryzen 8-cores aren't 'weak' that's your poor judgement if you think they are.

From the reviews, it's weaker than my cpu that come out almost 3 years ago. Continue keeping up your performance / price ratio crusade. It's off topic so there's no point for me to continue and keep on clogging this thread.
 

low-G

Member
Yes but most people don't own a card on the level of a 1080, 1080 Ti, or Titan X and game at 1080p at the same time to expose these performance gaps. I think you do so you're in the minority where differences may be noticeable. This is important.

People with cards below those and/or who game at 1440p/4K and the CPU is less of a bottleneck. I'd challenge anyone to tell the difference between 90fps and 98fps.

I can tell when playing VR, that's for sure. (Since 90fps is the minimum, and I notice the artifacts when I drop below)
 
That's exactly the dilemma I'm in. I currently have a 3570k which lasted me for around 4 years that I want to upgrade by the end of the year. But how should my upgrade go? I would like my next CPU to be a future-proofing upgrade that lasts me another 4 years so I want something beyond a quad-core which seems quite hard to achieve. My primary usage is gaming which means if I go Ryzen I would need to gamble 100% on games scaling better with more cores in the future because the single-core IPC performance seems quite lackluster. And while Intel still is king here the issue with the 7700k makes me worry while at the same time the cheapest Intel six-core I can buy here right now is 400€ and no way in hell I'm paying 400€ for a CPU alone when I still have to replace my board and RAM along with it :/
Eh, if you play any games now that tend to be multi-threaded (or can be forced to be multi-threaded, like UE3 games), then jumping to a 1600(X) is a good move. I had a 3570K myself and Mass Effect: Andromeda brought it to its knees, with one area in particular maxing out the CPU just by standing there. Got a 1600X and the only time the CPU maxes out is when the game starts and I load a save.

Also, because the 1600X has 6 cores, I managed to double XCOM 2's performance with a GTX 970 by making 4 cores handle shaders, which got me to 1080p 60FPS from 1080P 30FPS.
 
From the reviews, it's weaker than my cpu that come out almost 3 years ago. Continue keeping up your performance / price ratio crusade. It's off topic so there's no point for me to continue and keep on clogging this thread.

Weaker? In what world? Knitoe & Son's?

Ryzen's IPC is around Broadwell's, so in advance of your Haswell-based chip.
 
It's that simple now? Amazing.

Based on that picture I still don't know if it's easy and I'm terrified of trying it myself. I'd like to because I can't overclock my 7700k at all without getting into unreasonable heat territory. I'm already at 65 average when playing Battlefield 1. This is with an AIO cooler after about an hour, mind you (when they reach their max temps for a given workload). The voltage increase I'd need for even just 4.9Ghz would get into "too hot" territory for me.

Delidding would lower my temps to "nice and cool" when gaming, and I'd feel a lot better about trying to overclock. Though the only game I'm remotely CPU limited in is Breath of the Wild via CEMU.
 

Mareg

Member
I don't remember at what point it changed, but I swear back in the days, even CPUs that would be "unlocked" were not advertised as such by companies.

Pretty sure someone at the legal department of Intel is having the day of his life. "Told you guys we should never have removed that fine print". Since when did they stop telling consumer overclocking was a risk ? I'll give you it has become safer and easier. But still, Intel should be aware there are risks. And now the risk is their reputation. Which is all the more damaging.
 

rtcn63

Member
I have no idea. I'm terrified of trying it myself.... but I kinda want to because I can't overclock my 7700k at all without getting into unreasonable heat territory. I'm already at 65 average when playing Battlefield 1. This is with an AIO cooler after about an hour, mind you (when they reach their max temps for a given workload).

Delidding would lower my temps to "nice and cool" when gamings, and I'd feel a lot better about trying to overclock. Though the only game I'm remotely CPU limited in is Breath of the Wild via CEMU.

There are services where I believe you can send your CPU in or buy one pre-delidded (google).

And yeah, I do get that the cost is annoyingly above and beyond than what you *should* be paying for an Intel CPU. (And should make you reconsider the competition) That's coke and hooker money.
 
There are services where I believe you can send your CPU in or buy one pre-delidded (google).

And yeah, I do get that the cost is above and beyond than what you *should* be paying for an Intel PSU. That's coke and hooker money.

$50 to send it to SiliconLottery. Which I would do in a heartbeat would it not leave me without a CPU for a week (ship to them, delid, ship back). Can't have that, unfortunately.
 

MaDKaT

Member
Recently replaced my 6600k for a 7700k before this news :( I can certainly vouch for the chip running hotter than my 6600 at similar clock (4.6 currently) by about 10c. Delidding kit is on its way and I do hope that will do the trick as Im not really in the mood to replace my mobo and go a different route yet.
 

mephixto

Banned
·feist·;236079604 said:

Hmm if you read the intel warranty that is linked in that reply, there is only a warning by intel that they are not responsible for the norrmal operation of the CPU if you overclock or downclock but it doesn't void the warranty.

WARNING: Altering clock frequency and/or voltage may: (i) reduce system stability and useful life of the system and processor; (ii)
cause the processor and other system components to fail; (iii) cause reductions in system performance; (iv) cause additional heat or other
damage; and (v) affect system data integrity. Intel has not tested, and does not warranty, the operation of the processor beyond its
specifications. Intel assumes no responsibility that the processor, including if used with altered clock frequencies and/or voltages, will be
fit for any particular purpose.

what voids the warranty:

• any costs associated with the repair or replacement of the Product including labor, installation or other costs incurred by you, and in
particular, any costs relating to the removal or replacement of any Product that is soldered or otherwise permanently affixed to any
printed circuit board; OR
• damage to the Product due to external causes, including accident, problems with electrical power, abnormal electrical, mechanical or
environmental conditions, usage not in accordance with product instructions, misuse, neglect, alteration, repair, improper installation,
or improper testing; OR
• any Product which has been modified or operated outside of Intel’s publicly available specifications or where the original identification
markings (trademark or serial number) has been removed, altered or obliterated from the Product
 

knitoe

Member
Weaker? In what world? Knitoe & Son's?

Ryzen's IPC is around Broadwell's, so in advance of your Haswell-based chip.

In the world where I posted gaming screenshots and anyone can google many other reviews that clearly show they have slower fps? We must be living in different worlds...
 
Eh, if you play any games now that tend to be multi-threaded (or can be forced to be multi-threaded, like UE3 games), then jumping to a 1600(X) is a good move. I had a 3570K myself and Mass Effect: Andromeda brought it to its knees, with one area in particular maxing out the CPU just by standing there. Got a 1600X and the only time the CPU maxes out is when the game starts and I load a save.

Also, because the 1600X has 6 cores, I managed to double XCOM 2's performance with a GTX 970 by making 4 cores handle shaders, which got me to 1080p 60FPS from 1080P 30FPS.

I have the same issue in Battlefield 1 where on many maps my CPU is just pegged at 99% in multiplayer taking my GTX 1070 down with it :/ That's the only game where I notice my 3570k is near the end for me but I'm sure there will be more of that in the future. I'll see where Ryzen is at by the end of the year, for now I'm riding it out as I wait for further price-drops.

Interesting that manual core-management can make such a difference though. Out of curiosity, how would I go about this along with forcing UE3 games to use more threads?
 
In the world where I posted gaming screenshots and anyone can google many other reviews that clearly show they have slower fps? We must be living in different worlds...

And I could post about 10 screenies where the 5930K gets demolished by a 1800X but I won't because your mind is made up. So enjoy yourself.
 
I have the same issue in Battlefield 1 where on many maps my CPU is just pegged at 99% in multiplayer taking my GTX 1070 down with it :/ That's the only game where I notice my 3570k is near the end for me but I'm sure there will be more of that in the future. I'll see where Ryzen is at by the end of the year, for now I'm riding it out as I wait for further price-drops.

Interesting that manual core-management can make such a difference though. Out of curiosity, how would I go about this along with forcing UE3 games to use more threads?
Just follow this guide (copied off Steam):
This guide will not feature technical explanations as to why the settings work. It is writ for the everyday gamer. It is important to note that the UE3 engine was designed with consoles in mind and needs to be optimized for PC use. YOU MUST INCLUDE ALL CHANGES. Do not attempt unless you have a decent knowledge of what you are doing(and how to undo it). This will not effect the ablilty to play online and does not fall outside the guidelines of VAC.

And yes you may have seen some of these settings and tried some of them but they MUST all be done together for this to work properly because many settings affect the timing of other settings since this is a streaming based engine. TRY IT this way 1st. plz. give it a chance.

This WILL NOT fix choppy bink video playback as that is plugin based and not engine based. That is a different animal altogether (its very poorly coded for multi-core cpus)

THIS FIX WORKS FOR ALL UE3 BASED GAMES.(well tested)

I need to take a moment to explain that this will fix the kind of stutter that you get even on med to high-end hardware commonly called load/frame hitching. This is where your getting great Frames Per Sec (as reported by fraps or afterburner) but it "looks" choppy. Poor performace (I.E. running a game at settings you know your HW can't run well) will not see a serious decrease in hitching, though it will still help a fair bit. You can really tell the difference if you run with vsync ON and compare BEFORE FIX/AFTER FIX smoothness. It is very, very obvious. Anyhow....

ALL GAMES MUST HAVE COMPLETED FIRST RUN. (1 startup then exit)

1st Goto \ users \ yournamehere \ documents \ my games \ gamename \ ...... \ config.
(Note: EA games sometimes use "EA Games" instead of My Games)
Make a Backup of the "engine" ini file just in case you need it.

2nd Open the "engine" ini file in notepad (or your text ed of choice) Find and change the following lines ( just use ctrl-f ) taking care to match the settings to your system where noted. Also some of these settings you may not have to change IF they already match what is given.

Here they are: (REMEMBER, READ CAREFULLY, MATCH EXACTLY)
"*"=setting
"bUseTextureStreaming=True" (these are default for most but not all UE3 games)
"bUseBackgroundLevelStreaming=True"
"MipFadeInSpeed0=0" (these even out mipmap loads and draw time)
"MipFadeOutSpeed0=0"
"MipFadeInSpeed1=0"
"MipFadeOutSpeed1=0"
"PhysXGpuHeapSize=64" (these balance physx calls even on cpu based physx titles)
"PhysXMeshCacheSize=16"
"bSmoothFrameRate=TRUE" (this really does need to be on, ignore what you've read)
"MinSmoothedFrameRate=30" (keep these right here. setting higher/lower does no good)
"MaxSmoothedFrameRate=400"
"bInitializeShadersOnDemand=True" (reduces overall shader batch call size)
"DisableATITextureFilterOptimizationChecks=False" (driver based opt is MUCH faster)
"UseMinimalNVIDIADriverShaderOptimization=False" (same here)
"PoolSize=256" or (vidmem/poolsize exmpl: 512/128, 1024/256, etc, DO NOT exceed 768)
"bAllowMultiThreadedShaderCompile=True" (should already be on by default)
"ThreadedShaderCompileThreshold=4" (formulate like this: # of cpu-cores (not threads) -2)
"OnlyStreamInTextures=True" (reduces overall texture batch call size)

Check for multiple instances of these settings within this file as some games have them listed twice and you must change both (I.E Xcom:Enemy Unkown)

Now save the file, then right click it, choose properties and then set it to "read-only".
Launch your game, choose reset to defaults under that games graphics settings, then set them up how you want. These settings get saved to your profile file for that game and use the engine ini as a base. They are not saved to the ini file therefore it wont matter that it is read-only. However there are a few rare exceptions (SEE NOTE BELOW)

THAT'S IT.

All other settings in the file should be left at there defaults
(Yes, even "SizeOfPermanentObjectPool". FYI This can hurt performance if set different from defaults because it can actually increase frame call time if set incorrectly)

All changes asume that you have steam cloud sync on. If not, changes you make may be overwritten if you turn on cloudsync after the fact and may need to be re-done.

(SPECIAL NOTE) Some UE based game like BatmanAA or AC need to be configured how you want it 1st THEN do the fix minus the reset defaults part. You will also need to modify the "userengine" ini file in the same folder as the one above to match any relevent settings.

I think ThreadedShaderCompileThreshold has the most impact on CPU heavy games like XCOM, just due to the sheer amount of effects the game has, but if you go Ryzen, you can experiment with different core counts to see what works best for each game. I suspect something like Mass Effect 2 wouldn't need more than 2 cores for shaders for best performance, while the Arkham games might need 4.
 
Seems, unlike you, I have no problems being proven wrong. Go ahead, post them recent AAA games scores.

http://www.game-debate.com/cpu/inde...ryzen-r7-1800x-vs-core-i7-5930k-6-core-3-5ghz

In terms of overall gaming performance, the AMD Ryzen R7 1800X is massively better than the Intel Core i7-5930K 6-Core 3.5GHz when it comes to running the latest games. This also means it will be less likely to bottleneck more powerful GPUs, allowing them to achieve more of their gaming performance potential.

Not very smart of you to take that stance...
 

knitoe

Member

I am sorry, but WTF did you linked? It's like one of those dumb generic cpu comparison chart site.

This is what I want to see. A real review.
http://www.pcgamer.com/gaming-performance-of-ryzen-7-vs-core-i7-with-geforce-gtx-1080-ti/

Overclocked R7 1700 vs. i7-5930K with the fastest graphics card on the planet.

With an overclocked Ryzen 7 1700 as the AMD contender, pitting it against my regular GPU testbed is a nearly perfect fight. My Intel testbed is running an i7-5930K overclocked to 4.2GHz—I could push it to 4.5GHz, thanks to the liquid cooling, but I 4.2GHz with a decent air cooler (like the one in the AMD Ryzen system) seems a bit more of a fair fight.

6wDttndyWreY3mmUQZVkjc-650-80.png
 

Waikis

Member
I am sorry, but WTF did you linked? It's like one of those dumb generic cpu comparison chart site.

This is what I want to see. A real review.
http://www.pcgamer.com/gaming-performance-of-ryzen-7-vs-core-i7-with-geforce-gtx-1080-ti/
Just for shits and giggles I ran 5820k vs 6700k:
In terms of overall gaming performance, the Intel Core i7-5820K 6-Core 3.3GHz is massively better than the Intel Core i7-6700K 4-Core 4.0GHz when it comes to running the latest games. This also means it will be less likely to bottleneck more powerful GPUs, allowing them to achieve more of their gaming performance potential.

Rofl they seem to have a very simple text template for the comparisons.
 

low-G

Member
What a turn this thread has made. I mean everything I heard about Ryzen had it lagging a bit behind for gaming, so I was surprised to hear all the praise early in the thread (actually assumed they knew what they were talking about)...
 

rtcn63

Member
What a turn this thread has made. I mean everything I heard about Ryzen had it lagging a bit behind for gaming, so I was surprised to hear all the praise early in the thread (actually assumed they knew what they were talking about)...

The R5 CPU's are pretty damned legit, especially if you do production work etc. Right now it's like R5- mid-range, i7- high-end. (For gaming)
 

Lonely1

Unconfirmed Member

Hey, AMD Ryzens are good CPUs. Specially for the price. Right now I would recommend them over Intel's HEDT. But there's no reason to upgrade to them from them either and we have to say tha truth. Ryzen IPC seems to be between Haswell and Broadwell, but the difference between those Intel processors is already small, between 3% and 8%. And Haswell OC's better than Broadwell and much better than Ryzen.
 
What a turn this thread has made. I mean everything I heard about Ryzen had it lagging a bit behind for gaming, so I was surprised to hear all the praise early in the thread (actually assumed they knew what they were talking about)...

There is lots to praise about Ryzen. The R7 are a bit behind their Intel counterparts in gaming, but the longer you intend to own that CPU, the more interesting the Ryzens get (more threads, more CPU cores is gonna help in the future, we don't know the extent though). R5s are actually pretty much on par with their i5 counterparts in gaming, sometimes even in front + the time thing again is on AMDs side.
And for pretty much everything besides gaming, Ryzens are a no-brainer. No reason at all to get anything from the i5 or i7 line, unless you wanna go down the 1000$+ route.
 

Ostinatto

Member
Jesus, im gonna buy a new PC this week, and i was gonna to buy a I7 7700K, Good timing.

Should i buy a regular I7 7700 then?
 

Drey1082

Member
6700k has temp spike issues, too. Annoying, but seems harmless...so far...after having my 6700k since the 6700k launched.

Yep, when I was building my system with a 6700k, with a 212 evo I was getting 80C to 90C temps on CPU benchmarks with an average overclock.. I've noticed that it seemed to be spikes with various benchmark software (the Time Spy 3DMark CPU test was typically killer.) Standard gaming and CPU intensive activities never really get me over 70- 75C on average. I think these newer Intel CPUs just run hotter. 90+ though is insane.
 
I stress-tested mine on air before de-lidding + liquid ultra and after at 4.5 and 4.8Ghz - seemed to do quite well. I actually ran at 5Ghz (post de-lid) just to see if it would stay up and played ~hour-hour of Watch Dogs 2 with no issues. Temps were high but it was a $30 Intel cooler.

I've since put it on water but I've got a few things left to do before I fire that machine up.

Bullshit response though.

"Buy a $350 unlocked CPU!...but don't do anything to it."
 

TSM

Member
Haha, this thread is hilarious. The i7-7700k seems like it might have issues with temperature spikes, and the resident AMD enthusiasts decided this was a good time to knock on the door and offer everybody some literature. AMD's Ryzen is in no way a good substitute for a 7700k for gaming, and I have no idea why they would try to present it as one.

On topic: Wouldn't the spike in temperature indicate that the processor is processing a heavy load of some sort? If that's the case then logic would indicate eliminating these spikes would slow at least some of the work the processor does down. I'd imagine any good solution would have to wait for the next iteration of Intels processors.
 
Haha, this thread is hilarious. The i7-7700k seems like it might have issues with temperature spikes, and the resident AMD enthusiasts decided this was a good time to knock on the door and offer everybody some literature. AMD's Ryzen is in no way a good substitute for a 7700k for gaming, and I have no idea why they would try to present it as one.

On topic: Wouldn't the spike in temperature indicate that the processor is processing a heavy load of some sort? If that's the case then logic would indicate eliminating these spikes would slow at least some of the work the processor does down. I'd imagine any good solution would have to wait for the next iteration of Intels processors.

Why not?
 

Costia

Member
Looking at the benchmarks a few posts above the 7700K performs netter than the 1800X.
Looking at newegg, it is also cheaper.

Jesus, im gonna buy a new PC this week, and i was gonna to buy a I7 7700K, Good timing.
Should i buy a regular I7 7700 then?
I would say it depends on the price difference.
I got a 4960K because it was ~10-20$ more expensive than the non-K version.
I don't usually OC my CPU until its time to replace it (so i wont care if it dies).
 

spyshagg

Should not be allowed to breed

rtcn63

Member
That i5 quote pisses me off.

Firstly, launch date benchmarks.

Secondly, those i5's are shitting themselves with 99% cpu usage as of today. Ryzen 7's ~40% and Ryzen 5's 50%

Search youtube channels such as duderandom and TestingGames. Look for the cpu usage. I always say you shouldn't judge Ryzen's worth with todays games.

So you want people to buy a computer part based on potential as-of-yet-unknown future performance? Ryzen (and it's hardware compatibility) is still heavily in the process of being updated across the board IIRC, so it's performance may be changing week to week.

Also Gamersnexus actually went back and revisited the 1700/1700x after some updates a month later, but I didn't post it because people were talking about the 1800x (actually, they may have re-benched the 1800x as well but I missed it, apologies): http://www.gamersnexus.net/hwreviews/2865-ryzen-revisit-ram-overclock-windows-update-efi-updates
 

ZOONAMI

Junior Member
That i5 quote pisses me off.

Firstly, launch date benchmarks.

Secondly, those i5's are shitting themselves with 99% cpu usage as of today. Ryzen 7's ~40% and Ryzen 5's 50%

Search youtube channels such as duderandom and TestingGames. Look for the cpu usage. I always say you shouldn't judge Ryzen's worth with todays games.

Also doesnt a 1600x do better than the 7s generally for single core?

I thought i saw some 1600x benches with the ryzen pretty much in a dead heat with intels best chips.

The 1600x is an excellent gaming cpu. It's constantly sold out at my microcenter.
 

Costia

Member
That i5 quote pisses me off.
Firstly, launch date benchmarks.
Secondly, those i5's are shitting themselves with 99% cpu usage as of today. Ryzen 7's ~40% and Ryzen 5's 50%
i5's are not shitting themselves, they are properly utilizing all of the available HW to give you actual performance. While if your assertion on 40-50% utilization is correct, it means that 50-60% of the ryzen CPU is just a waste of HW for gaming purposes.
Search youtube channels such as duderandom and TestingGames. Look for the cpu usage. I always say you shouldn't judge Ryzen's worth with todays games.
This is the main problem with AMD, their HW is always going be good for the future, but performs worse when compared at current conditions.
Same with their GCN GPU archetecture, it was good for DX12/vulkan when most games were dx11/9/OGL.
The problem with this attitude is that by the time games advance to the new APIs/programming styles that utilize those extra 50-60% intel/nvidia will already have a new generation of HW that does it as well.
 

spyshagg

Should not be allowed to breed
So you want people to buy a computer part based on potential as-of-yet-unknown future performance? Ryzen (and it's hardware compatibility) is still heavily in the process of being updated across the board IIRC, so it's performance may be changing week to week.

Also Gamersnexus actually went back and revisited the 1700/1700x after some updates a month later, but I didn't post it because people were talking about the 1800x (actually, they may have re-benched the 1800x as well but I missed it): http://www.gamersnexus.net/hwreviews/2865-ryzen-revisit-ram-overclock-windows-update-efi-updates

i5's are not shitting themselves, they are properly utilizing all of the available HW to give you actual performance. While if your assertion on 40-50% utilization is correct, it means that 50-60% of the ryzen CPU is just a waste of HW for gaming purposes.

This is the main problem with AMD, their HW is always going be good for the future, but performs worse when compared at current conditions.
Same with their GCN GPU archetecture, it was good for DX12/vulkan when most games were dx11/9/OGL.
The problem with this attitude is that by the time games advance to the new APIs/programming styles that utilize those extra 50-60% intel/nvidia will already have a new generation of HW that does it as well.

It can be read anyway you'd chose. The way I meant it to be read is: My money will never go to buy any CPU that is pegged at 99% usage gaming wise while there are others that are within 20% of the performance with only 40% usage.

If you buy a cpu every 6 months then go for the current best scorer. If you buy one every two years like me, no i5's and the only i7 that would be worth is the 6900k but never for 1200 euro.

Ryzen changed the market picture for everyone that buys long term.

ps: i5's are shitting themselves, HARD.
 
gta-v-1080-16sa3r.png

witcher-3-1080z8yi9.png

You can easily see a 22+ fps difference.

And, by the time I upgrade my cpu, I start over fresh so lasting 2 gen means nothing for me.
so I'm a noob at CPU overclocking and I have a 6700k along side my 1080 FTW (soon to be 1080Ti).

If I'm looking at this correctly, the only thing you hope for by significantly overclocking my CPU (and potentially introducing heat, lifespan and stability issues) is 3-5 fps?

Is there anything I'm missing here that makes it something worth doing outside of the satisfaction of saying you did it? Are there other games that benefit meaningfully more? Or...nah?
 
Top Bottom