Well, I am out then if the other reviews confirm this. If Intel can't be buggered to get temps under control I am not going to bother with their CPUs.
I will take the potential FPS hit and go with Ryzen or just wait for Ryzen 2. Intel can take their substandard paste BS instead of proper TIM solder and shove it.
10c hotter than the 7700K?
LOL. Intel are going to tell us not to overlock again aren't they?
Time to redesign the age old pc case into a mini fridge.
If there is one thing Intel need to improve on with their newer chips it has to be temps, they want to add more cores to compete but at the same time they have to downclock them a lot and they still run piping hot.
2.3% downclock on a new hexacore from a fastest quadcore on the market is a lot?
2.3% downclock on a new hexacore from a fastest quadcore on the market is a lot?
I would wait for release day reviews before judging temps (and performance as well of course).yet it's touching 10c hotter then the 7 series version, right?
yet it's touching 10c hotter then the 7 series version, right?
7700k temperatures were not really something that should have happened. Now we are running 10 degrees higher then that. You don't see a problem here?It has 50% more cores and runs on essentially the same clocks on the same process. What is so surprising here?
My 4770K will do just fine for the time being.
I want the chip not to run at the same temperature as my oven?I mean the base clock alone is higher than most Ryzen CPUs overclocked I believe, and you have the extra cores also so what do you really want?
7700k temperatures were not really something that should have happened. Now we are running 10 degrees higher then that. You don't see a problem here?
What is the context of those temperatures anyway?
Don't know, I assume that is full load at stock speeds? Just those temps are bad, because that leaves pretty much no room to safely overclock whatsoever. Only option would be to delid the CPU (which is really only recommended for the biggest enthusiasts)and no one want their PC to catch on fire
But like what has been said in here, we won't know for sure until reviewers tell us what is up with those temps for sure. All I know is I would never get a CPU that gets that hot, not worth the risk to me.
What temperature do you think is dangerous for the CPU?
7700k temperatures were not really something that should have happened. Now we are running 10 degrees higher then that. You don't see a problem here?
I think going to higher bandwidth DDR4 memory should have positive impact for 4K gaming.Dont upgrade your i5 2500k or i5 3570k cpus for gaming at 4k, benchmarks show no gains at all.
Would like to see 1440p benchmarks.
But really getting cutting edge cpus and no gains at cutting edge gaming standards. Meh.
Dont upgrade your i5 2500k or i5 3570k cpus for gaming at 4k, benchmarks show no gains at all.
Would like to see 1440p benchmarks.
But really getting cutting edge cpus and no gains at cutting edge gaming standards. Meh.
It's not a bad idea. You can buy a 1600 now and then pop a Ryzen 2 into your motherboard in 2019.Well, I am out then if the other reviews confirm this. If Intel can't be buggered to get temps under control I am not going to bother with their CPUs.
I will take the potential FPS hit and go with Ryzen or just wait for Ryzen 2. Intel can take their substandard paste BS instead of proper TIM solder and shove it.
Don't some games still benefit? I quickly tried Crysis 3 and see the same framerate drops at the starting area in 1080p and in 4k (below 60fps), leading me to believe that my 4.5ghz 3570k is holding me back.Dont upgrade your i5 2500k or i5 3570k cpus for gaming at 4k, benchmarks show no gains at all.
Would like to see 1440p benchmarks.
But really getting cutting edge cpus and no gains at cutting edge gaming standards. Meh.
Yeah, your i5 is definitely holding you back there, according to DigitalFoundry even an i5 7600K can struggle with Crysis 3, it's a game that wants more than 4c/4t.Don't some games still benefit? I quickly tried Crysis 3 and see the same framerate drops at the starting area in 1080p and in 4k (below 60fps), leading me to believe that my 4.5ghz 3570k is holding me back.
Wait what, seriously? I mean, it's one review, but that's disheartening to hear. Guess the wait continues.
With a 6700K you should be well above 60fps at all times in Overwatch, even when streaming. Unless you're using some high quality encoding settings, I don't think your CPU is the bottleneck.
I don't really get it either. I mean I'm all for lower temp to have more room for overclocking, but if you do care about that the reaction shouldn't be to go for something slower.I mean the base clock alone is higher than most Ryzen CPUs overclocked I believe, and you have the extra cores also so what do you really want?
So the new 15W mobile CPU is almost i7-2600K performance!
https://www.computerbase.de/2017-09/core-i7-8550u-i5-8250u-test-kaby-lake-refresh-meilenstein/2/
This i7-8550U is already the 4C8T part: https://ark.intel.com/products/122589/Intel-Core-i7-8550U-Processor-8M-Cache-up-to-4_00-GHzAnd if this is the gains in the 15 watt, what are we looking at with mobile quadcores?
And if the 8-core desktop CPU are landing Q2 or Q3 2018, does that mean we will see laptop quadcores next year too?
I have seen reports of quite a bit higher temps on 7700k, especially OC'd. Now add 10 degrees to that and you will be running that 100 degrees on a 6 core with terrible thermal compound. I just don't trust that to last or not overhear my case. Mind you I do run the standalone corsair water cooler for CPU but still don't want to see those temps.What's wrong with 7700K temperatures?
I mean, they are on the higher side of things but this is a top end CPU, it is pretty much always like this with them, which is why I personally prefer mid-range HEDT "ticks" as they are usually rather cool. With 8700K there will be an option of 8700 which is only 100MHz slower in general but is rated at 65W TDP.
Also I know an old Gulftown CPU which is running at 100C at full load without much issues for more than seven years now. So it's really a question of which temperature a CPU can live with. Then there's AIOs which is pretty much a standard thing on top K models as people rarely buy these to run them at stock clocks.
Another review leak/early review, this time of both 8600K and 8700K, these results are looking pretty good.
http://lab501.ro/procesoare-chipset...-i5-8600k-coffee-lake-aorus-z370-ultra-gaming
Less than a week to go to know for sure how it performs.
I have seen reports of quite a bit higher temps on 7700k, especially OC'd. Now add 10 degrees to that and you will be running that 100 degrees on a 6 core with terrible thermal compound. I just don't trust that to last or not overhear my case. Mind you I do run the standalone corsair water cooler for CPU but still don't want to see those temps.
Suggestion below for grabbing 1600 or 1600x this year and upgrading to Ryzen 2 on sale mboard toward end of 2018 or in 2019 kind of makes sense. Plus I mainly game at 3400 x 1440 so I am more GPU limited anyway (waiting for next year to upgrade my 980ti).
Dont upgrade your i5 2500k or i5 3570k cpus for gaming at 4k, benchmarks show no gains at all.
Would like to see 1440p benchmarks.
But really getting cutting edge cpus and no gains at cutting edge gaming standards. Meh.
Meh, not worthy the hassle for someone already at 4970k@4.8
Try harder Intel.
8600k hitting 5.1ghz overclock, not bad. Hot, but good speed. Wish this reviewer would have over clocked the 8700k and Ryzen CPUs as well.Another review leak/early review, this time of both 8600K and 8700K, these results are looking pretty good.
http://lab501.ro/procesoare-chipset...-i5-8600k-coffee-lake-aorus-z370-ultra-gaming
Less than a week to go to know for sure how it performs.
So you're upset that the hardware you bought a few years ago is still very competent?
Yes, we should be much, much farther in terms of performance in the CPU area.
Every area in the IT industry is soaring (SSDs, networking, GPU etc) but in the CPU we have at best a meansly 10% increase.
I want to upgrade my aging DDR2 server to my current pc, but i don´t want to do a meaningless upgrade for me either.
Yes, we should be much, much farther in terms of performance in the CPU area.
Every area in the IT industry is soaring (SSDs, networking, GPU etc) but in the CPU we have at best a meansly 10% increase.
You are saying that from the viewpoint of a professioal engineer, I presume.
You forgot to quantify the other areas in terms of percentages, I am fascinated.