Aren't they sponsored by Intel? lol
I think the title is more because Linus did a video recently criticizing a LOT of stuff about this release, despite having Intel as a sponsor.
Aren't they sponsored by Intel? lol
Is this a joke? There are plenty of CPUs made in the past 5 years that will completely bottleneck you in some games.None of this is really relevant for the gaming side as no cpu in the past 5 years has been taxed by any game in real world use.
https://m.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/7
Still rocking the old 2600k for flawless VR. Put your money into a gpu unless you need a workstation.
AMD Ryzen is really, really competitive. Finally some damn competition in the CPU space.
Really, really, weird to see AMD more efficient wattage-wise per-core than intel.
Somewhere between August 2017 and February 2018. Have fun second-guessing Intel D:When is Coffee Lake coming out?
Damn it! Why you always beating me 😝.
Edit - Woof!
Intel FX represents
I think the title is more because Linus did a video recently criticizing a LOT of stuff about this release, despite having Intel as a sponsor.
But for the 140W Skylake-X parts, we recorded nearly 150W power consumption. Intel announced that the socket is suitable up to 165W,
The gaming story is unfortunately not quite as rosy. We had last minute BIOS updates to a number of our boards because some of the gaming tests were super underperforming on the new Skylake-X parts. We are told that these early BIOSes are having power issues to do with turboing, as well as Intels Speed Shift technology when the GPU is active.
While these newer BIOSes have improved things, there are still some remaining performance issues to be resolved. Our GTX1080 seems to be hit the hardest out of our four GPUs, as well as Civilization 6, the second Rise of the Tomb Raider test, and Rocket League on all GPUs. As a result, we only posted a minor selection of results, most of which show good parity at 4K. The good news is that most of the issues seem to happen at 1080p, when the CPU is more at fault. The bad news is that when the CPU is pushed into a corner, the current BIOS situation is handicapping Skylake-SP in gaming.
Intel FX represents
BF1 64 player is a CPU destroyer. That and Ashes of the Singularity.None of this is really relevant for the gaming side as no cpu in the past 5 years has been taxed by any game in real world use.
https://m.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/7
Still rocking the old 2600k for flawless VR. Put your money into a gpu unless you need a workstation.
What are you talking about, the i5 750 has a 95W TDP, wherever you're seeing the 180W value, I guess that's full system load, not just the CPU alone.Dude, I have an i5-750, under load it's like 180 watts! These puppies look mad efficient to me
Do you mean these HEDT chips or Threadripper?Probably not something you would want for games, unless you can get them to 3.9/4.0GHz on all cores.
Anandtech said:Pushing all of the Core i9-7900Xs cores with Prime95 or LuxRender propels power consumption to incredible heights. You do get 48 percent more rendering performance in LuxRender, but at the expense of 58 percent-higher power use. This approach has the elegance of a sledgehammer.
Threadripper. Sorry if that wasn't clear.Do you mean these HEDT chips or Threadripper?
Yes, I've been saying for a while that CPUs should have that level of control.For these HEDT CPUs, personally, I'm really interested in seeing how Turbo Boost 3.0 interacts with overclocking, and what options the boards have for adjusting that.
It would be pretty neat for an "all-purpose desktop" if you could get say a 7820X to run at >=4.6 GHz if only 1 or 2 cores are loaded, and progressively less down to say 4.0 GHz when all cores are used. (And a lot less than that if it's AVX code)
That should keep power consumption under control while giving you near top of the line performance in both stuff that doesn't scale well and the average moderately well-scaling application.
It would be a shame if the Turbo doesn't have the level of configuration options to make something like this feasible.
Will these CPUs create a price drop on the mid range lineup?
While I can't give better than second-hand knowledge of Ryzen atm, unless you need quad channel memory or the PCIe lanes, I really can't recommend the 7800X at $150 more than the 1600X + the additional cost of the x299 platform.Oh man, I'm debating between the i7-7800X or the Ryzen 1600X
I'm leaning towards the Ryzen at this point. Anyone in this thread have one, that can give me a first-hand experience with it?
What's 'real' supposed to mean here?I mean given the current landscape and if you can wait it's the better thing to do otherwise the real HEDT(Skylake-X) seems like a great platform.
Do you mean these HEDT chips or Threadripper?
For these HEDT CPUs, personally, I'm really interested in seeing how Turbo Boost 3.0 interacts with overclocking, and what options the boards have for adjusting that.
It would be pretty neat for an "all-purpose desktop" if you could get say a 7820X to run at >=4.6 GHz if only 1 or 2 cores are loaded, and progressively less down to say 4.0 GHz when all cores are used. (And a lot less than that if it's AVX code)
That should keep power consumption under control while giving you near top of the line performance in both stuff that doesn't scale well and the average moderately well-scaling application.
It would be a shame if the Turbo doesn't have the level of configuration options to make something like this feasible.
Oof...that's a lot of power. I'm still happy with my 5930K but I'm always itching for something faster. I'm more worried about heat than I am about price. Even the coolest chips struggle in our heat and humidity.
Is it safe to upgrade my 2600k yet?
3570k stays winning apparently
I'll keep this until games don't run. Until then, performance only increases in upcoming CPUs!
I'm guessing I should pass on this as my primary purpose is for gaming only. Mind you, I've been running on a i5 3570k for the past 5 years. What should be my next CPU upgrade for video gaming purposes? Coffee Lake?
My 550 watt Gold PSU is cowering behind my leg.
It's very unlikely to make much sense to get a Threadripper (or any of the i9s for that matter) for a PC which has gaming as one of its primary tasks.Okay, I was planning on building a new PC (video gaming, video editing, audio recording & editing) with something akin to these parts: https://ca.pcpartpicker.com/list/jhBB6X. Now i'm curious: With all the talk of Ryzen/Ryzen+/Threadripper, is it better to go go with an AMD CPU i.e. get the 1700/1700X or wait for Threadripper? Because Intel's latest offerings (7700K, i9) seem...lackluster in terms of performance per dollar in comparison to Ryzen.
Argh, my 5820K is starting to feel a little old, even though it's still an amazing CPU.
Is it safe to upgrade my 2600k yet?
uhh 1.47 is pretty high no wonder your cpu is getting worse.My 3930k gets worse by the week. Back in 2012 it did 4.9g at 1.47. Now its 4.4g at 1.47Dont know how much longer she can hold on, and I really would just rather upgrade to a new rig than stay with this aging platform for another ~2 years.I've abused the cpu for years
Intel's mainstream is out of the question since I need the 40+ pci lanes, and Ryzen(and TR I will assume) hits a freq oc wall around 4Ghz (probably due to its LPP) making it a bite the bullet situation for badly optimized/threaded games till the next gen chips come. And i9, I'd have to get a min of 7900x for the pci lanes, which, performance problems aside, looks like it will be a furnace, and suck power if it's clocked high.
Fun.
The i7 6700k is the GOAT CPU still for the price. I see no reason at all to buy any of these processors if you are simply using it for gaming.
The 6700k smokes the 2600k in gaming. I consider $275 for the 6700k an insane deal and very clearly the processor to buy right now.How can any cpu be the GOAT for the price when it's only been out for less than 2 years and the 2600k exists?