How many people have 4K monitors or TVs?Finally being able to stream 4k content from Amazon and Netflix is nice though I don't think there are any other CPUs out there that allow you to do that.
How many people have 4K monitors or TVs?
How many people have 4K monitors or TVs?
How many people have 4K monitors or TVs?
I'm looking to possibly make a new build with new memory/board/cpu (pulling rest of parts from my current build).
Had both the new i5 and i7 in my cart....went to look at reviews.....changed my mind.
Then why I (and not only me) don't have it if it's so affordable? I guess there's some shitty 4K TVs available but I ain't talking about these ya know. Same goes for 4K monitors and actually decent ones starting from 27'' and above.Are you seriously trying to suggest that 4k on TVs is rare? It is so common and affordable that almost every new 40+ TV is 4k.
These are all 4-core processors.
For gaming today (or in the near future), is there any reason to go with 6,8 or 10-core CPUs?
Is there any need for me to upgrade from a 5820k 4 GHz OC'd?
Be aware that Anandtech's gaming benchmarks are absolutely worthless.
So few sites seem to know how to actually run a CPU-limited game benchmark now, rather than selecting games where the CPU doesn't matter, or having their test be GPU-limited.
CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.
If the test is looking at average framerates and not minimums, the results won't tell you anything useful.
Frametime graphs/percentiles are an even better metric.
A 2500K doesn't cut it any more if you're trying to play new games and your GPU is a GTX 960 or faster.
And new i5s (6600K) are losing to old i7s (2600K) now that many games are starting to make use of more than four threads.
Those gaming benchmarks are weird, seems like every single one of them is bottlenecked by the GPU. Why would you test CPUs on ultra settings?
Almost all TVs sold now are 4K...How many people have 4K monitors or TVs?
But what about minimum FPS and frame times?
Most everything I've played is just fine on an overclocked 3570k.Be aware that Anandtech's gaming benchmarks are absolutely worthless.
So few sites seem to know how to actually run a CPU-limited game benchmark now, rather than selecting games where the CPU doesn't matter, or having their test be GPU-limited.
CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.
If the test is looking at average framerates and not minimums, the results won't tell you anything useful.
Frametime graphs/percentiles are an even better metric.
A 2500K doesn't cut it any more if you're trying to play new games and your GPU is a GTX 960 or faster.
And new i5s (6600K) are losing to old i7s (2600K) now that many games are starting to make use of more than four threads.
B
CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.
Those gaming benchmarks are weird, seems like every single one of them is bottlenecked by the GPU. Why would you test CPUs on ultra settings?
Congratulations, you also don't understand how CPU benchmarking in games should be performed.I'd much rather have professional sites run these benchmarks at settings and resolutions that actually make sense and reflect real world usage. Who the fuck games at 720p? What I, and I think most people, are looking for in these benchmarks is how it affects performance in the real world. No one is buying a $400 i7 so that they can game at 720P.
If the average FPS is that close, I doubt that the minimum FPS would have a significant disparity. But yes, it'd be nice to see more detailed benchmarks. I'm not holding my breath on any major improvements.
You must be confusing Speed Shift with something else. It was introduced with Skylake (2015) so I doubt there was a Linux patch for it in 2014.The whole article is a shit show. Anandtech has fallen since Anand left.
I stopped reading on page 1 when he stated that Mac and Linux didn't support the speed shifting... 2 minutes of googling and I found a patch from an Intel employee for Linux back in 2014 that supported it (Broadwell was in dev at the time, same tech) and MacOS has also supported it for some time.
If you can't do your research you really shouldn't be writing reviews IMO.
I'd much rather have professional sites run these benchmarks at settings and resolutions that actually make sense and reflect real world usage. Who the fuck games at 720p? What I, and I think most people, are looking for in these benchmarks is how it affects performance in the real world. No one is buying a $400 i7 so that they can game at 720P.
It's probably about 40% performance improvement and potentially decent OC with the latest chip (should OC better then Skylake). I am on the same CPU and It's getting to the point where it's almost worth it.I still don't see any reason to upgrade my 3770k.
You must be confusing Speed Shift with something else. It was introduced with Skylake (2015) so I doubt there was a Linux patch for it in 2014.
Anyone?
Assuming this is the same thing - and it certainly looks like it - I stand corrected.https://patchwork.kernel.org/patch/5246361/
You might be right about Broadwell not supporting it, but Speed shift (aka hardware p-states in non marketing speak) have definitely been in the kernel since 2014. That's in git of course, the first stable version may be a few months out from that.
Also consider that Intel usually has new features land in Linux months before shipping hardware. They like to have a lead in the kernel since many distributions use older kernels for stability.
Are these CPUs capable of being OCed in a similar way to the i5 2500k? Mine is OCed to 4.6ghz and I play at 1080. Wondering if I should upgrade, I get 60fps for every game I play with my current set up, 16 gigabytes of ram and SLI 980 Gtx.
How many people have 4K monitors or TVs?
Then why I (and not only me) don't have it if it's so affordable? I guess there's some shitty 4K TVs available but I ain't talking about these ya know. Same goes for 4K monitors and actually decent ones starting from 27'' and above.
Shitty 4K Monitor for poor people / Decent 43" 4K Monitor for not so poor people and Decent 27'' 4K Monitor for not so poor people
Now, I can't buy any of these decent ones cuz I need to work like half a year to buy one and this is without eating and drinking anything for half a year. So yeah, very affordable indeed.
It's crazy that my 2012 i7 machine is still viable.
I have a three year old i5 4570k and I've been wondering if I should replace it or not. I definitely get some stutter in open world games and I'm pretty sure the CPU is the reason for that. But also it feels like the improvements in Intel CPUs in the past few years have been so minor that I shouldn't even bother.
Hopefully that new AMD CPU lives up to the hype.
I think it's finally time to retire the 2500k.
The minimum framerates are killing me.
I think it's finally time to retire the 2500k.
The minimum framerates are killing me.
These benches seem severely GPU limited.
Same here, I keep saying, maybe next year will be the one. It never is.
Maybe next year though.
I'm not mad, it's just.... wrong to say such things when decent stuff is not really all that affordable for everyone. Now, I've hi-end rig and I can afford hi-end stuff, but I'm not gonna buy shitty hardware and decent one is not currently affordable and too expensive. And If I'm gonna buy shitty hardware then I might as well sell my hi-end rig and use a potato pc instead))))))))))))Why are you coming into a thread about top of the line tech and getting mad at people who are discussing the advancements of said tech. If you can't afford it then I'm sorry. But you shouldn't get all twisted up just because a CPU is capable of streaming 4K from services besides YouTube.
Is this average Joe does not have better things to do with his life?))))))Nowadays you can have 500$ 55'' 4K TVs, cheap ones though, but still. The average Joe can buy one without losing an arm.
I mean, I guess he can get a credit and then work his ass 24/7 to pay for this god knows how long.... But hey, at least he will have a 55'' 4K TV in his house. Cool, right? Well, no. 500$ is a lot of money which can be spent on something else and alot more important. I'm not telling people what to do, but this is not something average Joe can buy, I'm sorry, it's not.
I'm not mad, it's just.... wrong to say such things when decent stuff is not really all that affordable for everyone. Now, I've hi-end rig and I can afford hi-end stuff, but I'm not gonna buy shitty hardware and decent one is not currently affordable and too expensive. And If I'm gonna buy shitty hardware then I might as well sell my hi-end rig and use a potato pc instead))))))))))))
And again, I'm not the only one who can't afford this decent 4K stuff and when it will affordable - everyone will have decent 4K TV and 4K monitor.... just not right now and at a current date and year. 4K is only getting there, just like Blu-Ray once was and now they are common and actually affordable for everyone to buy and have at their home.
Oh and, decent stuff is just the tip of the iceberg and really not all that expensive compared to 150K RUB monitors or TVs. So yeah, decent stuff is not even the best there is which you can buy.
Is this average Joe does not have better things to do with his life?))))))
I mean, I guess he can get a credit and then work his ass 24/7 to pay for this god knows how long.... But hey, at least he will have a 55'' 4K TV in his house. Cool, right? Well, no. 500$ is a lot of money which can be spent on something else and alot more important. I'm not telling people what to do, but this is not something average Joe can buy, I'm sorry, it's not.