StriKeVillain
Banned
So now would be a good time to upgrade from my Phenom II?
I just moved up from a 965 BE to a 4690k. I'm really liking the improvement.
So now would be a good time to upgrade from my Phenom II?
It's been suggested earlier in the thread that Intel spends $10 billion annually on research, and that the laws of physics are actually causing issues here.Intel: "We can sit on our asses, cause really who is gonna catch up with us?"
Open source chip design will really start to take off, too.
In this case it's the laws of physics that are the problem.
[*]Your 2600K or 5820K might last a very long time =/
Maybe then AMD can catch up.
I am clueless about PC gaming, but all the 2500k/2600k replies puzzle me, were the next generations just minor % improvements or something?
I am clueless about PC gaming, but all the 2500k/2600k replies puzzle me, were the next generations just minor % improvements or something?
It'll be a while before they are fully competitive with ARM in terms of efficiency. The market wants compact, efficient processors for their mobile lives but even Intel's Atom series and "efficient" M Core series still are vastly bigger energy hogs compared to their ARM cousins.Well with the sales of computers on the decrease, except for Apple, this is probably a smart decision for Intel to make to compete with ARM. I think at some point they will go back, they still have enterprise processors to make besides consumer processors.
I just moved up from a 965 BE to a 4690k. I'm really liking the improvement.
My 4790K will last me years, lol
Gaming PCs are crazy environmentally unfriendly as it is.
This is depressing if true. Can't they go for more cores or combining several CPUs on the same mother board or something like that? I never thought I'd see the day when computer tech stopped evolving toward better performance. How do we go forward now? Better programming? Faster memory?
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.
Glad I just purchased my 5820K last year then.
Wondering if future advances in PC games will rely on GPGPU instead of the CPU.
Everybody read the above, as it's the most important sentence that will be formed in this thread.
Until the next Einstein comes along to either resolve the Heat Problem or dramatically decrease the price of modern computer components (which will happen naturally... over decades' worth of time), there really is nothing that Intel or anyone can do to significantly push processor speeds beyond top-top end hardware. --Well, not without things literally erupting in flame.
By the way, if any of you do have any solutions to micro and nanoscopic heat management beyond "make it bigger" (that's not a solution), then you're the richest man on Earth walking and you should go patent that shit right now. And no, an ice cube on the CPU will not help.
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.My 2500K at 4.6GHz will never have to be replaced, will it?
Not necessarily but there are many other tasks that benefit greatly from increased CPU power. I spend a lot of time producing video content and any amount of CPU you can throw at it has a beneficial effect.Well... do we really need more processing power for playing games?
(This is going to sound kind of bitter but its just my 0.02$).
I get why you guys are happy that you will be able to keep your CPU's for a long time, but this has far reaching affects beyond gaming. Unless there is a material/approach switch, (be it the higher clocks of graphene, or the smaller density of a carbon nano-tube, or something even more exotic like optical/quantum computing), there may be a lot of stalls in the current progression of technology in many fields. Computing affects many other industries and particularly artificial intelligence (well the hunt for true artificial intelligence) will suffer (even if temporarily) from a delay). On the other hand there is a much needed switch off of silicon coming and the only real thing that may satiate the hunger for more powerful CPUs before a material/approach switch is the approach towards Amdahl's law (which even then we are already close to approaching the limit of). So in many ways this is bad, but it also has the potential for massive breakthroughs down the road.
However I do think Intel is calling it quits rather early as there have been successful 1nm transistors, although the yields on those will be beyond bad. So here's hoping for a materials/approach switch in the near future since at the moment this is rather bad for the progression of technology as a whole. Because at the moment while you could just power up more low power machines, it does not replace out performing ten of those low powered machines for one of the power hungry ones (which you can then get ten of) by any margin, and the more computers you add to these super computing clusters the more the network, coordination, messaging, and limit you get from sheer parallel scalability comes into play.
You need to work on your bitter bro.
Would it help if I said true AI was impossible?
I have one. It's not enough for some games that have terrible multi-threading performance unless you win the jackpot in the overclocking-potential lottery or don't care about 60fps.
Well... do we really need more processing power for playing games?