Ignorant reply?
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.
Still using my 3year plus 3770k which hasn't been overclocked...
Doesn't that kind of depend where you get your electricity from?
We get electricity from a renewable source, so I don't really see how a gaming PC for me would be environmentally unfriendly.
Next Gen canceled.
The CPUs you see in gaming rigs are CPUs you'd find in any given desktop depending on the year you buy it. Or console. The difference between gaming PCs and everything else is the GPUs, mostly, which already have a wide range of power efficiencies to choose from and don't always run at max tilt.
Nice to know at least one of the components of my thingie will age gracefully.
Those are some long-ass weeks man.Even if you played on a gaming PC full tilt 12 hours a day 365 days a week it'd still use less energy than a refrigerator in the same time period.
Seems like I'll never get rid of the 5820K I'm buying.
Those are some long-ass weeks man.
I hope this means AMD can come up now and provide more affordable, but competitive alternatives.
The CPUs you see in gaming rigs are CPUs you'd find in any given desktop depending on the year you buy it. Or console. The difference between gaming PCs and everything else is the GPUs, mostly, which already have a wide range of power efficiencies to choose from and don't always run at max tilt.
ayy lmaoGraphene?
Thanks to HSA PC & Consoles will go back to having specialized co-processors because now they can work together better.
Why have a beastly CPU running hot & using a lot of energy when you can have a specialized processor for the task that can do the job more efficiently?
Moore's Law is about transistor density and cost, it does not govern how those transistors are used in processor design (performance vs. power).article said:Intel has said that new technologies in chip manufacturing will favour better energy consumption over faster execution times effectively calling an end to Moores Law, which successfully predicted the doubling of density in integrated circuits, and therefore speed, every two years.
CPU bro-fist.4690k owner here, in for the long haul.
I find it hard to believe that. If AMD had comparable performance Intel would find a way to improve perf.
My 2500K at 4.6GHz will never have to be replaced, will it?
I still have no idea if I should upgrade my CPU.
I now have a 980 Ti as my GPU, but my CPU is a little i5 2320 (3.0GHz). I don't know if this is going to be a bottleneck,
They could probably easily increase the performance, but at cost of efficiency/power consumption. And that's not what they are aiming for.
Well with the sales of computers on the decrease, except for Apple, this is probably a smart decision for Intel to make to compete with ARM. I think at some point they will go back, they still have enterprise processors to make besides consumer processors.It's funny seeing Intel scramble to compete with ARM but at the same time it's coming at the cost of desktop power.
the early years of semi-quantum-based chip production
My 3770k at 4.2Ghz will last me until I quit PC gaming I guess. Awesome. Upgraded from an i7-920, which might still be useful for today's games too lol.
Gemüsepizza;194388992 said:Doubt it. Modern CPUs/GPUs are not inefficient. You just have to think about where you want to run your code and how you write it. Development of big, specialized and exotic processors isn't something we will see anytime soon again in the console world, the costs are just too high. Nobody wants to spend countless billions on chip development. Complexity and requirements are just too high today. The current model of using off-the-shelf components is extremely successful. I don't see them changing this in the near future.