Would starting at 28nm potentially limit their ability to shrink the die in the future?
Does this make my Final Fantasy games prettier?
Would starting at 28nm potentially limit their ability to shrink the die in the future?
A stock 4 core Jaguar APU has a TDP of 15W with the GPU included. That's literally passive cooling territory.
Is it harder to shrink 28 -> 24 compared to 32 -> 24? Or what do you mean? Because starting out at 28nm seems like an advantage to me...
Is it harder to shrink 28 -> 24 compared to 32 -> 24? Or what do you mean? Because starting out at 28nm seems like an advantage to me...
I'll admit, bad example . But what I was saying still stands, you won't notice the difference (without measuring/bechmarking) on this cpu with a 400mhz overclock.
I imagine it gets exponentially more difficult the smaller the die gets. And I would think that starting with a smaller die would be advantageous in terms of heat and power but harder to get good yields.
So we are most likely looking at an additional 5fps - 10fps, in performance terms.
That's a pretty strange way of imaging the results, because that can't be how the performance is used. People want to run at 30, or 60, 35fps would be horrible. What would this mean in real-world application? More sparks? More NPCs?So we are most likely looking at an additional 5fps - 10fps, in performance terms.
So we are most likely looking at an additional 5fps - 10fps, in performance terms.
Would starting at 28nm potentially limit their ability to shrink the die in the future?
That's a pretty strange way of imaging the results, because that can't be how the performance is used. People want to run at 30, or 60, 35fps would be horrible. What would this mean in real-world application? More sparks? More NPCs?
what is that in GDDR5 ram?
What's up with all these better then expected specs? The GPU with it's custom parts was also a positive surprise, not to mention the whole RAM thing. I allmost would think that it was some kind of tactic from Sony to leak out some specs and then go beyond that in the real thing.
What's up with all these better then expected specs? The GPU with it's custom parts was also a positive surprise, not to mention the whole RAM thing. I allmost would think that it was some kind of tactic from Sony to leak out some specs and then go beyond that in the real thing.
28nm (2011) -> 20nm (2013) -> 14nm (2016) -> 10nm (2020) -> 7nm (2025) -> some new style of computing
That's the current plan for silicon transistors, and there will be stuff like FinFets and stacking that will ease the need to shrink down further.
28nm (2011) -> 20nm (2013) -> 14nm (2016) -> 10nm (2020) -> 7nm (2025) -> some new style of computing
That's the current plan for silicon transistors, and there will be stuff like FinFets and stacking that will ease the need to shrink down further.
28nm (2011) -> 20nm (2013) -> 14nm (2016) -> 10nm (2020) -> 7nm (2025) -> some new style of computing
That's the current plan for silicon transistors, and there will be stuff like FinFets and stacking that will ease the need to shrink down further.
Yeah, I can't wait to see the second wave of games really, given all these updates. SSM or ND for Christmas 2014 to pretty much steal the show.if sony is not screwing all on reliability and price, is going to be awesome.
Thinking that killzone, knack and Infamous have been developed on a very underpowered devkit...
The highest clock speed I've seen mentioned for Jaguar cores is 1.85GHz. Can't see any reason to believe this rumour other than wishful thinking.
So would you say a die shrink within two years of launch (and accompanying price drop) wouldn't be out of the question?
Are they going to reach 7nm? :O That's insane.
Less slow down from pretty explosions you mean?What this means is that, for example, in a game that's locked at 30 FPS it would reduce the amount of framerate drops below 30 FPS in situations caused by insufficient CPU performance.
How much do Ghz matter in a computer's performance?
In modern computing it actually doesn't matter much. The number of processors is more important than the clock speed.
Never heard of Amdahl's law, I take it?The number of processors is more important than the clock speed.
If the core voltage remains constant, it would result in a 25% power usage increase, witch isn't much in a jaguar core.wouldn't overclocking the CPU cause overheating?
Yeah, I posted this in one of the numerous ps4 CPU threads a couple days ago.
I have never personally heard of ps4daily, which is why I shyed away from a thread.
More power = better, but man, I worry what all these "upgrades" do to the price...
luckily upping the clock speed generally cost's nothing
I heard that graphene is the new holy grail in computing chips.
Especially on a Jaguar. There will be no reason to even have to adjust cooling.
Forgive my ignorance, but wouldn't overclocking the CPU cause overheating?
luckily upping the clock speed generally cost's nothing
Forgive my ignorance, but wouldn't overclocking the CPU cause overheating?
Scales it almost linearly.
Forgive my ignorance, but wouldn't overclocking the CPU cause overheating?
No. The reason we are getting more parallel is that we can't get higher clock speeds.In modern computing it actually doesn't matter much. The number of processors is more important than the clock speed.
If you are lucky . Jokes aside, getting linear or close to linear scaling just by increasing the frequency is hard.
Forgive my ignorance, but wouldn't overclocking the CPU cause overheating?
This wouldn't really be an overclock. It'll be manufactured at this clock speed. It just means they could get it a bit faster than originally planned. Jaguars run extremely cool anyway. Adding the extra 4gb of GDDR5 would have added way more heat than upping the clock speed on the CPU.