Stop the Presses!
if they increased the CPU clock speed wouldn't they have to increase the GPU clock speed too?
Nope. But it would be nice if they did.
Stop the Presses!
if they increased the CPU clock speed wouldn't they have to increase the GPU clock speed too?
If you see that as bias then that is what you want to see. I am simply stating on the actual possibilities. That is what is possible/viable and what is not. It is easy to see how the durango cpu can have twice as many flops as the ps4 cpu, all that is needed would be to implement 256bit AVX units as opposed to the standard 128bit units. What is not easy or viable is suddenly overclocking your cpu so late in the production calendar, all for a 24 flops increase.
How so? You stated that increasing the clock of the CPU would mandate a GPU clock increase too. That doesn't seem to match the clock disparity in actual retail APUs. The GPU can still run at 800MHz, and the CPU clock can be raised (or lowered) independently.
You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.
The bolded part is telling. Those FLOPS from the VMX are less useful than the FLOPS from clock speed increase. Unless you are going to tell me you can easily just code everything for 256-bit VMX units. The clock speed change feasibility is a function of power/cooling and maybe yields. It is not some insurmountable technical issue (they are not changing the SOC). I'm not claiming they changed he clock speed, I don't believe any rumors until the leak is from two or more credible sources. You are downplaying one rumor and cheer-leading another.
If you see that as bias then that is what you want to see. I am simply stating on the actual possibilities. That is what is possible/viable and what is not. It is easy to see how the durango cpu can have twice as many flops as the ps4 cpu, all that is needed would be to implement 256bit AVX units as opposed to the standard 128bit units. What is not easy or viable is suddenly overclocking your cpu so late in the production calendar, all for a 24 flops increase.
And I acknowledged that point by replying to your post. I mean you should look at my other post which explains why its not a good idea as there are other better ways to get more meaningful performance than to get just an extra 24 FLOPS on top of the original 104 FLOPs.
Maybe they changed the clock speed of the CPU /GPU to go along with the change from 192GB/s GDDR5 Ram to 176GB/s GDDR5 Ram.
Maybe they changed the clock speed of the CPU /GPU to go along with the change from 192GB/s GDDR5 Ram to 176GB/s GDDR5 Ram.
did the clock rate change when they made this change?
We have no proper details of the durango cpu, nor its FLOPs rating, even the leaks on vgleaks do not call it jaguar cores. If you still want to imply that I am cheering or whatever then go right ahead. I won't argue with you on that.
But additional custom silicon is more feasible than a bump to clock speed?
We also need to stop with the "2x flops", it is misleading. The VMX is not a CPU, it is a small part good at specific math operations, like SSE or MMX.
256bits AVX unit is not a custom silicon, it is a standard of the x86 architecture, and I never said VMX as that is a PPC architecture featureset.
This may be true but it is not standard for AMD CPU's yet, therefore it would be custom.
256bits AVX unit is not a custom silicon, it is a standard of the x86 architecture, and I never said VMX as that is a PPC architecture featureset.
It is not custom to Jaguar, which the APU is based on. VMX, AVX, dumb acronyms.
I was not aware that jaguar was 256bits already, if its not then its custom silicon because whatever AMD has it in now (if anything) needs to be adapted to Jaguar.
Do the math.
I was not aware that jaguar was 256bits already, if its not then its custom silicon because whatever AMD has it in now (if anything) needs to be adapted to Jaguar.
More money and hits for lensoftruth.com then.I predict it'll be like this gen, one version will have (worse) framerate drops and/or less resolution and/or missing effects. And in several cases people will notice and it will affect sales.
It can run 256-bit AVX instructions through two passes of the 128-bit FP units. The FP datapath isn't 256-bit.
Yeah, but even so to get double the peak SIMD FLOPS it would have to double the width of the 128bit FPU units AND the data paths.
It is not custom to Jaguar, which the APU is based on. VMX, AVX, dumb acronyms.
I'd say they were always aiming for 8GB but it required a whole bunch of hurdles before being possible. Even at dev kit level. Remember, Sony said their decision with the RAM was based on developer feedback. And Crytek were publically giving that feedback up to two years ago
Dumb acronyms indeed. lol if you say so it must be true.
Wait, so what's currently the latest rumor; that the latest Durango specs makes it better than PS4?
Still interesting if true, it would be the opposite of current gen, with MS at the top this time, but hopefully without the issues of getting the CPU fully optimized as it was with Cell.Nope. They're just the Durango CPU might be ~25% more powerful than the PS4's CPU. Durango's GPU and RAM are still quite a bit weaker.
Nope. They're just the Durango CPU might be ~25% more powerful than the PS4's CPU. Durango's GPU and RAM are still quite a bit weaker.
Stop the Presses!
if they increased the CPU clock speed wouldn't they have to increase the GPU clock speed too?
Nope;You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.
The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lolAnd so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.
Nope;
The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lol
I like how you discount that.It's no more rubbish than the other assertions you replied to. If you increase the clock rate of the GPU to match an increased CPU (Which we have established may not be necessary) then it would change the flop rating of the GPU. While it may be based on a faulty assumption, it's pretty simple math
And I acknowledged that point by replying to your post. I mean you should look at my other post which explains why its not a good idea as there are other better ways to get more meaningful performance than to get just an extra 24 FLOPS on top of the original 104 FLOPs.
I'm guessing Sony didn't give a Ghz rating on the CPU because if Microsoft see that Sony are running their version of the Jaguar @2Ghz whilst Microsoft are only running their version of the same CPU @1.6Ghz, then Microsoft would simply bump up their clock rates to match Sony.
When two competitors are essentially using the same hardware, every little advantage counts.
I guess the tech talk is over. Go refuel at B3D...
Nope;
The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lol
There are other ways, but MS seems to be throwing silicon at the problem. Esram, extra units in the CPU. This is going to be a massive chip
Maybe you should read true what we were saying before jumping in. I didn't even suggest that, just making a reply but if you think a snarky comment is better then sure, go right ahead.
The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So this is quite a rubbish rumor I think.
You literally typed
So yes you did literally suggest that and were the first to begin rubbishing. Sometimes it's just better to contain oneself before going defensive
So the 8 Jaguar cores are not as weak as people are trying to make them out to be?
I bet they won't up the clocks. That would mean more heat and that is something they definetely don't want.
MS might not be able to. Depending how much space the extra CUs take up Vs durango's ESRAM, Sony's chip might be smaller and easier to cool, giving them more headroom to increase clockspeeds.
And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.
Maybe, but why would Sony take the risk.
What are the benefits of having a UMA architecture ( the one in the PS4 ) versus the one in the next XBOX for AA and AF effects ?
I read through your post and the whole discussion.Maybe you should read through what we were saying before jumping in. I didn't even suggest that, just making a reply but if you think a snarky comment is better then sure, go right ahead.