• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks - Orbis GPU Detailed - compute, queues and pipelines

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
If you see that as bias then that is what you want to see. I am simply stating on the actual possibilities. That is what is possible/viable and what is not. It is easy to see how the durango cpu can have twice as many flops as the ps4 cpu, all that is needed would be to implement 256bit AVX units as opposed to the standard 128bit units. What is not easy or viable is suddenly overclocking your cpu so late in the production calendar, all for a 24 flops increase.

The bolded part is telling. Those FLOPS from the VMX are less useful than the FLOPS from clock speed increase. Unless you are going to tell me you can easily just code everything for 256-bit VMX units. The clock speed change feasibility is a function of power/cooling and maybe yields. It is not some insurmountable technical issue (they are not changing the SOC). I'm not claiming they changed he clock speed, I don't believe any rumors until the leak is from two or more credible sources. You are downplaying one rumor and cheer-leading another.
 

scently

Member
How so? You stated that increasing the clock of the CPU would mandate a GPU clock increase too. That doesn't seem to match the clock disparity in actual retail APUs. The GPU can still run at 800MHz, and the CPU clock can be raised (or lowered) independently.

And I acknowledged that point by replying to your post. I mean you should look at my other post which explains why its not a good idea as there are other better ways to get more meaningful performance than to get just an extra 24 FLOPS on top of the original 104 FLOPs.
 

Piggus

Member
You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.

Eh? Typically that applies to aspects of the memory, not the GPU clock speed. On a PC there's nothing bad about overclocking the CPU but not the GPU.
 

scently

Member
The bolded part is telling. Those FLOPS from the VMX are less useful than the FLOPS from clock speed increase. Unless you are going to tell me you can easily just code everything for 256-bit VMX units. The clock speed change feasibility is a function of power/cooling and maybe yields. It is not some insurmountable technical issue (they are not changing the SOC). I'm not claiming they changed he clock speed, I don't believe any rumors until the leak is from two or more credible sources. You are downplaying one rumor and cheer-leading another.

We have no proper details of the durango cpu, nor its FLOPs rating, even the leaks on vgleaks do not call it jaguar cores. If you still want to imply that I am cheering or whatever then go right ahead. I won't argue with you on that.
 

onQ123

Member
If you see that as bias then that is what you want to see. I am simply stating on the actual possibilities. That is what is possible/viable and what is not. It is easy to see how the durango cpu can have twice as many flops as the ps4 cpu, all that is needed would be to implement 256bit AVX units as opposed to the standard 128bit units. What is not easy or viable is suddenly overclocking your cpu so late in the production calendar, all for a 24 flops increase.

Maybe they changed the clock speed of the CPU /GPU to go along with the change from 192GB/s GDDR5 Ram to 176GB/s GDDR5 Ram.


did the clock rate change when they made this change?
 

CLEEK

Member
And I acknowledged that point by replying to your post. I mean you should look at my other post which explains why its not a good idea as there are other better ways to get more meaningful performance than to get just an extra 24 FLOPS on top of the original 104 FLOPs.

OK, I understand. I just though you were saying that it wasn't technically possible.

*puts on Obi Wan voice*

It all depends on your point of view. Rather than thinking of it as a clock increase, 2GHz might always have been their goal. The 1.6GHz speeds came from dev kit specs and old rumours. Like with the GDDR increase, they might always have planned for this, but had to be cautious and wait to see whether it could realistically be manufactured and still hit their launch date and yields.

I would assume thiat with a 2013 launch date in mind, they would have to be producing final silicon right now (albeit with tiny yields). They might just see that 2GHz is a viable option, rather than frantically just upping the clock at the last minute to eek out more performance.

Maybe they changed the clock speed of the CPU /GPU to go along with the change from 192GB/s GDDR5 Ram to 176GB/s GDDR5 Ram.

From what I've read, the move to 8GB would require 4Gb 1.35v GDDR5 modules, which run at the lower bandwidth. So in doubling the RAM size, by the very nature of the components needed to achieve this, they saw the slight drop in bandwidth. So it's unrelated to clock speed of the CPU or GPU.
 

scently

Member
Maybe they changed the clock speed of the CPU /GPU to go along with the change from 192GB/s GDDR5 Ram to 176GB/s GDDR5 Ram.


did the clock rate change when they made this change?

The decrease in rumored bandwidth was as a result of a change in the speed of the gddr5 from 6ghz to 5.5ghz.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
We have no proper details of the durango cpu, nor its FLOPs rating, even the leaks on vgleaks do not call it jaguar cores. If you still want to imply that I am cheering or whatever then go right ahead. I won't argue with you on that.

But additional custom silicon is more feasible than a bump to clock speed?

We also need to stop with the "2x flops", it is misleading. The VMX is not a CPU, it is a small part good at specific math operations, like SSE or MMX.
 

scently

Member
But additional custom silicon is more feasible than a bump to clock speed?

We also need to stop with the "2x flops", it is misleading. The VMX is not a CPU, it is a small part good at specific math operations, like SSE or MMX.

256bits AVX unit is not a custom silicon, it is a standard of the x86 architecture, and I never said VMX as that is a PPC architecture featureset.
 

KidBeta

Junior Member
256bits AVX unit is not a custom silicon, it is a standard of the x86 architecture, and I never said VMX as that is a PPC architecture featureset.

This may be true but it is not standard for AMD CPU's yet, therefore it would be custom.
 

scently

Member
This may be true but it is not standard for AMD CPU's yet, therefore it would be custom.

It would be a custom cpu, not a particularly custom silicon. Anyway the cpu might be the same, just making discussion. If MS wanted 256bits AVX units for whatever reason then AMD would put it there, unless you believe that AMD is not planning to introduce 256bit AVX unit even in their future products.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
256bits AVX unit is not a custom silicon, it is a standard of the x86 architecture, and I never said VMX as that is a PPC architecture featureset.

It is not custom to Jaguar, which the APU is based on. VMX, AVX, dumb acronyms.
 

KidBeta

Junior Member
It is not custom to Jaguar, which the APU is based on. VMX, AVX, dumb acronyms.

I was not aware that jaguar was 256bits already, if its not then its custom silicon because whatever AMD has it in now (if anything) needs to be adapted to Jaguar.
 

McHuj

Member
I was not aware that jaguar was 256bits already, if its not then its custom silicon because whatever AMD has it in now (if anything) needs to be adapted to Jaguar.

It can run 256-bit AVX instructions through two passes of the 128-bit FP units. The FP datapath isn't 256-bit.
 

KidBeta

Junior Member
It can run 256-bit AVX instructions through two passes of the 128-bit FP units. The FP datapath isn't 256-bit.

Yeah, but even so to get double the peak SIMD FLOPS it would have to double the width of the 128bit FPU units AND the data paths.
 

McHuj

Member
Yeah, but even so to get double the peak SIMD FLOPS it would have to double the width of the 128bit FPU units AND the data paths.

Yes. That's correct. And my guess the cache in the current design couldn't support that increased data rate either.

IMO, doubling the number of flops in a CPU would be a significant undertaking. Not saying it's impossible and MS has the money to do it.
 

jimbobb

Neo Member
I'd say they were always aiming for 8GB but it required a whole bunch of hurdles before being possible. Even at dev kit level. Remember, Sony said their decision with the RAM was based on developer feedback. And Crytek were publically giving that feedback up to two years ago

Ya, I'm sure they were always gunning for 8gb given the historical ~16x console ram increase. I like to think (again, I don't know a goddamn thing,) that both Sony & Microsoft made 2 different (but very similar) prototypes and after all of the rumors are heard, and the corporate espionage is done they will both release the console that will blow us away.
I've heard the rumors about Killzone running on less than 8gb of DDR5 and I'd like to believe it but I don't. I know it will be tits by release, tho!
 

sangreal

Member
I honestly cannot tell if this is sarcastic or not.

Efcsinjl.jpg
 

Fredrik

Member
Nope. They're just the Durango CPU might be ~25% more powerful than the PS4's CPU. Durango's GPU and RAM are still quite a bit weaker.
Still interesting if true, it would be the opposite of current gen, with MS at the top this time, but hopefully without the issues of getting the CPU fully optimized as it was with Cell.
 

artist

Banned
Stop the Presses!

if they increased the CPU clock speed wouldn't they have to increase the GPU clock speed too?

You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.
Nope;


And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.
The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lol
 

sangreal

Member
Nope;




The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lol

It's no more rubbish than the other assertions you replied to. If you increase the clock rate of the GPU to match an increased CPU (Which we have established may not be necessary so your post doesn't add anything here either) then it would change the flop rating of the GPU. While it may be based on a faulty assumption, it's pretty simple math
 

artist

Banned
It's no more rubbish than the other assertions you replied to. If you increase the clock rate of the GPU to match an increased CPU (Which we have established may not be necessary) then it would change the flop rating of the GPU. While it may be based on a faulty assumption, it's pretty simple math
I like how you discount that.
 

mrklaw

MrArseFace
And I acknowledged that point by replying to your post. I mean you should look at my other post which explains why its not a good idea as there are other better ways to get more meaningful performance than to get just an extra 24 FLOPS on top of the original 104 FLOPs.


There are other ways, but MS seems to be throwing silicon at the problem. Esram, extra units in the CPU. This is going to be a massive chip
 
I'm guessing Sony didn't give a Ghz rating on the CPU because if Microsoft see that Sony are running their version of the Jaguar @2Ghz whilst Microsoft are only running their version of the same CPU @1.6Ghz, then Microsoft would simply bump up their clock rates to match Sony.

When two competitors are essentially using the same hardware, every little advantage counts.
 

mrklaw

MrArseFace
I'm guessing Sony didn't give a Ghz rating on the CPU because if Microsoft see that Sony are running their version of the Jaguar @2Ghz whilst Microsoft are only running their version of the same CPU @1.6Ghz, then Microsoft would simply bump up their clock rates to match Sony.

When two competitors are essentially using the same hardware, every little advantage counts.

MS might not be able to. Depending how much space the extra CUs take up Vs durango's ESRAM, Sony's chip might be smaller and easier to cool, giving them more headroom to increase clockspeeds.
 

scently

Member
Nope;




The only rubbish thing here is your assertion - flop rating of GPU suggests the clock speed of CPU? :lol

Maybe you should read through what we were saying before jumping in. I didn't even suggest that, just making a reply but if you think a snarky comment is better then sure, go right ahead.
 

scently

Member
There are other ways, but MS seems to be throwing silicon at the problem. Esram, extra units in the CPU. This is going to be a massive chip

I wasn't talking about MS. I was saying that increasing the clock speed for such little gain when it is easier to activate a disabled core/CU does not seem like the right thing to do.
 

Razgreez

Member
Maybe you should read true what we were saying before jumping in. I didn't even suggest that, just making a reply but if you think a snarky comment is better then sure, go right ahead.

You literally typed

The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So this is quite a rubbish rumor I think.

So yes you did literally suggest that and were the first to begin rubbishing. Sometimes it's just better to contain oneself before going defensive
 

scently

Member
You literally typed



So yes you did literally suggest that and were the first to begin rubbishing. Sometimes it's just better to contain oneself before going defensive

Yes I said that, and that was a mistake of mine, but it was based on a suggestion that was made in the thread. Do read through it and you will see what I mean.

Look I will never claim to know everything, and if I make a mistake, I am happy to be corrected, but doing it through snarky or condescending comment is just wrong. But I think I will take your advice and just keep to myself or post elsewhere.
 
I bet they won't up the clocks. That would mean more heat and that is something they definetely don't want.

Jaguar cores use very little power, I doubt a 25% increase in their clocks would up the tdp much. And if the clocks get increased it's probably something that's been planned for a long time, it's not something that would be done on a whim.
 
And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.

why do you have to increase the GPU clock speed if your increasing the CPU clock speed? There both different values already. I don't see why there not independent of each other.
 

Omeyocan

Member
What are the benefits of having a UMA architecture ( the one in the PS4 ) versus the one in the next XBOX for AA and AF effects ?
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
What are the benefits of having a UMA architecture ( the one in the PS4 ) versus the one in the next XBOX for AA and AF effects ?

Both the 360 and Durango are UMA, maybe that isn't what you are asking though.
 

artist

Banned
Maybe you should read through what we were saying before jumping in. I didn't even suggest that, just making a reply but if you think a snarky comment is better then sure, go right ahead.
I read through your post and the whole discussion.

You were entirely wrong when you said the the CPU clock was known because we have a flop rating for the GPU. I also responded to the post you were replying, which also was based on a wrong assumption.
 
Top Bottom