• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks - Orbis GPU Detailed - compute, queues and pipelines

Cerny was specifically talking about the gpu.

If it is running at 2ghz then the system has at least 2tflops. It would have been said in the conference instead of "nearly".

Actually doing the math, according the above equation(8x2x8) you get 128 GFLOPs. So the clock increase only adds 26gflops, which would STILL put it just under 2tflops of performance. I thought t be more than this too.
 

Biggzy

Member
ok thanks & what's the formula getting these numbers I just made a random guess.

CPU: 8 (single precision ops 4 ADD + 4 MUL) * 1.6 GHz (Clock) * 8 (Cores) = 102.4 Gigaflops.[/QUOTE]

This one, mate. You just change the clock from 1.6 to 2.
 

onQ123

Member
Stop the Presses!

if they increased the CPU clock speed wouldn't they have to increase the GPU clock speed too?
 

Lord Error

Insane For Sony
Why would I need to do that? It's an off-shoot of a mobile line of CPUs, it's not even comparable to your usual 2013 desktop CPU, let alone the high-end ones (which one would describe as 'beasts').
It's a weaker core but there's uncommon large number of them. The idea is that they'd finally be used fully, as opposed to using just a single or two cores of something ike i5 or i7 like majority of games seem to do now. What's the flop rate of an i5 or i7 anyway?
 
Ooooooh! SO thats one of the reasons the GPU number is always so much higher!! It's not apples to apples... what does double precision and single precision actually mean?

Let me dust off part of my computer science part of my brain.

Single and double precision are two terms used to talk about the "bit size" of information.

Single is in 32-bits of data.
Read from Right to left, you look at something like this.

SEEEEEEEEDDDDDDDDDDDDDDDDDDDDDDD

That is 32 bits of data.

S= Sign bit.. tells you if it's - or +
E= Exponent
D= Decimal

Double point is basically the same thing but with 64-bits of info:
1 S
11 E's
52 D's
This allows a higher "precision." Meaning a more accurate result during calculations.

So in the end you have Single and Double. Single provides less precision, but takes up less memory space, while Double is more precise and takes up more. Single is often done on GPU tasks... graphics and such because things won't suddenly break when approximated.

Up until recently (maybe...a decade ago?) CPU's have exclusively done Double precision while GPU's were only capable of single precision calculations. Now that we have GPGPU's and things like CUDA, we can now do things like double precision on GPU's. They used to have to be done on CPU's... that's why this gen you see more focus on a beefier GPU instead of CPU.

Anyway, I feel like I went way too in depth, but I hope this helped.

Also, anything capable of double can do single.
 

Durante

Member
Up until recently (maybe...a decade ago?) CPU's have exclusively done Double precision while GPU's were only capable of single precision calculations. Now that we have GPGPU's and things like CUDA, we can now do things like physics (double precision) on GPU's. They used to have to be done on CPU's... that's why this gen you see more focus on a beefier GPU instead of CPU.
Physics (as in game physics, which is what I think we're talking about here) are mostly single precision as well. The reason you see more of it on GPUs doesn't really have to do with double or single, and more with the increased programmability and flexibility of GPUs in general.
 
Physics (as in game physics, which is what I think we're talking about here) are mostly single precision as well. The reason you see more of it on GPUs doesn't really have to do with double or single, and more with the increased programmability and flexibility of GPUs in general.

Ahh, ok. Thanks.

Well, of all the detail I had, I only got one minor detail wrong =P
 

Durante

Member
It's a weaker core but there's uncommon large number of them. The idea is that they'd finally be used fully, as opposed to using just a single or two cores of something ike i5 or i7 like majority of games seem to do now. What's the flop rate of an i5 or i7 anyway?
Flop rates aren't everything, otherwise no one would need a CPU. But to answer your question, a desktop quad-core i5 does about 220 GFlops. More importantly, it absolutely and decisively kills Jaguar (and, to be fair, pretty much every other processor microarchitecture out there) in things like ILP, single-threaded performance or branch prediction.
 

Biggzy

Member
Flop rates aren't everything, otherwise no one would need a CPU. But to answer your question, a desktop quad-core i5 does about 220 GFlops. More importantly, it absolutely and decisively kills Jaguar (and, to be fair, pretty much every other processor microarchitecture out there) in things like ILP, single-threaded performance or branch prediction.

Intel have just all round smashed it with their i5 and i7 processors; they are absolute beasts.
 
Not sure I'd like to play a game of poker with Sony, underclocking the cpu for the dev kits, and keeping the large bump in ram secret. Those cards up the sleeve!
 
Ooooooh! SO thats one of the reasons the GPU number is always so much higher!! It's not apples to apples... what does double precision and single precision actually mean?

Single precision has a lower accuracy then double precision.

single precision is about 6 significant digits
Double precision is about 10 significant digits

This has been a while so the exact figure im not sure about.
But it means is by using double precision for your calculation you have a more precise answer.
 
128 GFlops then.

Actually not much difference in real world performance terms.

25% is still a signifigant amount. Dont look at the numbers, I guess since CPU's are measured in douible precision the flops read much lower but are more precise. It will definitely help the CPU performance.

I'm really not trying to rub this in any Xbox fans faces, but I wanted to point this out. If the rumors are true that the Durango's CPU has around ~200flops by uprgarding the VMX units(and they also didn't get a clock upgrade), this makes the CPU power difference more insignifigant. Your looking at a 25% increases over PS4 CPU, when you account for the fact Durango is reserving two cores, or 50flops for the OS. 25% is still signifigant(as I pointed out up above), but compared to 100%, not so much...combind this with the fact PS4 has a much more powerful GPU enhanced for compute. This is all assuming that the 2ghz clock upgrade is legit too.

Not sure I'd like to play a game of poker with Sony, underclocking the cpu for the dev kits, and keeping the large bump in ram secret. Those cards up the sleeve!

Yes, they play a dirty game indeed! LOL

Honeslty though, if this was their stragey they would of wanted to announced their system after MS.
 

MarkV

Member
Flop rates aren't everything, otherwise no one would need a CPU. But to answer your question, a desktop quad-core i5 does about 220 GFlops. More importantly, it absolutely and decisively kills Jaguar (and, to be fair, pretty much every other processor microarchitecture out there) in things like ILP, single-threaded performance or branch prediction.

Where do you get those numbers? I'm confused now:
http://download.intel.com/support/processors/corei5/sb/core_i5-3500_d.pdf
Also, are we talking about DP or SP FLOPS comparison between PS4 CPU and i5 (102.4 vs xxx)?
 

scently

Member
25% is still a signifigant amount. Dont look at the numbers, I guess since CPU's are measured in douible precision the flops read much lower but are more precise. It will definitely help the CPU performance.

I'm really not trying to rub this in any Xbox fans faces, but I wanted to point this out. If the rumors are true that the Durango's CPU has around ~200flops by uprgarding the VMX units(and they also didn't get a clock upgrade), this makes the CPU power difference more insignifigant. Your looking at a 25% increases over PS4 CPU, when you account for the fact Durango is reserving two cores, or 50flops for the OS. 25% is still signifigant(as I pointed out up above), but compared to 100%, not so much...combind this with the fact PS4 has a much more powerful GPU enhanced for compute. This is all assuming that the 2ghz clock upgrade is legit too.



Yes, they play a dirty game indeed! LOL

Honeslty though, if this was their stragey they would of wanted to announced their system after MS.

You really should not believe these clock increase rumors. These consoles are designed with a cooling system in mind, which has been tested at a particular clock speed. If anything, clock speed get downgraded mostly. But if you believe that they can increase the clock speed then the same can be said for durango too right? The rumor is that durango has double the cpu FLOPS of the ps4. Their respective clock speed wasn't stated. Besides, the durango isn't officially announced yet, and the specs we know are around a year old.
 
You really should not believe these clock increase rumors. These consoles are designed with a cooling system in mind, which has been tested at a particular clock speed. If anything, clock speed get downgraded mostly. But if you believe that they can increase the clock speed then the same can be said for durango too right? The rumor is that durango has double the cpu FLOPS of the ps4. Their respective clock speed wasn't stated. Besides, the durango isn't officially announced yet, and the specs we know are around a year old.

except someone on beyond3d already suggested there was an increase to the clock numbers. Not saying its fact, but its worth looking at. There looks like there some sort of credence to it.

edit: Yes Durango's clock speed could easily be upgraded, but it seems like they went a different route(better route) and doubled the VMX units. Which also somewhat confirms the theory that there reserving a good portion of the CPU processing power for the OS. MS architecture and stragey is all logically backed up eachother(being that its more of an entertainment machine, then just a gaming machine).

I don't remember anyone saying that it was "double the cpu FLOPS of the ps4" the rumor was that it was 2 double the flops compared to a normal Jaguar CPU.

Exactly. Even more specifically, I believe the rumor was that what they did was double the VMX units, which was something they did with 360(probably when they realized that PS3's CPU was going to be a processing beast?).
 

onQ123

Member
You really should not believe these clock increase rumors. These consoles are designed with a cooling system in mind, which has been tested at a particular clock speed. If anything, clock speed get downgraded mostly. But if you believe that they can increase the clock speed then the same can be said for durango too right? The rumor is that durango has double the cpu FLOPS of the ps4. Their respective clock speed wasn't stated. Besides, the durango isn't officially announced yet, and the specs we know are around a year old.

I don't remember anyone saying that it was "double the cpu FLOPS of the ps4" the rumor was that it was 2 double the flops compared to a normal Jaguar CPU.
 

i-Lo

Member
You really should not believe these clock increase rumors. These consoles are designed with a cooling system in mind, which has been tested at a particular clock speed. If anything, clock speed get downgraded mostly. But if you believe that they can increase the clock speed then the same can be said for durango too right? The rumor is that durango has double the cpu FLOPS of the ps4. Their respective clock speed wasn't stated. Besides, the durango isn't officially announced yet, and the specs we know are around a year old.

I love the bias in this post. Being sceptical of one rumour based on certainty of another.

Yes, PS4's CPU probably got dowgraded to 999.999MHz.
 

CLEEK

Member
I was dubious of the increased clock of the CPU, as I was sure that Sony had explicitly stated 1.6GHz. But on their official spec sheet PDF, they say:

Single-chip custom processor
CPU : x86-64 AMD “Jaguar”, 8 cores
GPU : 1.84 TFLOPS, AMD next-generation Radeon™ based graphics engine

It could be telling they didn't give either clock or performance of the CPU, but were happy to state the performance of the GPU.

So maybe they were still working out what was the maximum clock they could get decent yields for. Good news if true (unless if makes the cooling inefficient and the console melts after 10 minutes usage).
 

KidBeta

Junior Member
You really should not believe these clock increase rumors. These consoles are designed with a cooling system in mind, which has been tested at a particular clock speed. If anything, clock speed get downgraded mostly. But if you believe that they can increase the clock speed then the same can be said for durango too right? The rumor is that durango has double the cpu FLOPS of the ps4. Their respective clock speed wasn't stated. Besides, the durango isn't officially announced yet, and the specs we know are around a year old.

You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.
 

KidBeta

Junior Member
except someone on beyond3d already suggested there was an increase to the clock numbers. Not saying its fact, but its worth looking at. There looks like there some sort of credence to it.

edit: Yes Durango's clock speed could easily be upgraded, but it seems like they went a different route(better route) and doubled the VMX units. Which also somewhat confirms the theory that there reserving a good portion of the CPU processing power for the OS. MS architecture and stragey is all logically backed up eachother(being that its more of an entertainment machine, then just a gaming machine).



Exactly. Even more specifically, I believe the rumor was that what they did was double the VMX units, which was something they did with 360(probably when they realized that PS3's CPU was going to be a processing beast?).

VMX does not exist in x86, they would be looking at AVX units.
 
So the 8 Jaguar cores are not as weak as people are trying to make them out to be?

This is what I tried to figure out by posting that Jaguar spec thread the other day. There was a lot of info in there that I didn't understand, and it seemed not many other people did either. It seems Jaguar offers a 18% performance increase over Bobcat though, amount many other things like being 128-bit wide FPU datapath vs 64, and being able to do a lot more EX and AGU scheduler entries(whatever that means! still dont have these answers) . Here's the link to the thread btw. Theres a large amount of pdf files that go into much more detail as well.

http://www.neogaf.com/forum/showthread.php?t=515548&highlight=

VMX does not exist in x86, they would be looking at AVX units.

K. sorry was probably a simple mix up.
 

CLEEK

Member
You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.

I though this was a requirement in an APU, but looking at the AMD site, it doesn't seem to matter if the CPU and GPU clocks are not clean multiples of each other.

For example, the current top end A10 has a GPU clocked at 800MHz, and a CPU clocked in the range of 3.8GHz to 4.2GHZ (a multiple of 4.75 to 5.25).

http://www.amd.com/US/PRODUCTS/DESKTOP/APU/MAINSTREAM/Pages/mainstream.aspx#7
 

onQ123

Member
The funny thing about a lot of the info that was leaked to people about the new hardware is they see the PR talk that compares the hardware to other hardware then assume that the other console is using normal off the shelf hardware.
 

scently

Member
You try and keep the GPU and CPU in sync. So if the 800mhz GCN 12 CU Durango GPU is correct, the CPU should be 1.6Ghz, because its a clean double multiple of the GPU clock speed.

And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.
 

RoboPlato

I'd be in the dick
I was dubious of the increased clock of the CPU, as I was sure that Sony had explicitly stated 1.6GHz. But on their official spec sheet PDF, they say:



It could be telling they didn't give either clock or performance of the CPU, but were happy to state the performance of the GPU.

So maybe they were still working out what was the maximum clock they could get decent yields for. Good news if true (unless if makes the cooling inefficient and the console melts after 10 minutes usage).

I'm pretty sure the clock speed is going to be more than 1.6 GHz and I have thought that for a while. AMD and Sony were both willing to throw out numbers for GPU and RAM but not CPU. I don't expect it to be too big of an increase but I don't think that 2Ghz is out of the question. I thought I heard someone over at B3D was saying 1.8GHz which sounds reasonable to me.

Both consoles are keeping audio rendering off of the CPU so that will also clear up a good amount of processing power.
 

KidBeta

Junior Member
And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.

Thats my opinion too, both of the rumours for Orbis/Durango saying that they both have a GPU/CPU combo running at 800mhz/1.6Ghz is pretty telling imo, yield / performance sweet spot must be there.
 

scently

Member
I love the bias in this post. Being sceptical of one rumour based on certainty of another.

Yes, PS4's CPU probably got dowgraded to 999.999MHz.

If you see that as bias then that is what you want to see. I am simply stating on the actual possibilities. That is what is possible/viable and what is not. It is easy to see how the durango cpu can have twice as many flops as the ps4 cpu, all that is needed would be to implement 256bit AVX units as opposed to the standard 128bit units. What is not easy or viable is suddenly overclocking your cpu so late in the production calendar, all for a 24 flops increase.
 

RoboPlato

I'd be in the dick
And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.

This is interesting. I had no idea that this was something to consider, I just assumed that the CPU and GPU were completely separate. Ignore my last post then (outside of the audio rendering part). I love learning new things like this, part of why I enjoy tech threads a lot.
 

Nachtmaer

Member
I'm pretty sure the clock speed is going to be more than 1.6 GHz and I have thought that for a while. AMD and Sony were both willing to throw out numbers for GPU and RAM but not CPU. I don't expect it to be too big of an increase but I don't think that 2Ghz is out of the question. I thought I heard someone over at B3D was saying 1.8GHz which sounds reasonable to me.

That's what I've been thinking as well. I have a feeling this 1.6GHz figure came from the fact that Kabini is using this as its frequency according to former info. If Kabini is able to keep everything under 18W (including GPU) even when the CPU can boost up to 2.4GHz, I'm sure a frequency bump this small won't break the bank on power draw.
 

scently

Member
Thats my opinion too, both of the rumours for Orbis/Durango saying that they both have a GPU/CPU combo running at 800mhz/1.6Ghz is pretty telling imo, yield / performance sweet spot must be there.

Indeed. It would be easier and better to enable a disabled core/CU than to overclock your cpu, which will mess up your cooling design/system integrity, not to mention the higher thermal/power need, all to get 24 more cpu flop just to upend MS. lol

BTW for those thinking I have some sort of agenda here, I actually am getting both consoles at launch, I am just discussing technical merits/demerits of doing these things. Its really not a black and white world. I don't have to hate Sony and like MS, it actually is possible to like both products, strange as it might seem.
 

CLEEK

Member
And so would be the ps4 too. The fact that we already have the FLOP rating of the ps4 gpu suggests the clock speed of the cpu. So if you are increasing the clock speed of the cpu to 2.0ghz then you will increase the clock speed of the gpu to 1.0ghz too, and it hasn't, going by the official specs published by Sony. So this is quite a rubbish rumor I think.

See my post a few above yours. The current AMD APUs have CPU and GPUs clocks well out of sync with each other.

http://www.neogaf.com/forum/showpost.php?p=48378164&postcount=438
 

CLEEK

Member
Yeah I can see that. But you should look at my post above. It explains why I think its a very unlikely thing to do.

How so? You stated that increasing the clock of the CPU would mandate a GPU clock increase too. That doesn't seem to match the clock disparity in actual retail APUs. The GPU can still run at 800MHz, and the CPU clock can be raised (or lowered) independently.
 
Top Bottom