• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks - Orbis GPU Detailed - compute, queues and pipelines

Lord Error

Insane For Sony
Well the RAM thing is the only thing that's been "wrong" so far and that was a last minute change. VGleaks obviously have a lot of detailed docs for PS4 (a 95 page PDF IIRC?) Why they choose to react to rumours I don't know.

I personally would just post everything......
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.
 

Sev

Neo Member
pretty obviously.

Because I have social life.

It is a good quick photomontage. The concern is that all you look more into the penis than the GDDR5.

Apologies if I offended anyone tweaking the work of Michelangelo.
 

artist

Banned
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.
I'm not sure what kind of docs VGLeaks have but it seems like there is some sort of interpretation involved. It also seems like english is not their first language(?).
 

thuway

Member
Can you expand on it because it doesn't seem to be universally accepted as having any inherent or specific meaning?

This is a rib to folks who earlier have a PDF from the Durango Summit of 2012. In the PDF, there is a chart which says the original Xbox 360 GPU was 60% efficient and the Durango GPU is near 100% efficient.

For a long time, people were under the impressions, PS4 would have a vastly inefficient GPU, but this throws some of that caution into the wind. That is all.
 

tfur

Member
Because I have social life.

It is a good quick photomontage. The concern is that all you look more into the penis than the GDDR5.

Apologies if I offended anyone tweaking the work of Michelangelo.

Careful playing with the hearts and minds of cut vs uncut gaf.
 

i-Lo

Member
Given I am so totally lost, I am going ask a few questions.

Firstly an open ended one:

What does it all mean for PS4 GPU when compared to traditional HD7850?

Closed question:

Does this mean that with little effort the developers can get to its theoretical 1.84TF?

Is the GPU customization better or worse than previously assumed?
 

thuway

Member
Given I am so totally lost, I am going ask a few questions.

Firstly an open ended one:

What does it all mean for PS4 GPU when compared to traditional HD7850?

Closed question:

Does this mean that with little effort the developers can get to its theoretical 1.84TF?

Is the GPU customization better or worse than previously assumed?

All this means is, they are giving developer's more options on the GPU at one time. If they want to make an area that is heavy on CPU work, they can do that and run code parallel without the GPU and CPU running into each other. It's "smart" design essentially.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Given I am so totally lost, I am going ask a few questions.

Firstly an open ended one:

What does it all mean for PS4 GPU when compared to traditional HD7850?

Closed question:

Does this mean that with little effort the developers can get to its theoretical 1.84TF?

Is the GPU customization better or worse than previously assumed?


IMO It looks like it is more flexible for GPGPU work (non-rendering). More compute rings, so it can kick off processing in parallel and not risk leaving CUs idle.
 

USC-fan

Banned
This is a rib to folks who earlier have a PDF from the Durango Summit of 2012. In the PDF, there is a chart which says the original Xbox 360 GPU was 60% efficient and the Durango GPU is near 100% efficient.

For a long time, people were under the impressions, PS4 would have a vastly inefficient GPU, but this throws some of that caution into the wind. That is all.

Hook a brother up!
 
This is a rib to folks who earlier have a PDF from the Durango Summit of 2012. In the PDF, there is a chart which says the original Xbox 360 GPU was 60% efficient and the Durango GPU is near 100% efficient.

For a long time, people were under the impressions, PS4 would have a vastly inefficient GPU, but this throws some of that caution into the wind. That is all.

This is very good news. :)

Also, you didn't answer my PM again... :'(

Deanos [Deano Calves, Ninja Theory dev [ex?]] just commented on B3D "GCN 2.0" when he saw VGLeaks article. :)
http://forum.beyond3d.com/showpost.php?p=1713696&postcount=630

Isn't Deanos Calves "DianoC"?

Edit 2

Yep that's not him. That deanos posts here too.
 

artist

Banned
What does it all mean for PS4 GPU when compared to traditional HD7850?
As it is, the Orbis GPU is already faster than the 7850. With this info, the gap increases when it comes to compute related tasks.

Does this mean that with little effort the developers can get to its theoretical 1.84TF?
Most likely.

Is the GPU customization better or worse than previously assumed?
It is obviously a more customized design. IMO it is for the better.
 

Perkel

Banned
GCN is AMDs current GPU architecture, and 2.0 is obviously its successor. Since we have no idea what exactly this entails, throwing around the "GCN 2.0" moniker in console tech discussions is pretty meaningless IMHO.

Well we can agree that this change is rather big to overall performance and i don't think AMD will throw it out to trashbin.

BTW did you notice that in that graph CU are devided ? Those pipelines are essentialy devided with each part going to different pool of CU. Or i am reading something wrong ?
 

nib95

Banned
So GCN 2.0 and 8GB GDDR5? Sony you beauties! Lol, people on this forum said both of these two things were basically definitely not going to happen! In a short few months the PS4 went from being half decent hardware wise, to actually being quite potent. I know GCN 2.0 isn't the biggest leap forward, but I'll take anything on the GPU at the mo since AMD GPU tech right now isn't the most formidable, especially against some of Nvidia's top offerings.
 

Biggzy

Member
Well we can agree that this change is rather big to overall performance and i don't think AMD will throw it out to trashbin.

BTW did you notice that in that graph CU are devided ? Those pipelines are essentialy devided with each part going to different pool of CU. Or i am reading something wrong ?

Isn't this akin to what ATI did with the Xenos and unified shaders?
 

Perkel

Banned
So GCN 2.0 and 8GB GDDR5? Sony you beauties! Lol, people on this forum said both of these two things were basically definitely not going to happen! In a short few months the PS4 went from being half decent hardware wise, to actually being quite potent. I know GCN 2.0 isn't the biggest leap forward, but I'll take anything on the GPU at the mo since AMD GPU tech right now isn't the most formidable, especially against some of Nvidia's top offerings.

We do not know if it is GCN 2.0 it may be just custom GPU for PS4 (doubt it)

Isn't this akin to what ATI did with the Xenos and unified shaders?

If you mean that they put custom design then yes (being probably their new standard in future)
 

artist

Banned
BTW did you notice that in that graph CU are devided ? Those pipelines are essentialy devided with each part going to different pool of CU. Or i am reading something wrong ?
It's still labelled as "unified" array. AMD likes to divide their CUs into two blocks for their digrams;

tahitihxcyt.jpg
 

ekim

Member
It doesn't matter if its called GCN2 or WTF4 - that architecture is damn impressive. I doubt any dev is able to use all the potential power in the PS4 within the next 1-2 years.
 

RoboPlato

I'd be in the dick
All of these efficiency optimizations combined with decent raw power should make PS4 an excellent console for devs to work with. Lookig forward to seeing games running on final hardware. Should be a big jump from what we saw last week.
 

Perkel

Banned
GCN + custom improvements or GCN 2.0, either one will do nicely.

GCN+/2.0 is not some magical lamp that will transform system. That diagram shows only that GPU will be just more efficient doing compute.


On side note if they gone this far i wonder what else did they changed

It's still labelled as "unified" array. AMD likes to divide their CUs into two blocks for their digrams;

Forgot about it. Thanks.
 

nib95

Banned
GCN+/2.0 is not some magical lamp that will transform system. That diagram shows only that GPU will be just more efficient doing compute.


On side note if they gone this far i wonder what else did they changed

Granted. Did you read my earlier post? The 7xxx line of AMD GPU's aren't actually that strong in the grand scheme of things, especially compared to other PC GPU alternatives, so the higher the spec of the current architecture we can get, the better. Whilst it might not be an improvement that will be dramatic (see my post above), any tangible improvements we get are great in my books. I'd have preferred 8xxx cards in these things, but obviously that can't or won't happen.
 

Biggzy

Member
If you mean that they put custom design then yes (being probably their new standard in future)

This is exactly what I meant, and unified shaders were in their GPUs within a year, although this was the way forward for GPUs in general.
 

MadOdorMachine

No additional functions
Considering what they did for GameCube and Xbox 360, I think PS4 will have a very capable GPU. It's obvious Sony is putting a lot of care in this and aiming high. I expect good things.
 

artist

Banned
Considering what they did for GameCube and Xbox 360, I think PS4 will have a very capable GPU. It's obvious Sony is putting a lot of care in this and aiming high. I expect good things.
I get this feeling that Sony is very happy to have gone with AMD for the GPU design and will probably stick with them in the future (hopefully).

Yes I admit I was wrong. Now I would like vgleaks to explain themselves. It ironic that they release something that confirms they were wrong on something else.
Yeah;
I'm not sure what kind of docs VGLeaks have but it seems like there is some sort of interpretation involved. It also seems like english is not their first language(?).
 

i-Lo

Member
All this means is, they are giving developer's more options on the GPU at one time. If they want to make an area that is heavy on CPU work, they can do that and run code parallel without the GPU and CPU running into each other. It's "smart" design essentially.

IMO It looks like it is more flexible for GPGPU work (non-rendering). More compute rings, so it can kick off processing in parallel and not risk leaving CUs idle.

As it is, the Orbis GPU is already faster than the 7850. With this info, the gap increases when it comes to compute related tasks.


Most likely.


It is obviously a more customized design. IMO it is for the better.

Thanks guys. I learn more from you all every day.

It sounds like no downtime for the CUs and as such all them are being used all the time either for graphics or general purpose stuff.

Also, unlike last time, the task switch can happen on any one or more CUs and is dependent upon the work load. This would also mean that there is no hardware divide like the earlier rumour suggested at 14 (fixed function) + 4 (GPGPU as well GPU related tasks), rather, much like the UMA RAM pool, the ratio between CUs performing GPU vs GPGPU taks is flexible and dynamic(?). Am I close to being correct? Also, is this what you call "context switching"?
 

benny_a

extra source of jiggaflops
It doesn't matter if its called GCN2 or WTF4 - that architecture is damn impressive. I doubt any dev is able to use all the potential power in the PS4 within the next 1-2 years.
I would like to understand this in real world gaming terms.

Here is my understanding based on reading the OP of the rumor thread for this:

They said their 2 ACE setup can already saturate the x16 PCI-E bus with bandwidth.

Now because of the APU in PS4 instead of using traditional CPU and GPU separate as on a PC that means you'd need to increase the way this bandwidth is utilized to take advantage of the 176GB/s bandwidth on PS4 for more than just classical GPU tasks. Right?

If that isn't the case, why is this GDDR5 such a big deal compared to a hypothetical 8GB DDR3 with 128MB ESRM setup?

Based on this benchmark about PCI-E bus performance in games: http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/
there is AT MOST a 10% benefit to the increased bandwidth with the higher PCI-E spec.

Will this change from the assumed 2 ACE to 8 ACE make a bigger impact than PCI-E 2.0 to PCI-E 3.0?

Thanks in advance!
 

artist

Banned
Also, unlike last time, the task switch can happen on any one or more CUs and is dependent upon the work load. This would also mean that there is no hardware divide like the earlier rumour suggested at 14 (fixed function) + 4 (GPGPU as well GPU related tasks), rather, much like the UMA RAM pool, the ratio between CUs performing GPU vs GPGPU taks is flexible and dynamic(?). Am I close to being correct?
That is what I always felt as the right approach, had lengthy discussion with JohnnySasaki86 on this.

Also, is this what you call "context switching"?
In a way.

Quick someone start a GCN2.0 meme.
I'm afraid I already did. Sorry :(
 
Alright, good gpu check. Lots of ram with good bandwidth check. The only thing left to worry about is the CPU and gddr5 latency problem. If Sony solved these issues then they have quite the capable system. I see the era of ps2 coming back )
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I would like to understand this in real world gaming terms.

Here is my understanding based on reading the OP of the rumor thread for this:

They said their 2 ACE setup can already saturate the x16 PCI-E bus with bandwidth.

Now because of the APU in PS4 instead of using traditional CPU and GPU separate as on a PC that means you'd need to increase the way this bandwidth is utilized to take advantage of the 176GB/s bandwidth on PS4 for more than just GPU tasks. Right?

I don't follow what you are saying. The APU (CPU+GPU) is on one bus using one memory pool (8GB GDDR5). There is no PCIe which was used to transfer data from the main memory (DDR3) to the video memory (GDDR5) in a PC.
 

Boss Man

Member
Alright, good gpu check. Lots of ram with good bandwidth check. The only thing left to worry about is the CPU and gddr5 latency problem. If Sony solved these issues then they have quite the capable system. I see the era of ps2 coming back )
It definitely seems like a capable and wisely designed system.

That's not what really made the PS2 successful though.
 

i-Lo

Member
That is what I always felt as the right approach, had lengthy discussion with JohnnySasaki86 on this.


In a way.


I'm afraid I already did. Sorry :(

Thanks. My view is a pure layman and I want to understand to the best of my ability to stop babbling ignorant stuff.
 

THE:MILKMAN

Member
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.

It would be nice if they explained the 14+4 thing, I agree. I think these guys are Spanish and may have mixed up a example use case in the docs.

Don't want to make excuses for them, but think they may not understand the tech docs they have and are relying a bit on places like GAF/B3D to discuss/mention thing's before doing their articles. I may be wrong, but the 8 ACE stuff was only mentioned on B3D yesterday? and now a article on VGleaks...
 
Top Bottom