• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"GPU compute is the new SPU" - Will this be a big advantage for PS4 long-term?

There's a significant amount of custom hardware in the PS4's design specifically to aid compute.

Originally people thought that this would be at the expense of rendering, but as it turns out it seems as though this is not the case. Mark Cerny stated that there's idle shader ops regardless of how far rendering is pushed, and that these idle shaders can be used for compute purposes - taking a substantial load off the CPU.

This is similar to what developers experienced last-gen with the cell's SPU's, expect in many cases it was the opposite -- they used CPU resources to aid rendering. I can see Sony's first party studios really diving into GPU compute to get the most out of the PS4, analyzing areas where there's idle performance that could be used in various ways just like they had to do with the CELL.

We haven't heard much about the Xbox One in terms of its GPU design, but it doesn't look like any special architecture was added in order to augment GPU compute.

While this advantage may not be seen initially, it could be a large distinguishing factor in a few years, particularly when exclusive Sony studios take advantage of it.

Sadly, I'm not sure how much third parties will really delve into this aspect of the design since they have to juggle multiple platforms.
 

Chaostar

Member
Yeah, I can't see third parties doing much with it but I'm certain Naughty Dog, Guerrilla Games, Santa Monica, Sucker Punch etc will work wonders down the road.

Waits patiently for first XXXX dev has 'maxed out' PS4 article
 
Mark Cerny:

“If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame – for example during the rendering of opaque shadowmaps – that the bulk of the GPU is unused. And so if you’re doing compute for collision detection, physics or ray-casting for audio during those times you’re not really affecting the graphics. You’re utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute,” he explained.


They both have it. Its in newer gpus in general.

I realize Xbox One has a limited form of it, but PS4's design goes a step further with additional ACE's specifically to augment compute.
 

RoboPlato

I'd be in the dick
A lot of middleware programs seem to be adding significant compute support and I'm sure that Sony's first party will do some awesome stuff with it.
 
The extra CU's in the PS4 GPU being used similarly to the SPUs in the CELL is certainly going to give it legs long term. More so than the X1. It may also be a GPGPU, but its just 12CU's, devs would be able to do more with the PS4 with a little more effort (unlike a great effort for marginal differences last gen).
 

pixlexic

Banned
Mark Cerny:

“If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame – for example during the rendering of opaque shadowmaps – that the bulk of the GPU is unused. And so if you’re doing compute for collision detection, physics or ray-casting for audio during those times you’re not really affecting the graphics. You’re utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute,” he explained.




I realize Xbox One has a limited form of it, but PS4's design goes a step further with additional ACE's specifically to augment compute.

well the good thing is ... unlike the PS3's SPUs all the new gpus have compute units so just by default they will become standard pretty quickly .
 

DieH@rd

Banned
They both have it. Its in newer gpus in general.

Yes they both have it, but PS4 gives dev much higher control over it [they can hunt for appropriate cycles during general rendering work and task CUs to do compute during them] and GPU hardware was modified to support it.
 

c0de

Member
Yes they both have it, but PS4 gives dev much higher control over it [they can hunt for appropriate cycles during general rendering work and task CUs to do compute during them] and GPU hardware was modified to support it.

Did MS, authors of directcompute (tressfx) say anything on what you can do with given hardware?
 

Chaostar

Member
A lot of middleware programs seem to be adding significant compute support and I'm sure that Sony's first party will do some awesome stuff with it.

Now this, this is interesting. Never thought about it before but, since the chips in PS4 and XB1 are (apparently) similar to chips that will eventually go into phones and tablets, it would make sense for middleware to take advantages of these strengths. This could mean that third parties can implement the console specific advantages without much effort on their part. Somebody have any more info on this? I'm just speculating.
 
Since no one really took advantage of the SPUs outside of sony first party and if GPU is the next SPU it does not look good for Sony lol
 

Spongebob

Banned
Now this, this is interesting. Never thought about it before but, since the chips in PS4 and XB1 are (apparently) similar to chips that will eventually go into phones and tablets, it would make sense for middleware to take advantages of these strengths. This could mean that third parties can implement the console specific advantages without much effort on their part. Somebody have any more info on this? I'm just speculating.
What?
 

Qassim

Member
SPUs were really only thing in Sony's platform in the games industry, GPU Compute is something the industry has been talking about for a while on PC, and can be done on most platforms.

I don't really see what is stopping it from being a widely adopted practice at the moment.
 

Rolf NB

Member
SPUs are way more flexible than GPUs. But I guess it's nice to have silicon that can be used for multiple purposes, even if it's not as efficient at certain tasks, as long as it can do them all.
 

netBuff

Member
well the good thing is ... unlike the PS3's SPUs all the new gpus have compute units so just by default they will become standard pretty quickly .

The PS4 has some specific enhancements that make it "supercharged" for compute: The L2 cache has a new tag bit "volatile" to optimize utilization in conjunction with compute, and each compute pipeline can queue 64 entries compared to 2 with regular Radeon graphics cards.

Did MS, authors of directcompute (tressfx) say anything on what you can do with given hardware?

Why are you emphasising the fact that Microsoft added a Compute API to DirectX this much? From the specs we know, the PS4 has a graphics chip that is better suited to general purpose computing.
 

RoboPlato

I'd be in the dick
Now this, this is interesting. Never thought about it before but, since the chips in PS4 and XB1 are (apparently) similar to chips that will eventually go into phones and tablets, it would make sense for middleware to take advantages of these strengths. This could mean that third parties can implement the console specific advantages without much effort on their part. Somebody have any more info on this? I'm just speculating.

I don't have much info unfortunately but I know Havok, PhysX, and several other major middleware programs have been adding substantial compute support. Havok in particular has been working closely with Sony to add PS4 optimized features.
 

Spongebob

Banned
All the consoles are capable of GPU compute, so it's not specialized knowledge that will go to waste because you can only use it on a specific system.



Their CPUs are, but I'm not sure what Chaostar is getting at.
Jaguar cores are already in some tablets. Not sure what he means by eventually...
 

c0de

Member
SPUs are way more flexible than GPUs. But I guess it's nice to have silicon that can be used for multiple purposes, even if it's not as efficient at certain tasks, as long as it can do them all.

People need to get the knowledge on where the limits for GPGPU are. But perhaps it can help for certain tasks.
 

MoneyHats

Banned
Did MS, authors of directcompute (tressfx) say anything on what you can do with given hardware?


I don't think there are any specifics out there, people are just assuming Xbox One doesn't have additional compute power such as additional queues, I mean to my understanding this is not something that Sony created but what will go into the future generation of GCN so it may well mean that both architecture's are designed that way.

We already saw what happened to the 3GB for OS fiasco, everyone assumed Sony only used 1GB and all 7GB were available for games even though there weren't specifics out there, when you assume you only make an ass of yourself so I try not to. I would love to hear more about the GPU in Xbone, hopefully we'll eventually get something.
 

c0de

Member
Why are you emphasising the fact that Microsoft added a Compute API to DirectX this much? From the specs we know, the PS4 has a graphics chip that is better suited to general purpose computing.

Better than what? Please go into more detail on what GPGPU can do and what not.
 

pixlexic

Banned
The PS4 has some specific enhancements that make it "supercharged" for compute: The L2 cache has a new tag bit "volatile" to optimize utilization in conjunction with compute, and each compute pipeline can queue 64 entries compared to 2 with regular Radeon graphics cards.



Why are you emphasising the fact that Microsoft added a Compute API to DirectX this much? From the specs we know, the PS4 has a graphics chip that is better suited to general purpose computing.

yeah but the question is "WILL IT BE USED LIKE SPUS"

I was trying to say that it will be used a lot more than SPUS because it will be a standardized function. We should see a lot more out these than we did the spu set up across all ps4 games.
 

Chaostar

Member

Could have worded that a lot better (hungover).

What I was basically trying to say is that if the middleware companies spend time taking advantage of the PS4 architecture then 3rd parties won't have to.

I don't have much info unfortunately but I know Havok, PhysX, and several other major middleware programs have been adding substantial compute support. Havok in particular has been working closely with Sony to add PS4 optimized features.

OK thanks anyway, I'll look into it. I remember seeing that 'blue pill' physics demo at the PS4 reveal, was that Havok?

Edit: Nevermind, it was Havok, I just googled it, relevant video here...
http://www.youtube.com/watch?v=bYDBP947ecw
 

DieH@rd

Banned
Did MS, authors of directcompute (tressfx) say anything on what you can do with given hardware?

MS created DX API for developers to send DC code to the GPU. Everything else is done by AMD and Nvidia who create different pipelines for such work.

Xbone has standard Radeon 7000 GPU, PS4 has an enchanced one, with more ALUs that allow "fine grain" aproach to the Direct Compute. Not only that developers [and middlewares] can better control how their work is done when entire CU is used for DC work, but Sony is aiming to give developers API tools to hunt for empty cycles when CUs are doing regular game rendering work. By finding those "holes" and filling them with DC tasks, PS4 GPU will have much higher efficiency. Sony motivates developers to use this techniques by providing little more ALU's than that is needed. Because this wont be easy [and wont come soon in most games], devs currently view this as "PS3 SPU". It will be hard to master initially, but it will give great results.

[PS: TressFX is a Direct Compute middleware package created by AMD]
 

c0de

Member
yeah but the question is "WILL IT BE USED LIKE SPUS"

I was trying to say that to will be used a lot more than SPUS because it will be a standardized function. We should see a lot more out these than we did the spu set up across all ps4 games.

We should appreciate the tech of SPUs given the time cell was created. Noone knew back then that GPUs would evolve like this. And yes, SPUs are way more capable than GPGPU in terms of variety.
 

c0de

Member
MS created DX API for developers to send DC code to the GPU. Everything else is done by AMD and Nvidia who create different pipelines for such work.

Xbone has standard Radeon 7000 GPU, PS4 has an enchanced one, with more ALUs that allow "fine grain" aproach to the Direct Compute. Not only that developers [and middlewares] can better control how their work is done when entire CU is used for DC work, but Sony is aiming to give developers API tools to hunt for empty cycles when CUs are doing regular game rendering work. By finding those "holes" and filling them with DC tasks, PS4 GPU will have much higher efficiency. Sony motivates developers to use this techniques by providing little more ALU's than that is needed. Because this wont be easy [and wont come soon in most games], devs currently view this as "PS3 SPU". It will be hard to master initially, but it will give great results.

[PS: TressFX is a Direct Compute middleware package created by AMD]

So much text without answering my question.
 

Spongebob

Banned
I don't think there are any specifics out there, people are just assuming Xbox One doesn't have additional compute power such as additional queues, I mean to my understanding this is not something that Sony created but what will go into the future generation of GCN so it may well mean that both architecture's are designed that way.

We already saw what happened to the 3GB for OS fiasco, everyone assumed Sony only used 1GB and all 7GB were available for games even though there weren't specifics out there, when you assume you only make an ass of yourself so I try not to. I would love to hear more about the GPU in Xbone, hopefully we'll eventually get something.
If MS also made these enhancements wouldn't they also be parading them? Especially since Sony has been mentioning them quite often..
 

MoneyHats

Banned
Compute is the future, so the simple answer to the OP... No, this will not be similar to SPUs because all future games on PC, Xbox One, PS4 will be pushing towards compute, not just a single platform (PS3 with SPUs) which means it will get adopted much quicker and will get used across the board.

Cerny adding additional ACE's with 64 queues was basically to prepare for the future, meanwhile the SPUs were unique hardware not used anywhere else, so major difference.
 
Since no one really took advantage of the SPUs outside of sony first party and if GPU is the next SPU it does not look good for Sony lol

God knows technology discussions on GAF are usually awful cesspools of ignorance, but please, just stop. How do you even define utilization as a metric? 100% of available SPU time running user code? That's not really hard--throw on some low priority tasks that fill all idle background render and game assist (audio occlusion traces for instance). Full utilization while also minimizing latency, eg avoiding deferred post effects? Pulling off a render effect you can do on 360 without SPU but requires 5ms across all six SPUs on PS3 (http://www.slideshare.net/nonchaotic/spu-assisted-rendering)? On and on.
 

MoneyHats

Banned
If MS also made these enhancements wouldn't they also be parading them? Especially since Sony has been mentioning them quite often..


There are several enhancements in Xbox One that MS could be parading, but guess what? they're not for some reason. (ie. Tiled Resources exclusive to DX 11.2 compliant GPUs) I don't think their focus has ever been to talk detailed hardware, just look at how unique Xenos was yet all you heard J. Allard talk about in interviews when questioned about 360 being inferior to PS3 was that 360 was all about games and the Xbox Live experience, it wasn't till quite a bit later that we started to find out that 360 was no push-over.
 

DieH@rd

Banned
So much text without answering my question.

OK here - MS will NEVER discuss hardware in PS4, and how the things have started, they will also skip detailed talk about Xbone GPU in the next few years because its obviously slower and less advanced.

But hey, they will damage their throats talking about magical move engines, embedded ESRAM, sound hardware, kinect sensors and simmilar stuff.
 

dr_rus

Member
There's a significant amount of custom hardware in the PS4's design specifically to aid compute.

Originally people thought that this would be at the expense of rendering, but as it turns out it seems as though this is not the case. Mark Cerny stated that there's idle shader ops regardless of how far rendering is pushed, and that these idle shaders can be used for compute purposes - taking a substantial load off the CPU.
It is. If you do a compute shader on a SIMD this SIMD can't do another shader - pixel, vertex or geometry - at the same time. What Cerny said is that the h/w is balanced in such a way that doing a compute of some kind seems like a logical choice. What he didn't explain though is that pixel, vertex and geometry shaders are a form of compute too. So in essence we're just talking about a higher compute ratio in general, not necessarily GPU compute. And this is a trend that is apparent to anyone who's been following GPUs evolution for the last ten years. Sony hasn't done anything new here, each new GPU architecture from AMD and NV is more and more compute-oriented.
 

RoboPlato

I'd be in the dick
Wait there's Jaguar cores in tablets right now that can do asynchronous compute? I wasn't aware of that, thanks. Guess I need to upgrade from my 'old' Nexus 7.

Asynchronous compute isn't exactly the same as GPU compute. Asynchronous compute is splitting the same task between the CPU and GPU so that the different calculations can be done most efficiently. GPU compute is using GPU resources to do tasks that are typically CPU tasks that modern GPUs are good at.
 

c0de

Member
OK here - MS will NEVER discuss hardware in PS4, and how the things have started, they will also skip detailed talk about Xbone GPU in the next few years because its obviously slower and less advanced.

But hey, they will damage their throats talking about magical move engines, embedded ESRAM, sound hardware, kinect sensors and simmilar stuff.

So we don't know. Thanks for your opinion.
 

netBuff

Member
There are several enhancements in Xbox One that MS could be parading, but guess what? they're not for some reason. (ie. Tiled Resources exclusive to DX 11.2 compliant GPUs) I don't think their focus has ever been to talk detailed hardware, just look at how unique Xenos was yet all you heard J. Allard talk about in interviews when questioned about 360 being inferior to PS3 was that 360 was all about games and the Xbox Live experience, it wasn't till quite a bit later that we started to find out that 360 was no push-over.

Considering the Xbox has a version of the same GPU the PS4 has, this is not in any way an unique advantage. Tiled Resources are not an API feature solely found in DirectX.
 

Chaostar

Member
Why would CPU cores be doing GPU compute?

I'm not gonna pretend to know exactly how it all works but from what I gather asynchronous compute means that the CPU and GPU can work on tasks that are normally mutually exclusive. Please see the link in my previous post's edit for more info.

edit: nvm guess I misinterpreted it.
 

Spongebob

Banned
I'm not gonna pretend to know exactly how it all works but from what I gather asynchronous compute means that the CPU and GPU can work on tasks that are normally mutually exclusive. Please see the link in my previous post's edit for more info.
You're correct, I already edited my post.
 

MoneyHats

Banned
OK here - MS will NEVER discuss hardware in PS4, and how the things have started, they will also skip detailed talk about Xbone GPU in the next few years because its obviously slower and less advanced.

But hey, they will damage their throats talking about magical move engines, embedded ESRAM, sound hardware, kinect sensors and simmilar stuff.



Not necessarily the reason, just look at 360 vs PS3, MS could've been parading the advantages of Xenos over RSX but they did nothing of the sort, in every interview when asked about the 360's capabilities, they dodged to talk about the games and the Xbox Live experience and refused to engage in technical talk, Sony on the other hand seemed more comfortable talking tech so I don't see a change this time around, just more of the same where later on we'll find out just what's in the hood of Xbox One once Anandtech or other sites take it apart, and even more architectural detail once they interview AMD post NDA.
 

netBuff

Member
Not necessarily the reason, just look at 360 vs PS3, MS could've been parading the advantages of Xenos over RSX but they did nothing of the sort, in every interview when asked about the 360's capabilities, they dodged to talk about the games and the Xbox Live experience and refused to engage in technical talk, Sony on the other hand seemed more comfortable talking tech so I don't see a change this time around, just more of the same where later on we'll find out just what's in the hood of Xbox One once Anandtech or other sites take it apart, and even more architectural detail once they interview AMD post NDA.

It's pretty obviously the reason, as we've seen quite a few leaked documents.

Your claim that Microsoft wasn't publishing tech comparisons between PS3 and 360 is also false: Major Nelson posted quite a few graphs outlining the 360's superiority at the time in his blog.
 
Not necessarily the reason, just look at 360 vs PS3, MS could've been parading the advantages of Xenos over RSX but they did nothing of the sort, in every interview when asked about the 360's capabilities, they dodged to talk about the games and the Xbox Live experience and refused to engage in technical talk, Sony on the other hand seemed more comfortable talking tech so I don't see a change this time around, just more of the same where later on we'll find out just what's in the hood of Xbox One once Anandtech or other sites take it apart, and even more architectural detail once they interview AMD post NDA.

Actually, they did a whole lot of this. Their unified shader architecture was a big talking point.
 

MoneyHats

Banned
Considering the Xbox has a version of the same GPU the PS4 has, this is not in any way an unique advantage. Tiled Resources are not an API feature solely found in DirectX.


I didn't say PS4's GPU would not pass compliance for DX 11.2, what I'm saying is that Tiled Resources is definitely an advantage over older architectures that won't be 11.2 compliant and they could be parading that when talking about Xbone's capabilities in general, not everything must be PS4 vs Xbone but Xbone's capabilities in general, yet they don't talk about any of that.

On the DX11.2 note, in addition to certain games being developed for that API making those games easier to port to Xbone, there is also middleware that works in conjunction with the API, so for example if you wanted to use Graphine on Xbox One to help take advantage of Tiled Resources, you will be able to because it was designed to work for DX 11.2, for the PS4 on the other hand devs will have to work with LibGcm and do all this work manually.
 
Top Bottom