• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

ekim

Member
I'm having trouble imagining why you would PM someone a link to the information from Sony that the extra 4 CU's are less significant then originally suspected

Why can't you just post a link like a normal person?

I'm not trying to be aggressive/offensive. I just can't think of any logical reason why you couldn't post a link short of it being from a banned site

Because it is no link nor public information,
 

Bossofman

Neo Member
Even if you stop at 14 CU's for Graphical tasks & only use the other 4 CU's for compute that's still 400GFLOPS of extra compute power that you are trying to ignore.

that 400GFLOPS of compute could make every game on the PS4 better than the same game on the Xbox One.

True, but I don't think you can compare 400Gflops of GPGPU 'Power' to 400Gflops of REAL CPU power, the CPU is much more efficient in doing it's own things.
 
Even if you stop at 14 CU's for Graphical tasks & only use the other 4 CU's for compute that's still 400GFLOPS of extra compute power that you are trying to ignore.

that 400GFLOPS of compute could make every game on the PS4 better than the same game on the Xbox One.

I agree with you, and, no, I'm definitely not ignoring them. Please read my posts carefully. I'm saying Sony saw this and made what they felt was the best possible recommendation for how devs should use those remaining resources, not that they were useless because of this fact. They said that there is a significant drop off in the benefit of the extra ALU resources after 14 CUs for graphics rendering, not for everything else, or not that there is no benefit, just that the benefit is not as great after that mark if you intend on using them for graphics rendering, and then they recommend using the remaining for CUs for compute.

I myself have said many times that I expect the PS4's compute advantage to be the biggest advantage that the console holds over the Xbox One time and time again, and it will make games better. People have to pay attention to what I say more often :p Those resources won't go to waste, but clearly sony felt that, for the sake of balance (shit I hate using that word now lol), they would recommend a 14 and 4 split between Graphics and Compute, but that doesn't mean you can't do 16 + 2 or 17 + 1 etc etc. It just sounds like they're saying if your benefit of extra CU for graphics is so low, why waste it on that, when you can use it on awesome compute stuff? That's how I see it anyway.

I'm having trouble imagining why you would PM someone a link to the information from Sony that the extra 4 CU's are less significant then originally suspected

Why can't you just post a link like a normal person?

I'm not trying to be aggressive/offensive. I just can't think of any logical reason why you couldn't post a link short of it being from a banned site

Only saying it now so I don't have to repeat it. I'm not posting a link, because it isn't a link. it's just information that I think proves what I'm saying is true beyond doubt, but it also carries the risk of getting someone in trouble or violating their trust, which is why I can't just say everything, and why immediately after I said it, I said I would have no issue sharing the information with a mod if it made people more comfortable that I'm being transparent and honest. I've known about it for awhile, but it became much more interesting once MS publicly confirmed their testing for 14 CUs and saying how the CPU was a limiting factor to their games. It's also a fair assumption architects for both systems know or was learning quite a bit about what was going on in the other project through various channels, so I don't find it completely insignificant that MS mentioned what sony stated to devs about their GPU and their recommended best usage of it.

edit:: Anytime I get info from a site, I post it 100% of the time. In fact, it's a habit I have that some find annoying at times :p
 

KidBeta

Junior Member
I agree with you, and, no, I'm definitely not ignoring them. Please read my posts carefully. I'm saying Sony saw this and made what they felt was the best possible recommendation for how devs should use those remaining resources, not that they were useless because of this fact. They said that there is a significant drop off in the benefit of the extra ALU resources after 14 CUs for graphics rendering, not for everything else, or not that there is no benefit, just that the benefit is not as great after that mark if you intend on using them for graphics rendering, and then they recommend using the remaining for CUs for compute.

I myself have said many times that I expect the PS4's compute advantage to be the biggest advantage that the console holds over the Xbox One time and time again, and it will make games better. People have to pay attention to what I say more often :p Those resources won't go to waste, but clearly sony felt that, for the sake of balance (shit I hate using that word now lol), they would recommend a 14 and 4 split between Graphics and Compute, but that doesn't mean you can't do 16 + 2 or 17 + 1 etc etc. It just sounds like they're saying if your benefit of extra CU for graphics is so low, why waste it on that, when you can use it on awesome compute stuff? That's how I see it anyway.

Your aware compute can be used for graphics right?
 
It has zilch to do with the CPU.
Thanks for confirming. It seemed an odd comment to throw in there... :/
If we're talking about 'balance' this is actually a function of two things, the hardware and the software. (Talking about hardware being balanced independent of a specific piece or pieces of software is nonsense really)

Your hardware will present resources in a certain ratio. A certain amount of instruction throughput to a certain amount of memory throughput.

An algorithm will have a ratio of demands on instruction throughput and memory throughput.

If these ratios align, the hardware is well 'balanced' for the software or vice versa.

If they're not well aligned, you could have demand on one resource holding back the potential of the other resource to reach its peak on this algorithm.

The whole '14 CUs' thing is a Sony suggestion that a 'typical game', today, its graphics-only pipe will align well to a ratio of 14 CUs to 32 ROPs to 176GB/s of bandwidth etc. etc. Thus the heavy encouragement to mix in GPGPU tasks because you'll probably be able to do that without impacting your render pipeline much or at all.

But this is not something prescriptive. A game may well scale linearly against a higher number of CUs vs these other resources. It's up to the game's pipe entirely.

And it's not that render perf won't improve at all with more CUs in this 'typical case' (whatever that is). If you want to look at DF's benchmarks of some current games aimed at isolating the impact of greater CU performance, you still got an average 25% gain or whatever out of 50% more ALU.

Throwing a higher ratio of ALU than software today might typically align with isn't done for kicks, it's a future proofing thing that has some benefit today, and more tomorrow if software shapes itself against higher ratios of ALU to other resources. And the beauty of a console is that software does tend to shape itself against the hardware.
Thanks, your posts are always really straightforward to follow. I can totally get that part of it is future proofing for if devs do want to do more with GPU compute. But it's also pretty clear that a dev can use those resources for whatever they want and Cerny has already stated that there is no evangelising - that it will be used for whatever a specific game intends.

So I'm curious where exactly this idea that there's massive diminishing returns beyond 14 CUs, that makes it pointless to use them for graphics, i.e. the bolded that's being suggested now by some posters keeps coming from... and why exactly Sony would suggest this at a developer conference.

And why exactly this would specifically impact the PS4... as opposed to desktop GPUs which show performance gains with more compute units.
 

onQ123

Member
True, but I don't think you can compare 400Gflops of GPGPU 'Power' to 400Gflops of REAL CPU power, the CPU is much more efficient in doing it's own things.

Who said anything about comparing it to CPU flops? the point is it would still be 400GFLOPS left over to do computing tasks.
 
Some people like Senjutsu miss the the point in such a glaring away, sometimes you gotta double read the post to take it all in.

First and foremost Xbox One has 12 CUs. So even if Ps4 only used 14 CUs for graphics, there would be an advantage already. Second Sony gave an example and people ran with it, but developers can split the CUs usage anyway they want. Third, and this is probably the one that might feel like a kick in the teeth, if a game uses 4 CUs for GPGPU (physics for example) that's going to impact the visuals and the interactivity of the game.

So even if we limit it to physics and such, the situation doesn't get any better for the Xbox One... a 14+4 split in a multiplat game translates to what on the Xbox One? Do they simply eliminate the more demanding GPGPU functionality, or downgrade the graphics further?

A complete dead end.
 

KidBeta

Junior Member
Thanks for confirming. It seemed an odd comment to throw in there... :/
Thanks, your posts are always really straightforward to follow. I can totally get that part of it is future proofing for if devs do want to do more with GPU compute. But it's also pretty clear that a dev can use those resources for whatever they want and Cerny has already stated that there is no evangelising - that it will be used for whatever a specific game intends.

So I'm curious where exactly this idea that there's massive diminishing returns beyond 14 CUs, that makes it pointless to use them for graphics, i.e. the bolded that's being suggested now by some posters keeps coming from... and why exactly Sony would suggest this at a developer conference.

And why exactly this would specifically impact the PS4... as opposed to desktop GPUs.

The only thing I can think of is that the FF GFX hardware might be a limiting factor but doing GPGPU compute preprocessing or new algorithms / effects get around that.
 

Bossofman

Neo Member
Who said anything about comparing it to CPU flops? the point is it would still be 400GFLOPS left over to do computing tasks.

Because the way your saying it, is why even have a CPU IF part of a giant GPU can just act as one? It doesn't work that way, the GPGPU can only do certain things a CPU can and if the things needed are one of the things it can't do well or at all, then that power might go to waste.
 

omonimo

Banned
Stop it Leadbetter your defence force about xbone starts to be embarassing now. I just want to remember KI at 720p to reach 60 fps in multiplayer, a beat em up, Jeez with the magic power of the Cloud; KZ multiplayer is 60 fps at 1080p with the 'mortal' psn plus. 200 GB lol can't believe this bullshitness continues to be quotes in this article, it's pure MS propaganda.
 
Oh right, now SenjutsuSage (of all people) has some "secret insider information" about Sony hardware that he can not post publicly for "reasons" and which somehow describe a bottleneck which isn't in the specs and which magically reduces the performance beyond 14 CUs, while AMD has plans to release cards with up to 44 compute units.
 

RoboPlato

I'd be in the dick
It's suspiciously interesting the MS PR specifically mentioned than the GPU overclock was better than running with 14 CUs...

No it isn't. That's the max their chip has and they were looking into opening up the two deactivated CUs.

Gemüsepizza;83311589 said:
Oh right, now SenjutsuSage (of all people) has some "secret insider information" about Sony hardware that he can not post publicly for "reasons" and which somehow describes a bottleneck which isn't in the specs which magically reduces the performance beyond 14 CUs, while AMD will release cards with up to 44 compute units.

Yep. I'm real interested to see what his info is from, especially since the place he gets most of his info from seems to be a Microsoft developer.

Why are we talking about 14+4 CUs again? I thought that was debunked or at least clarified.

Completely debunked by Cerny himself in an interview.
 

statham

Member
Stop it Leadbetter your defence force about xbone starts to be embarassing now. I just want to remember KI at 720p to reach 60 fps in multiplayer, a beat em up, Jeez with the magic power of the Cloud; KZ multiplayer is 60 fps at 1080p with the 'mortal' psn plus. 200 GB lol can't believe this bullshitness continues to be quotes in this article, it's pure MS propaganda.
needed to be quote for lololol
 

panda-zebra

Banned
the 14+4 was only ever a rumour I think. The only thing I've ever heard Sony say publically was Mark Cerny saying that they have 'a little more ALU' in them than you would normally have.

Yes, that's from the face-to-face with Cerny by DF: http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny

Digital Foundry said:
Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute...

Mark Cerny said:
That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.
 

PSGames

Junior Member
Thanks for confirming. It seemed an odd comment to throw in there... :/
Thanks, your posts are always really straightforward to follow. I can totally get that part of it is future proofing for if devs do want to do more with GPU compute. But it's also pretty clear that a dev can use those resources for whatever they want and Cerny has already stated that there is no evangelising - that it will be used for whatever a specific game intends.

So I'm curious where exactly this idea that there's massive diminishing returns beyond 14 CUs, that makes it pointless to use them for graphics, i.e. the bolded that's being suggested now by some posters keeps coming from... and why exactly Sony would suggest this at a developer conference.

And why exactly this would specifically impact the PS4... as opposed to desktop GPUs.

Well if what Gofreak said is true then it does impact desktop GPUs. He stated benchmarks of comparable GPUs with 50% more CUs only received a 25% gain in performance. Sony is saying it might be better to use those extra CUs for Compute worked because you will gain a greater than x% of performance then just using them on graphics.

Argumentation based on info that only he knows and don't share a simple link.

Here you go. When the PS4 specs were first leaked : http://www.vgleaks.com/world-exclusive-orbis-unveiled-2/

That had to have come from somewhere.
 

onQ123

Member
I agree with you, and, no, I'm definitely not ignoring them. Please read my posts carefully. I'm saying Sony saw this and made what they felt was the best possible recommendation for how devs should use those remaining resources, not that they were useless because of this fact. They said that there is a significant drop off in the benefit of the extra ALU resources after 14 CUs for graphics rendering, not for everything else, or not that there is no benefit, just that the benefit is not as great after that mark if you intend on using them for graphics rendering, and then they recommend using the remaining for CUs for compute.

I myself have said many times that I expect the PS4's compute advantage to be the biggest advantage that the console holds over the Xbox One time and time again, and it will make games better. People have to pay attention to what I say more often :p Those resources won't go to waste, but clearly sony felt that, for the sake of balance (shit I hate using that word now lol), they would recommend a 14 and 4 split between Graphics and Compute, but that doesn't mean you can't do 16 + 2 or 17 + 1 etc etc. It just sounds like they're saying if your benefit of extra CU for graphics is so low, why waste it on that, when you can use it on awesome compute stuff? That's how I see it anyway.

Basically what Sony was doing was making a case for using GPGPU Compute.

14 CU's used for fixed function graphics vs 18 CU's for fixed function graphics will get you the same game that will just look a little better but using 14 CU's for fixed function graphics & the other for compute can change the game by a lot be giving you better physics & other simulations.
 
hes claiming to have insider information.

Problem is even if he has insider info does he understand it or does the person that told him understand it .
It's hard to take some info at face value with so much spinning and lack of understanding going on .

EDIT it also matter how old the info is Cerny already explain the 14 + 4 as a eg of something you can do if you want .
 

Guymelef

Member

Even the press release form February:
http://www.scei.co.jp/corporate/release/130221a_e.html
The Graphics Processing Unit (GPU) has been enhanced in a number of ways, principally to allow for easier use of the GPU for general purpose computing (GPGPU) such as physics simulation. The GPU contains a unified array of 18 compute units, which collectively generate 1.84 Teraflops of processing power that can freely be applied to graphics, simulation tasks, or some mixture of the two.
 

RoboPlato

I'd be in the dick
Well if what Gofreak said is true then it does impact desktop GPUs. He stated benchmarks of comparable GPUs with 50% more CUs only received a 25% gain in performance. Sony is saying it might be better to use those extra CUs for Compute worked because you will gain a greater than x% of performance then just using them on graphics.

Framerate performance has never scaled linearly by CUs and Flops because it's not the only GPU factor in determining framerate.
 
Some people like Senjutsu miss the the point in such a glaring away, sometimes you gotta double read the post to take it all in.

First and foremost Xbox One has 12 CUs. So even if Ps4 only used 14 CUs for graphics, there would be an advantage already. Second Sony gave an example and people ran with it, but developers can split the CUs usage anyway they want. Third, and this is probably the one that might feel like a kick in the teeth, if a game uses 4 CUs for GPGPU (physics for example) that's going to impact the visuals and the interactivity of the game.

So even if we limit it to physics and such, the situation doesn't get any better for the Xbox One... a 14+4 split in a multiplat game translates to what on the Xbox One? Do they simply eliminate the more demanding GPGPU functionality, or downgrade the graphics further?

A complete dead end.


Here's an interesting question... how many CUs does Sony use for redundancy? We now know that XBO has 14 CUs 2 disabled for redundancy. Does PS4 have all 18 CUs active? Maybe 20 CUs and 18 active? hopefully not 18 total with 4 disabled for redundancy and PR is running with saying its 18 "Hardware balanced at 14 CUs"
 
PMing you.

Why pm and not just post the link? What is this source that you only have access to?

hes claiming to have insider information.

Anyone who has that kind of information wouldn't go into such detail in the manner that he did but even so anyone could claim to have second and third hand info and spread fud and use the 'I'll pm you" the details which just ruins these kinds of threads. Let's not turn into beyond3d

This needs the bish test.
 

gofreak

GAF's Bob Woodward
True, but I don't think you can compare 400Gflops of GPGPU 'Power' to 400Gflops of REAL CPU power, the CPU is much more efficient in doing it's own things.

It has different memory advantages and weaknesses. It depends in other words. If you needed high bandwidth with fairly regular memory access the GPU might best a CPU of comparable fp performance.

Of course it's a bit moot anyway since CPUs with huge compute power aren't exactly ten a penny. Hence the rise of GPGPU.

Thanks for confirming. It seemed an odd comment to throw in there... :/
Thanks, your posts are always really straightforward to follow. I can totally get that part of it is future proofing for if devs do want to do more with GPU compute. But it's also pretty clear that a dev can use those resources for whatever they want and Cerny has already stated that there is no evangelising - that it will be used for whatever a specific game intends.

True. But they're just saying: have a look to see if there's a point where you could be using CUs better for other things.

They're asking devs to check if they're under-using ALU, holding the fairly attractive suggestion that quite a lot of power could be spare without hugely affecting render performance. But, again, it's very much your mileage may vary.

So I'm curious where exactly this idea that there's massive diminishing returns beyond 14 CUs, that makes it pointless to use them for graphics, i.e. the bolded that's being suggested now by some posters keeps coming from... and why exactly Sony would suggest this at a developer conference.

And why exactly this would specifically impact the PS4... as opposed to desktop GPUs which show performance gains with more compute units.

It wouldn't be pointless... like those PC benches show, you can still get a gain (variable depending on the game). In some the gain may even be linear. But if on average the gain isn't linear, and your software is being held back by some other point in the pipeline beyond a certain level of shading performance, it is simply wise to consider throwing in other work that may have a different ratio of resource demands, to get alu utilisation back up, and get some nice features or work done into the bargain.

It's not a specific impact on PS4. This is true of any GPU.

And for sure it's in Sony's interests to push GPGPU. It could be a strong competitive advantage for them, it would be hard if not impossible to bring parity on another console if a game is chucking a decent amount of GPGPU around.
 

goonergaz

Member
No it isn't. That's the max their chip has and they were looking into opening up the two deactivated CUs.

So a 'happy coincidence' then that the small overclock>Sonys setup

Just odd they would mention Sonys thoughts in the same PR, they could have just left Sony out
 

Bundy

Banned
Why are we talking about 14+4 CUs again? I thought that was debunked or at least clarified.

=

Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute...

That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.

I seem to recall you might have talked about a toolchain where code could be compiled either for CPU or GPU. Is that right or have I got that completely wrong?

Such a toolchain does exist. It's AMD's HSA [Heterogeneous System Architecture]. That's very exciting but our current strategies are about exposing the low-level aspects of the GPU to a higher-level language. We think that's where the greatest benefit is in year one.

Is there dedicated audio processing hardware within the PlayStation 4? What can it do?

here's dedicated audio hardware. The principal thing that it does is that it compresses and decompresses audio streams, various formats. So some of that is for the games - you'll have many, many audio streams in MP3 or another format and the hardware will take care of that for you. Or, on the system side for example, audio chat - the compression and decompression of that.
 
unearthed the Sony Devcon CU optimisation graph. I think Sony must be the blue one
YAcsfMJ.gif

nsWXEXJ.jpg
 
No it isn't. That's the max their chip has and they were looking into opening up the two deactivated CUs.



Yep. I'm real interested to see what his info is from, especially since the place he gets most of his info from seems to be a Microsoft developer.



Completely debunked by Cerny himself in an interview.

That's a huge misconception. He said no different than what I said, only he danced around or left out the information communicated to devs. He actually didn't lie, he just omitted or didn't fully clarify.

In fact, Cerny greatly hinted that it was true.

http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny

"The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics."

When he says a little bit more ALU than if you were thinking strictly about graphics, and when he says the hardware is intentionally not 100 percent round, he means that it doesn't scale in returns for graphics per block of ALU like you might think.

Another Reiko? This is getting sad.

I'm not spouting bs about a dual apu or dual GPU or 4 to 5+ teraflops worth of performance. :)
 

KidBeta

Junior Member
That's a huge misconception. He said no different than what I said, only he danced around or left out the information communicated to devs. He actually didn't lie, he just omitted or didn't fully clarify.

In fact, Cerny greatly hinted that it was true.

http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny



When he says a little bit more ALU than if you were thinking strictly about graphics, and when he says the hardware is intentionally not 100 percent round, he means that it doesn't scale in return per block of ALU like you might think.

This isn't how the real world works, it isn't so cut and dry but I could believe it for specific scenarios.
 

badb0y

Member
That's a huge misconception. He said no different than what I said, only he danced around or left out the information communicated to devs. He actually didn't lie, he just omitted or didn't fully clarify.

In fact, Cerny greatly hinted that it was true.

http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny



When he says a little bit more ALU than if you were thinking strictly about graphics, and when he says the hardware is intentionally not 100 percent round, he means that it doesn't scale in returns for graphics per block of ALU like you might think.

Yes, it does....

Do we live on a planet where graphics cards don't exist? The scaling is not linear but we are not just talking about extra CUs here, we have faster memory on PS4 and we have more ROPs.
PS4 unbalanced confirmed :)

that's really interesting though. And perhaps would that mean that cross-gen multiplatform titles might not show much difference between xbox and PS4? Assuming they are relatively straightforward up-ports.

Unless there is hard cap on FPS, anything that the Xbox One can run will run faster on the PS4. It's the nature of the architecture in these new consoles they inherited from being based on PCs.

How do you think HD 7970 runs faster than a HD 7870? Do you think developers go in and code it that way?
 
Only saying it now so I don't have to repeat it. I'm not posting a link, because it isn't a link. it's just information that I think proves what I'm saying is true beyond doubt, but it also carries the risk of getting someone in trouble or violating their trust, which is why I can't just say everything, and why immediately after I said it, I said I would have no issue sharing the information with a mod if it made people more comfortable that I'm being transparent and honest. I've known about it for awhile, but it became much more interesting once MS publicly confirmed their testing for 14 CUs and saying how the CPU was a limiting factor to their games. It's also a fair assumption architects for both systems know or was learning quite a bit about what was going on in the other project through various channels, so I don't find it completely insignificant that MS mentioned what sony stated to devs about their GPU and their recommended best usage of it.

edit:: Anytime I get info from a site, I post it 100% of the time. In fact, it's a habit I have that some find annoying at times :p

Has this new info source been verified by bish yet?
 
This isn't how the real world works, it isn't so cut and dry but I could believe it for specific scenarios.

Just keep in mind, this doesn't somehow make the PS4 any less awesome than it already is. It's how sony designed the system, and it's smart to push GPGPU.

Has this new info source been verified by bish yet?

No, actually, but I'll go message him the info currently so you guys know that I did.
 
Top Bottom