• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

onQ123

Member
I'm very sure that's not the case according to all leaks and information I got. Why should they use 14 CUs when they know the system will ship with 12 CUs?

Why would they toy with the idea of using all 14 CU's if they had no chance of the console running with 14 CU's?


Chances are 12 CU's was the safe bet just in case the yields are not so good & 14 CU's was the dream number if the yields was good enough to take the hit.
 
Pretty clear where this is heading and why it was posted.

MS says we targeted 12 CU plus overclock because it outperforms 14 CU. They spins the old story about the 14cu+4cu for PS4 to show the xbone is more powerful since its more balanced. Since we have new "info" that came straight from "sony" that after 14CU is diminishing return so its a "waste" to use this for gfx. Of course the source is unknown and most likely the 6 months old at least.

Now we have the xbone can out perform the ps4 gpu because they have the perfect number of CU and its overclocked. What about the 4CU for compute? well the xbone has 15 special processors that off load cpu tacks and plus the cpu is now over clocked!

LOL

So easy to see right through this stuff. Complete PR bullshit. Back in real life the ps4 has a massive advantage with ALU[50%+], ROP[100%+], TMU[50%+], usable ram per frame[100%+]and more advance gpu compute support.

I would be shocked if a single game ran better on xbone than PS4.

yeah. i don't see how 14 CU < 12 CU w/ OC. it's obvious why they chose 12 CU... because they need to. It has nothing to do with diminishing returns and everything to do with the fact that they need better yields on their chip. Going 14 CU means no CU on the chip can be out of the required spec... that means they'll take a big hit in yields on their chip. So they have no choice but to go 12 CU to keep yields up and then maximize the frequency as much as possible to get as much performance as they can since the PS4 is more powerful.

I think it's all utter nonsense. 14 CU > 12 CU plain and simple. More is better, ask the AT&T LTE kids on the commercials. I'm not buying the beyond 14 CU on PS4 nonsense either. I think it's all BS spin.
 

ethomaz

Banned
I think it's 2 ACEs with 8 queues each, I remember Kidbeta saying that.
I really don't know... PS4 have 8 ACEs with 8 queues each (64 total).

My point is if a dev want to use the free SPs to computes in parallel with graphics they need the ACE + Queue to manage that... so if the dev is already using all the ACEs available the compute task will be in queue waiting the ACEs get free even if there are free SPs on GPU.

PS4 can manage more compute tasks in parallel than Xbone... so even if a dev choose to use compute tasks in your game he will get a limitation for that... the limitation in PS4 will be more hard to happen.

Using the GAF words "PS4 is more BALANCED for compute tasks than Xbone" :D

Edit - Fixed the PS4 queues numbers.
 

ekim

Member
Why would they toy with the idea of using all 14 CU's if they had no chance of the console running with 14 CU's?


Chances are 12 CU's was the safe bet just in case the yields are not so good & 14 CU's was the dream number if the yields was good enough to take the hit.

But you don't develop and show off games made for 14 CUs for the small possibility of a 1.5 TFlop machine.
 

panda-zebra

Member
The 14 and 4 thing is simply a recommendation by Sony to developers on what they see as an optimal balance between graphics and compute.

So your source says this is actually a Sony recommendation and not just simply an example using those figures?

I can see how they might have arrived at the numbers based off experience with a number of their own projects or tests, but to flat out state "we recommend you do this exact 14/4 split" when it's going to vary from project to project seems a little odd. But if that's what they said or that's what you say your source said they said, then I suppose that's that.
 

gofreak

GAF's Bob Woodward
Do AMD and nvidia do this kind of profiling of current game engines? If so, I wonder why they continue to make GPUs with these kinds of ratios - portent with even more CUs than PS4 but similar ROPs and bandwidth. Shouldn't they be seeing similar drop offs in performance and should consider adjusting their cards? Or does the varying nature of PC screen resolutions bring that equation back into balance?

A performance increase is still a performance increase even if it isn't always proportional on every game. Vendors will chase whatever gains they can economically get.

A certain amount is future proofing...what's typical today in terms of alu intensity and what might be typical in two or three years can be very different.

A certain amount might be hoping devs will tune games to their resource ratios for better perf on their cards.

A certain amount is having one eye on gpgpu markets.

A certain - maybe quite large - amount of it is that it's easier to improve alu resources vs other things like memory access/bandwidth (an old story, not just on gpu).
 

USC-fan

Banned
This stuff is hilarious.

5 billion transistors!

Power of the cloud, your console is 4x more powerful!

Wait no, specs don't matter!

Wait no, they matter, we're doing all these last minute tweaks for 6% increase here and there. That's gonna make a huge difference! Forget the two previous PR lines!

Diminishing returns! Objectively better HW is not really better. Let me show you this by randomly adding BW numbers!

We got balance, baby!


What a trainwreck.
It is pretty funny when you list it like that.

Reminds me so much of the PS3 launch. Just trainwreck after trainwreck....

yeah. i don't see how 14 CU < 12 CU w/ OC. it's obvious why they chose 12 CU... because they need to. It has nothing to do with diminishing returns and everything to do with the fact that they need better yields on their chip. Going 14 CU means no CU on the chip can be out of the required spec... that means they'll take a big hit in yields on their chip. So they have no choice but to go 12 CU to keep yields up and then maximize the frequency as much as possible to get as much performance as they can since the PS4 is more powerful.

I think it's all utter nonsense. 14 CU > 12 CU plain and simple. More is better, ask the AT&T LTE kids on the commercials. I'm not buying the beyond 14 CU on PS4 nonsense either. I think it's all BS spin.
Pretty easy to see right through this stuff.
 

ekim

Member
I really don't know... PS4 have 8 ACEs with 64 queues each.

My point is if a dev want to use the free SPs to computes in parallel with graphics they need the ACE + Queue to manage that... so if the dev is already using all the ACEs available the compute task will be in queue waiting the ACEs get free even if there are free SPs on GPU.

PS4 can manage more compute tasks in parallel than Xbone... so even if a dev choose to use compute tasks in your game he will get a limitation for that... the limitation in PS4 will be more hard to happen.

PS4 has 8 ACEs with 8 queues each not 64 if I'm not entirely mistaken. 64 queues per ACE sounds plain wrong.
 
Why would they toy with the idea of using all 14 CU's if they had no chance of the console running with 14 CU's?


Chances are 12 CU's was the safe bet just in case the yields are not so good & 14 CU's was the dream number if the yields was good enough to take the hit.

they want to close the gap between the PS4 and XB1. they can't make drastic changes, so they looked into running the chips with all 14 CU's on the chip active instead of the default 12 active + 2 deactivated (for yields). I'm pretty sure they didn't like what they saw since they'd have a shit load of faulty chips and would have to cherry pick the few good ones and would end up with limited XB1 supplies. They had to go 12 CU + 2 dead or else it would cost too much.
 

ethomaz

Banned
PS4 has 8 ACEs with 8 queues each not 64 if I'm not entirely mistaken. 64 queues per ACE sounds plain wrong.
You are right... my mistake... thanks.

8 ACEs with 64 queue pipelines total.

Xbone? 2 x 8 (16 total) or 2 x 4 (8 total)?
 

RoboPlato

I'd be in the dick
So your source says this is actually a Sony recommendation and not just simply an example using those figures?

I can see how they might have arrived at the numbers based off experience with a number of their own projects or tests, but to flat out state "we recommend you do this exact 14/4 split" when it's going to vary from project to project seems a little odd. But if that's what they said or that's what you say your source said they said, then I suppose that's that.

From my interpretation that's just the point where rendering efficiency starts to pull back a bit so it would be a more efficient use of the full GPU potential to dedicate the other CUs to compute.
 

gofreak

GAF's Bob Woodward
So your source says this is actually a Sony recommendation and not just simply an example using those figures?

I can see how they might have arrived at the numbers based off experience with a number of their own projects or tests, but to flat out state "we recommend you do this exact 14/4 split" when it's going to vary from project to project seems a little odd. But if that's what they said or that's what you say your source said they said, then I suppose that's that.

Cerny was clear that it was not a 'formal evangelisation'. I would take it more as a suggestion to profile your game because there might be more resources left on the table after rendering than you think, with an illustrative example based on some 'typical' load. No more, no less IMO.
 

nib95

Banned
PS4 has 8 ACEs with 8 queues each not 64 if I'm not entirely mistaken. 64 queues per ACE sounds plain wrong.

PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 68gb/s+ 32MB ESRAM @109gb/s

From my understanding.
 

Tripolygon

Banned
So your source says this is actually a Sony recommendation and not just simply an example using those figures?

I can see how they might have arrived at the numbers based off experience with a number of their own projects or tests, but to flat out state "we recommend you do this exact 14/4 split" when it's going to vary from project to project seems a little odd. But if that's what they said or that's what you say your source said they said, then I suppose that's that.
Sony press release
The Graphics Processing Unit (GPU) has been enhanced in a number of ways, principally to allow for easier use of the GPU for general purpose computing (GPGPU) such as physics simulation. The GPU contains a unified array of 18 compute units, which collectively generate 1.84 Teraflops of processing power that can freely be applied to graphics, simulation tasks, or some mixture of the two.

Sony Team Lead SCEE R&D https://www.youtube.com/watch?v=JHJL9MgkiKE

If you can withstand that awful audio recording you would hear him talk about PS4 GPU to developers.
 

RoboPlato

I'd be in the dick
PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 69gb/s+ 32MB ESRAM @109gb/s

From my understanding.

Nib, I gotta ask. Do you have these numbers memorized or do you have them ready to copy and paste at a moment's notice? I've seen you drop them out of no where so many times and I can't even remember all of them despite seeing them so much.
 

artist

Banned
Do AMD and nvidia do this kind of profiling of current game engines? If so, I wonder why they continue to make GPUs with these kinds of ratios - portent with even more CUs than PS4 but similar ROPs and bandwidth. Shouldn't they be seeing similar drop offs in performance and should consider adjusting their cards? Or does the varying nature of PC screen resolutions bring that equation back into balance?
Yes, they do and it's exactly how they plan for future design/archs.
 

nib95

Banned
Nib, I gotta ask. Do you have these numbers memorized or do you have them ready to copy and paste at a moment's notice? I've seen you drop them out of no where so many times and I can't even remember all of them despite seeing them so much.

Copy and paste. Though I do know most off by heart now. Lol.
 

onQ123

Member
But you don't develop and show off games made for 14 CUs for the small possibility of a 1.5 TFlop machine.

You do when you have a 'lower the resolution' fallback plan.


by the way it wasn't 1.5 TFLOPS it would have been 1.43 TFLOPS so the alternative was upping the clock on the 12 CU's coming to 1.31 TFLOPS which wasn't much of a performance loss.
 

ekim

Member
You do when you have a 'lower the resolution' fallback plan.


by the way it wasn't 1.5 TFLOPS it would have been 1.43 TFLOPS so the alternative was upping the clock on the 12 CU's coming to 1.31 TFLOPS which wasn't much of a performance loss.

Yeah I calculated with the upclock and 14 CUs. My fault :)
 
You know, if MS had any integrity, they would do what Nintendo did: state that they don't think all that power is necessary to make great video games in 2013. Or just don't talk about specs at all.

They know ppl will not pay $60 for a multiplat game being inferior, nor will they risk $500 up front for an inferior product with this knowledge beforehand.

So it's in MS' interest to muddy the waters and pretend like the PS4 is not more balanced or more powerful.
 

astraycat

Member
Well there is the rumor that the devkits took about a 15% hit with one of the updates.


so do the math 14 CU's (896 ALU's) to 12 CU's (768 ALU's) = close to a 15% drop.

Sure, but that's probably due to some poorly thought out driver/kernel changes. Those kinds of things happen.

Going for 14 CUs to 12 CUs is a very hefty change, and would have required huge amounts of QA before going out to developers. They're not about to go through that sort of effort just for an experiment. It would likely require developer code changes as well, since there is probably code that depends on the number of CUs (stuff like setting aside scratch memory, ect.).
 
I am very much looking forward to F5 but on a technical level its not that impressive. With non-dynamic baked lighting, it is a last gen game running at 1080p and a high frame rate......

Like me, you fell for. "Good Art"

It has static lighting, not baked. It features sub-surface scattering for the paint and image based lighting for reflections which are both real time. FM5 is technically impressive on a number of levels.
 

onQ123

Member
Sure, but that's probably due to some poorly thought out driver/kernel changes. Those kinds of things happen.

Going for 14 CUs to 12 CUs is a very hefty change, and would have required huge amounts of QA before going out to developers. They're not about to go through that sort of effort just for an experiment. It would likely require developer code changes as well, since there is probably code that depends on the number of CUs (stuff like setting aside scratch memory, ect.).

I'm not saying that it happened but I'm questioning if it might have happened.
 

EagleEyes

Member
Not really, just so much misinformation floating around these days. Some people seem hell bent of promoting false numbers so it's good to have accurate figures to hand.
Come on now Nibs you have been a console warrior since your days on IGN. Do you really, honestly consider yourself these days nonbiased? Honest question.
 
I think the Xbox One has 2 ACEs with 4 queues.

2 ACEs with 16 queues.

Now that that's settled, back to the reading the spin, I almost feel like taking senjutsu off ignore so I can get some good laughs, almost.

Guess I don't need to.

SenjutsuSage
Banned
(Today, 05:07 PM)
 

nib95

Banned
Come on now Nibs you have been a console warrior since your days on IGN. Do you really, honestly consider yourself these days nonbiased? Honest question.

My IGN days were over a decade ago.... Plus I was only a teenager back then.

Everyone has biases and preferences. This is only natural. Mine even temporarily switched to the 360 at one point when I started buying many of my multiplatform games on the system. But I would say irrespective of personal preference I've always kept it real and honest. Using facts, evidence and/or grounded logic to form opinions and conclusions instead of letting my preferences dictate them.

That is the key differentiator in what makes such discussions appreciable or worth any merit, and why despite any personal preferences I may have, I have not made the same mistakes or been lambasted in the same way as some others have on this particular forum. It is also the reason I'm still here posting today.

Have your preferences, that's fine, but try to refrain from letting them dictate you forming arguments based on intellectual dishonesty.
 

prwxv3

Member
Welp senjutsusage got banned. Was his info right but he interpreted it completely wrong or was he just complete bullshit.
 

nib95

Banned
Welp senjutsusage got banned. Was his info right but he interpreted it completely wrong or was he just complete bullshit.

He's been misinformed or out right wrong on so many countless things I really have no idea any more.

His source was probably bullshit.

Far as I can remember, the only source he actually claimed to have was a personal friend who is supposedly a Microsoft first party developer. If true, his friend likely never gave much if anything away though because most of SenjutsuSage's viewpoints and theories were based off information trawled from other forums such as B3D and so on.
 
I'm pretty sure Senju got banned for making mountains out of molehills

His usual spiel

Just went to far with it this time

Or maybe he was really silly enough to use a bad source

Bish was involved and considering he is/was a developer

I'm sure his BS radar is fully in sync
 

Skeff

Member
What was he trying to claim anyway ?

A lot of things, but the most likely falsehood was that he claimed sony briefed third party developers that after 14 CU's for graphics they didn't really do much, Yes he claimed SONY said that to devs.
 
Top Bottom