• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

Oct 30, 2011
5,446
1,726
755
Thanks for the summary. This thread has really gone places.

This is really nothing we didn't already know. Mark Cerny stated that GPU's are not fully efficient, and that there's gaps in performance with idle cycles. To fill these gaps in, devs are suggested to use the GPU for compute...similarly to how devs would try and maximize cycles on CELL for various purposes since the hardware was quite flexible.

Also, I hate that there's this perception that compute somehow won't impact graphics because it's not direct rendering. IMHO, the biggest advancements we will see in the coming gen will come from making environments feel more alive and tangible, rather than static....this is where GPGPU comes into play heavily.

And at the end of the day, the performance gap for the PS4 is still 40+%, with that figure perhaps being larger since Sony has optimized and "balanced" the hardware specifically to extract more out of GPGPU.

Every detail we've seen regarding the XB1 vs. PS4 indicates that PS4 is far better "balanced" for gaming.
 

nib95

Banned
Feb 26, 2007
34,618
2
0
Done.



I've taken MS's high-end value for their real-world mem bw.
Take away the memory bandwidth number used for the Xbox One (not really fair given a huge bulk of that bandwidth is towards a paltry 32mb) and on this graph at least, the Xbox One is definitely more comparable to the 7770. Not sure the bump in Prim rate is going to have much of an impact on performance based on this either.
 

USC-fan

Banned
Oct 9, 2005
7,115
1
1,235
Done.



I've taken MS's high-end value for their real-world mem bw.
Pretty awesome chart. Great work.

Make it pretty clear that again the more balance gpu clearly points to PS4.

saw this posted.

"AMD gained a clean sweep in Wii U with R700 shaders, PlayStation 4 with Sea Islands shaders, and Xbox One with Southern Islands shaders, along with Jaguar x86-64 CPU cores for the PlayStation 4 APU and Xbox One SoC, that means most of the AAA cross-platform titles will have the same starting line and a unified feature set on graphics capabilities. [Author's note: except for features enabled the eSRAM on Xbox One which some devs states it's "a pain to use the eSRAM".]"

Thought Sea island was just rebranded Southern Island. Also labels PS4 an apu and xbone a SOC wonder if this has any merit.

http://semiaccurate.com/2013/09/20/amd-livestream-gpu14/
 

Chobel

Member
Mar 26, 2013
15,673
3
515
"Just like our friends we're based on the Sea Islands family"


I didn't pay attention to this before but Sea Islands is the HD 8000 GPU's

so that kill the Bonaire HD 7790 speculations.
HD 7790 is one of Sea Islands Family GPUs.

EDIT: link http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-HD-7790-Review-Sea-Islands-and-Bonaire-Make-Appearance
As it turns out, today marks the release of the Radeon HD 7790, a completely new piece of silicon under the Sea Islands designation
 

Cidd

Member
Jan 14, 2013
4,356
1
500
Done.



I've taken MS's high-end value for their real-world mem bw.
Just looking at this chart I take it that the PS4 performance will be close or a little better than the 7850 and the Xbone a little better than the 7770?
 

nib95

Banned
Feb 26, 2007
34,618
2
0
Artist, can you do one with the Xbox One's memory (main DDR3 ram) at 68 GB/s as well? Just curious to see how it looks in comparison.
 

artist

Banned
May 7, 2006
16,629
0
0
You should probably do a graph that treats the ALUs as continuous data rather than discrete if you want to model the relationship between the two?

EDIT: although the new graph probably makes the old one redundant anyway.
Like this?



edit: Fixed scale.
 

shinra-bansho

Member
Nov 13, 2011
16,595
0
0
Yeah.

Although I think the new graph you've made does a better job and is more useful, since it shows a lot of other metrics too and factors in clock rate for compute performance.
Just looking at the 7850 and 7750 for instance on your new graph, where all the other metrics are also increased proportionally, game performance essentially scales perfectly (with compute). Cf. comparing the 7750 and the 7790, where the increase in compute performance of (eyeballing) around 120% only nets a 60% performance benefit, presumably because the other metrics like pixel fill rate and memory bandwidth haven't kept up. Looking at the 7850 vs the 7970 seems to be a similar case, etc.

Very interesting.
 

LittleJohnny

Banned
Oct 17, 2006
1,099
0
0
Take away the memory bandwidth number used for the Xbox One (not really fair given a huge bulk of that bandwidth is towards a paltry 32mb) and on this graph at least, the Xbox One is definitely more comparable to the 7770. Not sure the bump in Prim rate is going to have much of an impact on performance based on this either.
Artist, can you do one with the Xbox One's memory (main DDR3 ram) at 68 GB/s as well? Just curious to see how it looks in comparison.
You're something else nib. For some reason I think you would never ask for a graph showing only 14 CU for PS4... You know, just for curiosity ... Wink
 

artist

Banned
May 7, 2006
16,629
0
0
Artist, can you do one with the Xbox One's memory (main DDR3 ram) at 68 GB/s as well? Just curious to see how it looks in comparison.
Here.

Why it the mem bandwidth for XB1 so high there? Is it based on the fake numbers Microsoft gave?
It's the real-world usage* numbers that Microsoft gave; 140-150GB/s. I just gave them the benefit of doubt here because the truth is somewhere in between.

See this is what happens when sage gets banned. You guys start fapping together over charts.
If you arent going to contribute any valuable piece to this thread, best to not throw insults at those who are.
 

viveks86

Member
Sep 12, 2013
15,859
0
485
Everything I've said here holds true for the Xbone too. It just is far more limited in its GPGPU capabilities, due to only have 2 ACEs, 8 queues and 12 CUs to process compute tasks on.

I see people mentioning 8 queues (2 ACEs, 4 queues per ACE) as well as 16 queues (2 ACEs, 8 queues per ACE). Which is the right one? Is this still up in the air?
 

Pug

Member
Jun 8, 2004
2,172
67
1,495
Cardiff
See this is what happens when sage gets banned. You guys start fapping together over charts.
I like charts, i just cant wait for the pixel and spark counters when the games are eventually released. This thread has more laughs than intentional that's for sure, comedy gold.
 

AgentP

Member
Jun 5, 2011
7,072
1,039
775
They are not fake numbers (Assume you are referring to 68+ 150 real time or 204+ 68 peak theoretical)

those are real
They are real for when the system is performing at its best and devs have all the data where it needs to be. I suspect the real aggregate bandwidth will be marginally over 100GB/s, hence the 12CUs.

The PS4 can achieve near its peak with little juggling and careful attention-it is one pool. So it will probably see 150GB/s with ease.
 

artist

Banned
May 7, 2006
16,629
0
0
Just looking at the 7850 and 7750 for instance on your new graph, where all the other metrics are also increased proportionally, game performance essentially scales perfectly (with compute). Cf. comparing the 7750 and the 7790, where the increase in compute performance of (eyeballing) around 120% only nets a 60% performance benefit, presumably because the other metrics like pixel fill rate and memory bandwidth haven't kept up. Looking at the 7850 vs the 7970 seems to be a similar case, etc.

Very interesting.
Yup.

In terms of scaling linearly, the weightage for different metrics translating into performance across games looks like;
1. Compute
2. Texell Fill
3. Pixell Fill/Mem Bandwidth
4. Prim Rate
 

nib95

Banned
Feb 26, 2007
34,618
2
0
Here.


It's the real-world usage* numbers that Microsoft gave; 140-150GB/s. I just gave them the benefit of doubt here because the truth is somewhere in between.


If you arent going to contribute any valuable piece to this thread, best to not throw insults at those who are.
That was real world usage only for the Esram. Not for the DDR3. Which has a max bandwidth of 68 GB/s. Really this one is more accurate with respect to the bulk of the graphics ram.




Yup.

In terms of scaling linearly, the weightage for different metrics translating into performance across games looks like;
1. Compute
2. Texell Fill
3. Pixell Fill/Mem Bandwidth
4. Prim Rate
Interestingly Compute and Texell Fill are the two (along with memory bandwidth) the PS4's GPU has over the 7850.


They are not fake numbers (Assume you are referring to 68+ 150 real time or 204+ 68 peak theoretical)

those are real
You're something else nib. For some reason I think you would never ask for a graph showing only 14 CU for PS4... You know, just for curiosity ... Wink
Not at all the same. It is disinginuous to combine the Xbox One's DDR3 ram and Esram with respect to main memory graphics performance. Whilst they can be used in tandem to each other (hence adding together the bandwidth), the fact remains that the Esram is only good for a meagre 32mb (as others alluded to early in the thread, more like a controlled scratchpad) whilst the DDR3 is the bulk of the memory.

Surely you can acknowledge how that greatly differs from simply artificially and randomly cutting away 4 CU's off the PS4 for the sake of it?
 

Nyampoo

Member
Dec 25, 2006
28
0
0
Tokyo, Japan
Done.



I've taken MS's high-end value for their real-world mem bw.
Excellent chart.
At a first glance, PS4's mem bandwidth looked a little bit too high for its computing resources, then I realized that the CPU would also consume some portion of the bandwidth, that makes PS4 even more "balanced" in terms of GPU.
 

Smash

Banned
Mar 5, 2007
324
0
0
They are not fake numbers (Assume you are referring to 68+ 150 real time or 204+ 68 peak theoretical)

those are real
Considering the numbers are unprecedented for ESRAM and it's supposed to be a Microsoft discovery I automatically assume they're lying. It's not the first time they've inflated numbers using crappy math,
 

viveks86

Member
Sep 12, 2013
15,859
0
485
That was real world usage only for the Esram. Not for the DDR3. Which has a max bandwidth of 68 GB/s. Really this one is more accurate with respect to the bulk of the graphics ram.



Esram is only good for a meagre 32mb (as others alluded to early in the thread, more like a controlled scratchpad) whilst the DDR3 is the bulk of the memory.
But Nib, isn't this graph completely discounting ESRAM? The point of the ESRAM is to compensate for the low DDR3 bandwidth right? I think artist's original graph of 140-150 seems fair for the moment until real tests are done.
 

nib95

Banned
Feb 26, 2007
34,618
2
0
But Nib, isn't this graph completely discounting ESRAM? The point of the ESRAM is to compensate for the low DDR3 bandwidth right? I think artist's original graph of 140-150 seems fair for the moment until real tests are done.
I agree. Neither graph would be completely accurate (for the reasons I stated above). The Esram one due to the small ram quantity and implications of it, the DDR3 one because of the omission of the Esram and the advantages of it.

Just wanted both up there so I could gauge how the DDR3's bandwidth compared to that of traditional PC GPU's of similar architecture.
 

SPE

Member
Mar 30, 2007
6,609
1
1,020
I see people mentioning 8 queues (2 ACEs, 4 queues per ACE) as well as 16 queues (2 ACEs, 8 queues per ACE). Which is the right one? Is this still up in the air?
Good question. I don't think this hast been officially confirmed by MS (and it's very likely they won't confirm any spec that is lower the PS4).

I was using the most common figure I've seen listed. I've also seen it mentioned that the Xbone has 2 x ACE and just 2 x Compute Queues, which is what the similar spec AMD PC cards have.

The recent Hotchips 25 event confirmed the Xbone had 2 ACEs, but didn't confirm queue counts.
 

Josh378

Member
Jun 4, 2013
877
0
370
My IGN days were over a decade ago.... Plus I was only a teenager back then.

Everyone has biases and preferences. This is only natural. Mine even temporarily switched to the 360 at one point when I started buying many of my multiplatform games on the system. But I would say irrespective of personal preference I've always kept it real and honest. Using facts, evidence and/or grounded logic to form opinions and conclusions instead of letting my preferences dictate them.

That is the key differentiator in what makes such discussions appreciable or worth any merit, and why despite any personal preferences I may have, I have not made the same mistakes or been lambasted in the same way as some others have on this particular forum. It is also the reason I'm still here posting today.

Have your preferences, that's fine, but try to refrain from letting them dictate you forming arguments based on intellectual dishonesty.

Haha, I remember those days as well on IGN..I was a teenager as well posting side-by-side with Nibs. I still think back to those days and shudder with disbelief on how I acted back then...good(fun) times tho.


But on topic. The problem is that people are saying "on paper...". I agree if this was some type of foreign console design like the PS3 was(which will take awhile to even understand the system), but we are talking about a system designed to be so console friendly to developers with a familiar system architecture that this "on paper..." specs will 90% match the system capability from the first year and on. MS, please stop spinning specs, it is what it is. Just focus on Kinect and how it will revolutionize gaming.

Look at what Nintendo did with the Wii. Yes, it was the weakest system by far vs 360 and PS3...ppl made jokes about it's specs. But at the end, not only did it sell the most, but it also focus on motion control more than PR spec war that they knew they would lose vs the other next gen systems and was successful pushing that technology to the masses.

Right now, MS needs to go into hush-hush mode with specs wars (which Sony not even paying attention to them, they know they won this by default...it will show when the games are released), start working with their internal studios and figure out what needs to happen to show why Kinect makes the difference. I don't know why this is so hard for MS to market this way. It's common sense that this should be the focus since their system specs are far under the PS4's specs.
 

onQ123

Report me for starting troll threads. Ban warning.
May 1, 2010
15,830
4,072
1,275
Good question. I don't think this hast been officially confirmed by MS (and it's very likely they won't confirm any spec that is lower the PS4).

I was using the most common figure I've seen listed. I've also seen it mentioned that the Xbone has 2 x ACE and just 2 x Compute Queues, which is what the similar spec AMD PC cards have.

The recent Hotchips 25 event confirmed the Xbone had 2 ACEs, but didn't confirm queue counts.
I wonder where people got the 16 queue number from too. because it seem like they just looked at the fact that PS4 has 8 queues per pipeline & used it for the XBox One & came up with 16.

I guess we have to wait for the EuroGamer article about Xbox One GPGPU.
 

ethomaz

Member
Mar 19, 2013
31,199
15,564
1,025
38
Brazil
I wonder where people got the 16 queue number from too. because it seem like they just looked at the fact that PS4 has 8 queues per pipeline & used it for the XBox One & came up with 16.

I guess we have to wait for the EuroGamer article about Xbox One GPGPU.
Well GCN 1.0 have 2 x 2 (2 ACEs with 2 queues each).
 

TheLegendary

Member
Oct 12, 2007
15,008
0
0
My IGN days were over a decade ago.... Plus I was only a teenager back then.

Everyone has biases and preferences. This is only natural. Mine even temporarily switched to the 360 at one point when I started buying many of my multiplatform games on the system. But I would say irrespective of personal preference I've always kept it real and honest. Using facts, evidence and/or grounded logic to form opinions and conclusions instead of letting my preferences dictate them.

That is the key differentiator in what makes such discussions appreciable or worth any merit, and why despite any personal preferences I may have, I have not made the same mistakes or been lambasted in the same way as some others have on this particular forum. It is also the reason I'm still here posting today.

Have your preferences, that's fine, but try to refrain from letting them dictate you forming arguments based on intellectual dishonesty.
I remember you from IGN and you're still the same as you were there; you just hide it better because you have to here. It's a shame, but it is what it is.
 

nib95

Banned
Feb 26, 2007
34,618
2
0
I remember you from IGN and you're still the same as you were there; you just hide it better because you have to here. It's a shame, but it is what it is.
Hide what better? Preferences? I have never hidden them. Contrary to that I've acknowledged that it's only natural to have them. Like I said earlier, preferences are perfectly acceptable and normal so long as you don't let them cloud your judgement or evoke intellectual dishonesty in your arguments.

I think part of the issue is certain people have aligned themselves so rigidly to a particular side that they are unable to comprehend anything that puts their favoured product in any form of negative light, to offer balanced, evidenced or logic based argument, or to criticise anything related to their side.

I mean, this whole persecution complex and attack thing has really gotten out of hand, even you yourself, a few times now have dropped by a thread to regurgitate this exact same attack instead of actually offering anything additive to the discussion at hand. It really is somewhat sad, but at the same time amusing.
 

malboroking

Banned
Oct 13, 2009
1,571
0
0
Seattle
Considering the numbers are unprecedented for ESRAM and it's supposed to be a Microsoft discovery I automatically assume they're lying. It's not the first time they've inflated numbers using crappy math,
I would take anything Klocker has to say with a huge grain of salt.
 

Biker19

Banned
Apr 1, 2013
6,623
0
0
I don't see how that's the case. All I'm saying is that the PS4 has a significant advantage and that people shouldn't ignore or rationalize it.

It's fine if you want a console for other reasons, but you're just not doing yourself (or others) a favor if you keep ignoring or rationalizing these differences. You might end up disappointed .
At the very least, you will be incredibly annoying when partaking in important internet arguments, like this one.
I think its a little disingenuous to say the X1 is any less of a gaming machine than the 360 was if we're basing it off specs.
This is an absurd statement.

The device wouldn't have nearly the power it does if it wasn't designed as a game machine first and foremost.

Maybe you're thinking of an Apple TV. Or a tablet. IDK.
See what EventHorizon's saying in his post from another thread here?

That's what I'm talking about.
 

KidBeta

Junior Member
Aug 22, 2012
1,501
0
0
I wonder where people got the 16 queue number from too. because it seem like they just looked at the fact that PS4 has 8 queues per pipeline & used it for the XBox One & came up with 16.

I guess we have to wait for the EuroGamer article about Xbox One GPGPU.
I don't see the XB1 matching the PS4 on GPGPU, a lot of what Sony changed seems to have to do with improving GPGPU esp when working with the CPU.

I don't know, but I believe his information are true.
Presentation form last year, I really doubt its changed.

In regards to SenjutsuSage quote, whilst I don't doubt its real it does not bring anything of substance. The main problem being that it doesn't describe _WHY_ it happens or what parts are being maxed / can't be used to increase power and this seems wrong to me.
 

klee123

Member
Dec 18, 2005
6,664
0
1,135
Why do these people do this to themselves? Why is it so important for them to prove the PS4 isn't more powerful than the Xbone that they would go to such measures as making up inside information even though the evidence is right in front of them plain as day. for god sake devs have even confirmed it.
They refuse to believe that they will get the short end of the stick when it comes to multiplat games.

Had it been the other way around they'd been using it as ammo for console war arguments about how multiplats are ALWAYS "better on Xbox"
 

spisho

Neo Member
Aug 17, 2012
235
0
0
The secret sauce was I think first referred to by VGLeaks, when they interpreted the PS4 GPU wrongly, thinking it as a 14CU unit with a 4CU separate module and something was "secret" or "hidden". I dont have the link handy but probably can dig it up later.

Aside for Aegies, Leadbetter also tried to use the secret sauce for the Move Engines;

http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis
Leadbetter suggesting that move engines are the equivalent of two or three fixed function SPUs based on the performance of compression/decompression, wow. That's quite the stretch. Since he doesn't provide a benchmark I'm guessing he's basing it off SPU graphs from actual games, in which case I'm fairly certain it wasn't two or three dedicated SPUs but a percentage of two or three SPUs across a frame.

It has static lighting, not baked. It features sub-surface scattering for the paint and image based lighting for reflections which are both real time. FM5 is technically impressive on a number of levels.
I haven't seen a detailed technical breakdown of FM5 but from what you've posted it's sounds similar to FM4, which featured IBL and pre-baked lightmaps. It's probably also forward rendered since it doesn't feature any night racing.
 

artist

Banned
May 7, 2006
16,629
0
0
Leadbetter suggesting that move engines are the equivalent of two or three fixed function SPUs based on the performance of compression/decompression, wow. That's quite the stretch. Since he doesn't provide a benchmark I'm guessing he's basing it off SPU graphs from actual games, in which case I'm fairly certain it wasn't two or three dedicated SPUs but a percentage of two or three SPUs across a frame.
This is exactly why people question his credibility. Any hardware journalist using the term "secret sauce" in a serious tone is all but laughable.