• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

artist

Banned
What first spawned the 'secret sauce' anyway? What was it rumored to be? I always thought it was supposed to be cloud computing (lol).
The secret sauce was I think first referred to by VGLeaks, when they interpreted the PS4 GPU wrongly, thinking it as a 14CU unit with a 4CU separate module and something was "secret" or "hidden". I dont have the link handy but probably can dig it up later.

Aside for Aegies, Leadbetter also tried to use the secret sauce for the Move Engines;
Indeed, many of the functions assigned to the Move Engines - tiling/untiling of textures, lossless decompression, texture decompression - were often hived off to the SPUs on the PlayStation 3. VGLeaks' quoted bandwidth specs - around 25GB/s, a plausible figure - is also ballpark with SPU performance carrying out the same task. So in essence, you could view elements of this "secret sauce" hardware as something like two or three fixed function SPUs available alongside the eight-core AMD processor.
http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis
 
This confirms that every Xbox One actually has 14 CU's because as we know per Albert Penellow there is no hardware difference between a consumer Xbox One and a developer Xbox One.

In the future we will see Microsoft's own version of "unlocking the last SPU" debate with this. ;-)
Well technically every PS4 has 20 CU's in it, it also has 2 reserved for redundancy.
 

Averon

Member
Ekim didn't deserve it IMO.

Ekim vouched for SS and alluded to having "sources". That was a one-two combo he wasn't going to walk away from. He knew the rules, so him getting tagged by Bish--who's been pretty active and adamant about alluding to sources you actually don't have--was totally justified.
 

beast786

Member
The secret sauce was I think first referred to by VGLeaks, when they interpreted the PS4 GPU wrongly, thinking it as a 14CU unit with a 4CU separate module and something was "secret" or "hidden". I dont have the link handy but probably can dig it up later.

Aside for Aegies, Leadbetter also tried to use the secret sauce for the Move Engines;

http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis


shocking knowning whos leadbetter sources have been. its amazing to see same PR phrases like "secret sause, cloud , balance , bandwith max etc" and how each phrase is then included in leadbetter PR articles.
 

ethomaz

Banned
Well technically every PS4 has 20 CU's in it, it also has 2 reserved for redundancy.
It that confirmed?

I mean... this use for redundancy is only used when you didn't have a mature manufacture process for your chip... I can see that for the Xbone because the size, eSRAM and additional co-processors (DMAs, Audio, etc) but for a more simple chip like PS4 I can't see they doing that.

Nobody will wast silicon to create redundancy without have some production issue to fix.
 

shaolinx1

Banned
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."
 

Guymelef

Member
shocking knowning whos leadbetter sources have been. its amazing to see same PR phrases like "secret sause, cloud , balance , bandwith max etc" and how each phrase is then included in leadbetter PR articles.

"You want free clicks? Follow this rules Richy..."
 
Who are you guys going to argue with now?

Didn't take long for one to appear.

That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."
 
"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

lol shouldn't that be for the ppl paying more for an inferior product.
 

prwxv3

Member
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."

What a meltdown.
 
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."

What are you blathering on about? Edge has been ahead of the curve on the next-gen info. Eurogamer has done a fair share of reporting. Leadbetter has taken a few knocks due to math issues and being quick to report some information that has been proven incorrect. An example of this would be the 7 minute DVR limitation on the PS4 that didn't exist.
 

nib95

Banned
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."

Genuinely laughing out loud. Who'd have guessed a new casualty would come so soon?
 
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins.


Your "friend" sure writes funny comedy. His satire of what a uber-flaming fanboy would say after reading legitimate criticisms is spot-on.
 
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."

The problem is that Ledbetter chooses to be an active participant in Microsoft attempting to re-write the basic concepts of GPU power and gaming performance. He chose to pass along everything they had to say unchallenged. That is not journalism. That is evangelism.

Before the introduction of the PS4 and XB1, we never saw debates about whether a 7770 was equal to a 7850. Yet here we have Microsoft attempting to start these types of debates, and Ledbetter is a willing participant. It's ridiculous to watch.
 

viveks86

Member
Hadn't checked this thread in a few hours! RIP! Looks like we can final get back on topic!

Yoga%201.jpg
 

CLEEK

Member
Just to clarify, nobody is suggesting that there's a 14 CU GPU, and then some 4CU GPU off by itself somewhere.

The PS4 has one GPU with 18 full Compute Units. The 14 and 4 thing is simply a recommendation by Sony to developers on what they see as an optimal balance between graphics and compute. It was never suggested, at least not by me, that devs are forced to use it this way. They're free to use all 18 on graphics, 16 on graphics with 2 on compute, 5 on graphics with 13 on compute. It's all up to devs. Sony just gave a recommendation based on what they observed taking place after 14 CUs for graphics specific tasks. The extra resources are there, there's a point of diminishing returns for graphics specific operations, so they suggest using all the extra ALU resources for compute, rather than using them for graphics and possibly not getting the full bang for your buck. That's how I see it, and probably the last I'll say on the subject.

Don't want to kick a man when he's down, but either you or I have a misunderstanding of how GPGPU works on AMD GPUs. And I think it's you!

My understanding is this:

Graphical rendering and GPGPU work together. It's not one or the other. Devs don't partition off CUs for compute or rendering.

In an AMD GPU there are a number of Asynchronous Compute Engines (ACE) that manage compute tasks. The GPU grinds away doing its normal graphical rendering, but by the nature of GPUs, they have loads of free cycles each frame due to the latencies in the rendering pipeline. The ACEs then use the free cycles to get the GPU to do GPGPU tasks. To have a constant feed of compute tasks, they are split up into small packets - Sony refer to this as 'fine grain' - which are queued up. These requests run out-of-order, again, managed by the ACEs.

The PS4 has 8 ACEs, and a total of 64 compute queues. Each ACE will work on processing the tasks in the queue. The more CUs the GPU has, the more chance there are space cycles that can be used for GPGPU. The more CUs are on the GPU, the more compute requests it can process per frame.

So there is no split, and as far as I'm aware, not direct control over how many CUs are used for GPGPU. The ACEs manage this, not the developers. So there is no magical 14 CUs for rendering and 4 for GPGPU, as that isn't how GCN tech works with compute. Devs do not reserve anything, they just load up the compute queue and let the ACEs run the show. They will tune how much compute can be done, and when in the frame it occurs, based on how many free cycles they get a frame from their rendering pipeline.

As per Cerny:

"If you look at the portion of the GPU available to compute throughout the frame, it varies dramatically from instant to instant. For example, something like opaque shadow map rendering doesn't even use a pixel shader, it’s entirely done by vertex shaders and the rasterization hardware -- so graphics aren't using most of the 1.8 teraflops of ALU available in the CUs. Times like that during the game frame are an opportunity to say, 'Okay, all that compute you wanted to do, turn it up to 11 now.'"

Sounds great -- but how do you handle doing that? "There are some very simple controls where on the graphics side, from the graphics command buffer, you can crank up or down the compute," Cerny said. "The question becomes, looking at each phase of rendering and the load it places on the various GPU units, what amount and style of compute can be run efficiently during that phase?"

Everything I've said here holds true for the Xbone too. It just is far more limited in its GPGPU capabilities, due to only have 2 ACEs, 8 queues (or 2, or 4 or 16 - no one knows for sure) and 12 CUs to process compute tasks on.
 

Teletraan1

Banned
Who are these "sources" that keep getting people banned? They're either very good at convincing people they're legitimate, or the people referring to them are too easily led astray? Maybe confirmation bias or something?

Maybe they are being Manti Te'o'd or Chris Anderson'd by fake insiders, the internet is a fucked up place. But from the contents of their posts I feel like they are probably just shills taking it to the extreme.
 

Skeff

Member
That was a great article. Apparently RAM situation looking better for X1 than PS4.

But the best part I have to say was the comment made by my buddy "YouSmellLikeDogBuns"

"Edge Online does article with "Anonymous Sources" - "Gospel truth! Can't question it! Thank you based Edge!"

Eurogamer does properly researched article full of actual journalism and reaches out to Microsoft to get comment, like an actual journalists do - "BAAAAAW LEADBETTERS BIASED! STOP DOING JOURNALISM. PANDER TO ME AND TELL ME HOW MUCH PS4 IS BETTER. I NEED MY PURCHASE VALIDATED"

This is why you get a shi**y games media, you ignorant cretins."

I like the biggest Irony in all of this is that the article this thread is based around only happened because of NeoGAF and the discussion that was had with Albert Penello and the request for more details from his technical fellow, The Albert Penello Miracle equations lead to this article, Good job GAF, shame there were no tough follow up questions asked.
 

JaggedSac

Member
Everything I've said here holds true for the Xbone too. It just is far more limited in its GPGPU capabilities, due to only have 2 ACEs, 8 queues and 12 CUs to process compute tasks on.

You left out the Bone's cloud capabilities which offset the lack of CUs.






Lets get this party started up again.
 

CLEEK

Member
You left out the Bone's cloud capabilities which offset the lack of CUs.

Lets get this party started up again.

:)

If you have a series of time critical compute tasks that need to be processed per-frame, offloading this over the internet to Azure servers is not feasible. The round trip has far, far too much latency in it.

Cloud processing will not be used for low latency tasks, like graphical rendering or physics. It has its uses, but not for GPGPU tasks.
 

artist

Banned
So trying to get the discussion back on topic, so I went around mapping out the GCN family GPU and their resources, trying to see if there was any balanced (or unbalanced) GPUs;

untitled94fsh.png


From the above you can see that Prim rate doesnt really effect overall performance as much, see how the 7790 has more triangle output than the 7850, same goes with 7870 over 7950/7970.

Second, an excess of pixel fill (ROP) isnt of tangible benefit either - 7870 having 25% more pixell fill over the 7950 yet that doesnt translate well in games because it's lacking in other areas - compute/texel fill.

These are the two areas that Microsoft decided to strengthen by going with the upclock and the two areas they gave away was texel and compute ..

How again is this a more balanced design? Oh yeah, those numbers that they ran on current titles, ones which will never be public domain. ¬_¬

*Performance source
 
And that's the end of that chapter.
YGYh9.gif


Anyway, I don't think anyone would have taken issue with the idea that:
  • For any GPU...
  • depending upon the workload...
  • the benefit of more ALU resources may or may not scale linearly beyond a certain point...
  • and the PS4 is no different in this regard...
  • and an example use case based on a "typical game" may suggest that it may be more worthwhile to use the equivalent of 4 of the CUs for GPGPU compute, since there'll be diminishing returns from the non-linear increase in performance, but this again isn't a rule and will vary from case to case...

Instead, the claim seemingly made was that:
  • Specifically for the PS4 GPU, which is somehow different....
  • regardless of the workload...
  • the benefit of more ALU resources always ceases to be linear at the point of 14 CUs as a rule...
  • and the PS4 is special in this regard...
  • where after 14 CUs there's some hugely significant drop off in return on resources for graphics, and 14 is always the magic number.
....
 

nib95

Banned
So trying to get the discussion back on topic, so I went around mapping out the GCN family GPU and their resources, trying to see if there was any balanced (or unbalanced) GPUs;

untitled94fsh.png


From the above you can see that Prim rate doesnt really effect overall performance as much, see how the 7790 has more triangle output than the 7850, same goes with 7870 over 7950/7970.

Second, an excess of pixel fill (ROP) isnt of tangible benefit either - 7870 having 25% more pixell fill over the 7950 yet that doesnt translate well in games because it's lacking in other areas - compute/texel fill.

These are the two areas that Microsoft decided to strengthen by going with the upclock and the two areas they gave away was texel and compute ..

How again is this a more balanced design? Oh yeah, those numbers that they ran on current titles, ones which will never be public domain. ¬_¬

*Performance source

Damn dude, nice work. Pretty intriguing results.
 

CLEEK

Member
Final point on anyone who thinks the Xbone is more balanced. Just because the company that makes something tell you this, doesn't mean it's in any way true.

s-FAIR-BALANCED-large.jpg
 

onQ123

Member
"Just like our friends we're based on the Sea Islands family"


I didn't pay attention to this before but Sea Islands is the HD 8000 GPU's

so that kill the Bonaire HD 7790 speculations.
 

USC-fan

Banned
So trying to get the discussion back on topic, so I went around mapping out the GCN family GPU and their resources, trying to see if there was any balanced (or unbalanced) GPUs;

untitled94fsh.png


From the above you can see that Prim rate doesnt really effect overall performance as much, see how the 7790 has more triangle output than the 7850, same goes with 7870 over 7950/7970.

Second, an excess of pixel fill (ROP) isnt of tangible benefit either - 7870 having 25% more pixell fill over the 7950 yet that doesnt translate well in games because it's lacking in other areas - compute/texel fill.

These are the two areas that Microsoft decided to strengthen by going with the upclock and the two areas they gave away was texel and compute ..

How again is this a more balanced design? Oh yeah, those numbers that they ran on current titles, ones which will never be public domain. ¬_¬

*Performance source

Can you add ps4 and xbone to that chart?
 
Interesting development. I guess he couldn't prove his claims after all.


Good. He was becoming intolerable. There are plently here exicted for the xbone, but this dude little by little was melting down and becoming more like a long winded gamefaqs poster with every post. If it isnt a perma, lets hope he is put on time out till after the consoles are released.


Edit: Sorry. only just saw the warning.
 

Coiote

Member
So many corpses littering the ground on spec threads. Someone needs to do a proper bodycount when next gen finally launches.
 
Pretty clear where this is heading and why it was posted.

MS says we targeted 12 CU plus overclock because it outperforms 14 CU. They spins the old story about the 14cu+4cu for PS4 to show the xbone is more powerful since its more balanced. Since we have new "info" that came straight from "sony" that after 14CU is diminishing return so its a "waste" to use this for gfx. Of course the source is unknown and most likely 6 months old at least.

Now we have the xbone can out perform the ps4 gpu because they have the perfect number of CU and its overclocked. What about the 4CU for compute? well the xbone has 15 special processors that off load cpu tacks and plus the cpu is now over clocked!

LOL

So easy to see right through this stuff. Complete PR bullshit. Back in real life the ps4 has a massive advantage with ALU[50%+], ROP[100%+], TMU[50%+], usable ram per frame[100%+]and more advance gpu compute support.

I would be shocked if a single game ran better on xbone than PS4.
Missed this before, but putting on my tinfoil hat for a moment, that really does seem to have been the whole point of this. How Machiavellian.

But doesn't the argument essentially backfire... in that if games are going to start using 400GFLOPs of the GPU for non-graphical tasks then where exactly are they going to get those on the XB1.
It is linear, atleast where it's relevant.

untitledm0f61.png


This scaling beyond 14CU non-sense is getting laughable.
You should probably do a graph that treats the ALUs as continuous data rather than discrete if you want to model the relationship between the two?

EDIT: although the new graph probably makes the old one redundant anyway.
 

Cidd

Member
The PS4 is clearly the result of a game developer given the driving seat to design and customize a gaming console. it's such a straightforward design that I find it odd that some think it's Unbalanced somehow.
 
Top Bottom