• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

EGM1966

Member
I thought the Steam Controller thread might finally put this one to bed... apparently not.

Given how often "power" has been declared to not be a major factor in console sales I have to say it's interesting how a thread about "power" is proving so huge and divisive.
 
I thought the Steam Controller thread might finally put this one to bed... apparently not.

Given how often "power" has been declared to not be a major factor in console sales I have to say it's interesting how a thread about "power" is proving so huge and divisive.


What does the Steam controller have to do with the power difference between the PS4 and Xbone?
 
Actually the way I see it looks like this.

X1, peak 274GB/s average 200GB/s
PS4, peak 176GB/s average 130GB/s

Now the only figure that can possibly be in dispute is the PS4 average figure as Sony havent been forthcoming with any test results. But MS state that



If you take the middle of that its 75% hence 130GB/s. Now I have heard guys going on about "The ps4 gets 172GB/s" which has its source in a web site where there is a quote from a director of an indi games company. No unless somehow sony have silently solved the intractable average vs peak bandwidth problem that has been in existence since the year dot, I would suggest thats a mis-quote/mis-speak etc and he was actually referring to the peak bandwidth of 176GB/s.

Maybe its 140, or if they actually managed to achieve 85% they could get 150, but even that is a long way south of 200.

This article goes into depth as to why http://archive.arstechnica.com/paedi...latency-1.html you never achieve peak for any great amount of time.

With regards to they beyond3d bullshit...hummm to be honest I think they hold their posters to pretty high standards. They certainly get well moody when you try and make a direct comparison between the x1 and the ps4!

This BS has been debunked so many times that the arguments needn't be addressed anymore.

You go on believing whatever you want; it bears no significance in reality.
 
You have absolutely no proof to back those figures up. Your post is full of conjecture and speculation.

Also, curious post history. Heh.

Surely the only number that isn't proved is the 130Gb/s avaerage ps4 bandwith figure, and I say as much in my post. But I also go onto say how I arrive at that number, what assumptions I have made to achieve it. I reference MS saying that you tend to run at 70-80 percent effeciency on an external interface, and the linked article explains why you dont get peak all the time.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Actually the way I see it looks like this.

X1, peak 274GB/s average 200GB/s
PS4, peak 176GB/s average 130GB/s

That's because you are cherry picking info that confirms your bias. Just because MS "Measured" 200GB/s in one game does not imply it is averaging 200GB/s or that figure holds up for more than a few operations or in more than one engine.


Now the only figure that can possibly be in dispute is the PS4 average figure as Sony havent been forthcoming with any test results. But MS state that



If you take the middle of that its 75% hence 130GB/s. Now I have heard guys going on about "The ps4 gets 172GB/s" which has its source in a web site where there is a quote from a director of an indi games company. No unless somehow sony have silently solved the intractable average vs peak bandwidth problem that has been in existence since the year dot, I would suggest thats a mis-quote/mis-speak etc and he was actually referring to the peak bandwidth of 176GB/s.

Maybe its 140, or if they actually managed to achieve 85% they could get 150, but even that is a long way south of 200.

It doesn't really matter the exact number, the fact that all the data is accessible by the amount is what matters. It has enough for 18CUs according to AMDs own released line of Southern Island cards. How much is needed by 12CUs, 200GB/s? No, more like 100GB/s.


This article goes into depth as to why http://archive.arstechnica.com/paedi...latency-1.html you never achieve peak for any great amount of time.

So the GDDR5 in the PS4 doesn't achieve peak, but you have no issue believing Rangers that the eSRAM's 32MB will somehow quadruple the systems aggregate bandwidth?

With regards to they beyond3d bullshit...hummm to be honest I think they hold their posters to pretty high standards. They certainly get well moody when you try and make a direct comparison between the x1 and the ps4!

Then you are a fool, one of them (my bet is Rangers) or new to B3D. There is no vetting or posters, the only thing they do is try to keep things on topic, but that doesn't keep idiots from posting. So many people there have pretended to be insiders here and got banned, but happily post there without consequence. Hell eastman posts here, but never tried the "insider" shtick that he does there. Most of the B3D faithful are non-technical MS fans. eastman, Tap In, Rangers, RudeCurve, Blackjedi, AlphaWolf, etc. etc. There are about five members total who's opinion is worth a damn.
 

RoboPlato

I'd be in the dick
Surely the only number that isn't proved is the 130Gb/s avaerage ps4 bandwith figure, and I say as much in my post. But I also go onto say how I arrive at that number, what assumptions I have made to achieve it. I reference MS saying that you tend to run at 70-80 percent effeciency on an external interface, and the linked article explains why you dont get peak all the time.
An indie dev was getting 172GB/s out of PS4 months ago. It's going to be operating near it's theoretical peak for most devs for the entire generation.
 

EGM1966

Member
What does the Steam controller have to do with the power difference between the PS4 and Xbone?

Nothing - but it felt like everyone had powered into that thread and I figured two sprawling threads couldn't last and the Steam Controller would become the new "everyone post their opinion" thread.

But man this thread just won't die. I've tried to keep up but the amount of figures people are batting back and forth, and how often they differ, is just insane.
 

twobear

sputum-flecked apoplexy
I wonder, given that we know that the Xbone is bandwidth starved if we don't see developers start using 720p for multiplatform versions of Xbone games vs. 1080p PS4 very often. Even though the (theoretical) performance gap is not 2x, they might decide it's just not worth the effort on the Xbone version. It seems plausible to me that we could see the Xbone version at 720p 60 with good AA and the PS4 version at 1080p with perhaps a slightly lower IQ and slightly shakier framerate.
 

foxbeldin

Member
According to OldSchoolNerd's maths (and microsoft), x360 has a 256 GB/s eDRAM + 22,4 GB/s GDDR3, adding up to 280GB/sec! Bandwidth monster !
Cancel nextgen

Oh good God! That article has created a never ending nightmare.

And people extracting only the numbers that serves their story, like the 200GB/s one, forgetting it's a theoretical peak, while MS themselves tell us they peaked at 133GB/s only, in a really exceptionnal scenario.
 

Doc Evils

Member
If these numbers were true it would be like saying the PS4 has 256x more memory than the Xbone (32MB vs 8192MB). Stop using a tiny cache as the be-all, end-all.

Hasn't there been confirmation it's more of a scratch pad than actual cache?
 

twobear

sputum-flecked apoplexy
According to OldSchoolNerd's maths (and microsoft), x360 has a 256 GB/s eDRAM

Heh, it's only just occurred to me that the 360 had more eDRAM bandwidth than the Xbone.

Where's SenjutsuSage to argue that this was canny design on MS's behalf and not just a tremendous engineering fuckup?
 

CoG

Member
Hasn't there been confirmation it's more of a scratch pad than actual cache?

Scratch pad is a better description if it. In that context, the PS4 has up to a 8GB scratch pad with 176GB/s to it from all memory and the Xbone has a 32MB scratch pad with 68GB/s access to it from main memory.
 
That's because you are cherry picking info that confirms your bias. Just because MS "Measured" 200GB/s in one game does not imply it is averaging 200GB/s or that figure holds up for more than a few operations or in more than one engine.

Where did they say one game?. They really wouldn't have been very through if they only used one game! Its an average utilisation figure. Over an entire second (ie producing 30 or 60 frames) it averaged 200GB/s. Thats not a few operations thats an entire seconds worth of data! If one engine can do it, I suggest others could too.

It doesn't really matter the exact number, the fact that all the data is accessible by the amount is what matters. It has enough for 18CUs according to AMDs own released line of Southern Island cards. How much is needed by 12CUs, 200GB/s? No, more like 100GB/s.

Yes but clearly something is going on. MS have measured the GPU/CPU combined getting through 200GB/s. I think the most the CPU is allowed is 20Gb/s, so the GPU is getting though min 180GB/s. Its doing something with that data!

So the GDDR5 in the PS4 doesn't achieve peak, but you have no issue believing Rangers that the eSRAM's 32MB will somehow quadruple the systems aggregate bandwidth?
No system RAM ever achieves peak continually. The DDR3 in the x1 is 68GB/s but they only get 50-55Gb/s average out of it. The ESRAM achieves peak more readily because it has a dedicated read bus and a dedicated write bus, but even the write bus has bubbles in it and can only get 7/8ths of its max.

As I have said before the size of the ESRAM is not relevent as it is big enough. This is evidenced by the fact that MS have measured it running at 150GB/s, running real code. They are not doing that for fun, its being extremely heavily utilised for a reason. That reason being its there, its fast, and its big enough.


Then you are a fool, one of them (my bet is Rangers) or new to B3D. There is no vetting or posters, the only thing they do is try to keep things on topic, by that doesn't keep idiots from posting. So many people there have pretended to be insiders here and got banned, but happily post there without consequence. Hell eastman posts here, but never tried the "insider" shtick that he does there. Most of the B3D faithful are non-technical MS fans. eastman, Tap In, Rangers, RudeCurve, Blackjedi, AlphaWolf, etc. etc. There are about five members total who's opinion is worth a damn.

Thanks for the personal attack, not really necessary in a technical discussion.

But back to the topic. A few posts up I have quoted a few of the guys there to support what I am saying. Shifty Geezer, Gubbi, Silent_Buddha. Do you repect any of them?
 

beast786

Member
tumblr_mo9py1cx5X1sq97afo1_500.gif
 
Oh good God! That article has created a never ending nightmare.

You cant blame MS for doing it though can you? Those guys have spent years designing and building something they are proud of and probably a bit too deeply attatched to, then the internet says its crap...when they know it isn't. They want to get their side of the story out. Which is fair enough I think.

To my mind it can be used to challenge the notion that the ps4 is going to be all over the x1 ... something I am trying to do here! For my sins....

But anyway all this has brought on quite a considerable thirst and I'm off for a few pints of Guiness. Laters.
 

Chobel

Member
Where did they say one game?. They really wouldn't have been very through if they only used one game! Its an average utilisation figure. Over an entire second (ie producing 30 or 60 frames) it averaged 200GB/s. Thats not a few operations thats an entire seconds worth of data! If one engine can do it, I suggest others could too.

Bullshit!!! There no way in hell that one game will use 200GB in a second.
 

omonimo

Banned
Where did they say one game?. They really wouldn't have been very through if they only used one game! Its an average utilisation figure. Over an entire second (ie producing 30 or 60 frames) it averaged 200GB/s. Thats not a few operations thats an entire seconds worth of data! If one engine can do it, I suggest others could too.



Yes but clearly something is going on. MS have measured the GPU/CPU combined getting through 200GB/s. I think the most the CPU is allowed is 20Gb/s, so the GPU is getting though min 180GB/s. Its doing something with that data!


No system RAM ever achieves peak continually. The DDR3 in the x1 is 68GB/s but they only get 50-55Gb/s average out of it. The ESRAM achieves peak more readily because it has a dedicated read bus and a dedicated write bus, but even the write bus has bubbles in it and can only get 7/8ths of its max.

As I have said before the size of the ESRAM is not relevent as it is big enough. This is evidenced by the fact that MS have measured it running at 150GB/s, running real code. They are not doing that for fun, its being extremely heavily utilised for a reason. That reason being its there, its fast, and its big enough.




Thanks for the personal attack, not really necessary in a technical discussion.

But back to the topic. A few posts up I have quoted a few of the guys there to support what I am saying. Shifty Geezer, Gubbi, Silent_Buddha. Do you repect any of them?

Jesus Christ I can't believe someone here tries to pass the idea there is something of concrete about this pr bullshit of MS. There is not a single concrete prove of this, come on. They talk of a general optimization, meh, they just should be more specific or stop it. It's sound like the cloud, stop to believe to everything without a prove, we are no more childrens. They have meusured, Christ. Oh my God, I can't stop to shake my head.
 
B3D is an excellent source for laughs these days. Glad to see juniors linking such excellent comedy here. Saves me the work of actually sifting through B3D's bullshit to get to the fun stuff :p

On that note, we need to pack shit up. No amount of developer commentary citing personal experiences developing on both platforms will ever quell the uprising of "junior"s spouting nonsense and spreading FUD.

Sorry, juniors, I'll take a dev's word over yours all day every day - even if its news I don't want to hear.

What type of news don't you want to hear?
 

vcc

Member
Yes thats because there are two interfaces to the esram, allowing reads and writes to occur simultanously. Its all in the DF article. I am not making it up!

According to the DF article the ESRAM is restricted to reading or writing on each bus at a time and it means 102GB is it's peak transfer rate. According to the DF article by using instruction tetris which lets them do some extra writes or reads while certain other instructions happen they can achieve more than that. They did 133GB while doing alpha transparency blending, a operation that generally inflates bandwidth numbers. Then someone took an absurd step and projected out this very limited trick into the general case and assuming they could do this every 8/9 clock ticks to get 198GB/s.

This was before the upclock so 109GB is after the up clock and 198GB became 208GB. But the last number is very very funny math. MS has been repeating that funny math. It's the equivalent to claiming a stock Chrysler Caravan can do 0-60 in 2.46 seconds because they dropped one from a crane and clocked it. It has almost no bearing to it's real world performance.
 

omonimo

Banned
According to the DF article the ESRAM is restricted to reading or writing on each bus at a time and it means 102GB is it's peak transfer rate. According to the DF article by using instruction tetris which lets them do some extra writes or reads while certain other instructions happen they can achieve more than that. They did 133GB while doing alpha transparency blending, a operation that generally inflates bandwidth numbers. Then someone took an absurd step and projected out this very limited trick into the general case and assuming they could do this every 8/9 clock ticks to get 198GB/s.

This was before the upclock so 109GB is after the up clock and 198GB became 208GB. But the last number is very very funny math. MS has been repeating that funny math. It's the equivalent to claiming a stock Chrysler Caravan can do 0-60 in 2.46 seconds because they dropped one from a crane and clocked it. It has almost no bearing to it's real world performance.

I can't say better. Thank you very much.
 

Perkel

Banned
You cant blame MS for doing it though can you? Those guys have spent years designing and building something they are proud of and probably a bit too deeply attatched to, then the internet says its crap...when they know it isn't. They want to get their side of the story out. Which is fair enough I think.

To my mind it can be used to challenge the notion that the ps4 is going to be all over the x1 ... something I am trying to do here! For my sins....

But anyway all this has brought on quite a considerable thirst and I'm off for a few pints of Guiness. Laters.


Killer Instinct - 720p
Ryse - 900p
Battlefied 3 - 720p
Forza 5 -1080p/60

Killzone SF - 1080p60inmulti
Driveclub - 1080p targeting60
Second Son - 1080p
Knack - 1080p
Deep Down - 1080p(???)targeting 60fps.


This is real difference between those two consoles. PS4 outputs much higher resolution + better looking games at the same time.

No amount of special souse, dGpu, bandwidth write+read combination, low latency and pure grade A bullshit can change that.

And no you can't add up write and read. Max theoretical max bandwidth for Xbone is ~160GB/s. So 270GB/s is just pure bullshit.

You quote Shifty Gazeer without even reading what he wrote. He wrote that if MS got 150GB/s that this is possible especially since it is in theoretical maximum range. Thing is we don't trust MS on those numbers and artificial test may look way different to average game usage of memory
 

Pjsprojects

Member
I'm not a tech head and don't understand why everyone goes on about Xbone/PS4 needing the devs to learn the systems to get more out of them, from what i gather they are just built to a budget pc's, so seeing as devs already do pc versions of games that offer better graphics surly they already know whats what with the these new consoles?

With the PS3 i understand it was not a standard design and needed a more skilled dev to get the best from it and at the time the 360 came out pc gaming was not so advanced as it is now being already HD.

Also if these two consoles have around 4-5gb of mem to use then am i right in thinking that includes the graphics mem? i ask this due to a pc having main ram and a further few gigs on the video card.
 

CoG

Member
I was hoping this was just a delusional fan, but there's so much willfully disingenuous BS here this man HAS to be getting paid for this. There's no way this is an honest misinterpretation of figures at this point.

It's hard not to be wary when someone pops up out of the blue whose entire post history is defending the Xbox One. Keep in mind that there's probably a backlog of rabid fanboys on these threads in read-only mode chomping at the bit for approval and once they get in they let out their pent up stream of fanboy angst. Up to Bish and crew to weed out the shills from the plain ol' idiots.
 
You cant blame MS for doing it though can you? Those guys have spent years designing and building something they are proud of and probably a bit too deeply attatched to, then the internet says its crap...when they know it isn't. They want to get their side of the story out. Which is fair enough I think.

To my mind it can be used to challenge the notion that the ps4 is going to be all over the x1 ... something I am trying to do here! For my sins....

But anyway all this has brought on quite a considerable thirst and I'm off for a few pints of Guiness. Laters.

Judging by your arguments, I would say you're already a few pints in!

MS was caught with their pants down. Plain and simple. They underestimated a damaged competitor, made some poor functionality and technology decisions, and now will pay the price.

X1 won't fail, but I would wager it will fail in relation to what the x360 has achieved.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Where did they say one game?. They really wouldn't have been very through if they only used one game! Its an average utilisation figure. Over an entire second (ie producing 30 or 60 frames) it averaged 200GB/s. Thats not a few operations thats an entire seconds worth of data! If one engine can do it, I suggest others could too.

Where did you get this idea, B3D posters? MS never said this. Since an MS engineer said the "measured" quote, I would assume he was profiling an MS game, like Forza 5. He would not be privy to other games and certainly not everyone in development.


But back to the topic. A few posts up I have quoted a few of the guys there to support what I am saying. Shifty Geezer, Gubbi, Silent_Buddha. Do you repect any of them?

Shifty is a nice fellow, but no developer and the rest you picked are loyal MS fans. The Budda fellow is practically a MS employee with his constant spin and defense.
 
Judging by your arguments, I would say you're already a few pints in!

MS was caught with their pants down. Plain and simple. They underestimated a damaged competitor, made some poor functionality and technology decisions, and now will pay the price.

X1 won't fail, but I would wager it will fail in relation to what the x360 has achieved.

$100 price gap alone will guarantee that much. The power gap is really only relevant to the core and hardcore. Casual fans aren't going to be doing side by side comparisons for anything except a price tag.
 
Amazing how they keep showing up. This might be one of the best threads ever for moderation purposes.

Seeing "we" in such a blatant FUD post... delusional fanboy or terribly transparent shill? YOU DECIDE!
 

HORRORSHØW

Member
They don't die, they multiply. Whenever one gets bished, another steps up to take the mantle, desperately trying to dance the same tired dance.
 
Amazing how they keep showing up. This might be one of the best threads ever for moderation purposes.

Seeing "we" in such a blatant FUD post... delusional fanboy or terribly transparent shill? YOU DECIDE!
Surprised his posts don't start with "hey guys" and end with "thanks Microsoft!"

With a few "I'm getting both" claims thrown in for good measure.

Either way, I thought we were due for another serving of hot, delicious FUD.
 

vcc

Member
$100 price gap alone will guarantee that much. The power gap is really only relevant to the core and hardcore. Casual fans aren't going to be doing side by side comparisons for anything except a price tag.

for the casual audience it would be the same idea as horse power in cars or resolution on DSLR's. It influences their buying decisions even if they don't understand what it means. all they care about is X > Y and if the price is close they buy X. If X is also a $100 cheaper then it becomes a very easy sell.

Nuanced details like the utility of the kinect or the relative quality of the XBL network is harder to convey and market.
 
Top Bottom