• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.

Analysis Observer: System Redux is Simply Better on PS5 Than Xbox Series X

01011001

Member
Dec 4, 2018
4,661
7,312
510
Ah, recall the hate I got for stating this issue when talking to DEVS on PS5 SX and now I see that is now a conversation piece.

For those that shoot down these kind of comments with no fact or feeling behind them, please, in future, recall this as you may then return to it in the future when it suits you to.

well it became obvious once DMC5 announced that raytracing would come in a later patch (of course they since then pushed it out at launch anyways but they obviously had issues)

also it never was hard to believe since even Xbox One dev kits and tools were supposedly pretty shit for the longest time
 

AllBizness

I cry about Microsoft. A lot.
Mar 22, 2020
178
306
375
CPU Bottleneck anyone?
More like RAM bottleneck. Xbox has an additional 16 CU's over PS5 but cant use all that great hardware due to lack of efficient RAM. Needs 20gb or better to have a noticable improvement over PS5 imo. This is the reason PS5 is performing just as good or slightly better with less CU's although their clocked high so they are more performant then Xbox's CU's but Xbox has more of them. One X had more RAM then Pro and it showed a noticable improvement in the games. That's not the case here they have the same amount of RAM as PS5 and they gimped it with two speeds so on paper it looks like more bandwidth then PS5 when in actuality it averages about the same as PS5.
 

DynamiteCop!

Member Series S
Mar 3, 2018
5,070
12,567
840
More like RAM bottleneck. Xbox has an additional 16 CU's over PS5 but cant use all that great hardware due to lack of efficient RAM. Needs 20gb or better to have a noticable improvement over PS5 imo. This is the reason PS5 is performing just as good or slightly better with less CU's although their clocked high so they are more performant then Xbox's CU's but Xbox has more of them. One X had more RAM then Pro and it showed a noticable improvement in the games. That's not the case here they have the same amount of RAM as PS5 and they gimped it with two speeds so on paper it looks like more bandwidth then PS5 when in actuality it averages about the same as PS5.
No... Just no...
 

NoMoChokeSJ

Banned
Jun 3, 2014
1,575
1,403
580
DMC runs identically on both - you’re talking about differences which are imperceptible during intended use.

The same for ACV and COD:CW.

It’ll be the same for this.

No one will notice any difference even when running side by side. Trying to call it a “win” for either console is just warring.

And throwing bloober under the bus for console war point scoring is infantile.
You were the one saying The Medium is the one game that might not come off as looking worse during since it isn't multiplatform. Which implies that the XsX looks worse/performs on every released multiplatform game. This is factually incorrect.

I agree console warring is silly, but I would argue that suggesting an indie game is some sort of a technical bar for console performance is also infantile.

Its hard to be critical of people console warring while you are throwing grenades yourself.
 
  • Like
Reactions: DarkMage619

BizarroPete

Member
Mar 12, 2010
71
44
745
www.youtube.com
The article says there's no raytracing, will added with this patch?
Based on that tweet it looks like the patch is to improve performance, and they're working on an update to correct problems with ray tracing. I don't know if that means ray tracing isn't enabled, or that it isn't working as planned, but it's probably the latter.
 

01011001

Member
Dec 4, 2018
4,661
7,312
510
Go and watch Mark Cerny's video. TFLOPs isn't the be all and end all of measuring GPU performance.

it is if you compare exactly the same architecture.

and Cerny has no motive to make his system look better than the competitor's of course... none whatsoever
 
Last edited:

01011001

Member
Dec 4, 2018
4,661
7,312
510
It isn't

Hence why Xbox isn't the default performance leader right now

show me an example of a PC benchmark where a lower TFLOPs card outperforms one using the same architecture with higher TFLOPs

you can clock up all you want. PC tests actually show that higher clocks, reaching the same TFLOP performance than a lower clocked wider card, perfoms worse
 
Last edited:
  • Like
Reactions: Aladin

Coolwhhip

Member
Aug 26, 2019
3,072
9,792
605
 

DynamiteCop!

Member Series S
Mar 3, 2018
5,070
12,567
840
It isn't

Hence why Xbox isn't the default performance leader right now
show me an example of a PC benchmark where a lower TFLOPs card outperforms one using the same architecture with higher TFLOPs

you can clock up all you want. PC tests actually show that higher clocks, reaching the same TFLOP performance than a lower clocked wider card, perfoms worse
This is true and it's already been tested, there's a performance disparity. Even with matched TFLOP's the lower clocked higher CU card comes out a few percentage points ahead of the higher clocked lower CU card.

Frequency is not a replacement for physical hardware.
 
Last edited:
  • Like
Reactions: kuncol02
Jan 16, 2020
3,495
12,066
765
show me an example of a PC benchmark where a lower TFLOPs card outperforms one using the same architecture with higher TFLOPs

Who gives a shit about PC benchmarks?

Fact is the PS5 w/ its 10.23tflops can outperform the XSX w/ it's 12.1 tflops

Plus

Comparing the same lineup of desktop GPUs isn't comaprable, because unlike those, the PS5 and SX GPUs are customised differently.
 
Last edited:

DynamiteCop!

Member Series S
Mar 3, 2018
5,070
12,567
840
Who gives a shit about PC benchmarks?

Fact is the PS5 w/ its 10.23tflops can outperform the XSX w/ it's 12.1 tflops

Plus

Comparing the same lineup of desktop GPUs isn't comaprable, because unlike those, The PS5 and SX GPUs are customised differently.
It's all x86 PC hardware, and no it can't. Everything is limited by software, on even footing in the software stack there's no scenario where the PlayStation 5 could actually out-render the Series X.
 
Jan 16, 2020
3,495
12,066
765
Says the guy who has absolutely no idea how RDNA 2 actually performs or scales, we won't know until tomorrow. Who do you think you're talking to lol?

So you admit it's a flawed comparison then? Nice

RDNA2 GPUs are clocked much higher than RDNA1 GPUs. Gee I wonder why?

I believe i'm talking to Member Series S. Care to make any more predictions? Lmao
 
Last edited:
  • Like
Reactions: Md Ray

Md Ray

Member
Nov 12, 2016
1,996
6,631
585
India
show me an example of a PC benchmark where a lower TFLOPs card outperforms one using the same architecture with higher TFLOPs

you can clock up all you want. PC tests actually show that higher clocks, reaching the same TFLOP performance than a lower clocked wider card, perfoms worse
I had GTX 980 and 970 a while ago and on the 970, setting the frequency as high as it can go (higher than 980), resulted in better perf than 980 more often than not, even though 970's memory setup was gimped.
there's no scenario where the PlayStation 5 could actually out-render the Series X.
There are. In scenarios where the game/scene is pixel fillrate, rasterization/cache bandwidth heavy, the PS5 could out-render SX. Again, TF isn't the be all and end all of measuring GPU perf.
it is if you compare exactly the same architecture.

and Cerny has no motive to make his system look better than the competitor's of course... none whatsoever
You're ignoring other parts of the GPU's perf.



At HotChips MS makes mention of two more metrics: rasterization rate and pixel fillrate in their GPU Evolution slide besides TFLOPS and bandwidth. And those are also just as relevant as the TFLOPS and bandwidth metric. SX GPU has advantages in the first two but the PS5 GPU has 22% advantage in the other two, plus 22% higher cache bandwidth and whatnot.

We're even seeing this in the perf results. They've been generally on par. PS5 pulls ahead in some, XSX pulls ahead elsewhere.
 
Last edited:

DynamiteCop!

Member Series S
Mar 3, 2018
5,070
12,567
840
There are. In scenarios where the game/scene is pixel fillrate, rasterization/cache bandwidth heavy, the PS5 could out-render SX. Again, TF isn't the be all and end all of measuring GPU perf.
Yes, but pixel fill rates mean very little without the texel rates being accounted for. If you look at the PlayStation 5 and Series X they bear the near identical breakdown that the PlayStation 4 Pro did to the Xbox One X.

PS5 and Series X



PS4 Pro and One X

 

Md Ray

Member
Nov 12, 2016
1,996
6,631
585
India
On the same architecture it is. He was talking about comparing GCN to Rdna flops.
No.

Mark Cerny said:
Here's two possible configurations for a GPU roughly of the level of the PlayStation 4 Pro. This is a thought experiment don't take these configurations too seriously.

If you just calculate teraflops you get the same number, but actually the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU.

That's just one part of the GPU there are a lot of other units and those other units all run faster when the GPU frequency is higher at 33% higher frequency rasterization goes 33% faster processing the command buffer goes that much faster the L2 and other caches have that much higher bandwidth and so on.
 

Md Ray

Member
Nov 12, 2016
1,996
6,631
585
India
Yes, but pixel fill rates mean very little without the texel rates being accounted for. If you look at the PlayStation 5 and Series X they bear the near identical breakdown that the PlayStation 4 Pro did to the Xbox One X.

PS5 and Series X



PS4 Pro and One X

No. Texel rate difference between PS5 and XSX is 18%.

Not even close to the 43% difference there is between Pro and One X.

Also, Pro did not have advantages in rasterization rate, cache bandwidth like the PS5 does.
 

Three

Member
Oct 26, 2014
5,358
2,699
610
This is true and it's already been tested, there's a performance disparity. Even with matched TFLOP's the lower clocked higher CU card comes out a few percentage points ahead of the higher clocked lower CU card.

Frequency is not a replacement for physical hardware.
This matters very little for fixed hardware. Speed is a replacement for more hands at a certain job. You absolutely can get results with high clocks that match low clock high CU especially if it was not created/optimised for the low clock high CU hardware which can leave CUs idling more often.
 

Aladin

Member
Jul 26, 2020
370
443
280
Ps5 spent its budget on adaptive triggers and faster loading times and LEDs. Xbox spent it on a bigger faster apu.
By what was being reported, the Xsex still had a higher cost of manufacturing.
😎✌ pretty confident that 8% advantage in DMV5 will turn into 15-20% in an optimised game.
 

DynamiteCop!

Member Series S
Mar 3, 2018
5,070
12,567
840
No. Texel rate difference between PS5 and XSX is 18%.

Not even close to the 43% difference there is between Pro and One X.

Also, Pro did not have advantages in rasterization rate, cache bandwidth like the PS5 does.
What I'm saying though is that they both lean into the same things, while it may not be to the same degree what they perform better in is the same. And again Microsoft has a stark memory bandwidth lead.
 

Aladin

Member
Jul 26, 2020
370
443
280
No. Texel rate difference between PS5 and XSX is 18%.

Not even close to the 43% difference there is between Pro and One X.

Also, Pro did not have advantages in rasterization rate, cache bandwidth like the PS5 does.
Increase in rasterisation due to increased clock frequency, hits the bottleneck pretty quickly. Where do you store rasterised vector parameters after each cycle?
 
  • LOL
Reactions: luca_29_bg

dvdvideo

Report me for SonyGAF. SonyBots, fanboys > 04/27
Sep 15, 2005
1,577
1,440
1,680
This thread still exists? Why don't you guys wait for a few games that actually matter to compare.
 

Md Ray

Member
Nov 12, 2016
1,996
6,631
585
India
What I'm saying though is that they both lean into the same things, while it may not be to the same degree what they perform better in is the same. And again Microsoft has a stark memory bandwidth lead.
Increase in rasterisation due to increased clock frequency, hits the bottleneck pretty quickly. Where do you store rasterised vector parameters after each cycle?
The point is, those advantages on PS5 will manifest results like this:

I'm not saying PS5 will outperform SX in every game, that'd be delusional.

The results will flip when/if workloads/games relies on SX's advantages like mem bandwidth and floating-point.

Also, thinking that every game will perform better on SX without acknowledging PS5's advantages would be delusional too.
 

Aladin

Member
Jul 26, 2020
370
443
280
AMD Rx570 card when overclocked,
1. Fps improvement is less than the tflops increase.
2. Stability decreases, 1% Lows becomes terrible.
Pretty simple. Registers come coupled with compute units , to increase stability they need to get more cache. There is no replacement for displacement.
 
Last edited:
Jan 29, 2019
5,147
5,360
495
I'm sure this has been mentioned before but flops mean shit all if you don't have good management and developers to use those teraflops. Whatever the problem is on XSX, the programers couldn't overcome it.
That's fine, he is an xbox fan, giving excuses it all they do, I think the dashboard of the xbox systems offers an excuse of they day when you start the machine to spread on forums.
 
  • Like
Reactions: sol_bad

longdi

Ni hao ma, fellow kids?
Jun 7, 2004
7,973
5,084
1,865
Ah, recall the hate I got for stating this issue when talking to DEVS on PS5 SX and now I see that is now a conversation piece.

For those that shoot down these kind of comments with no fact or feeling behind them, please, in future, recall this as you may then return to it in the future when it suits you to.

You also got the news that SX weaker tools are holding back it's potential? Damn son, i hope Andrew sort it out asap
 

Tschumi

Banned
Jul 4, 2020
2,559
2,966
590
Japan
What resolution is PS5 running with RT?
I know you've probably had a million quotes for this, but I'll just briefly add to them:

I don't think anyone who isn't an Xbox fan actually cares about what resolution the game runs at with rt. I remember a few DF videos in which John L got really jazzed by how good Control looked on his CRT screen even at super low resolutions.. I'm happy playing at 1920x1080~ maybe I'll bump up to 1440 in a few more years~ if it looks better, and runs better, at sub 4k, i think that pretty neatly communicates that 4k resolution is, at this stage, a pointless red herring~

And anyway, subsequent comments seem to say that XSX resolution isn't any better than PS5 to boot~ so voila~
 
Last edited:
Jan 29, 2019
5,147
5,360
495
Series X with 52 CUs, and features not yet fully used in software , is not ?
This is not how it works, obviously there could be "tools" problems, but AMD GPUs with 64CUs have existed before, x86 CPUs with 8 cores have existed for a long time now, programmers know parallelism now.

I mean, sure the xbox dev kits could make things more complicated than they need to be, but it can't be that bad, this is not alien tech with unforeseeable quirks like the Cell was.
 
  • Like
Reactions: DynamiteCop!

Tripolygon

Member
May 6, 2012
4,301
5,823
1,070
NYC
Yes, but pixel fill rates mean very little without the texel rates being accounted for. If you look at the PlayStation 5 and Series X they bear the near identical breakdown that the PlayStation 4 Pro did to the Xbox One X.
This is false. The difference between PS4 Pro and Xbox One X was bigger than X1X just merely having 40% more theoretical TF or PS4 having 40% higher pixel fill rate. X1X outclassed Pro on literally everything but pixel fill rate. That is not true with PS5 and XSX.

CPU
Pro - 2.1GHz
XOX - 2.3GHz 9% difference for XOX

PS5 - 8 core 16 threads 3.5GHz
XSX - 8 core 16 threads 3.6GHz 2.6% difference

GPU Teraflop
Pro - 2304 x .900 x 2 = 4.2TF
XOX - 2560 x 1.172 x 2 = 6TF 40% difference for XOX

PS5 = 10.28TF
XSX = 12TF 15% difference

RAM/Bandwidth
Pro 8GB at 217.6 GB/s
XOX 12GB at 326.4 GB/s 40% difference for XOX

PS5 = 16GB @ 448GB/s 22% difference against the 6GB
XSX = 10GB @ 560GB/s and 6GB @ 336GB/s 22% difference for 10GB


Triangle rasterization
Pro 4 X .900 = 3.6 Billion triangles/s
XOX 4 X 1.172 = 4.7 Billion triangle/s 26% for XOX

PS5 - 4 x 2.23 = 8.92 BT/s 20% difference for PS5
XSX - 4 x 1.825 = 7.3 BT/s

Culling rate
Pro 8 X .900 = 7.2 BT/s
XOX 8 X 1.172 = 9.2BT/s 24% difference for XOX

PS5 - 8 x 2.23 GHz = 17.84 BT/s 20% difference for PS5
XSX - 8 x 1.825 GHz = 14.6 BT/s

Pixel fill rate
Pro 64 x .900 = 58 GPixel/s 41% difference for Pro
XOX 32 X 1.172 = 38 GPixel/s

PS5 - 64 x 2.23 - 142.72 GPixel/s 20% difference for PS5
XSX - 64 x 1.825 - 116.8 GPixel/s

Texture fill rate
Pro 144 X .900 = 130 GTexel/s
XOX 160 X 1.172 = 188 GTexel/s 36% difference for XOX

PS5 - 4 x 36 x 2.23 = 321.12 GTexel/s
XSX - 4 x 52 x 1.825 = 379.6 GTexel/s 16% difference for XSX


Literally everything was faster in XOX than PS4 Pro except for Pixel fill rate

Almost everything is faster in PS5 except for Texture fill rate and 15% more TF.

Bonus ray triangle intersection rate

PS5 - 4 x 36 x 2.23 = 321.12 Billion RTI/s
XSX - 4 x 52 x 1.825 = 379.6 Billion RTI/s 16% difference for XSX
 
Last edited: