• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Watch Dogs Legion: PlayStation 5 vs Xbox Series X|S - Graphics, Performance, Ray Tracing!

Man, that 18 seconds is a far cry from how I understood Cerny's presentation... :messenger_sad_relieved:

(barely any difference, from what I see, both locked to 30fps???):


I love these forums, the amount of pettiness in here always gives me a good chuckle :messenger_tears_of_joy: Someone will reply to this and say "Man, that equal performance on the 'Most Powerful console in the world' is a far cry from what Xbox has been touting this whole time" :messenger_tears_of_joy: I can almost feel the tension from all the anger held inside by the fanboys.
 
Last edited:
Once again....launch hardware on launch titles....there will most likely be a huge patch shortly

don’t know why people are getting worked up over results that are going to change
 

Gudji

Member
This doesnt make it less stupid.

Doesn't change the fact that the performance shown in early games stayed throughout the generation, be it cross-gen or not.

Yes. I can see faster graphics memory bandwidth, sustained system performance and 30% more compute units along with a full RDNA 2.0 feature set. All of this on top of 2 more TFLOPS.

Hopefully you've seen these too:

 

ethomaz

Banned
Mmm... not sure if I follow. RDNA1 doesn't have RT accelerated hardware, you can dedicate the shader cores to compute intersections and so on but the performance is horrible, NVIDIA enabled raytracing on the 1000 series and I believe not even the 1080TI could produce acceptable results. On RDNA2 is part of the CUs. The TMUs are also part of the CUs as well.

ujsrilj.jpg
AMD GCN, RDNA, RDNA2 are all modular.
You can choose which version of each module you want to have... or which version of the CUs you want... you can combine CU version 1.5 with ROPs version 1.3 for example (RDNA is suppose to start in version 2.0).

What I don't know which RT is really a separably module or together.
AMD is not really clear about if it is together with TMU module or not.

IMO if RT is it own module then it makes easy to AMD to insert/remove RT for each GPU they will launch in the future.
 
Last edited:

Md Ray

Member
From your logic an RTX2080 would be faster than a 2080TI due to its faster clocks. This is obviously not the case. I also remember XBOX One higher clocks vs PS4 and everybody knows how that turned out....LOL
Despite its 7% higher clocks the XB1 was still at a huge disadvantage. 16 ROPs on XB1 vs 32 on PS4, less ACEs, night and day difference in bandwidth etc. There was just no advantage at all on the GPU side for XB1.

PS5 and XSX GPU differences are nowhere near like that. Both have same amount of 64 ROPs, same amount of ACEs. Computational power is higher on SX, but elsewhere the PS5 GPU has other advantages.

And the games' performance show that.

This wasn't the case between PS4 & XB1.
 

cebri.one

Member
AMD GCN, RDNA, RDNA2 are all modular.
You can choose which version of each module you want to have... or which version of the CUs you want... you can combine CU version 1.5 with ROPs version 1.3 for example (RDNA is suppose to start in version 2.0).

What I don't know which RT is really a separably module or together.
AMD is not really clear about if it is together with TMU module or not.

IMO if RT is it own module then it makes easy to AMD to insert/remove RT for each GPU they will launch in the future.

I believe the RT module is a somewhat customized TMU but I get what you are saying.

I guess we'll never know until we get a closer look at PS5's chips. XB does seem to have the same CU config:

202008180220211.jpg



BTW, the 30% IPC increase is totally wrong. It's 30% increase in CU performance at the same power level, but IPC I think is substantially lower.
 
Last edited:

Lysandros

Member
.
Despite its 7% higher clocks the XB1 was still at a huge disadvantage. 16 ROPs on XB1 vs 32 on PS4, less ACEs, night and day difference in bandwidth etc. There was just no advantage at all on the GPU side for XB1.

PS5 and XSX GPU differences are nowhere near like that. Both have same amount of 64 ROPs, same amount of ACEs. Computational power is higher on SX, but elsewhere the PS5 GPU has other advantages.

And the games' performance show that.

This wasn't the case between PS4 & XB1.
Just as a side note, we do not know the number of ACE's for PS5's GPU yet.
 

yurinka

Member
Once again....launch hardware on launch titles....there will most likely be a huge patch shortly

don’t know why people are getting worked up over results that are going to change
Stuff that in particular games happen only in a console like tearing in some place, lacking a certain option in the menus, VRR support on the console level, some 5fps dip that only happens the first time you run that stage, the implementation of some effect etc may be fixed with patches in the next few weeks.

In fact some of them they are being releaed. But these patches will make the versions even more similar than they already are. Don't expect to see a 40% boost in performance in a game due to a bugfifx.

Same goes with new SDK/GDK/OS/devkits updates: they may fix some bugs and maybe help to optimize a bit ome stuff, but don't expect a big change.
 
Last edited:

Md Ray

Member
Yes. I can see faster graphics memory bandwidth, sustained system performance and 30% more compute units along with a full RDNA 2.0 feature set. All of this on top of 2 more TFLOPS.
Yet the PS5 is outperforming XSX consistently.

What about PS5's 22% higher pixel fillrate, and rasterization rate? Higher cache bandwidth? Cache scrubbers that help improve GPU perf?

Faster mem b/w only for the 10GB segment. The rest is slower than PS5. Full RDNA 2.0 feature set is a purposefully misleading marketing phrase. TF isn't the be-all and end-all, absolute indicator of GPU's perf.

Devs have also been saying that these two are very close in perf. since months. This is proving to be true.
 
Last edited:

Md Ray

Member
.
Just as a side note, we do not know the number of ACE's for PS5's GPU yet.
Didn't they went with 4 ACEs for PS4 Pro? I'm assuming they retained the same number of ACEs for PS5.

I've seen the specs of Big Navi 80 CU GPU and it too seems to have 4 ACEs.
 

geordiemp

Member
I believe the RT module is a somewhat customized TMU but I get what you are saying.

I guess we'll never know until we get a closer look at PS5's chips. XB does seem to have the same CU config:

202008180220211.jpg



BTW, the 30% IPC increase is totally wrong. It's 30% increase in performance at the same power level, but IPC I think is substantially lower.

Also note in RDNA1 and XSX, prim and RB is per shader array, on RDNA2 prim and raster are spread across Shader engine / 2 shader array. I believe this arrangement is the 4 game server reason....and it is not the same even if we ignore infinity cache.


bqjHdvY.jpg


MyGA29i.jpg
 
Last edited:

MrFunSocks

Banned
Series S showing how it isn’t going to have a problem being a 1080p box.

Seems parity was thename of the day during development and they’ve done a good job. I’m assuming AF will get patched since it’s the same setting but clearly isn’t working properly. RT puddles could be due to less CUs but is also most likely just a bug.
 

ZywyPL

Banned
Basically choose whichever bugs you like the most.

I feel like Bugsnax is the "defining" title of the upcoming generation, like holy shit, issue after issue, but after bug, where are all the guys who complained about gaming on PC now I ask? I think anyone who won't get his hands on new consoles this year is actually doing himself a favor, next year the consoles and the games will be able to be labeled as "ready".
 
Wait, there are fanboys here trying to get a win with this game?
We have one bit of missing AF on Xbox, some missing Ray tracing in Windows reflections of a puddle, and lower PS5 resolution on the robot spider and there's a win to be found?
Wow.
 

NullZ3r0

Banned
Yet the PS5 is outperforming XSX consistently.
In multiplat launch games. Circular logic. We have no idea which platform was the lead platform. Code optimization matters and I don't believe this game is optimized for any platform.

What about PS5's 22% higher pixel fillrate, and rasterization rate? Higher cache bandwidth? Cache scrubbers that help improve GPU perf?

You mean "up to" 22% higher pixel fillrate. PS5 uses boost clocks so none of your metrics reflect sustained or average performance. No amount of tweaking can make one piece of hardware perform better than another piece of superior hardware. An overclocked RTX 2060 will never run optimized code better than a stock RTX 2080.

Faster mem b/w only for the 10GB segment. The rest is slower than PS5. Full RDNA 2.0 feature set is a purposefully misleading marketing phrase. TF isn't the be-all and end-all, absolute indicator of GPU's perf.

You don't know what you're arguing here. If you wanted to argue that the PS5 has more graphics memory available to it that the Series X, then you have a point. But if this was somehow a counter to my argument that the Series X has faster graphics memory than the PS5, then you're wrong.

Devs have also been saying that these two are very close in perf. since months. This proving to be true.
The PS4 Pro and Xbox One X were "close" in performance as well. But optimized code always ran better on the faster console.

The fact that devs are scrambling to release patches to some of these games proves my point. The code is buggy and unoptimized which is to be expected of launch games that are released on 5 different platforms.
 

huraga

Banned
2000Mhz 20CUs will have better performance than 1000Mhz 40CUs.
That is true to any GPU.
The only exceptions are when there is bandwidth differences between the cases.

I have no ideia where you find that claim lol

BTW Series X and PS5 are different architectures.

Guy again, you are talking about it without knowledge. I´m computer engineer from 16 years ago and not expert in GPU but I know a little bit about cpu/gpu and I know that for GPU is always better more CU´s than mhz because graphics calculations are highly parelelizable. This is why usually the biggest improvements in GPU is thanks to add more CU´.

Your are talking and judging the desing of many engineers from your sofa when you never designed a gpu in your life.

I don´t be sure if you are 15 years old or something like that.
 

Edgelord79

Gold Member
So basically no discernable difference for people gaming on Xbox Series X or PS5. Unless they are zooming in to 200%...
 
Last edited:

sendit

Member
I agree that Sony has proven to be more efficient with what they have than Xbox so far. It's impressive.

This will go throughout the generation. Minimal difference at best. The differentiating factor will be which 1st party studio can impress the mass market the most.

It was definitely amusing seeing/reading the comments from the beginning of this year till real world comparisons started to show. A lot of salty individuals. Additionally, Sony effectively has a 399 console that can go toe to toe with a 499 console.
 
Last edited:

Iamborghini

Member
It is nice to see Sony worked on their AF implementation in SDK.
From what I remember a lot of games had AF issues on PS4 due the SDK having a bit more hard way to use it.
Seems like with PS5 SDK it is fine because are devs are using high AF implementation already.

I think the PS5 version of NBA 2K20 has a very soft anistropic filter (x4 or x8) instead of the x16 on the xbox series x.
 

NullZ3r0

Banned
Wait, there are fanboys here trying to get a win with this game?
We have one bit of missing AF on Xbox, some missing Ray tracing in Windows reflections of a puddle, and lower PS5 resolution on the robot spider and there's a win to be found?
Wow.
The desperation is strong and shows that all the walking back after the 13 TFLOP rumors crashed and burned was just fluff. A few months down the line it will be glorious to look back at these threads.
 

sircaw

Banned
After that whole ssd campaign about double the speed. They can only get 8 seconds on the xsx and even slower on some games. Not really much to celebrate there


Edit: doesn't take much but just some logic to upset the sony cult lol


you might not think it's a big deal for some reason, but after the year of bullcrap from Microsoft fanatics, Parity +better controller+better loading times is a massive win.
 

Godfavor

Member
The PS5's 22% uplift in pixel fillrate over SX likely makes up for the lack of 2 TF. 👌🏼

That's a smart GPU design, IMO. Do more with less.
This is not the case , pixel fillrate and rasterization performance are bottlenecked first by memory bandwidth
 
Last edited:

Md Ray

Member
In multiplat launch games. Circular logic. We have no idea which platform was the lead platform. Code optimization matters and I don't believe this game is optimized for any platform.



You mean "up to" 22% higher pixel fillrate. PS5 uses boost clocks so none of your metrics reflect sustained or average performance. No amount of tweaking can make one piece of hardware perform better than another piece of superior hardware. An overclocked RTX 2060 will never run optimized code better than a stock RTX 2080.



You don't know what you're arguing here. If you wanted to argue that the PS5 has more graphics memory available to it that the Series X, then you have a point. But if this was somehow a counter to my argument that the Series X has faster graphics memory than the PS5, then you're wrong.


The PS4 Pro and Xbox One X were "close" in performance as well. But optimized code always ran better on the faster console.

The fact that devs are scrambling to release patches to some of these games proves my point. The code is buggy and unoptimized which is to be expected of launch games that are released on 5 different platforms.
Parts of the PS5 GPU is faster than XSX GPU. And more than one game's perf show this. End of story.

Both will trade blows going forward.

"Launch games" and "code optimizations" are just a bunch of excuses.
 

Concern

Member
you might not think it's a big deal for some reason, but after the year of bullcrap from Microsoft fanatics, Parity +better controller+better loading times is a massive win.


I've called for parity from the beginning. Celebrating issues with dev kits is just as stupid as the ones already suggesting ms will be paying devs to sabotage ps ports.

The controller is subjective. I wouldn't use it for a multiplayer game. But I'd definitely rock it on single player.

Loading, considering the specs is not a win when you have literally double the speed and can barely edge out an 8 second advantage, sometimes less, and even slower.
 

sircaw

Banned
The desperation is strong and shows that all the walking back after the 13 TFLOP rumors crashed and burned was just fluff. A few months down the line it will be glorious to look back at these threads.

You really sound butt hurt over all of this.

Denying a set of results because they did not go your way is ridiculous.

If Xbox ever wins a comparison in the future, i think everyone should just turn around and say the same thing to you.

Sad + disappointing, you have no honor little grasshopper.
 

Godfavor

Member
Then what then?

I always thought the high clocks was helping them out quite a bit but if they dont mean anything then I don't know what else it could be.

It is way easier to fill 36 cu's with work. More mature tools since they are using an upgraded version of ps4 ones.
On the other hand, driver problems, Split memory and more cu's are more difficult with utilization
 

geordiemp

Member
Mmm... not sure if I follow. RDNA1 doesn't have RT accelerated hardware, you can dedicate the shader cores to compute intersections and so on but the performance is horrible, NVIDIA enabled raytracing on the 1000 series and I believe not even the 1080TI could produce acceptable results. On RDNA2 is part of the CUs. The TMUs are also part of the CUs as well.

ujsrilj.jpg




I do agree that 4 additional CUs per SA is odd and can be detrimental. The same way that RDNA2 cards as well as the PS5 clock very high and the XSBX doesn't, which could be another hint that is not running on the most optimal config.

In addition, there are two shader arrays with all the CUs active, but the number of rasterizers, RBs, L1 cache, is the same. So there could be bottle necking there, the GPU may have issues distributing loads efficiently as well. We'll see. I guess XBSX first party will be optimized to overcome the issues and it will be mostly equal on multiplats.

The bottlenecks also happen in the parameter cache and LDS - the intermediate data store in theshader array.....Here is some info from Cerny


vvUqP5B.png
 
Last edited:

sircaw

Banned
I've called for parity from the beginning. Celebrating issues with dev kits is just as stupid as the ones already suggesting ms will be paying devs to sabotage ps ports.

The controller is subjective. I wouldn't use it for a multiplayer game. But I'd definitely rock it on single player.

Loading, considering the specs is not a win when you have literally double the speed and can barely edge out an 8 second advantage, sometimes less, and even slower.


but it's still better, and by reviews, the controller is a massive difference, it's a tangible difference the general public will feel vs not feel as with the xbox one, especially in a game like this.

If you asked 100 ransom people would they prefer their car with power steering or without, or electric windows with or without, which one do you think they will choose?

I am not sure why people are not taking this into consideration, its a massive difference in how the game plays and feels. If it was a pure fps i could see your point, this is not, its got a gazillion things in it to do besides just shooting.
 
Last edited:
It is way easier to fill 36 cu's with work. More mature tools since they are using an upgraded version of ps4 ones.
On the other hand, driver problems, Split memory and more cu's are more difficult with utilization

Well the higher clock speed does help keep the CUs fed. Not to mention there's probably more cache per CU which helps as well. Then there's the cache scrubbers which helps out work the cache system. Seems like all of that leads to a GPU being very efficient and easy to work with.
 

Godfavor

Member
Well the higher clock speed does help keep the CUs fed. Not to mention there's probably more cache per CU which helps as well. Then there's the cache scrubbers which helps out work the cache system. Seems like all of that leads to a GPU being very efficient and easy to work with.
Cache scrubbers are used with IO though.
Not sure how much cache ps5 has
 
Last edited:

Concern

Member
but it's still better, and by reviews, the controller is a massive difference, it's a tangible difference the general public will feel vs not feel as with the xbox one, especially in a game like this.

If you asked 100 ransom people would they prefer their car with power steering or without, or electric windows with or without, which one do you think they will choose?

I am not sure why people are not taking this into consideration, its a massive difference in how the game plays and feels. If it was a pure fps i could see your point, this is not, its got a gazillion things in it to do besides just shooting.


I've played astrobot with it. I liked it way more than I expected. I just still don't see myself using it for multiplayer games, at least not competitive ones like say cod for example.

But for DS, GoW, etc, I'd use all its features.
 
Cache scrubbers are used with IO though.
Not sure how much cache ps5 has

I thought the Cache Scrubbers helped the GPU out.


If the ideas are sufficiently specific to what we're trying to accomplish like the GPU cache scrubbers I was talking about then they end up being just for us.

Flushing all of the GPU caches whenever the SSD is read is an unattractive option it could really hurt the GPU performance
 
Last edited:

sircaw

Banned
I've played astrobot with it. I liked it way more than I expected. I just still don't see myself using it for multiplayer games, at least not competitive ones like say cod for example.

But for DS, GoW, etc, I'd use all its features.

Yep and that's why my point stands, its not just a gimmick, its a whole better way of experiencing the game for the better.

Microsoft will probably emulate it in a few year's time, and then people will be all over it like hot glue.

this is an evolution in gaming terms, it might only be 10-15% more but it's still more.

People really need to start taking it in factor, its important to the gamers enjoyment.
 
Last edited:

sendit

Member
The desperation is strong and shows that all the walking back after the 13 TFLOP rumors crashed and burned was just fluff. A few months down the line it will be glorious to look back at these threads.

Agreed. In a few months, the Xbox will surely pull ahead.


z0np6PE.png
 

DForce

NaughtyDog Defense Force
In multiplat launch games. Circular logic. We have no idea which platform was the lead platform. Code optimization matters and I don't believe this game is optimized for any platform.



You mean "up to" 22% higher pixel fillrate. PS5 uses boost clocks so none of your metrics reflect sustained or average performance. No amount of tweaking can make one piece of hardware perform better than another piece of superior hardware. An overclocked RTX 2060 will never run optimized code better than a stock RTX 2080.



You don't know what you're arguing here. If you wanted to argue that the PS5 has more graphics memory available to it that the Series X, then you have a point. But if this was somehow a counter to my argument that the Series X has faster graphics memory than the PS5, then you're wrong.


The PS4 Pro and Xbox One X were "close" in performance as well. But optimized code always ran better on the faster console.

The fact that devs are scrambling to release patches to some of these games proves my point. The code is buggy and unoptimized which is to be expected of launch games that are released on 5 different platforms.


The PS4 Pro and Xbox One X were close in performance? lol

You want people to take you seriously and you post stuff like this.

PS4 Pro had 8GB of GDDR5 RAM vs Xbox One X's 12GB. The GPU difference was about 40%. This means Xbox One X had the advantage of higher resolution textures and higher resolution. You're also suggesting that just because the clocks are not "sustained" that means the metrics are a non-factor.
 
Top Bottom