• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Watch Dogs Legion: PlayStation 5 vs Xbox Series X|S - Graphics, Performance, Ray Tracing!

Bankai

Member
I'm thinking that since the Xbox toolset has to accommodate more platforms than the PS5 toolset , it will be more complicated to use, or it will have constraints built in that the simpler PS5 toolset would not. I mean, the Xbox GDK would have to allow development on the Xbox One original design, the X1X, the XSX, the XSS, and PC, with its all variations. That's a lot of bases to cover. In contrast, the PS5 GDK would only need to facilitate development on the PS5 and PS4. So it would be a considerably simpler process.

Would this be a factor? I'm not a tech guy, so I could be off, just checking my thought process.

Thanks a lot GAF, I just found out I can never-ever read that word again without laughing.
 

Humdinger

Member
Don't be a tool.

To clarify, I'm not saying it's all about the tools. I assume design features have something to do with it, too. I just don't hear much about this aspect, apart from vague references to the tools not being "mature" enough (grow up, tools!).

Presumably, it could impact progress going forward, including progress on improving those tools.

What I mean is, some people are saying Xbox needs time to improve its GDK toolset. As others point out, Sony will be doing the same. Perhaps there is more room for improvement on the Xbox side, I do not know. But it might also be true that progress will be slower with the Xbox tools, since the GDK has to cover all these different platforms, not just one or two as on the PS5 side.

Anyway, just looking to see if my thinking is straight or crooked. As I said, not a tech guy, so I could be completely off.
 

ZywyPL

Banned
I'm thinking that since the Xbox toolset has to accommodate more platforms than the PS5 toolset, it will be more complicated to use, or it will have constraints built in that the simpler PS5 toolset would not.

Keep in mind that PS5 games also have to work on PS4, Pro, and various PC configurations, and sometimes even Stadia, so it applies to both.
 
The PS5 loads the game 8 seconds faster and it is a cross generation game who is not optimized to take fully advantage of the ssds
and the architecture of this next generation systems. And 8 seconds is 8 seconds waiting for the game so there is no parity, another win for PS5 :messenger_winking: :messenger_beaming:
You didn't get the joke. And it's a good one so you are missing out.
 

Caio

Member
Thanks a lot GAF, I just found out I can never-ever read that word again without laughing.

If someone had told me on July 22nd that I would have laughed with tears every time people pronounce or write ""Halo Infinite"" and lately""ToolSet"", I would have never believed, never ever ! Now, thanks to GAF, I can't stop laughing, and this is very healthy :)

Thank you guys, you are great :D even Riky ahahahahahah
 

assurdum

Banned
From your logic an RTX2080 would be faster than a 2080TI due to its faster clocks. This is obviously not the case. I also remember XBOX One higher clocks vs PS4 and everybody knows how that turned out....LOL
Jesus Christ. You are comparing GCN architecture and Nvidia which are not built to push performance with higher frequency. Lol what? Are you happy to spread such ignorance just to lead your narrative? Inform you better for the fuck sake. RDNA2 is all about performance X watts . The overclocking frequency in the new Navi is incredibly high and still for someone it means nothing in performance. Meh
 
Last edited:

Tripolygon

Banned
Makes sense...
Near identical. I know some of you are so blinded by fanboyism that you can't understand a simple statement but i said near identical meaning they could be a couple of pixels or fps above or below so not identical but close enough that you can say they are identical. Like COD and Assassins creed where PS5 actually holds a slight lead in resolution and framerate than XSX and in DMCSE where XSX holds a lead in framerate. There is no parity between them but they are close enough to each other to call both near identical. Does everything need to be explained to you?
 
Last edited:

TBiddy

Member
Near identical. I know some of you are so blinded by fanboyism that you can't understand a simple statement but i said near identical meaning they could be a couple of pixels or fps above or below so not identical but close enough that you can say they are identical. Like COD and Assassins creed where PS5 actually holds a slight lead in resolution and framerate than XSX and in DMCSE where XSX holds a lead in framerate. There is no parity between them but they are close enough to each other to call both near identical. Does everything need to be explained to you?

That's my point. They are identical, thus there is parity between the two. They have minor differences in terms of performance, but all things considered there is parity.. unless I'm misunderstanding the word parity.
 
Last edited:

Tripolygon

Banned
That's my point. They are identical, thus there is parity between the two. They have minor differences in terms of performance, but all things considered there is parity.. unless I'm misunderstanding the word parity.
They are not identical, there are slight differences between the 2 but from a gameplay perspective they a minute inconsequential differences. I guess sure you can say they are in par with each other.
 
There is a reason you dont just extend shader array, do you know what they could be ? Go look it up.
images


And I'm guessing adding a shader array is more costly than adding CUs to the existing shader arrays.
 
Last edited:

ethomaz

Banned
Plenty of 5700 cards are sold at 2100mhz speed. It is not a crazy over clock at all.

Now all you have to do is go and do that benchmark test you have put forward and show us the results.
The onus is on you to show me some benchmarks done is an open and non biased way which go against the results DF got.

Its simple. I'm not trying to say a set of results are wrong and shouldn't be relied on. You are. Thats why its up to you to provide the proof of your claim.
What?

Show me a RX 5700 card being sold at 2100Mhz. With air you will reach around 1900Mhz... You need liquid cooler to go over 2000Mhz.

The issue lies in performance scaling and hard limit of clocks.

RX 5700 XT: 1780Mhz
RX 5700 XT OC: 2100Nhz (+18% clock)

“With the “EvenMorePower5700XT” registry modification installed, we found 2.1 GHz to be the limit of both our Asus and PowerColor reference cards. We tried a few other tweak profiles but the same 2.1 GHz hard limit on our hardware was observed.”

“For what was about a 7% boost in performance on average you’re looking at an almost 40% increase in power consumption as the GPU power draw increased from just 186 watts right up to a Vega-like 258 watts (!).”


18% clock increase for 7% better performance.
RDNA performance doesn’t scale well after 1900Mhz... you will see basically no gain in performance from 2000Mhz to 2100Mhz for ecample

But hey DF tried to score a point, no?

That is exactly the opposite of RDNA 2 that the performance scale linear to clock until around 2500Mhz.

Edit - Another example.

RX 5700 XT ROG STRIX 1880Mhz vs 1950Mhz (+4%) vs 2100Mhz (+12%)

Performance gain: 2% and 7%.

 
Last edited:

TBiddy

Member
They are not identical, there are slight differences between the 2 but from a gameplay perspective they a minute inconsequential differences. I guess sure you can say they are in par with each other.

Could be me not being a native English speaker I guess. No worries. I think we agree, more or less
 

ethomaz

Banned
If the theory that a given GPU with a higher clock speed would outperform a given GPU more CUs it would hold up across all GPUs. It would be true of GCN cards, Nvidia cards, RDNA 1 cards. To say that this isn't true of RDNA 1 cards, but it is true if RDNA 2 cards just doesn't make sense.
The rule would be the same across both architectures.

I think this has ran its course. You arnt going to change my mind, and I'm not going to change yours.
We will have to leave it up to the consoles to show which was more accurate.
It is true since you don’t reach the limit of performance/clock scale in the architecture.

2100Mhz is over the clock you have solid performance/clock scaling for RDNA.
 

Pasedo

Member
It's an interesting scenario when two people are given the same objective but one is given a $100 to do it and the other is given near limitless access to funds. Usually it's the one with the most restrictions who will innovate better and deliver a better end result.
 

GHG

Gold Member
I was responding to someone mentioning loading time. Quick resume is a perfectly reasonable thing to discuss in that context. No need to get your jimmies rustled here.

Ok let's not bother doing any technical analysis or comparisons between load times because of quick resume then.

Thank god for these trump cards.
 

DrBillOA

Banned
Side thought: Neogaf seems intent on painting this game as anti Black lives matter, and for once they are right??!
 

Humdinger

Member
Keep in mind that PS5 games also have to work on PS4, Pro, and various PC configurations, and sometimes even Stadia, so it applies to both.

Well, with multiplats, because of MS's Play Anywhere prioritization, I assume they'll develop PC versions out of the Xbox GDK, since it's already mandated or baked in for that kit.

I did forget about the Pro, but I'm still seeing it as Xbox kit having to cover 5 bases, with significantly more spread and variety, and the PS kit only having to cover three, with a more tightly focused cluster. Pro and PS4 are closer in similarity than X1X and the original Xbox One are, plus on the Xbox side you've got XSS and XSX differences, plus the variability introduced by PC.

I'm thinking all the Play Anywhere stuff and the additional bases to cover makes getting the Xbox kit "right" (i.e., tapping the full performance for XSX) a trickier, more complicated process than getting the PS5 kit right (i.e., tapping the PS5's capabilities).

Again, I'm just suggesting this as one potential factor in the mix, not as the whole story. There are a lot of design decisions that could be playing into it as well.
 
I'm much more likely to buy PS5 (this time mainly due to GoW) than Xbox (never owned one), but performance comparison with locked framerates is something rather special, let alone, using DF to judge things.

And... 18 seconds "instant loading" is... pettiness? Eh...
Pointless? For sure. This game loads faster on PS5 vs Series X, is that something, yea sure... but it's not really that big of a difference. I just find the picking an choosing, the compartmentalizing of things to suit your narrative to to spite another group of people very fascinating behavior.
 

TBiddy

Member
Ok let's not bother doing any technical analysis or comparisons between load times because of quick resume then.

Thank god for these trump cards.

Excellent strawman. But of course, you have tons of experience creating those, so it's not surprising.
 
  • Empathy
Reactions: GHG

BigLee74

Member
Quick resume is a valid feature to bring up with regards to loading times, no question.

However, until its bug free, and games don't just drop off your quick resume list, it's not something that can be relied upon.

So enjoy your 8 second quicker loading time on PS5. I'll just scratch the balls a few more times to pass the time!
 
Don't be a tool.

To clarify, I'm not saying it's all about the tools. I assume design features have something to do with it, too. I just don't hear much about this aspect, apart from vague references to the tools not being "mature" enough (grow up, tools!).

Presumably, it could impact progress going forward, including progress on improving those tools.

What I mean is, some people are saying Xbox needs time to improve its GDK toolset. As others point out, Sony will be doing the same. Perhaps there is more room for improvement on the Xbox side, I do not know. But it might also be true that progress will be slower with the Xbox tools, since the GDK has to cover all these different platforms, not just one or two as on the PS5 side.

Anyway, just looking to see if my thinking is straight or crooked. As I said, not a tech guy, so I could be completely off.
It seems fair.

Also, it's not like MS released a completely new IDE/APIs, they are changing how the interaction between the platforms happen (I thought that had done this around the windows 8 days with the attempt at having PC/phone/tablet/xbox as targets). The platform is pretty similar to PCs with the x86 CPU, the GPU is in the same general family as the latest GPUs released by AMD both on PC and PS5.

The multi target could be an hindrance, but this is not like Sony had only one target, they still have PS4/PRO as well as the 5.
Excellent strawman. But of course, you have tons of experience creating those, so it's not surprising.
The person you responded to was clearly sarcastic, but he may have a point... what's the difference if the game loads in 16 seconds on initial load on the X against 13 on the PS5, if the next loads on the X take 6 seconds or so?

We can still compare the load times to discuss amongst ourselves, but the experience people have of it may not benefit the "winner".

I say this as a huge PS fan.
 

llien

Member
Pointless? For sure. This game loads faster on PS5 vs Series X, is that something, yea sure... but it's not really that big of a difference. I just find the picking an choosing, the compartmentalizing of things to suit your narrative to to spite another group of people very fascinating behavior.

You might be seeing narratives that do not exist.
Perhaps focusing of what is actually being said, instead of something that is possibly implied, would be the better approach.
 
You might be seeing narratives that do not exist.
Perhaps focusing of what is actually being said, instead of something that is possibly implied, would be the better approach.
Narrative, what narrative are you talking about? You took that comment way too serious dude. You've been here for awhile, right? there's a shit ton that is implied.
 

ethomaz

Banned
You might be seeing narratives that do not exist.
Perhaps focusing of what is actually being said, instead of something that is possibly implied, would be the better approach.
There is no narrative at all.

Multiolatforms runs better on PS5... that is a fact not narrative.
 

Methos#1975

Member
Yes indeed, got to love "console wars" where adults seriously argue about 3 frame differences and 5 sec load time variances as if these are hills worth dying on. Makes me ashamed to even be a gamer.
 

TBiddy

Member
The person you responded to was clearly sarcastic, but he may have a point... what's the difference if the game loads in 16 seconds on initial load on the X against 13 on the PS5, if the next loads on the X take 6 seconds or so?

We can still compare the load times to discuss amongst ourselves, but the experience people have of it may not benefit the "winner".

I say this as a huge PS fan.

Of course we can compare loading times. We can compare anything. But as you say - it won't make any difference in the long run.

I could maybe understand it, if we were talking 10 seconds vs. 2 minutes or something like that.
 

thelastword

Banned
The fuck? No, its not end of story just because you say so. The PS5 has an inferior GPU. There's no debating that. We aren't discussing religious beliefs where things are open to interpretation. We're talking physics and math. PS5s GPU is inferior. It doesnt make it a bad GPU, but its not as powerful.

Code optimizations matter and time will prove you to be delusional.
It has been proven that a 36 CU GPU clocked 400Mhz higher with other efficiencies, like caches, geometry engine, faster interconnect between CPU, MEM and IO will be more performant than a regular 56CU GPU clocked much lower.....

Consoles are not PC's, the custom work is what you want to look at and also you need to look at how many resources are being utilized from each part....It's the same way with the CPU, folks said Series X CPU is faster, well, if you are only looking at the paper here, all you see is the 100Mhz faster clocks without HT, but you also need to be aware that the Series X CPU is already using that extra 100Mhz and even more just for the OS, it's more resource hugging API etc....So when you see PS5 performing better in the most CPU intensive 1080p 120Hz modes, you should not be surprised....


Just watching specs without analyzing the inner details will always set you off tangent.... You need to be realistic and see the GPU's how they would really function in realtime for a better picture of performance.....In the end, better performance, better effects, better resolution wins. If the PS5 has pole position in any two of these, it wins the faceoff, because better effects and higher resolution can always be toned down to achieve better performance.....Yet from what we have seen PS5 wins in all three cases in the majority of faceoffs so far and 2/3 in the closer one...funny enough, which was already better on PS5 3/3 before Ubisoft depatched it on PS5.....
 
Top Bottom