• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Watch Dogs Legion: PlayStation 5 vs Xbox Series X|S - Graphics, Performance, Ray Tracing!

but it's still better, and by reviews, the controller is a massive difference, it's a tangible difference the general public will feel vs not feel as with the xbox one, especially in a game like this.

If you asked 100 ransom people would they prefer their car with power steering or without, or electric windows with or without, which one do you think they will choose?

I am not sure why people are not taking this into consideration, its a massive difference in how the game plays and feels. If it was a pure fps i could see your point, this is not, its got a gazillion things in it to do besides just shooting.

Because it's a gimmick and a lot of people are turning it off, especially in multiplayer games.
 

Concern

Member
Yep and that's why my point stands, its not just a gimmick, its a whole better way of experiencing the game for the better.

Microsoft will probably emulate it in a few year's time, and then people will be all over it like hot glue.

this is an evolution in gaming terms, it might only be 10-15% more but it's still more.

People really need to start taking it in factor, its important to the gamers enjoyment.


Idk if ms will emulate if its not in from day one. I expect most devs would probably just ignore it unless its easy to develop for.
 

Md Ray

Member
Not sure if this has anything to do with rasterization. It is part of the IO
No. It's part of the GPU. When a specific data in the GPU caches are no longer needed, the cache scrubbers evict those unnecessary data in a more fine-grained manner instead of flushing the whole cache which could hurt GPU perf, in other words, this could tank the frame-rate.

Hence the addition of cache scrubbers so that frame-rate don't tank when stale data from the caches are removed.
 

sircaw

Banned
Because it's a gimmick and a lot of people are turning it off, especially in multiplayer games.

It's not a gimmick, Sony has designed it with many game titles in mind, just because you can turn it's feature set off does not make it a needless function.

You want 30 fps modes with all the eye candy or 60 fps with the smooth performance, these you can also be changed or turned of in a lot of games, are those gimmicks, no there are not.

Sony put millions if dollars into their controller research to give what they believe is a better next gen performance. Just because you don't like it does not mean millions of casual players our there wont.

The controller is a win-win, no amount of trying to rubbish it will change that.
 

Godfavor

Member
No. It's part of the GPU. When a specific data in the GPU caches are no longer needed, the cache scrubbers evict those unnecessary data in a more fine-grained manner instead of flushing the whole cache which could hurt GPU perf, in other words, this could tank the frame-rate.

Hence the addition of cache scrubbers so that frame-rate don't tank when stale data from the caches are removed.
This is to keep the gpu fed from the IO. I don't believe that it has anything to do with fillrate
 
Last edited:

Warablo

Member
Maybe one day I will be able to play the game and have it actually save. Its astonishing that they haven't released a patch to fix the Series X version. Its literally unplayable because the game won't save.
 

ethomaz

Banned
Digital Foundry have already debunked that in real world testing.
Never lol

I don’t know where you guys take these false stories.

What DF tried is to simulate a PS5 using a RDNA card where the performance doesn’t scale after 2000Mhz and so shows no gain in performance.

But hey RDNA 2 does scale performance up to 2500Mhz or even more.

I even necro a thread to say I was right about they doing that misleading benchmarks lol
 
Last edited:

Godfavor

Member
Nowhere did I mention fillrate in that post. Cache scrubbers help GPU perf. It is a part of the GPU, not a part of IO like you keep saying.

whole argument is that the ps5 has the fillrate advantage despite having less bandwidth
 
Last edited:
Nowhere did I mention fillrate in that post. Cache scrubbers help GPU perf. It is a part of the GPU, not a part of IO like you keep saying.

I think there's some confusion here. I was talking about the GPU performance in general and not just rasterization. I'm just trying to better understand why the PS5 GPU appears to be an efficient design.
 
Link? There is no l2 like "infinite" cache in ps5
Everything is compressed in RDNA GPUs, from L1, L2 to GDDR6 (which wasn't the case with GCN).

In previous architectures, AMD introduced delta color compression to reduce bandwidth and save power. The RDNA architecture includes enhanced compression algorithms that will save additional bandwidth. Additionally, the texture mapping units can write compressed color data to the L2 cache and other portions of the memory hierarchy, whereas in earlier architectures, compressed data could only be written back to memory.

To maximize memory bandwidth, Navi also employs lossless color compression between L1, L2, and the local GDDR6 memory.
 

Md Ray

Member
Road to ps5 shows that is is part of the IO though. Watch it again

And the whole argument is that the ps5 has the fillrate advantage despite having less bandwidth
Again, It's part of the GPU. They exist in the caches of the GPU. It's you who need to watch it again.
 
DF didn't debunk anything. They concluded by saying they'd need a GPU with more than 40 CUs and more games need to be tested for an accurate picture.
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.
 

sendit

Member
You mean "up to" 22% higher pixel fillrate. PS5 uses boost clocks so none of your metrics reflect sustained or average performance.

You're sounding really desperate right now. It's kind of sad. To put this in perspective, even PC parts use boost clocks as well as their theoretical advertised Teraflop performance calculated based on those boost clocks.

RTX 3090
Base: 1400mhz
Boost: 1700mhz

Radeon 6800XT
Base: 1825mhz
Boost: 2250mhz
 
Last edited:

Md Ray

Member
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.
Nope. Watch that vid again and then come back. Rich tested just one game. He then concluded by saying he'd need more than 40 CU GPU and "more testing points" to fully conclude that higher CU is better.

Higher frequency is better and the games on PS5 vs SX clearly show PS5 outperforming SX. It is what it is.
 
Last edited:

jroc74

Phone reception is more important to me than human rights
Additionally, Sony effectively has a 399 console that can go toe to toe with a 499 console.
Yeah, this is getting overlooked and dismissed. And its designed so that it can be built on the exact same production line. Remove the drive, change the cover and you have the DE.

Amazing job by Cerny, Sony.
 

Godfavor

Member
Road to ps5 showed that it is on the ssd talk. My bad
No mention that helps with rasterizarion though

And yeah, the PS5 has 142.72 Gpix/s fillrate vs SX's 116 Gpix/s.

These are the theoritical max
l2 compression could help improve rasterizarion in both consoles though, so it is still limired to memory bandwidth.
 
It's not a gimmick, Sony has designed it with many game titles in mind, just because you can turn it's feature set off does not make it a needless function.

You want 30 fps modes with all the eye candy or 60 fps with the smooth performance, these you can also be changed or turned of in a lot of games, are those gimmicks, no there are not.

Sony put millions if dollars into their controller research to give what they believe is a better next gen performance. Just because you don't like it does not mean millions of casual players our there wont.

The controller is a win-win, no amount of trying to rubbish it will change that.

I actually like the controller, I had a chance to try it a couple days ago. You asked why people are not taking it more into consideration, I gave you an answer. It's a nice to have addition but ultimately it's a gimmick. 30 vs 60 fps is a much, much bigger improvement.
 
Never lol

I don’t know where you guys take these false stories.

What DF tried is to simulate a PS5 using a RDNA card where the performance doesn’t scale after 2000Mhz and so shows no gain in performance.

But hey RDNA 2 does scale performance up to 2500Mhz or even more.

I even necro a thread to say I was right about they doing that misleading benchmarks lol
It had nothing to do with overclocking a GPU.
They wanted to see if you got more performance with a higher clock on lower CUs than with lower clocks and more CU.
They had two GPUs with identical tflops, one with higher frequency and lower CUs, and one with lower clocks and higher CUs.
Same architecture, and none were overclocked.
The results were that the GPU with lower frequency and higher CUs had better performance. It was open and shut.
No need for you to bring RDNA 2 into it, as that had nothing to do with it.
The results are the results.
But maybe DF were shilling for MS here. I forgot about that.
 

Md Ray

Member
No mention that helps with rasterizarion though
I didn't say it helps with rasterization though. You seem confused.
These are the theoritical max
l2 compression could help improve rasterizarion in both consoles though, so it is still limired to memory bandwidth.
You keep saying "limited to memory bandwidth" but so far nearly every game is performing better on PS5. Shouldn't it be limited by bandwidth since it has -20% less bandwidth than SX?
 

Godfavor

Member
I didn't say it helps with rasterization though. You seem confused.

You keep saying "limited to memory bandwidth" but so far nearly every game is performing better on PS5. Shouldn't it be limited by bandwidth since it has -20% less bandwidth than SX?

Forget it. I was replying to your first post about fillrate/rasterization, which both are affected by memory bandwidth

I have explained a few posts back of what I believe the bottleneck is
 
Last edited:
Nope. Watch that vid again and then come back. Rich tested just one game. He then concluded by saying he'd need more than 40 CU GPU and "more testing points" to fully conclude that higher CU is better.

Higher frequency is better and the games on PS5 vs SX clearly show PS5 outperforming SX. It is what it is.
Lol. So the results showed that lower frequency and higher CU count had better performance, and your take is that the opposite is true.
You are plain wrong. The facts are the facts.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
At 30fps it's impossible to know the extra performance going to waste.
 

DForce

NaughtyDog Defense Force
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.

"So I'd say that we'd need more testing points and an RDNA card with more than 40CUs to get much more in the way of meaningful data to be honest"

Lets not forget that Richard recently said that the PlayStation 5 is punching above its weight, which suggest it's performing higher than he anticipated.

You can quote Richard and leave out an important part.
 

RoadHazard

Gold Member
Identical except for each seemingly having their own minor visual bugs. And out of those the worse AF on XSX is probably more noticeable.

Anyway, PS5 doing well again. Clearly both could go higher than 30fps, since they never drop a frame, but we don't know how much higher each could go. A 60fps mode without RT definitely seems possible.
 

Shmunter

Member
Why can't DF get their videos to run in 4k on YT? Yeah, its supposed to be a YT issue, but how come VGT and NXG can get their 4k videos uploaded just fine? I feel like something else is going on with the DF channel, because it's been 5 days now, and Dirt5 is still showing 1080. That's now Dirt5, WD, and COD comparison videos that have failed to encode in 4k. Meanwhile, the sponsored videos for COD and Godfall are both up in 4k after 2 and 4 days respectively.

If it was just YT to blame, then those sponsored videos should have suffered delays as well, but they haven't. It's quite strange. I find these comparison videos to be useless when they're sub-4k.
Yeah I was going to say, and the compression looks god awful too.
 

Leyasu

Banned
Absolutely unbelievable that all hell has broken loose over some launch titles.. lol

Do people on either side think that these games represent the best of what is possible on these consoles?
 

sendit

Member
Absolutely unbelievable that all hell has broken loose over some launch titles.. lol

Do people on either side think that these games represent the best of what is possible on these consoles?

No one is thinking that. However, the difference between the two consoles graphically in multi-platform titles will be minimal at best. Before these comparison videos started showing up, the narrative was the XSX would have a clear advantage in multi platform titles.
 
Last edited:

Leyasu

Banned
No one is thinking that. However, the difference between the two consoles graphically in multi-platform titles will be minimal at best. Before these comparison videos started showing up, the narrative was the XSX would have a clear advantage in multi platform titles.
Both will have advantages and disadvantages.

The next gen Battlefield in 2021 will be interesting
 

sendit

Member
Both will have advantages and disadvantages.

The next gen Battlefield in 2021 will be interesting

They're too close to have any serious graphical advantages. You're setting yourself up for disappointment if you think a multi-platform game such as Battlefield 2021 will change that.
 
For a second I though this was about those "gaming outlets" using Vgchartz to hype up PS5 vs. Xbox One. The fact that isn't a prominent thread looking through two pages of the forum shows me Gaf has improved well.
 

phil_t98

#SonyToo
No.
As Alex pointed out from the config files, they have the same value. Something isn't working properly in the Xbox version.
It’s like people miss out what they don’t want to see, he definently said that and that it is likely the puddles missing in PS5 was a bug to.

the funny thing is that people are talking about these first gen games and if they utlilising the full power and maxing out the consoles, nobody knows how much power is being used. It’s could only be using 50% of what’s available, we know games will get massivly better next bY this time next year visuals when the devs have a better handles on how to use the systems better
 

Humdinger

Member
Just watched the video. So, no differences. He sounded a little mystified why the XSX wasn't performing better than PS5.

Even though it's a draw, I think PS5 is the winner, since it is exceeding expectations ("punching above its weight," as they say). Or, another way of seeing it is XSX is failing to live up to expectations. That 18% TF on-paper advantage is seemingly being nullified by ... something.
 

ReBurn

Gold Member
Just watched the video. So, no differences. He sounded a little mystified why the XSX wasn't performing better than PS5.

Even though it's a draw, I think PS5 is the winner, since it is exceeding expectations ("punching above its weight," as they say). Or, another way of seeing it is XSX is failing to live up to expectations. That 18% TF on-paper advantage is seemingly being nullified by ... something.
It's being nullified by lazy devs

/s
 

Romulus

Member
This thread made me realize that both Sony and MS will have mid generation refreshes. Neither will allow the other to have a massive advantage for more than a year. Not to mention both Pro and X1X sold well.
 
Lets not forget that Richard recently said that the PlayStation 5 is punching above its weight, which suggest it's performing higher than he anticipated.

You can quote Richard and leave out an important part.
RDNA has nothing to do with the test. That's just him saying it would be good to get more info on different cards
That in no way changes the fact that real world tests showed the opposite of what some here are claiming.

And as for Richard's comments, if you are agreeing with him you are literally saying the PS5 shouldn't be able to run ACV as good as it does. I think PS5 can easily run that, so I don't think the PS5 is doing things here that it technically shouldn't be able to.
 
Last edited:

DForce

NaughtyDog Defense Force
RDNA has nothing to do with the test. That's just him saying it would be good to get more info on different cards
That in no way changes the fact that real world tests showed the opposite of what some here are changing.

And as for Richard's comments, if you are agreeing with him you are literally saying the PS5 shouldn't be able to run ACV as good as it does. I think PS5 can easily run that, so I don't think the PS5 is doing things here that it technically shouldn't be able to.

It does.

You're ignoring key factors because it doesn't support your case at all.

If there were no benefits, then Mark Cerny wouldn't have gone for such high clocks.
 

JackMcGunns

Member
missing puddles in reflections

Missing items in reflections, whether puddles or anything else indicates less objects were considered for the ray tracing pipeline. Since ray tracing can be fine tuned aggressively or less aggressive, this would be considered a less aggressive approach
 
Top Bottom