• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Watch Dogs Legion: PlayStation 5 vs Xbox Series X|S - Graphics, Performance, Ray Tracing!

NullZ3r0

Banned
Parts of the PS5 GPU is faster than XSX GPU. And more than one game's perf show this. End of story.

Both will trade blows going forward.

"Launch games" and "code optimizations" are just a bunch of excuses.
The fuck? No, its not end of story just because you say so. The PS5 has an inferior GPU. There's no debating that. We aren't discussing religious beliefs where things are open to interpretation. We're talking physics and math. PS5s GPU is inferior. It doesnt make it a bad GPU, but its not as powerful.

Code optimizations matter and time will prove you to be delusional.
 

Rentahamster

Rodent Whores
Why can't DF get their videos to run in 4k on YT? Yeah, its supposed to be a YT issue, but how come VGT and NXG can get their 4k videos uploaded just fine? I feel like something else is going on with the DF channel, because it's been 5 days now, and Dirt5 is still showing 1080. That's now Dirt5, WD, and COD comparison videos that have failed to encode in 4k. Meanwhile, the sponsored videos for COD and Godfall are both up in 4k after 2 and 4 days respectively.

If it was just YT to blame, then those sponsored videos should have suffered delays as well, but they haven't. It's quite strange. I find these comparison videos to be useless when they're sub-4k.

You have to be a premium member to have access to the 4K high quality video.

 
It does.

You're ignoring key factors because it doesn't support your case at all.

If there were no benefits, then Mark Cerny wouldn't have gone for such high clocks.
Stop with the Cerny worship.
It means as much if a Xbox fanboy says "well lower clocks and more CUs is better otherwise MS wouldn't have done it".
Sony had to go with higher clocks because they stuck with a 36CU GPU.
He was trying to sell what he has.
I mean Cerny said a 825gb SSD size was the logical thing to do as well, so 825>1000.

And again, RDNA has nothing to do with it. If the rule was that higher clocks is more performent than more CUs, the results DF got would have followed the rule.
 

Shmunter

Member
Missing items in reflections, whether puddles or anything else indicates less objects were considered for the ray tracing pipeline. Since ray tracing can be fine tuned aggressively or less aggressive, this would be considered a less aggressive approach
Alex Battlestar Galactica claims config files for the Ray tracings are identical between the consoles so results should be 1:1

However I’m not sure that’s an accurate baseline as games receive patches meaning its a moving target.
 

DForce

NaughtyDog Defense Force
Stop with the Cerny worship.
It means as much if a Xbox fanboy says "well lower clocks and more CUs is better otherwise MS wouldn't have done it".
Sony had to go with higher clocks because they stuck with a 36CU GPU.
He was trying to sell what he has.
I mean Cerny said a 825gb SSD size was the logical thing to do as well, so 825>1000.

And again, RDNA has nothing to do with it. If the rule was that higher clocks is more performent than more CUs, the results DF got would have followed the rule.

Still doesn't change the fact that they didn't have to overclock it if it has a negative impact.

They also had to go with 825 instead of 1TB.

Stick to twitter. Maybe the Xbox fanboys on there will believe in the bs you're posting. lol
 
Still doesn't change the fact that they didn't have to overclock it if it has a negative impact.

They also had to go with 825 instead of 1TB.

Stick to twitter. Maybe the Xbox fanboys on there will believe in the bs you're posting. lol
Why would I go on twitter?
All I have done is pointed out that Digital Foundry has tested the exact thing you claim to be true, and the results go against your claim.
So now I am a fanboy who should go to twitter?
You are out there claiming that everything Cerny says is gospel, claiming that after three or four launch games it is now fact that PS5 is more powerful than XSX. Meanwhile I am saying that its best to wait things out before jumping to conclusions, as there is quite a journey to go.
I must be the soy boy version of a console warrior, because I'm hardly waring very well at all.
 
Last edited:

DForce

NaughtyDog Defense Force
Why would I go on twitter?
All I have done is pointed out that Digital Foundry has tested the exact thing you claim to be true, and the results go against your claim.
So now I am a fanboy who should go to twitter?
You are out there fanboy claiming that everything Cerny says is gospel, claiming that after three or four launch games it is now fact that PS5 is more powerful than XSX. Meanwhile I am saying that its best to wait things out before jumping to conclusions, as there is quite a journey to go.
I must be the soy boy version of a console warrior, because I'm hardly waring very well at all.

You left out their conclusion and tried to twist what was reported.

Want to point out where I said PS5 is more powerful?

Lets see it.
 
You left out their conclusion and tried to twist what was reported.

Want to point out where I said PS5 is more powerful?

Lets see it.
The conclusion where he stated it would be good to get more testing done doesn't negate the results they actually got. If he did more testing there is no guarantee the results wouldn't be the same.
And in his conclusion he doesn't say that the results they got are incorrect. The numbers are the numbers. They are the only facts we have to go on. Unless you can supply other results that show a different outcome, then this is the default.
Why you would want to put that together and call me a fanboy I don't know.
You obviously don't like the results and so are trying to diminish them by throwing in disclaimers and mudding the water.
The cards wernt overclocked, and Richards conclusion never cast doubts in the results they got.

I do apologise to you, it wasn't you claiming the PS5 was stronger, it was some other posters..
 
Last edited:

DForce

NaughtyDog Defense Force
The conclusion where he stated it would be good to get more testing done doesn't negate the results they actually got. If he did more testing there is no guarantee the results wouldn't be the same.
And in his conclusion he doesn't say that the results they got are incorrect. The numbers are the numbers. They are the only facts we have to go on. Unless you can supply other results that show a different outcome, then this is the default.
Why you would want to put that together and call me a fanboy I don't know.
You obviously don't like the results and so are trying to diminish them by throwing in disclaimers and mudding the water.
The cards wernt overclocked, and Richards conclusion never cast doubts in the results they got.

I do apologise to you, it wasn't you claiming the PS5 was stronger, it was some other posters..
No, you're twisting his words.

He said he needs RDNA cards for more meaningful data.


Do you know what the word meaningful means? lol

The numbers are without a RDNA 2 card, which is totally different. I don't mind the results, looks like there's a reason why you left the full quote out.
 
No, you're twisting his words.

He said he needs RDNA cards for more meaningful data.


Do you know what the word meaningful means? lol

The numbers are without a RDNA 2 card, which is totally different. I don't mind the results, looks like there's a reason why you left the full quote out.
The exact quote doesn't change the results.
If he said something like "however we think these results are a bit misleading as the way the RAM was set up would give an advantage to the lower CU GPU, and so we need to do further testing to see if this isn't the case" then I would agree with you that you can't take anything from their testing.

But that's not the case.
If it was a factual rule that a higher frequency gives more performance than a larger CU set up then the results would have followed that rule.

Now show me some benchmarks that show me that the results DF got are wrong.

I will leave the ball in your hands to prove your assertion.
 

ethomaz

Banned
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.
That test doesn’t exists by DF... you are probably confusing what they did.


It had nothing to do with overclocking a GPU.
They wanted to see if you got more performance with a higher clock on lower CUs than with lower clocks and more CU.
They had two GPUs with identical tflops, one with higher frequency and lower CUs, and one with lower clocks and higher CUs.
Same architecture, and none were overclocked.
The results were that the GPU with lower frequency and higher CUs had better performance. It was open and shut.
No need for you to bring RDNA 2 into it, as that had nothing to do with it.
The results are the results.
But maybe DF were shilling for MS here. I forgot about that.
That was not the test they did lol
They overclocked the RX 5700 to match PS5.

I made a summary for you: https://www.neogaf.com/threads/digital-foundry-ps5-uncovered.1534691/page-19#post-261183531

Their test setup:

RX 5700 @ 2100Mhz
RX 5700 XT @ 1900Mhz

Very misleading.

If they wanted to make any point they should have used clocks where RDNA scale like:

RX 5700 @ 1890Mhz
RX 5700 XT @ 1700Mhz

In these clocks you will get similar TFs and the 5700 will outperform the 5700XT.


Stop with the Cerny worship.
It means as much if a Xbox fanboy says "well lower clocks and more CUs is better otherwise MS wouldn't have done it".
Sony had to go with higher clocks because they stuck with a 36CU GPU.
He was trying to sell what he has.
I mean Cerny said a 825gb SSD size was the logical thing to do as well, so 825>1000.

And again, RDNA has nothing to do with it. If the rule was that higher clocks is more performent than more CUs, the results DF got would have followed the rule.
Well look at the results... he basically nailed it.

And no he didn’t try to sell it... what he tried to explain was the goal to decrease the bootlenecks on the PS5 to make it hardware to reach more potential performance.

He archived that.

BTW about the SSD the 825GB was choose because it matches the 12 lanes he wanted to make it cheaper... he explained she could reach the same speeds with more expensive and fast memory modules but he choose to reach it with cheaper memory modules... so 12 lanes were used that made the capacity options be 825GB or 1650GB... they obvious choose the cheaper 825GB.
 
Last edited:

Azurro

Banned
Why would I go on twitter?
All I have done is pointed out that Digital Foundry has tested the exact thing you claim to be true, and the results go against your claim.
So now I am a fanboy who should go to twitter?
You are out there claiming that everything Cerny says is gospel, claiming that after three or four launch games it is now fact that PS5 is more powerful than XSX. Meanwhile I am saying that its best to wait things out before jumping to conclusions, as there is quite a journey to go.
I must be the soy boy version of a console warrior, because I'm hardly waring very well at all.

You are misquoting what Černý mentioned due to console warring. He never said 825 GBs was better than 1000 GBs, he mentioned that due to cost constraints that was the highest storage number they could put in the system but that by looking at the data of the behaviour of the average user, that they hope it is enough. He went with 36 CUs since they have a cost limit and decided he'd rather go narrower and faster rather than wider but slower.

The GPU isn't necessarily inferior either, the fact it has higher clocks makes it better at rasterization and fillrate. It also has a lot of other features that the Xbox SX doesn't have, such as the Geometry Engine and the cache scrubbers. You also forget that this is a system and the slowest component determines your average performance.

You don't have enough understanding of this so you are getting angry.
 

Lysandros

Member
Didn't they went with 4 ACEs for PS4 Pro? I'm assuming they retained the same number of ACEs for PS5.

I've seen the specs of Big Navi 80 CU GPU and it too seems to have 4 ACEs.
I didn't know that they went with 4 ACE's for PS4 PRO, what's the source for that?
 

Truespeed

Member
You will see the series X hardware flex when MS's studios projects purpose built ground up for the X start dropping. The only thing holding the hardware back is the incentive of third parties taking time optimize for the x specifically rather than the one size fits all approach and hoping the X can just brute force through the sometimes clumsy concessions made to accommodate as many platforms as possibly.

So the only company capable of harnessing the XSX's true power is Microsoft by creating XSX exclusive games that can't be compared against? That's like playing Jeopardy by yourself and declaring you won.
 
You are misquoting what Černý mentioned due to console warring. He never said 825 GBs was better than 1000 GBs, he mentioned that due to cost constraints that was the highest storage number they could put in the system but that by looking at the data of the behaviour of the average user, that they hope it is enough. He went with 36 CUs since they have a cost limit and decided he'd rather go narrower and faster rather than wider but slower.

The GPU isn't necessarily inferior either, the fact it has higher clocks makes it better at rasterization and fillrate. It also has a lot of other features that the Xbox SX doesn't have, such as the Geometry Engine and the cache scrubbers. You also forget that this is a system and the slowest component determines your average performance.

You don't have enough understanding of this so you are getting angry.
Why would I be angry? We are just talking about games consoles on a forum. There's nothing personal about it. It's just a hobby.
We can disagree on things, that's fine.
I didn't say anything about Cerny for console waring.
I was making the point that you can't follow anyone, Cerny included, but also people like Phil Spencer, like they are demigods and only speaking the full facts. They are both trying to sell their systems, and they will word things in a way to make their system seem better.

Again, don't fall for the sales pitches.
The Geometry Engine is an AMD function that goes back to GCN GPU's, as do primitive Shaders.

"Turning to graphics, the Polaris architecture enhances the geometry engines and tremendously improves both performance and energy efficiency of the rasterization stage. Polaris-based GPUs have 1-4 geometry engines, depending on overall performance targets (e.g. the Radeon™ RX 460 GPU has two, while the Radeon™ RX 480 GPU has four). The screen space is partitioned to load balance between the geometry engines, which can each rasterize a triangle per clock. The Polaris geometry engines use a new filtering algorithm to more efficiently discard primitives. As figure 5 illustrates, it is common that small or very thin triangles do not intersect any pixels on the screen and therefore cannot influence the rendered scene. The new geometry engines will detect such triangles and automatically discard them prior to rasterization, which saves energy by reducing wasted work and freeing up the geometry engines to rasterize triangles which will impact the scene. The new filtering algorithm can improve performance by up to 3.5X (fig. 6), and the benefits are more pronounced in scenes with many polygons."

The Geometry Engine was also found in Rdna 1 cards.

" The 5700 integrates 40 RDNA compute units, a multi-level cache comprising an L2, L1, and L0, a geometry engine, 64 pixel units, and 4 asynchronous compute engines. One of the main design goals that AMD doubled down on was higher frequencies at lower power across all those components."
 

Truespeed

Member
So, Sony has able to improve the RDNA2 IPC by 18% with customizations? Or Microsoft didn't take full advantage of the design by limiting the clock at 1.8Ghz.
I really don't know, I'm just think it's too early to call who is best. But yes, I believe that Sony can match the XSX, but I rather wait a bit more.

There's a reason why they're clocked at 1.8GHz and that's to make their yield of 52 working CU's.
 

Lysandros

Member
Lets not forget that Richard recently said that the PlayStation 5 is punching above its weight, which suggest it's performing higher than he anticipated.

You can quote Richard and leave out an important part.
But Richard doesn't seem to have a single clue on why PS5 is performing 'exactly like it sould' since his comprehension is frozen at CU number and 'partial' RAM bandwidth. That seems to be the case for the rest of DF crew. They are pushing this narrative hard in nearly every comparison video. Relevant/essential GPU metrics such as rasterization, fill rate and cache bandwidth are within the realm of the mysterious/speculation in their world and specific hardware customizations like cache scrubbers simply don't exist.
 

DForce

NaughtyDog Defense Force
The exact quote doesn't change the results.
If he said something like "however we think these results are a bit misleading as the way the RAM was set up would give an advantage to the lower CU GPU, and so we need to do further testing to see if this isn't the case" then I would agree with you that you can't take anything from their testing.

But that's not the case.
If it was a factual rule that a higher frequency gives more performance than a larger CU set up then the results would have followed that rule.

Now show me some benchmarks that show me that the results DF got are wrong.

I will leave the ball in your hands to prove your assertion.

You're trying too hard and it's not working.

They said it's not meaningful. If those results were 100% concrete, then they wouldn't bother doing anymore testing.

DF results aren't wrong because they clearly said they they need more meaningful data.

Look up that word and maybe it will help you understand what he was talking about.
 

DForce

NaughtyDog Defense Force
But Richard doesn't seem to have a single clue on why PS5 is performing 'exactly like it sould' since his comprehension is frozen at CU number and 'partial' RAM bandwidth. That seems to be the case for the rest of DF crew. They are pushing this narrative hard in nearly every comparison video. Relevant/essential GPU metrics such as rasterization, fill rate and cache bandwidth are within the realm of the mysterious/speculation in their world and specific hardware customizations like cache scrubbers simply don't exist.

Yes, they can only run test based on what they have. This is why they were surprised at the results when the PlayStation 5 comparisons released.

This is also custom hardware, which an RDNA 2 GPU cannot fully imitate.

So far, we can only conclude that Cerny is right based on the comparisons we have seen so far.
 

ReBurn

Gold Member
So the only company capable of harnessing the XSX's true power is Microsoft by creating XSX exclusive games that can't be compared against? That's like playing Jeopardy by yourself and declaring you won.
It's not possible to know whether the issues with a game are due to hardware capabilities or developer capabilities. Comparing games on different platforms tells us that the games are different but it doesn't tell us why. In a lot of ways it does come down to how first party games perform to know the truth.

On paper the PS3 was clearly the more powerful system and Sony first party consistently unlocked that power. Naughty Dog showed us what the PS3 was truly capable of, but third party developers rarely produced anything of the same quality as first party studios. Consistently, for the first few years anyway, Xbox 360 outperformed it on multiplats. A lot of it does come down to cross-platform engines and ability to optimize them for the hardware, but all we can do is count grass.

When we finally get some first party games from Microsoft the full story will be told. If those suffer from the same frame rate and resolution drops that we see from the multiplats that are currently being compared we'll know whats happening. Until then we just have to go with what we see, that XSX is a doing bit worse at multiplats, and that for now PS5 is probably the best way to play most of them if you don't have a capable PC.
 
That test doesn’t exists by DF... you are probably confusing what they did.



That was not the test they did lol
They overclocked the RX 5700 to match PS5.

I made a summary for you: https://www.neogaf.com/threads/digital-foundry-ps5-uncovered.1534691/page-19#post-261183531

Their test setup:

RX 5700 @ 2100Mhz
RX 5700 XT @ 1900Mhz

Very misleading.

If they wanted to make any point they should have used clocks where RDNA scale like:

RX 5700 @ 1890Mhz
RX 5700 XT @ 1700Mhz

In these clocks you will get similar TFs and the 5700 will outperform the 5700XT.



Well look at the results... he basically nailed it.

And no he didn’t try to sell it... what he tried to explain was the goal to decrease the bootlenecks on the PS5 to make it hardware to reach more potential performance.

He archived that.

BTW about the SSD the 825GB was choose because it matches the 12 lanes he wanted to make it cheaper... he explained she could reach the same speeds with more expensive and fast memory modules but he choose to reach it with cheaper memory modules... so 12 lanes were used that made the capacity options be 825GB or 1650GB... they obvious choose the cheaper 825GB.
Plenty of 5700 cards are sold at 2100mhz speed. It is not a crazy over clock at all.

Now all you have to do is go and do that benchmark test you have put forward and show us the results.
The onus is on you to show me some benchmarks done is an open and non biased way which go against the results DF got.

Its simple. I'm not trying to say a set of results are wrong and shouldn't be relied on. You are. Thats why its up to you to provide the proof of your claim.
 
Last edited:

Truespeed

Member
It's not possible to know whether the issues with a game are due to hardware capabilities or developer capabilities. Comparing games on different platforms tells us that the games are different but it doesn't tell us why. In a lot of ways it does come down to how first party games perform to know the truth.

On paper the PS3 was clearly the more powerful system and Sony first party consistently unlocked that power. Naughty Dog showed us what the PS3 was truly capable of, but third party developers rarely produced anything of the same quality as first party studios. Consistently, for the first few years anyway, Xbox 360 outperformed it on multiplats. A lot of it does come down to cross-platform engines and ability to optimize them for the hardware, but all we can do is count grass.

When we finally get some first party games from Microsoft the full story will be told. If those suffer from the same frame rate and resolution drops that we see from the multiplats that are currently being compared we'll know whats happening. Until then we just have to go with what we see, that XSX is a doing bit worse at multiplats, and that for now PS5 is probably the best way to play most of them if you don't have a capable PC.

Was the PS3 really a more powerful system? From teraflop comparison both were in the same ballpark. The difference is that the PS3 was a radical departure in architecture with 1 PPE and 8 SPE's and it took years for developers to wrap their head around the idea that what they did in the previous generation wasn't going to work this time around. - job scheduling systems that farmed work to the SPE's was the new paradigm. ND delivered magic on the PS3, but realistically had ND developed for the 360 they probably could have maxed it out as well. - that's just how they roll. That's not that case with the last generation or this generation. Both architectures are known quantities and what you applied previously is easily transferrable. I also don't expect anything from MS to outclass anything ND or the ICE team will do.
 
Last edited:
No, you're twisting his words.

He said he needs RDNA cards for more meaningful data.


Do you know what the word meaningful means? lol

The numbers are without a RDNA 2 card, which is totally different. I don't mind the results, looks like there's a reason why you left the full quote out.
If the theory that a given GPU with a higher clock speed would outperform a given GPU more CUs it would hold up across all GPUs. It would be true of GCN cards, Nvidia cards, RDNA 1 cards. To say that this isn't true of RDNA 1 cards, but it is true if RDNA 2 cards just doesn't make sense.
The rule would be the same across both architectures.

I think this has ran its course. You arnt going to change my mind, and I'm not going to change yours.
We will have to leave it up to the consoles to show which was more accurate.
 

Azurro

Banned
Why would I be angry? We are just talking about games consoles on a forum. There's nothing personal about it. It's just a hobby.
We can disagree on things, that's fine.
I didn't say anything about Cerny for console waring.
I was making the point that you can't follow anyone, Cerny included, but also people like Phil Spencer, like they are demigods and only speaking the full facts. They are both trying to sell their systems, and they will word things in a way to make their system seem better.

Phil Spencer is a salesman, Mark Černý is an engineer, just a very well spoken one. Spencer throws bullshit marketing to the wind, while Černý speaks about technical topics in a very eloquent way. You can't equate the two, their skillsets are very different.

Again, don't fall for the sales pitches.
The Geometry Engine is an AMD function that goes back to GCN GPU's, as do primitive Shaders.

"...

The Geometry Engine is not the same one present in GCN, it doesn't have the same functions as the one created by Cerny's team for PS5.

Not everything is a "gotcha", that talk where the PS5 was introduced was meant for the GDC, it's new functionality. If you don't want to believe it because...I don't know why honestly, then I don't know what to tell you other than that you should outgrow console warring.
 
Phil Spencer is a salesman, Mark Černý is an engineer, just a very well spoken one. Spencer throws bullshit marketing to the wind, while Černý speaks about technical topics in a very eloquent way. You can't equate the two, their skillsets are very different.



The Geometry Engine is not the same one present in GCN, it doesn't have the same functions as the one created by Cerny's team for PS5.

Not everything is a "gotcha", that talk where the PS5 was introduced was meant for the GDC, it's new functionality. If you don't want to believe it because...I don't know why honestly, then I don't know what to tell you other than that you should outgrow console warring.

Firstly, Spencer isn't a salesman. He has a bachelors degree and has spent over 20 years at MS and started as a games developer. He might be a good talker, but he isn't without a great deal of knowedge in this industry.
But just like Cerny, he is trying to sell his system.

Sorry to dissapoint you, but PS5 GE is based on AMDs GE. That includes primitive shaders, which you would have heard about on the road to PS5. Its "custom" in the same way the PS5 is "custom" RDNA2.
AMD has moved away from the GE and Primitive shaders found on the RDNA1 GPUs to the more superior Mesh Shaders in RDNA 2.
 

Md Ray

Member
The fuck? No, its not end of story just because you say so. The PS5 has an inferior GPU. There's no debating that. We aren't discussing religious beliefs where things are open to interpretation. We're talking physics and math. PS5s GPU is inferior. It doesnt make it a bad GPU, but its not as powerful.

Code optimizations matter and time will prove you to be delusional.
Lol.

HotChips slide from MS:
sUET9U8.jpg


Notice they use rasterization rate and pixel fillrate metric as well apart from compute and mem bandwidth to show their GPU evolution. While SX has advantage in the first two units, in comparison, PS5's rasterization rate and pixel fillrate are 8.92 Gtri/s and 142.72 Gpix/s, respectively.

Gtri/s relates to primitive units and Gpix/s is related to ROPs. These units in the PS5's GPU are 22% faster. And this isn't some open to interpretation religious beliefs or something, lol. XSX GPU is inferior than PS5 at those.

Parts of the PS5 GPU is indeed faster than XSX GPU. Shocking, I know. It's maths and physics and facts.
 
Last edited:

Azurro

Banned
Firstly, Spencer isn't a salesman. He has a bachelors degree and has spent over 20 years at MS and started as a games developer. He might be a good talker, but he isn't without a great deal of knowedge in this industry.
But just like Cerny, he is trying to sell his system.

Spencer is a salesman, his CV is unnecessary when all you need to do is look at his activities promoting Xbox. MS is chock-full of dishonest marketing that bend the truth and he is front and center promoting these narratives as well as engaging and promoting dishonest fanboys on twitter. How you are skeptical of a technical PS5 talk as below and yet you swallow everything Spencer mentions is a bit beyond me.

Sorry to dissapoint you, but PS5 GE is based on AMDs GE. That includes primitive shaders, which you would have heard about on the road to PS5. Its "custom" in the same way the PS5 is "custom" RDNA2.
AMD has moved away from the GE and Primitive shaders found on the RDNA1 GPUs to the more superior Mesh Shaders in RDNA 2.

What do you mean "custom" RDNA2? That is a moniker for a set of functionalities that chip manufacturers can pick and choose from, not all of them are relevant for consoles. Sony worked with AMD to develop RDNA2 and has created features that may also show up in future iterations of RDNA2, just like the increased number of ACE units did in Polaris.

The Geometry Engine is not the same one from RDNA1, I don't understand why this is a talking point. This isn't a "gotcha", it's new functionality, if you don't want to believe it that's your business.

I mean, tell me your rationale, do you think Sony just ordered an off the shelf semi RDNA1 chip, bolted on some RDNA2 functions and overclocked it? That'd be...very ignorant on your part. Next thing you'll tell me is that it's a 9 TF part because it throttles or something similar.

But you know, let's move on from conspiracy theories, if the PS5 is so behind, why is the performance better in most cases? Tell me why the machine designed by marketing liar Černý performs so well?
 
Last edited:

Md Ray

Member
I do apologise to you, it wasn't you claiming the PS5 was stronger, it was some other posters..
You might wanna look at this.
HotChips slide from MS:
sUET9U8.jpg


Notice they use rasterization rate and pixel fillrate metric as well apart from compute and mem bandwidth to show their GPU evolution. While SX has advantage in the first two units, in comparison, PS5's rasterization rate and pixel fillrate are 8.92 Gtri/s and 142.72 Gpix/s, respectively.

Gtri/s relates to primitive units and Gpix/s is related to ROPs. These units in the PS5's GPU are 22% faster. And this isn't some open to interpretation religious beliefs or something, lol. XSX GPU is inferior than PS5 at those.

Parts of the PS5 GPU is indeed faster than XSX GPU. Shocking, I know. It's maths and physics and facts.
 
Last edited:

J_Gamer.exe

Member
PS5... eats higher spec sheets for breakfast. 🙃

But yeah parity.

Parity is a win for sony given the power bragging.

A lot of confusion in these videos why series x isn't doing better but its probably the ps5 that doing way better than raw tf suggests with the incremental gains in areas like better caching probably unified, faster caching, less cache misses with cache scrubbers, faster rasterisetion, much faster io, cu utilisation etc etc.

All these little gains for efficiency and lower latency increases performance which its why as cerny said, its dangerous to rely on teraflops as an absolute indicator of performance.
 
Last edited:
Spencer is a salesman, his CV is unnecessary when all you need to do is look at his activities promoting Xbox. MS is chock-full of dishonest marketing that bend the truth and he is front and center promoting these narratives as well as dishonest fanboys on twitter.



What do you mean "custom" RDNA2? That is a moniker for a set of functionalities that chip manufacturers can pick and choose from, not all of them are relevant for consoles. Sony worked with AMD to develop RDNA2 and has created features that may also show up in future iterations of RDNA2, just like the increased number of ACE units did in Polaris.

The Geometry Engine is not the same one from RDNA1, I don't understand why this is a talking point. This isn't a "gotcha", it's new functionality, if you don't want to believe it that's your business.

I mean, tell me your rationale, do you think Sony just ordered an off the shelf semi RDNA1 chip, bolted on some RDNA2 functions and overclocked it? That'd be...very ignorant on your part. Next thing you'll tell me is that it's a 9 TF part because it throttles or something similar.

But you know, let's move on from conspiracy theories, if the PS5 is so behind, why is the performance better in most cases? Tell me why the machine designed by marketing liar Černý performs so well?
I see that as you couldn't argue with what I explained to you, you decided to throw around terms like conspiracy theory and the whole 9tf thing that gets you banned.

Here are the facts for you.
The PS5 uses the AMD Geometry Engine and Primitive Shaders. It was all in the Road to PS5 talk.
Primitive shaders and the GE are older tech that was originally installed by AMD into some GCN GPUs, but couldn't get it working properly. They got it working and used it in their RDNA1 cards.
As I said, the GE and Primitive Shaders are old tech that AMD replaced with the more superior Mesh Shaders for their RDNA 2 GPUs.
That's not a conspiracy theory, it is what it is.
So to jump into a thread and throw out about how the PS5 is putting out better performance than the XSX because the XSX doesn't have a Geometry Engine was quite funny, when literally the Geometry Engine and the Primitive Shaders it uses are outdated tech.
That's why I told you to becareful about just parroting some term you heard, because chances are it isn't as big a deal as you were sold on.
We have plenty of time to see how this gen unfolds. Both consoles will speak for themselves. I would also just say, don't claim a victory for one side over the other just yet.
Launches are always full of unoptimized and rushed software.
 

MastaKiiLA

Member
You have to be a premium member to have access to the 4K high quality video.

I thought that was only for downloading the videos that they encoded. I thought they offered free 4k videos on YT, but you had to tolerate the YT compression. I'll have to go back and check, but I'm pretty sure I've been seeing all their content on YT in 4k, up until this point. If you have to be a member just to access the 4k comparisons on YT, then I'll unsubscribe. However, that wasn't my impression.
 
You might wanna look at this.
The PS5 has a higher Raster number, as well as a higher pixel fill rate.
I know. I have known since they were revealed.
But these two don't determine a GPU strength.
There are other things such as texture fill rate, compute as well.
Then you have other capabilities which may or may not provide better graphics and performance.

I come to decisions on how I think things might play out due to the specs. Real life then will play out.
For instance, the PS5 has a faster SSD.
I think that should give the PS5 faster load times and faster respawn times in games.
Now at launch we have seen a number of examples of where the XSX actually has faster load times than the PS5. Now I look at that and say, well its launch and its not unsurprising to think some devs haven't got to grips with the hardware, and might not be optimising the SSD properly, so rather than say "see, the XBA is superior to the PS5 SSD", I stick by my original belief and will wait for a while before making a judgement.

Same goes for the power of the consoles.
The XSX has a more powerful GPU, a faster CPU, faster memory bandwidth, more advanced RDNA 2 features like VRS, Mesh Shaders, SFS and other extras such as ML.

Just like I did with the PS5 SSD, I'm not concerned with some rushed launch titles, I think the XSX will perform a little better, and so I will hold my fire and wait till things settle.

Its not a fanboy thing, its not a troll thing, its reading the specs and making a judgement on what I think will happen.
And if down the track the XSX loads games quicker than the PS5, or the PS5 has better performance still, then I will say "huh, looks like it didn't pan out like I thought".
 

TBiddy

Member
Quick resume saving the day when the "performance crown" falls flat on its face.

Yes, let's negate one of the advantages the XSX has. Makes perfect sense.

Why would I be upset about parity ? Party is best for all gamers.u just play where u like.

I don't know why you would be upset. You seem upset.

There is no parity here [...] PS5 and XSX are dropping to near identical resolution at similar taxing spots.

Makes sense...
 
Last edited:

GHG

Member
Yes, let's negate one of the advantages the XSX has. Makes perfect sense.

Don't know why you're bringing it up in a thread about the performance of a specific game. Looks like "VRR" and "quick resume" are the "friends" and "controller" of this gen.

The cycle begins anew.
 

TBiddy

Member
Don't know why you're bringing it up in a thread about the performance of a specific game. Looks like "VRR" and "quick resume" are the "friends" and "controller" of this gen.

The cycle begins anew.

I was responding to someone mentioning loading time. Quick resume is a perfectly reasonable thing to discuss in that context. No need to get your jimmies rustled here.
 

llien

Member
I love these forums, the amount of pettiness in here always gives me a good chuckle :messenger_tears_of_joy: Someone will reply to this and say "Man, that equal performance on the 'Most Powerful console in the world' is a far cry from what Xbox has been touting this whole time" :messenger_tears_of_joy: I can almost feel the tension from all the anger held inside by the fanboys.

I'm much more likely to buy PS5 (this time mainly due to GoW) than Xbox (never owned one), but performance comparison with locked framerates is something rather special, let alone, using DF to judge things.

And... 18 seconds "instant loading" is... pettiness? Eh...
 

geordiemp

Member
Guy again, you are talking about it without knowledge. I´m computer engineer from 16 years ago and not expert in GPU but I know a little bit about cpu/gpu and I know that for GPU is always better more CU´s than mhz because graphics calculations are highly parelelizable. This is why usually the biggest improvements in GPU is thanks to add more CU´.

Your are talking and judging the desing of many engineers from your sofa when you never designed a gpu in your life.

I don´t be sure if you are 15 years old or something like that.

And I am a semiconductor physics background and work in the field. Your wrong.

No parts just add CU to shader arrays to get more performance, except XSX.

All shader arrays are 10 CU or less except Nav14, a low end part. 6800 has 8 shader arrays of 10 CU, they dont just add CUs lol

Go look how PC parts scale and come back, adding Cus also adds shader arrays, Prims, RB, consideration for L1 and indeed LDS all of which can have bottlenecks and hence the scaling.

If XSX added shader arrays I would agree, so 6 shader arrayus 60 CU part >>> Ps5. There is a reason you dont just extend shader array, do you know what they could be ? Go look it up.

Once you have figured all that out, we can talk.
 
Last edited:

Leyasu

Banned
They're too close to have any serious graphical advantages. You're setting yourself up for disappointment if you think a multi-platform game such as Battlefield 2021 will change that.
??

No, that’s not what I meant. Not everyone has the same war mentality.

I was implying that we will finally start to see games that will be made for these consoles.

I don’t give a fuck about the console war.
 

Gudji

Member
The fuck? No, its not end of story just because you say so. The PS5 has an inferior GPU. There's no debating that. We aren't discussing religious beliefs where things are open to interpretation. We're talking physics and math. PS5s GPU is inferior. It doesnt make it a bad GPU, but its not as powerful.

Code optimizations matter and time will prove you to be delusional.

PS5 has a faster GPU with less compute units but almost everything else is better than on XSX due to clocks. Time to do the math again.
 
Last edited:
Not going to say anything on PS5 v XSX as brave warriors on either side are doing their duty there but I will say that (after 10-15 hours or so of play) sometimes this game looks astonishingly good and other times like utter crap.

A real mixed bag.
 

Redtemplar

Neo Member
I'm reading a lot of people talking about parity, forgueting that faster loading times are also part of the performance of a game on a given system.
SO THERE IS NO PARITY IT IS A WIN FOR PS5 :messenger_sunglasses:
 
I'm reading a lot of people talking about parity, forgueting that faster loading times are also part of the performance of a game on a given system.
SO THERE IS NO PARITY IT IS A WIN FOR PS5 :messenger_sunglasses:
Yes but it's not loading twice as fast so anything over half the loading time of xsx is a loss for PS5 hahaha.
 

Redtemplar

Neo Member
Yes but it's not loading twice as fast so anything over half the loading time of xsx is a loss for PS5 hahaha.

The PS5 loads the game 8 seconds faster and it is a cross generation game who is not optimized to take fully advantage of the ssds
and the architecture of this next generation systems. And 8 seconds is 8 seconds waiting for the game so there is no parity, another win for PS5 :messenger_winking: :messenger_beaming:
 

Humdinger

Member
I'm thinking that since the Xbox toolset has to accommodate more platforms than the PS5 toolset, it will be more complicated to use, or it will have constraints built in that the simpler PS5 toolset would not. I mean, the Xbox GDK would have to allow development on the Xbox One original design, the X1X, the XSX, the XSS, and PC, with its all variations. That's a lot of bases to cover. In contrast, the PS5 GDK would only need to facilitate development on the PS5 and PS4. So it would be a considerably simpler process.

Would this be a factor in what's holding the Xbox back? I'm not a tech guy, so I could be off. I'm just checking my thought process.
 
Last edited:
Top Bottom