• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Matt weighs in on PS5 I/O, PS5 vs XSX and what it means for PC.

ToadMan

Member
I already did:


That doesn’t say everyone who worked on it was hired in 2019.... the way you said “we know” I assumed you had Hr records or something not some twitter interpretation crap.

Having said that, PS5 is reputedly easy to work with and the SSD is an order of magnitude step change in performance. So perhaps it is possible to see that just a year or so of work by a small team can reliever jaw dropping results on PS5.

Perhaps the difficulty porting to Xsex to get around it’s limitations is why it won’t be released until the end of next year.
 

ToadMan

Member
Ability to run sustained 100% is different from running 100% all the time
Stop attempt to fudge matters, you know it all too well.

There is no practical application of a processor that would require a every transistor to flip every clock. Ever.
 

Exodia

Banned
That doesn’t say everyone who worked on it was hired in 2019.... the way you said “we know” I assumed you had Hr records or something not some twitter interpretation crap.

Having said that, PS5 is reputedly easy to work with and the SSD is an order of magnitude step change in performance. So perhaps it is possible to see that just a year or so of work by a small team can reliever jaw dropping results on PS5.

Perhaps the difficulty porting to Xsex to get around it’s limitations is why it won’t be released until the end of next year.

Brian already detailed the members of the Nanite team and those who worked on developing it. Everyone in his team which was created in 2019 was hired in 2019/2020
You either accept the truth or live in your fantasy land.

 
Last edited:

ToadMan

Member
Seems unlikely, considering the engine runs on almost every platform known to man. But it's a fun narrative, sure.

UE5 has only been shown running on PS5 to date.... it probably runs in some lower quality form on other platforms though


Brian already detailed the members of the Nanite team and those who worked on developing it. Everyone in this team was hired in 2019/2020
You either accept the truth or live in your fantasy land.



It s not my fantasy - I’m not really interested in when they were I’m hired. I’m just interested in why you say “you know” something with such force, when your evidence for the claim is just a tweet. Like I said - you don’t know the composition of the team, you don’t have HR records.

If that counts as proof for you - it seems the rest of what you say is also based on flimsy “evidence”.
 

Exodia

Banned
It s not my fantasy - I’m not really interested in when they were I’m hired. I’m just interested in why you say “you know” something with such force, when your evidence for the claim is just a tweet. Like I said - you don’t know the composition of the team, you don’t have HR records.

If that counts as proof for you - it seems the rest of what you say is also based on flimsy “evidence”.

The team was created in 2019 period. Prior to that he worked solo on it. Everyone who was added to the team WHICH HE LISTED was hired in 2019/2020.
This same group of people also said they took part in the development of Nanite.
But of course they are all fucking liars because it goes against your fairy tale narrative.
 
Last edited:

geordiemp

Member
Brian already detailed the members of the Nanite team and those who worked on developing it. Everyone in his team which was created in 2019 was hired in 2019/2020
You either accept the truth or live in your fantasy land.



Brian was working on lots of stuff in 2018 and his work on this subject has been years in the making. Just because he was joined by others in 2019 to do that demo does not mean EPIC was not talking to Sony earlier (as Tim sweeny SAID for 4 years for graphics and IO storage, he said in the gamespot interview)

Sony working / discussions with epic for 4 years and Starting programming the demo are not mutually exclusive.

Your drawing conclusions that collaboration and technical discussions on fast streaming only started with the demo to fit your narrative that sony just got lucky.

Tim Sweeny outright said around 4 years for IO streaming discussions. Is tim sweeny making it up ?
 
Last edited:

ToadMan

Member
I assume it's running on PC, XSX, XB1, PS4 and Switch as well right now. It's just not ready for release. Not on those platforms and not on the PS5.

It has nothing to do with speculated limitations.

The comment I responded to implies the current state of UE5 and what was shown in the PS5 demo was the result of just one year of work by a small team.

That’s all it took to stun the Video game world with a PS5 based demo. Unusual for Epic - they’ve normally shown PC tech demos first and consoles come later. This time they led with PS5.

They did that because that’s where it looks best right now. That implies the other platforms don’t look so good ... and that’s what they’ll be trying to optimise for next year or so.
 

MoreJRPG

Suffers from extreme PDS
That doesn’t say everyone who worked on it was hired in 2019.... the way you said “we know” I assumed you had Hr records or something not some twitter interpretation crap.

Having said that, PS5 is reputedly easy to work with and the SSD is an order of magnitude step change in performance. So perhaps it is possible to see that just a year or so of work by a small team can reliever jaw dropping results on PS5.

Perhaps the difficulty porting to Xsex to get around it’s limitations is why it won’t be released until the end of next year.

Where did this ridiculous narrative start that the XSX isn’t easy to work with? It’s the same architecture that’s been around the community for years. The reason PS5 is so easy to dev on is because the 360 was the gold standard and they had no choice but to dramatically improve their system from the PS3 debacle.
 
Last edited:

geordiemp

Member
Where did this ridiculous narrative start that the XSX isn’t easy to work with? It’s the same architecture that’s been around the community for years. The reason PS5 is so easy to dev on is because the 360 was the gold standard and they had no choice but to dramatically improve their system from the PS3 debacle.

We have had devs and engine makers calling Ps5 a dream with a god tier IO system. Cant be bothered to link, go look for yourself.
 

ToadMan

Member
The team was created in 2019 period. Prior to that he worked solo on it. Everyone who was added to the team WHICH HE LISTED was hired in 2019/2020.
This same group of people also said they took part of the development.
But of course they are all fucking liars because it goes against your fairy tale narrative.

I don’t have a narrative. I’m just calling into question what you consider evidence. The content of one tweet is enough for you evidently.

"The whole problem with the world is that fools and fanatics are always so sure of themselves, but wiser people so full of doubts".

It explains a lot about the other things you’ve claimed to “know”
 

Exodia

Banned
That doesn’t say everyone who worked on it was hired in 2019.... the way you said “we know” I assumed you had Hr records or something not some twitter interpretation crap.

I don’t have a narrative. I’m just calling into question what you consider evidence. The content of one tweet is enough for you evidently.

"The whole problem with the world is that fools and fanatics are always so sure of themselves, but wiser people so full of doubts".

It explains a lot about the other things you’ve claimed to “know”

Brian who runs the Nanite project literally listed the members of his team to give them their due credit. A team he literally said was created in the "Last year". A team who was "super influential" to Nanite. A team that had "direct Nanite development".
All the listed team member was all hired in 2019/2020 based on their Linkedin. This is also backed up by their tweets saying they worked on nanite.

This 100% fact supported by actual resume records in Linkedin.

https://www.linkedin.com/in/runestubbe/?originalSubdomain=dk
https://www.linkedin.com/in/gwihlidal/?originalSubdomain=ca
https://www.linkedin.com/in/ola-olsson-89929997/?originalSubdomain=se
https://www.linkedin.com/in/bernhard-kerbl-283b8679/?originalSubdomain=at
https://www.linkedin.com/in/andrew-lauritzen/?originalSubdomain=ca

But I forgot that here only none-sense is allowed to spread and absolute fairy tale is encouraged, pushed and believed instead. Any statement backed up by actual facts will be discouraged and disbelieved.
Brian was clearly lying. He also clearly omitted team members who worked on nanite because he had personal strife and contempt with them.
Clearly there are 10 other omitted unknown members. And all the members he listed LIED about their resume on linkedin.

Why should we listen to them? We should listen to ToadMan instead.
 
Last edited:
Okay getting things back to some sanity....we know the front end and asynchronous compute capability of RDNA2 is much better than RDNA which had major strides over GCN in that aspect, so I'm wondering if it will be much easier to saturate the CUs even in lower/lesser demanding workloads.

Basically for example, if you have tasks that are only demanding, say, 6 TF of overall GPU output, if it's possible to spread that workload over as much of the GPU as possible. For example PS5's 10.275 TF equals around 285.4 GF per CU. So normally you'd expect new CUs to be added to the workload only when the previous ones were fully occupied, therefore a 6 TF workload on this approach would equate to 21 CUs on PS5.

On XSX, 12.147 TF equals out to about 233.5 GF per CU, so that same 6 TF workload would normally net about 25 CUs. So for the same 6 TF workload if you're just filling up new CUs when the previous ones are occupied with work, the XSX has more L1 and L2 cache to work with that workload on, in terms of physical parallel cache amounts, but PS5 can work through the workload faster due to the higher GPU clock.

What I'm wondering is if things are improved enough to where that same 6 TF workload can be spread across many more CUs, because that would bode pretty well not just for XSX obviously, but also PS5, in the GPUs getting more and more of their hardware utilized even for workloads that aren't peaking their TF maximum. I think the frontend and async capabilities on these RDNA2 GPUs are being underestimated; there's a reason AMD is pushing for larger GPUs now and feel confident they can compete with Nvidia going forward. A push for much larger GPUs wouldn't be favored if frontend and async capabilities hadn't improved by magnitudes in the meantime.

If AMD didn't upgraded RDNA2 clock speeds a lot, then I wouldn't be surprised if opposite is true. That clock is really end of power curve for current RDNA cards and would lower yields significantly. We will see more when RDNA2 cards finally hit market. In addition if final PS5 is anything like devkit, then it will be much more expensive to manufacture than XBox.

No i was only talking about the SSD I/O costs, particularly the NAND modules. For reasons I'll explain later I think Sony's is less expensive on that front, but I'm not sure if the entire I/O setup is cheaper on their end or MS's, due to all of the fixed function hardware Sony's put into theirs (which is something else I should touch on sometime).

In terms of actual APU costs I'm expecting they're probably on parity, and I do agree with PS5 GPU clock being at the end of the power curve even on RDNA2 (I actually think it's past the sweetspot TBH), which could affect pricing due to yields.
 
Last edited:

ToadMan

Member
Where did this ridiculous narrative start that the XSX isn’t easy to work with? It’s the same architecture that’s been around the community for years. The reason PS5 is so easy to dev on is because the 360 was the gold standard and they had no choice but to dramatically improve their system from the PS3 debacle.

Are you talking about hardware or software?
 

ToadMan

Member
Brian who runs the Nanite project literally listed the members of his team to give them their due credit. A team he literally said was created in the "Last year". A team who was "super influential" to Nanite. A team that had "direct Nanite development".
All the listed team member was all hired in 2019/2020 based on their Linkedin. This is also backed up by their tweets saying they worked on nanite.

This 100% fact supported by actual resume records in Linkedin.

https://www.linkedin.com/in/runestubbe/?originalSubdomain=dk
https://www.linkedin.com/in/gwihlidal/?originalSubdomain=ca
https://www.linkedin.com/in/ola-olsson-89929997/?originalSubdomain=se
https://www.linkedin.com/in/bernhard-kerbl-283b8679/?originalSubdomain=at
https://www.linkedin.com/in/andrew-lauritzen/?originalSubdomain=ca

But I forgot that here only none-sense is allowed to spread and absolute fairy tale is encouraged, pushed and believed instead. Any statement backed up by actual facts will be discouraged and disbelieved.
Brian was clearly lying. He also clearly omitted team members who worked on nanite because he had personal strife and contempt with them.
Clearly there are 10 other omitted unknown members. And all the members he listed LIED about their resume on linkedin.

Why should we listen to them? We should listen to ToadMan instead.

I don’t have a horse in this race. I don’t care if they worked there 2 weeks or twenty years.

I haven’t said anywhere that I disagree - I just asked where you had your evidence from and you showed it.

LinkedIn profiles are more like it though - you could’ve produced those from the get go instead of a tweet. I hope you look for corroboration on everything you post in future.
 

Exodia

Banned
LinkedIn profiles are more like it though - you could’ve produced those from the get go instead of a tweet. I hope you look for corroboration on everything you post in future.

I did it was at the end of the post I referenced in response to your question.

 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
The issue is people like you are purposefully misinterpreting what many are saying on here. Nobody thinks SSD are useless or will not have a good impact on games. People are skeptical that SSD are the be all or will replace GPU,CPU,RAM etc.... all of which the XSX has the advantage

When did even ONE person say this?
 

Paracelsus

Member
When did even ONE person say this?

Follow the most avid posters in the argument. It's the elusive narrative being built, that the SSD custom system has no longer need for any serious hardware, and thus PS5 will have better looking games that run at a higher res and better framerate than XBSX.
The XBSX hardware advantage is meager, the PS5 pushes too much data way too fast for XBSX to compensate etc.

If the argument was "no loading times, no LOD, bigger worlds with no transitions" people wouldn't be puking rainbows.
 

kuncol02

Banned
When did even ONE person say this?
You clearly never went into next gen speculation thread. Especially after UE5 presentation.

No i was only talking about the SSD I/O costs, particularly the NAND modules.
AFAIR There were speculations, that they could get away with slower (and cheaper) modules thanks to using 12 channels.
 

Naddy

Banned
That’s all it took to stun the Video game world with a PS5 based demo. Unusual for Epic - they’ve normally shown PC tech demos first and consoles come later. This time they led with PS5.

factually, wrong. Epic usually shows a tech demo when a new playstation console launches, they did the same with PS4, look here:



I think they have a marketing contract.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Follow the most avid posters in the argument. It's the elusive narrative being built, that the SSD custom system has no longer need for any serious hardware, and thus PS5 will have better looking games that run at a higher res and better framerate than XBSX.
The XBSX hardware advantage is meager, the PS5 pushes too much data way too fast for XBSX to compensate etc.

If the argument was "no loading times, no LOD, bigger worlds with no transitions" people wouldn't be puking rainbows.

Sorry dude, I just haven't seen people say all that. Why would the PS5 SSD give a game better framerate and better resolution? You sure people have been saying this?
 

Naddy

Banned
We have had devs and engine makers calling Ps5 a dream with a god tier IO system. Cant be bothered to link, go look for yourself.

It was EPIC who said that and well they have a marketing contract it seems, they did the same back then with PS4.
Also, no one ever said that XSX is hard to develop for.

What we know is that some devs had to throttle the CPU to make sure that the GPU is running sustained:

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.

We even have Cerny say, that in order to make use of the PS5, developers actually have to change their engines and build their engines around PS5 to make use of it:

Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level.

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

So, according to Cerny, devs now have to change their engines. Their engines need to be basically based on power consumption in order to achieve optimal performance on PS5. If they do not do this, they will not be able to achieve the optimal performance - according to Cerny. Keep in mind, that this only applies to PS5.
It does not apply to XSX, on XSX you do not need to throttle the CPU to ensure a sustained GPU clock, both CPU and GPU can be sustained at 100% under all environments:

Once again, Microsoft stresses the point that frequencies are consistent on all machines, in all environments. There are no boost clocks with Xbox Series X.
 

soulbait

Member
Once again, everything turns into a debate and I don't get it. The way everyone gets so passionate about their console of choice you would think that have some sort of stake in the console selling well.

I am a Xbox first gamer and I will go with the XsX first this generation, like I did with the current (got PS4 Pro when it launched). I just prefer the overall experience and especially the controller. I will eventually get a PS5 and enjoy the exclusive games there and usually at a discount due to getting it later.

Even though I am Xbox first, I am excited about the SSD in PS5. I am excited to see if it does make any differences and excited to see the games that take advantage of it. It is still a question mark on how much of a difference it will make, but it is wicked fast and has potential. I remember when the original Xbox came out, and along with the built in broadband support, what set it apart was the internal HDD. Some games took interesting use with this, allowing for custom soundtracks in their games. NHL 2K with the same songs as I could hear at the arena as I played as the Hurricanes was great! Custom radio station in True Crime was cool. But then a game like Blinx came around and did something that was not able to be done on others: its rewind feature. It was cool (maybe not the best game, but I liked the gimmick). I think the SSD in PS5 will have a bigger impact than the gimmicks I refereed to on the OG Xbox, and I am looking forward to see what they do.

XsX has the power advantage, and that is going to be great too. More visual fidelity and hopefully better AI. I looking forward to seeing what all the first party studios can do with this beast of a console as well with the assumed better performance on third-party games. Xsx's SSD may not be as fast as PS5's, but it is still game changer for all consoles to have a SSD. I believe everyone can get on board with liking less load times.

TLDR:
As a gamer I am excited about the upcoming possibilities from both consoles. I prefer Xbox, but I am still looking forward to seeing how PS5 takes advantage of its significantly faster SSD. Stop fighting to determine which is the best; we all have our own metrics on which we feel are the most important. If speed and graphics power were all that mattered then Nintendo would not still be the force that it is.
 

Elog

Member
This sounds reasonable. They just fail to also say lower framerate and lower resolution.
There are advantages and drawbacks to each console.

XSX should have a frame rate advantage at any given resolution over PS5. Texture resolution (and number of textures used in any given scene) should be a PS5 advantage though since it is not currently GPU limited but I/O limited.
 
Last edited:

geordiemp

Member
It was EPIC who said that and well they have a marketing contract it seems, they did the same back then with PS4.
Also, no one ever said that XSX is hard to develop for.

What we know is that some devs had to throttle the CPU to make sure that the GPU is running sustained:



We even have Cerny say, that in order to make use of the PS5, developers actually have to change their engines and build their engines around PS5 to make use of it:



Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

So, according to Cerny, devs now have to change their engines. Their engines need to be basically based on power consumption in order to achieve optimal performance on PS5. If they do not do this, they will not be able to achieve the optimal performance - according to Cerny. Keep in mind, that this only applies to PS5.
It does not apply to XSX, on XSX you do not need to throttle the CPU to ensure a sustained GPU clock, both CPU and GPU can be sustained at 100% under all environments:

What a load of dribble, hardware control of power has nothing to do with developers and software for games, you are funny.

Also, for the rest, just NO.


iBvJfYu.png
 
Last edited:

NullZ3r0

Banned
It’s also tied to clock speed. It’s just people trying really hard at widening the GPU gap because 2 Tflops were a really big deal until people realized it was just a 15%-18% difference.

It gets talked plenty, but no matter how hard you keep looking for it, it isn’t there.

There’s no massive difference and if there were, you would be hearing it from devs already. We aren’t.
Its still a 2 TFLOP difference, though. And we are hearing it from developers, you just choose to ignore thoughts comments in favor of Sony-sponsored devs.

Proof will be in the pudding in a month's time.
 

Thirty7ven

Banned
Its still a 2 TFLOP difference, though. And we are hearing it from developers, you just choose to ignore thoughts comments in favor of Sony-sponsored devs.

Proof will be in the pudding in a month's time.

It’s a 15% - 18% difference and what we are hearing from developers is that differences will be modest and the gap is much smaller than PS4 - Xbox One.

I wonder how some of you survived this generation, with the Xbox One. You speak as if the difference is humongous now, I can only imagine how the difference between the PS4 and the Xbox One looked to you.
 

Nikana

Go Go Neo Rangers!
It’s a 15% - 18% difference and what we are hearing from developers is that differences will be modest and the gap is much smaller than PS4 - Xbox One.

I wonder how some of you survived this generation, with the Xbox One. You speak as if the difference is humongous now, I can only imagine how the difference between the PS4 and the Xbox One looked to you.

Tons of people said one X couldn't hit 4k with a 1.8 TF difference and they did many times. Comparing percents isn't always fair.
 

Thirty7ven

Banned
Tons of people said one X couldn't hit 4k with a 1.8 TF difference and they did many times. Comparing percents isn't always fair.

I don’t care what some fanboys said. I care what devs say.

XBX doesn’t over-perform when compared to the PS4 pro. It performs according to its GPU and Memory differential. It’s not magic.
 
Last edited:

Elog

Member
It's not as if the UE5 demo had many unique assets. It had very few but very complex assets.

You are correct that at any given time each piece of hardware has a texture budget that can be filled with (number of textures)*(average texture size in bytes). Point is that with the I/O capabilities of the PS5 it should have a larger texture budget than the XSX.

Like I said, the XSX should have a frame rate advantage at any given resolution.

Strengths and weaknesses.

Disclaimer: Based on available reliable information.
 
Last edited:

Nikana

Go Go Neo Rangers!
It is more accurate than taking an absolute delta irrespective of baseline.
You will notice much more of a graphical jump from 4TF to 6TF than you will from 10TF to 12TF.
That's what people refer to as diminishing returns.

Right but that was the case with One X. The 6 TF was said by many to not be a huge jump. The GPU wouldn't make that much of a difference because of (insert reason here, RAM, TF, ETC.)

Its premature to say it wont make a big difference.
 

Thirty7ven

Banned
Right but that was the case with One X. The 6 TF was said by many to not be a huge jump. The GPU wouldn't make that much of a difference because of (insert reason here, RAM, TF, ETC.)

Its premature to say it wont make a big difference.

There wasn’t much difference between the PS4 pro and the XBX. And the difference that there is is explained by the GPU and Ram increase.

Why are you trying to artificially stretch the difference?
 

LordOfChaos

Member
Not much new but AMD had a corporate presentation today






RDNA 2 is definitely made to scale better than before to higher clock speeds
 
Last edited:

Nikana

Go Go Neo Rangers!
There wasn’t much difference between the PS4 pro and the XBX. And the difference that there is is explained by the GPU and Ram increase.

Why are you trying to artificially stretch the difference?

1440p vs 4K with high res textures isn't a big difference? Im not stretching anything. As someone who owns both machines and has played a few games on both like Doom, Destiny 2, RB6, Cod MW. Theres a clear difference in Image quality. It will obviously be different from game to game depending on how the developer utilizes the power.

Saying there wont be a huge difference again is premature. The same was said about the One X. RAM speed wont make a huge difference. 6TF vs 4.2 TF isn't a huge difference. Both of which ended up being false.
 

DForce

NaughtyDog Defense Force
Its still a 2 TFLOP difference, though. And we are hearing it from developers, you just choose to ignore thoughts comments in favor of Sony-sponsored devs.

Proof will be in the pudding in a month's time.

18% difference or a 2TF is not a lot when we're comparing a 10TF GPU to a 12TF GPU.

RTX 2070 Super 9TF
RTX 2080 Super 11.1TF




PS4 and XBOX GPUs are about 40% apart. This is not the case with the XSX and PS5.

Anyone who has been gaming on PC knows that reaching that target frame-rate will only result in a small reduction of resolution.

You can watch reviews when it comes to the difference between both GPUs and they'll tell you that the difference is not big, it's small.

XSX should have better RT due to more CUs, but we still don't know how much of a better performance the PS5 is going to receive due to a higher clock.
 
Last edited:

FranXico

Member
The 6 TF was said by many to not be a huge jump
And it wasn't huge. It was noticeable, but not as dramatic as the ~500GF difference between the X1 and the PS4, which in turn, became less drastic over time as developers got used to both machines.

The one thing I'm expecting to really stand out graphically between the two next-gen machines is ray tracing (better on XSX of course). The extent of its impact will depend on the game though.
 

Nikana

Go Go Neo Rangers!
And it wasn't huge. It was noticeable, but not as dramatic as the ~500GF difference between the X1 and the PS4, which in turn, became less drastic over time as developers got used to both machines.

The one thing I'm expecting to really stand out graphically between the two next-gen machines is ray tracing (better on XSX of course). The extent of its impact will depend on the game though.

Honestly I would say they are on par. Maybe its because I went from a 1080p to 4k OLED this gen when I got a pro and One X. But the differences in image clarity were similar in terms of leaps for me. Its very clear which has the advantage of Xbone vs PS4 and One X vs Pro.
 

II_JumPeR_I

Member
It really gets ridiclious what kind of expectations people got because of that SSD ...
There is more to a System than just the SSD...
There is a reason why cerny never mentioned Raytracing or said nonsense like TF dont matter...

Getting tired about this SSD jerkoff contest on almost every gaming Forum.

I guess i should by this SSD Revision (when its out) for my PC so that i can get a better System than the SX /s
 

Thirty7ven

Banned
Honestly I would say they are on par. Maybe its because I went from a 1080p to 4k OLED this gen when I got a pro and One X. But the differences in image clarity were similar in terms of leaps for me. Its very clear which has the advantage of Xbone vs PS4 and One X vs Pro.

Point being the difference isn’t bigger than it is on paper.
 
You clearly never went into next gen speculation thread. Especially after UE5 presentation.


AFAIR There were speculations, that they could get away with slower (and cheaper) modules thanks to using 12 channels.

They could, and likely did. However I have reasons to speculate MS are probably using similar smaller/slower modules as well, it comes down to pricing.

It's relatively easy to "mux" multiple NAND chip modules in parallel to a channel. The main hit you could potentially take is price depending on the number of chips, but it would still be cheaper than a smaller number of larger and faster NAND modules going by market averages, and we know NAND costs have been a factor for both companies for a long while now.

18% difference or a 2TF is not a lot when we're comparing a 10TF GPU to a 12TF GPU.

RTX 2070 Super 9TF
RTX 2080 Super 11.1TF




PS4 and XBOX GPUs are about 40% apart. This is not the case with the XSX and PS5.

Anyone who has been gaming on PC knows that reaching that target frame-rate will only result in a small reduction of resolution.

You can watch reviews when it comes to the difference between both GPUs and they'll tell you that the difference is not big, it's small.

XSX should have better RT due to more CUs, but we still don't know how much of a better performance the PS5 is going to receive due to a higher clock.


Higher clocks will only affect some RT intersection bounces and pixel fillrate, and speed of operation of data on the GPU caches. All other GPU performance metrics are generally influenced by more CUs, TMUs, physical cache amounts, ROPs etc.

One thing people should remember with PS4 Pro and the X is that games for those did not natively target that hardware; they were always bound to the base consoles and simply used as resolution upgrade boxes and little else. It's arguable that the full potential of both mid-gen refreshes was never realized due to this fact.

PS5 and XSX are not in that same position (inb4 someone screams "but Lockhart! Lockhart!!"), so we will see games, even at launch, targeting their specs as their baseline. This isn't to suggest some magical growth in assumed performance deltas, just to illustrate that you can assume that conclusion without erroneously using the mid-gen consoles as a basis for the argument since, again, they were always restricted by the PS4 and XBO in terms of what they could do in the first place.

Not much new but AMD had a corporate presentation today






RDNA 2 is definitely made to scale better than before to higher clock speeds


But it still depends on a host of other factors. For example the process; the consoles are using DUV enhanced, rather than EUV. It's safe to assume that you'd get better performance on EUV than DUV enhanced.

RDNA1 had a sweetspot of 1.7 GHz - 1.8 GHz. Going by the PS5 cooling patent and the XSX's GPU freq, I think we can rightly guess that the sweetspot for RDNA2 on DUV enhanced is somewhere around 1.8 GHz to 1.95 GHz or maybe 2 GHz if we're pushing it. Higher than that though and we're outside of the northern tip of that sweetspot, so then you're seeing smaller freq gains for power load.

Going by Cerny's own 10% power reduction for 2% frequency drop claim (and this is with their cooling system factoring in), you have a 5:1 ratio of power to frequency outside of that sweetspot at the frequencies PS5's GPU is at. I don't know if that's much better than RDNA1 once you start overclocking those GPUs, tbh (I understand the PS5 isn't overclocking, but Cerny did refer to it as a continuous boost mode).

So in terms of RDNA2's better scaling to clock speeds, there's still a limit and I'd venture that's at the 2 GHz figure. After that you still run into worst-than-linear scaling WRT power load for freq gains, and the cooling needs to be all the more capable to help sustain those clocks for prolonged periods. Even then, it's beneficial to implement something like Smartshift; I wouldn't be surprised if AMD's RDNA2 desktop GPUs implement it to push for insane performance when competing with Nvidia's absolute top-of-the-line cards.

soulbait soulbait You're right in that both systems are going to be awesome, and have amazing tech. I'm trying to gather thoughts together to express this in the best way possible and I think when people put the flags down and just look at the systems for what they are, they will be very impressed with the similarities and differences both bring to the table.

However, as for why we get passionate in discussing these systems? Well for some of us, we're just really into discussing console technology. I love learning about console architectures, from SNES to MegaDrive to Saturn, Jaguar, PS1, PS2, Dreamcast, Gamecube etc. Arcade and microcomputer architectures, too. And I also love reading up on console history, as much as possible.

So one thing I've noticed is the influence the media and, nowadays, online forum and social media discourse, can have in putting certain narratives out there when it comes to these systems. If you look back to the past you can see that even with gaming magazines of the day, not all reporting was honest and truthful. Companies like Nintendo literally ran slander editorials against rivals, and big companies like EA put out false performance narratives on certain gaming platforms that fed to narratives that weren't necessarily true (both of these examples are WRT SEGA). There's also instances of gaming magazines intentionally giving bad scores to games on certain platforms simply because they didn't get free gifts (one European gaming magazine did this with a computer game maker back in the day for a review of their game on the Amiga IIRC).

Nowadays there are game journalists and content creators with massive platforms who can basically either use their influence to either discuss these consoles fairly, or to stoke fanboy flames and console wars. Some unfortunately DO choose the latter, and the thing is the discourse they cause there spreads outwards to other parts of the gaming community. It damages genuine discussion with people who are generally in the middle. Worst still, a lot of the tactics used to create those type of divides when it comes to console brands, is borrowed from what we see currently by the media in driving political divides, so you get a lot of fanboys/fangirls with extreme POV and who fall into echo chambers to reinforce their strong biases, to the point they don't even realize they are doing it anymore!

As someone who likes pretty much every gaming platform and brand in one way or another, and has a lot of appreciation for their contributions to the industry and hobby, it's sooo frustrating to see when people (either unknowingly or outright intentionally) fabricate FUD against a given platform or system simply to support their preferred brand. That stuff spreads outwards and can influence other people to behave similarly, and can outright ruin genuine discussion. So whatever I can do to stem that type of thing, I will gladly do, because I'm passionate about this stuff.

Now, I have a tendency to do that moreso for one given side over the other but it really depends on what I perceive is the trend in terms of which system/platform/brand gets the short end of the stick more often than the other in these kind of discussions. Right now, IMHO, that unlucky one tends to be MS and Xbox, so I gravitate on focusing more speculation around that in hopes of clearing misconceptions or even being more enlightened on some things from time to time. But that isn't to say I don't focus speculation on Sony or PS to clear up what I think are misconceptions (either in embellishing them or being FUD against them); it's just that on this forum for example there are already a lot of other people who do that, so there's less of a reason for me to do so in possibly being redundant.
 
Last edited:
It is more accurate than taking an absolute delta irrespective of baseline.
You will notice much more of a graphical jump from 4TF to 6TF than you will from 10TF to 12TF.
That's what people refer to as diminishing returns.
I don't understand how people don't understand relative performance. It doesn't matter if it's 2TF or 20TF, if the relative difference is 18%, or whatever, the power difference is the same. 30% will always be a larger performance gap than 18% not matter how large the numbers. Math does not lie, and usually, only a Sith and Math deals in absolutes!

(in not much of a Starwars fan, i've just always liked that quote LOL)
 
Last edited:

Lethal01

Member
You gotta be shitting me. Pray tell how a SSD processes pixels on the screen?
It doesn't, it helps the GPU by .

Having more available Vram due to having to keep less assets in ram since you can stream in so much per frame. Allowing you to have higher quality or more diverse assets, And allowing for more memory intensive graphical effects.

Allowing for new assets streaming technology like Nanite

Being able to make the world look more seamless for the same reason. for example you may be able to create a cityscape that has a bunch of openings showing an underground portion of the city that you couldn't have done at that quality if you had to keep it all in ram

Being able to seamlessly transition into giant buildings that house their own levels at any time. Some may think this is already possible but it usually has to be very carefully orchestrated OR the buildings have to be low enough quality that you can have it ready in ram for whenever the player wants to go inside.

Less restrictions on dynamic events in the city due to being able to pull any enemy, vehicles, npc, animation, etc into ram instantly.

better cutscenes, once again you can now make cuts that happen anywhere in the world without having to switch to a video file.
You could have a phone conversations between two characters doing their own thing in two totally different locations that fill the ram to the brim.

here, I'm going to post this shit for the thousandth time.



Next time I'll just pm anyone who needs an explanation.
 
Top Bottom