• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

kyliethicc

Member
Really?

Is this a case of MS just spending a shit ton or that 3rd party vendors can actually make money on GP with new games?
Game publishers are paid cash by Sony and Microsoft when they are put on PS Plus / PS Now / Xbox Live / Game Pass.

Game publishers are also paid cash by Sony and Microsoft for timed exclusivity deals.

But none of that is funding a game. The publisher funds the game and collects the revenue.
 

FrankWza

Member
Well me neither but I prefer to believe them until they get proven that they are lying.
They said they could have made it on last gen consoles but they would have had to make concessions.
probably a good idea because I remember reading series s was really low res.

this is an interesting quote against the tools argument I think:
“With each generation of consoles, they’re still closer and closer to the PC,” says Zięba. “The structure and everything, so with new consoles and Xbox Series X and S, it was mostly a no brainer in a way, you know, like the engine is prepared for it and architecture is like normal, you know how this stuff works there, not like the PS3 or something. So no, it was a really pleasant experience, mostly.”
 

PaintTinJr

Member
And then we have games that demonstrate extremely fast load times on the PS5. Spiderman Remastered, Spiderman Mike's Morales, Demon Souls and NiOh are the best examples of that in my opinion. Then there's the question of how Ratchet will handle the I/O since it's supposed to be a proof of how it works.

I would agree with you that the PS5s I/O is inferior due to some of the multiplats but then we have examples of when it can be incredibly fast.

I guess in order for the I/O to be of any use the devs have to program for it. And that's something that's probably not done in multiplats but it is with 1st party titles and exclusives.

I guess we have to wait for more games before being certain of the systems I/O capabilities. Does that seem ok to you?
Going by Control UE load time being marginally slower on the PS5, and being around the slow PC sata6 SSD speed, despite using Ooodle compression, it is proof that this generation, just like the PS3 one will be marred by parity clauses that will be capping frame-rates and putting in artificial wait states in games IMHO.

The game is an ultimate edition and is a native PS5 app, and looks a bit last-gen compared to Morales IMO - which is open world - yet requires 10secs more of loading at a minimum of 5.5GB/s, suggesting the PS5 has 60GB of GDDR6. Even if we said it could only manage 25% of that performance - and still no compression used - we'd still have time to spare that is waiting, because games aren't using 16GB on Ps5, and certainly not a cross-gen port.

So IMHO, any arguments about the IO performance in the consoles can only be assessed truly by their own first party efforts, because most third parties will be contractually bound to avoid showing the true state of the comparison if all these games that take longer than 3-4secs to load are going to be the norm.
 
Well me neither but I prefer to believe them until they get proven that they are lying.

We do have a massive jump in CPU and I/O. Some next gen games are simply not going to be possible on last gen systems. Slow HDDs and those anemic jaguar Cores can't handle a lot compared to these SSDs and Zen 2 CPUs.
You're absolutely right, the jump this gen is massive. And I'm glad we got rid of those Jaguars.
In an ideal world devs would just develop for current gen and make a clean cut, they'd also only develop for the newest GPU. But since we have something like 150 million last gen PS and XBoxes against 10 mio current gen, it just makes no sense (financially) to not consider cross gen for the foreseeable future.
So how Halo Infinite or Horizon FW would have looked just being built for current gen we'll never know.
 
Going by Control UE load time being marginally slower on the PS5, and being around the slow PC sata6 SSD speed, despite using Ooodle compression, it is proof that this generation, just like the PS3 one will be marred by parity clauses that will be capping frame-rates and putting in artificial wait states in games IMHO.

The game is an ultimate edition and is a native PS5 app, and looks a bit last-gen compared to Morales IMO - which is open world - yet requires 10secs more of loading at a minimum of 5.5GB/s, suggesting the PS5 has 60GB of GDDR6. Even if we said it could only manage 25% of that performance - and still no compression used - we'd still have time to spare that is waiting, because games aren't using 16GB on Ps5, and certainly not a cross-gen port.

So IMHO, any arguments about the IO performance in the consoles can only be assessed truly by their own first party efforts, because most third parties will be contractually bound to avoid showing the true state of the comparison if all these games that take longer than 3-4secs to load are going to be the norm.

I don't know if such a parity clause exists though. If a game can load in 2 seconds on the PS5 and in 10 seconds on the XSX (just an example) I don't see why it's a big deal for developers to do that. They already are making decisions that cause visuals differences between the two (Hitman 3). I don't see why they can't do the same where the I/O is concerned.

But you're right that 1st will be the ones to truly show it off. Ratchet should be an interesting one to look at since it's supposed to show off what the I/O can do.
 
So how Halo Infinite or Horizon FW would have looked just being built for current gen we'll never know.

I guess we never will. As I said earlier hopefully that doesn't happen with games like Hellblade 2 and God of War Valhalla. But you never know with those large install bases on previous gen consoles.
 

devilNprada

Member
Game publishers are paid cash by Sony and Microsoft when they are put on PS Plus / PS Now / Xbox Live / Game Pass.
Well yeah I knew this....
But none of that is funding a game. The publisher funds the game and collects the revenue.
This was the question.. I did not think so, but he kind of made it sound like it did... unless maybe it's FTP micro transaction based.
Edit: it’s future as far as keeping titles relevant when they grow the library
This means to me... not that gamepass won't succeed, but their games will get barried in a large library.
 

PaintTinJr

Member
I don't know if such a parity clause exists though. If a game can load in 2 seconds on the PS5 and in 10 seconds on the XSX (just an example) I don't see why it's a big deal for developers to do that. They already are making decisions that cause visuals differences between the two (Hitman 3). I don't see why they can't do the same where the I/O is concerned.

But you're right that 1st will be the ones to truly show it off. Ratchet should be an interesting one to look at since it's supposed to show off what the I/O can do.
Well, the alternative to believing that parity clauses - that were confirmed and defended by Xbox officials back in the days of 360/Ps3 IIRC to DF -aren't back in play, is that either Cerny lied about technical specs, or that all these developers doing native PS5 apps are terrible at their jobs to be so far adrift from getting 5.5GB/s, never mind the 22GB/s theoretical with compression.

In fact, with the PS5 SSD being multi-channel, the worst performance we should expect for loading s a 4x increase over a Sata6 SSD, and Control Ultimate Edition is a 1/4 or 1/2 of that speed if a 4GB - 8 GB ram usage for the game is to be believed, and everything needs reloaded on every death/restart.
 

IntentionalPun

Ask me about my wife's perfect butthole
Yes existed before but not in that scale (on consumer electronics). That design (where SPU can handle AI, Physics and graphics) forced dev like Guerilla to utilize that to max.

GPUs don’t do AI today... and GPUs did... physics and graphics, before the cell.

I’m not seeing anything being actually explained here why the Cell was some precursor to modern GPGUs. GPUs don’t use stream processing units. The only modern hardware I can find info on that does something similar is the Tempest Engine in PS5.

What is the similarity between The Cell and modern GPGUs that people are claiming prepared devs for using them? (Even though... they existed before Cell)
 

DaGwaphics

Member
Going by Control UE load time being marginally slower on the PS5, and being around the slow PC sata6 SSD speed, despite using Ooodle compression, it is proof that this generation, just like the PS3 one will be marred by parity clauses that will be capping frame-rates and putting in artificial wait states in games IMHO.

The game is an ultimate edition and is a native PS5 app, and looks a bit last-gen compared to Morales IMO - which is open world - yet requires 10secs more of loading at a minimum of 5.5GB/s, suggesting the PS5 has 60GB of GDDR6. Even if we said it could only manage 25% of that performance - and still no compression used - we'd still have time to spare that is waiting, because games aren't using 16GB on Ps5, and certainly not a cross-gen port.

So IMHO, any arguments about the IO performance in the consoles can only be assessed truly by their own first party efforts, because most third parties will be contractually bound to avoid showing the true state of the comparison if all these games that take longer than 3-4secs to load are going to be the norm.

Or maybe it's not a big conspiracy and the game precompiles some shaders on load or does some CPU busy work. :messenger_tears_of_joy:
 

roops67

Member
Well, the alternative to believing that parity clauses - that were confirmed and defended by Xbox officials back in the days of 360/Ps3 IIRC to DF -aren't back in play, is that either Cerny lied about technical specs, or that all these developers doing native PS5 apps are terrible at their jobs to be so far adrift from getting 5.5GB/s, never mind the 22GB/s theoretical with compression.

In fact, with the PS5 SSD being multi-channel, the worst performance we should expect for loading s a 4x increase over a Sata6 SSD, and Control Ultimate Edition is a 1/4 or 1/2 of that speed if a 4GB - 8 GB ram usage for the game is to be believed, and everything needs reloaded on every death/restart.
I get what you're saying, just by raw uncompressed speed alone the PS5 should be loading twice as fast the XSX, but something is amiss. Not sure what to make of those parity clause, hey they did go to the effort of oodle compressing it all so if anything yes it should load heck lot faster on PS5. Could be how the multi plat game is written therefore not taking advantage of native features, and being multi plat most the code is the same between the platforms and can't use the PS5 features without breaking the code, an yep they would had to add artificial waits not to break the code... dunno I'm just guessing aloud
 
Last edited:

Lunatic_Gamer

Gold Member

End Of An Era: Sony Just Announced They’re Stopping Production Of The PS5

Get ready to experience a roller coaster of emotions, Sony diehards. In a press conference early Thursday morning, the Japanese console maker ushered in the end of an era with the bittersweet announcement that they will stop production of the PlayStation 5.
“It was a wild ride, but all good things must come to an end,” said a visibly emotional Sony CEO Kenichiro Yoshida, pausing several times to collect himself while announcing the momentous decision to cease manufacturing the console. “We want to thank the designers and the game makers. But most of all we want to thank the fans for everything they did to make this the most incredible three months in our company’s history.”
Of course, it’s better to go out on top than for people to grow tired of you, and we wish Sony all the best on their farewell tour selling off their last backstock of consoles. And as much as we wish the PS5 could last forever, we prefer this to the sad spectacle of watching Microsoft trying to wring a few more months out of the doddering Xbox Series X.


Sorry I couldn’t resist. Had to post it. Too funny. 😂🤣😂
 
Last edited:

End Of An Era: Sony Just Announced They’re Stopping Production Of The PS5






Sorry I couldn’t resist. Had to post it. Too funny. 😂🤣😂

Get him boys.

UnsteadyDownrightBarracuda-size_restricted.gif
 

Imtjnotu

Member

End Of An Era: Sony Just Announced They’re Stopping Production Of The PS5






Sorry I couldn’t resist. Had to post it. Too funny. 😂🤣😂
someone start a next next gen speculation thread bow and get this PS6 ball rolling
 

Hashi

Member
GPUs don’t do AI today... and GPUs did... physics and graphics, before the cell.

I’m not seeing anything being actually explained here why the Cell was some precursor to modern GPGUs. GPUs don’t use stream processing units. The only modern hardware I can find info on that does something similar is the Tempest Engine in PS5.

What is the similarity between The Cell and modern GPGUs that people are claiming prepared devs for using them? (Even though... they existed before Cell)
Vector
GPUs generally are streaming units.
 

jonnyp

Member
Quality mode has input lag issue, also it's so choppy even though it's 30fps that choppiness gives it a feel below 30fps. I play 30fps games with zero problems, but it's near unplayable in Quality mode compared to Spiderman MM that I played in Fidelity mode. Maybe bad motion blur implementation? Not sure.

Yeah, I thought there was something wrong with my PS5 when I turned on RT mode on Control UE. It's unplayable in that mode and I also have no problem playing in 30fps in other games.
 
Alex narrative is always: " Nope ps5 hasn't it, neither that, neither the other. I practically know nothing about ps5 hardware because I can't care less, but believe me when I said everything inside it gives diminished return"
He was already console warring "PS5 has no hardware RT" for months before being very surprised by Ratchet and Clank & Spiderman pristine implementation of RT.

He just moved his goalpost. Xbox has to do something better (at hardware) than PS5. But he is again wrong, as PS5 is the console that has shown the most impressive display of polygons, not the XSX.
 

PaintTinJr

Member
Or maybe it's not a big conspiracy and the game precompiles some shaders on load or does some CPU busy work. :messenger_tears_of_joy:
Shaders on consoles are loaded in a precompiled state AFAIK because they would otherwise represent an attack vector for hacking/reverse engineering to run homebrew/pirated software, so chances of that are very slim IMHO. And with unified memory, some CPU busy task sounds unlikely, when it would be an accelerated async task if it were needed. But the PC version would have shown up these timely workloads in the H2H, which I'm pretty sure it didn't.
 

assurdum

Banned
He was already console warring "PS5 has no hardware RT" for months before being very surprised by Ratchet and Clank & Spiderman pristine implementation of RT.

He just moved his goalpost. Xbox has to do something better (at hardware) than PS5. But he is again wrong, as PS5 is the console that has shown the most impressive display of polygons, not the XSX.
Now I don't care particularly which offers the best one but Alex clearly doesn't follows the ps5 stuff and if he is intellectually honest he should just say it's not his matter and go on.
 
Last edited:

PaintTinJr

Member
Can you provide some links regarding that? Would love to read on it
I'll try, but we are probably at least a decade too late for that article to still be available - given how it showed behind the curtain - and if my archive.org experiences with DF articles are anything to go by, it wouldn't be there in its original form, now, either. I'm sure I can't be the only person that remembers that brief defence of the parity clause from back in the day.

The best I can offer was a reminder of the widely read article at the start of PS4/XB1 gen - which I'm sure most people remember, hopefully including yourself -when Ubisoft said they developed to make both versions of AC equal frame-rate/resolution/fidelity - despite the major hardware differences between the two platforms; essentially confirming a parity clause was in play, without actually saying it was contractually obligated.
 

assurdum

Banned
I'll try, but we are probably at least a decade too late for that article to still be available - given how it showed behind the curtain - and if my archive.org experiences with DF articles are anything to go by, it wouldn't be there in its original form, now, either. I'm sure I can't be the only person that remembers that brief defence of the parity clause from back in the day.

The best I can offer was a reminder of the widely read article at the start of PS4/XB1 gen - which I'm sure most people remember, hopefully including yourself -when Ubisoft said they developed to make both versions of AC equal frame-rate/resolution/fidelity - despite the major hardware differences between the two platforms; essentially confirming a parity clause was in play, without actually saying it was contractually obligated.
I think it's more an Ubisoft choice. They prefer parity over better version probably because they are scared to sell less in the "inferior" version? Though curiously this attitude wasn't applied at all to the one X version, well, at least in the post patch launch. Quite bizzarre especially considered their sales are majorly tied to the playstation brand
 
Last edited:

Elog

Member
If the hardware is identical between RDNA1 and RDNA2 and primitive shaders and mesh shaders are the same thing hardware wise, why has AMD not just enabled mesh shaders on RDNA1? They could have done that two years ago.
That is not what he is saying. He saying that the primitive shader function is the same for both shaders. What the Mesh hardware piece is doing is determining on what pieces of the geometry that primitive shader should be applied and how much (i.e. culling and variable rate shading respectively). RDNA1 does not have that sorting function.

If you look at a simplistic (please observe) outline of the hardware flow comparison for the shader function between RDNA1, RDNA2 and the Sony solution it looks roughly as follows:

RDNA1: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on close to 100% of geometry data -> output
RDNA2: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> Mesh/VRS hardware block works through geometry to cull/ prioritize -> primitive shader hardware unit works on just 10-20% of geometry data
PS5: Geometry generated by the GE -> Culling and prioritization of geometry happens on GE level -> 10-20% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on just 10-20% of geometry data

On paper the PS5 solution is better BUT comes with a big drawback - it requires reengineering of graphical engines to be used. The RDNA2 (and Nvidia solution that is similar) will be able to be used with current graphical engines without significant recoding.

Why is the PS5 solution better on paper (please observe again - paper is one thing and reality is another - we do not know the real world numbers yet)? Two main things: Doing culling and prioritization on the GE level makes it easier for the programmer to control exactly what is happening with APIs. Secondly, since less data is pushed to the arrays in the first place your CUs are used much more efficiently throughout (this is what Matt H meant with his tweet that is above in this chain) including a significant increase in practical shader array and CU cache sizes (less data -> practical increase in cache size and efficiency).
 
Last edited:
Status
Not open for further replies.
Top Bottom