• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Nowcry

Member
Using temp injection and getting the same experience as a native 4k but with a constant 60 fps will offer a better experience for the user.

I do not see a clear victory at all, getting drops of 20 fps in scenarios maintaining the 4k or 1800p with a stable 60 fps. It is to be very tied.
 

HoofHearted

Member
It's also possible the difference is much smaller in performance. It probably is.

PS5 @ 1800P could be hitting higher than the XSX's FPS average @4k if both were unlocked.

W/O dynamic resolution, and with a locked framerate, you get these big pixel output differences w/o really knowing what the real engine performance difference was. PS5 couldn't maintain 60FPS at 4k? Dump it to 1800p.. that is a quick way to "solve that problem" lol
The more important (and interesting) question that should be asked is - what caused, (or why did?), the developer to make a conscious choice to target a lower resolution and shadow quality in this particular game?

Up until now - all cross-gen games for XSX and PS5 have been (generally) equal in output (assuming latest updates/patches in play... ).

Per the article - the FPS drops are rare and inconsequential:

"Put simply, it's 60 frames per second... with just one exception in our hours of play. In the Mendoza mission set in Argentina, it is possible to see Xbox consoles run between 50 to 60fps around a field towards the outskirts of the level, while PlayStation 5 remains constant at 60fps. Hopefully IO will look at improving this for owners of the Microsoft machines, but everything else we played ran flawlessly - bar a very slight stutter in a cutscene at the beginning of the Miami stage from Hitman 2, where all consoles dip from to near 40fps. At this point though, it feels more like a nitpick rather than anything that would have an impact on any specific purchasing recommendation."


The biggest difference is resolution and shadow quality:

"Meanwhile, there are additional tweaks to shadow quality too. Series S uses the equivalent to PC's low quality shadows (in common with the last-gen base console renditions of Hitman 3), while PlayStation 5 runs at medium (similar to One X and PS4 Pro) and Xbox Series X operates at high. The difference is fairly subtle across all three, but it's there nonetheless. In all other scenarios, all three next-gen consoles are the same and the overall presentation is first class. Yes, Series X's resolution advantage is there, and the Glacier engine thrives on precision, giving it a pristine edge. With that said, however, the lower resolution on PlayStation 5 is in no way a problem and still looks wonderful."
 

cragarmi

Member
The more important (and interesting) question that should be asked is - what caused, (or why did?), the developer to make a conscious choice to target a lower resolution and shadow quality in this particular game?

Up until now - all cross-gen games for XSX and PS5 have been (generally) equal in output (assuming latest updates/patches in play... ).

Per the article - the FPS drops are rare and inconsequential:

"Put simply, it's 60 frames per second... with just one exception in our hours of play. In the Mendoza mission set in Argentina, it is possible to see Xbox consoles run between 50 to 60fps around a field towards the outskirts of the level, while PlayStation 5 remains constant at 60fps. Hopefully IO will look at improving this for owners of the Microsoft machines, but everything else we played ran flawlessly - bar a very slight stutter in a cutscene at the beginning of the Miami stage from Hitman 2, where all consoles dip from to near 40fps. At this point though, it feels more like a nitpick rather than anything that would have an impact on any specific purchasing recommendation."


The biggest difference is resolution and shadow quality:

"Meanwhile, there are additional tweaks to shadow quality too. Series S uses the equivalent to PC's low quality shadows (in common with the last-gen base console renditions of Hitman 3), while PlayStation 5 runs at medium (similar to One X and PS4 Pro) and Xbox Series X operates at high. The difference is fairly subtle across all three, but it's there nonetheless. In all other scenarios, all three next-gen consoles are the same and the overall presentation is first class. Yes, Series X's resolution advantage is there, and the Glacier engine thrives on precision, giving it a pristine edge. With that said, however, the lower resolution on PlayStation 5 is in no way a problem and still looks wonderful."
To me this is just the consequence of an engine developed for last gen machines favouring TFs, over clock speeds. The Series X can just brute force with their TF advantage, the effect used for reflections is very expensive in terms of graphics computation, if they used RT instead the machines would probably be a lot closer. Interesting non the less, but a cross gen game is unlikely to show either machines in their best light.
 

Lysandros

Member
I remember RGT saying that Sony were sending out tutorial and codes for first and third party studios so they could actively explore the GE and use it in different ways.

He also stated that a developer told him it's like Primitive Shaders on steroids, it's fair to say we won't see games take full advantage of it till around the second or third wave of games which is like 2023 onwards.

If his leaks are correct in that the GE also heavily influenced RDNA 3's geometry handling then it should be something special.
I hope that Sony is doing something similar about its hardware heavy SSD/IO complex because the situation is blatantly embarassing in third party games right now.
 

Gudji

Member
Why is so weird that Hit man 3 has 1800p on ps5? So far all the 3rd party games have been performing the same on both consoles, why now?
Sony hasn't updated the tools yet.
Barack Obama Reaction GIF
 

kyliethicc

Member
I guess the big question is why not make both 4K with DRS?
I wonder if they wanted to make sure the game is 1080p and 2160p on the Series S and X respectively. This could explain why they chose a static resolution that results in some frame drops instead of using dynamic res. Maybe their engine is not built to use DRS, idk.

Just like some games run at higher res on PS5, I don't think this game tells us much new info about what either console is truly capable of. I bet the PS5 could run the game at 4K just with some frame drops, etc.

The honest reality of all the console war shit talking is that most of us would never be able to tell any of these differences just by playing the games. Thankfully we have DF and others supplying us all with constant ammo.
 

He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs. I'm assuming Alex has no clue what he's talking about and I don't say that lightly. Dealer proved his absolute lack of common sense by citing Tom Warren as a credit source of info, that's how low he went and this post he made now further proves that point. Citing Battaglia as a credible source of info when he knows jack about compiler architectures is a new low. But then again, it's Dealer we're talking about, so we're basically talking about an absolute nobody who will stoop far too low to get his nonsensical point across. By the way, he latches onto the bs spread by people like Astal and Misterxmedia and I'm sure that on its own says a lot about him.
 
Last edited:

Mahavastu

Member
Not really, than they had to whitelist the TV on the PS5 and the PS5 in the TV. I really don't think that it works that way. If it really works (currently) it is just a custom standard nobody knows about. Something we really don't want to see if there are standards that could work with any device.

But I really get it. HDMI 2.1 seems to be a mess.
I assume the Sony TV and the PS5 do only support some subset of the HDMI2.1 API/functionality needed to get the VRR working but "not good enough" support it 100% so it may fail with other devices, not made by Sony.
Such problems happen quite often if a new technology is introduced and devices by different manufacturers are not compatible with each other even if they should. Maybe Sony just wants/needs more devices by others to test against to avoid frustration.
As you said, HDMI2.1 seems to be a mess and that makes the life of a manufacturer not easy.
 
Last edited:
Someone's PS5 be coming in like this


Ice cleats should be standard issue PPE for a job like that. If that video happened in the UK and the video was shown to the HSE, that courier company would get a right bollocking.
As explained before, these comparisons between game consoles and PC are not really interesting because it is impossible to have it fair => the PS5 (and XsX of course) are running effects at quarter resolution.... We already said that, and Alex seems only now to share that point...



Afaik, alpha transparency effects are more memory bandwidth bound than GPU compute bound. Given how expensive memory bandwidth is on consoles, it makes sense why lower res alpha effects are used. On PC GPUs get full VRAM bandwidth all to themselves, whereas on console the UMA means the GPU has to share what paltry bandwidth is available with the CPU.

A hypothetical PS5 "equivalent" spec'd PC would be able to run full res alpha effects no problem, because there's simply more available total and effective GPU memory bandwidth and far less contention issues.
 

Mahavastu

Member
He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs.
Alex talks about "build performance", which would mean "how long does it take to build/compile the game". It has nothing to do with the performance of the resulting code.
That is something which makes the "change -> compile -> test" cycle faster, which reduces the idle time of the devs
 

Shmunter

Member
He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs. I'm assuming Alex has no clue what he's talking about and I don't say that lightly. Dealer proved his absolute lack of common sense by citing Tom Warren as a credit source of info, that's how low he went and this post he made now further proves that point. Citing Battaglia as a credible source of info when he knows jack about compiler architectures is a new low. But then again, it's Dealer we're talking about, so we're basically talking about an absolute nobody who will stoop far too low to get his nonsensical point across. By the way, he latches onto the bs spread by people like Astal and Misterxmedia and I'm sure that on its own says a lot about him.
If that’s the case, a quick patch with recompiled code for past games should demonstrate a zero effort upgrade. waiting....
 

Bo_Hazem

Banned
Maybe it's the first next-gen game without DRS.

Xbox Series X in native 4k has few dips. PS5 in native 4k should have more dips.

In static resolution, PS5 at 1800p shows constant 60fps .

Maybe with DRS PS5 should get 4k in some scenarios.

But of course, clear Xbox Series X win.

[UPDATED]

Nope:

1800p vs 2160 = 44% (total pixel count)
60fps (actually more than 60fps to have solid 60fps all the time) vs 41fps = 46.3%

46.3 - 44 = more than 2.3% advantage for PS5

Another usual win for PS5, but unknown how big exactly because solid 60fps is actually something around 60-80fps.
 
Last edited:

Rea

Member
uqHsXtb.jpg

Seems like hit man devs are never give Playstation some loves. This is Hitman 2 running on Pro vs X. As you can see performance delta is massive even at the same resolution. No many other 3rd party games have this massive differences between both platforms. So yeah, hit man game engine is not so optimise for Playstation.
 
If that’s the case, a quick patch with recompiled code for past games should demonstrate a zero effort upgrade. waiting....
I mean Alex has his sources(but what sources?). Maybe the pull out the ass kind of source. But what am I saying? The more knowledgeable Xbox “insiders” out there like dealer & Colt said it’s the “tools”. So it must be the “tools”, right? Any bs they can conjure to make their plastic box of choice look disadvantaged. In this day and age, it’s simply unacceptable to ship a devkit with unfinished game development tools to devs during the final stretch leading up to the consoles’ launch. Some people do not realize how stupid that assertion sounds, how un-Microsoft it looks. I would never, in a million years come up to the assumption that Microsoft, a software development and cloud infrastructure powerhouse, and developer of one of the two most popular OSes on planet earth, would ship unfinished software to game devs. A thing which would undoubtedly be detrimental to their reputation, apart from the fact that it sounds dumb as f@¢k. Microsoft themselves wouldn’t stoop so low as to ship devkits with unfinished software dev tools in it. But dumbasses like Colt & Dealer find it perfectly reasonable to assume something that dumb. *Tsk tsk tsk* what absolute “brainlets”, I had no idea some people were that dumb.
 
Last edited:

Bo_Hazem

Banned
Seems like screenshots have rattled some of the xbox mob. :lollipop_tears_of_joy: Anyway, it's obvious that PS5 is still ahead, VGtech and NXGamer should have a more clear head-to-head video coming as well. Overall it seems like rushed as one is more of a performance mode and the other is a resolution mode.
 

LiquidRex

Member
Not sure if he meant it, but seems the recycled FUD won't stop. Even when it's actually nearly a tie here in this game while PS5 is leading by a comfortable advantage in most games so far.
What about CPU SMT, how much of an impact will that have being that XSX has the option of with or without, where as the PS5 is SMT locked with regards Hitman 3.
 

Bo_Hazem

Banned
What about CPU SMT, how much of an impact will that have being that XSX has the option of with or without, where as the PS5 is SMT locked with regards Hitman 3.

Yup, that's another thing. But if both aren't using identical resolution it's hard to compare. Because PS5 in the same scene being locked 60fps means it's actually 60+ fps. So overall it's a tie or a small advantage on PS5 even with XSX having the option to boost to 3.8GHz with 8 threads.
 

Codeblew

Member
Yup, that's another thing. But if both aren't using identical resolution it's hard to compare. Because PS5 in the same scene being locked 60fps means it's actually 60+ fps. So overall it's a tie or a small advantage on PS5 even with XSX having the option to boost to 3.8GHz with 8 threads.
But if the game uses over 8 threads, SMT will win regardless. 1 or 2 threads is probably used by the OS, but the point stands. non-smt would be more performant on XBOX if the game is <=8 threads. > 8 threads is pretty much a wash with SMT. I would hope modern games were using SMT to get max performance on both consoles.
 

Bo_Hazem

Banned
But if the game uses over 8 threads, SMT will win regardless. 1 or 2 threads is probably used by the OS, but the point stands. non-smt would be more performant on XBOX if the game is <=8 threads. > 8 threads is pretty much a wash with SMT. I would hope modern games were using SMT to get max performance on both consoles.

And no decompression going on PS5 in the CPU vs PC. It's really an interesting gen to watch and enjoy.
 

Riky

$MSFT
44% resolution advantage for XSX.
More than 46.3% performance advantage for PS5.

Just swallow it as it is.

You're the one gagging pretty badly

The 44% resolution advantage is for 100% of the game, also higher shadow settings are for the entire game.

The "performance advantage" is for less than 0.5% of the game if that.

Also VRR support totally negates that miniscule " performance advantage"
 
Last edited:

Dibils2k

Member
uqHsXtb.jpg

Seems like hit man devs are never give Playstation some loves. This is Hitman 2 running on Pro vs X. As you can see performance delta is massive even at the same resolution. No many other 3rd party games have this massive differences between both platforms. So yeah, hit man game engine is not so optimise for Playstation.
its because many devs pushed for higher resolution on One X version, if One X version was same resolution as PS4 Pro, every game would have this kind of advantage on xbox in performance
 

onesvenus

Member
He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs. I'm assuming Alex has no clue what he's talking about and I don't say that lightly. Dealer proved his absolute lack of common sense by citing Tom Warren as a credit source of info, that's how low he went and this post he made now further proves that point. Citing Battaglia as a credible source of info when he knows jack about compiler architectures is a new low. But then again, it's Dealer we're talking about, so we're basically talking about an absolute nobody who will stoop far too low to get his nonsensical point across. By the way, he latches onto the bs spread by people like Astal and Misterxmedia and I'm sure that on its own says a lot about him.
What's so wrong about Alex's quote? Better compilers do increase the performance of the same code.
Alex talks about "build performance", which would mean "how long does it take to build/compile the game". It has nothing to do with the performance of the resulting code.
That is something which makes the "change -> compile -> test" cycle faster, which reduces the idle time of the devs
Which in turn allows them to be more productive, thus being able to improve things faster than before. Why shouldn't that be an improvement?
[UPDATED]

Nope:

1800p vs 2160 = 44% (total pixel count)
60fps (actually more than 60fps to have solid 60fps all the time) vs 41fps = 46.3%

46.3 - 44 = more than 2.3% advantage for PS5

Another usual win for PS5, but unknown how big exactly because solid 60fps is actually something around 60-80fps.
You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).
Using that as absolute values is obviously a flawed comparation that someone with minimal math knowledge knows.
You should know better than to keep trolling.
I mean Alex has his sources(but what sources?). Maybe the pull out the ass kind of source. But what am I saying? The more knowledgeable Xbox “insiders” out there like dealer & Colt said it’s the “tools”. So it must be the “tools”, right? Any bs they can conjure to make their plastic box of choice look disadvantaged. In this day and age, it’s simply unacceptable to ship a devkit with unfinished game development tools to devs during the final stretch leading up to the consoles’ launch. Some people do not realize how stupid that assertion sounds, how un-Microsoft it looks. I would never, in a million years come up to the assumption that Microsoft, a software development and cloud infrastructure powerhouse, and developer of one of the two most popular OSes on planet earth, would ship unfinished software to game devs. A thing which would undoubtedly be detrimental to their reputation, apart from the fact that it sounds dumb as f@¢k. Microsoft themselves wouldn’t stoop so low as to ship devkits with unfinished software dev tools in it. But dumbasses like Colt & Dealer find it perfectly reasonable to assume something that dumb. *Tsk tsk tsk* what absolute “brainlets”, I had no idea some people were that dumb.
You talk as if rushed software doesn't exist. Should I point you to the number of post-release patches they have published for Windows?
What's worse, shipping unfinished devkits or not shipping them at all?
 

Akuji

Member
The whole tools talk is weird. As history has shown the graphics those consoles can display improves within a gen. That goes for both.
It’s not embarrassing to say tools will make a console perform better. It’s embarrassing to say one console will improve radically more in that aspect as the other without giving anything that thought is based on. PS5 seems more like the console with more headroom to improve because from what i know, there is more custom silicon in the ps5 that needs to be handled correctly like the geometry engine.

does xbox have a equivalent? Hardware based of course.
 

Lunatic_Gamer

Gold Member

Nioh 3 Hint Dropped as Team Ninja Explore Haptic Feedback Possibilities


“We considered a lot of different ideas for how to support these features [in Nioh 2]”, he told DailyBits. “For a really intense action game, utilizing the haptic feedback too much may take away from the player’s experience with the game and could hurt their overall enjoyment of the title.”
“As a result, we implemented only a few of the most suitable ideas into the game. I would like to try and make a game that makes full use of the PS5’s haptic feedback feature in the future. It would turn into a game that couldn’t be experienced on the previous generation of hardware.”
“The plan for now is to have the Nioh team move on to work on new projects after the release of Nioh 2 Complete Edition. In order to ensure that future titles, including the possibility of Nioh 3, are titles that all of our fans can enjoy and look forward to, we are putting all of our effort into Nioh 2 CE.”

 
What's so wrong about Alex's quote? Better compilers do increase the performance of the same code.

Which in turn allows them to be more productive, thus being able to improve things faster than before. Why shouldn't that be an improvement?

You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).
Using that as absolute values is obviously a flawed comparation that someone with minimal math knowledge knows.
You should know better than to keep trolling.

You talk as if rushed software doesn't exist. Should I point you to the number of post-release patches they have published for Windows?
What's worse, shipping unfinished devkits or not shipping them at all?
Code runs faster as long as you dedicate your time to optimizing it. The question isn’t how good a compiler is, the question is how good the code is and how well written it is. You can have the world’s best compiler and the fastest cpu around but that won’t do your code any good if it’s filled with flaws that can and will affect performance. It’s not the compiler that’s the problem, it’s the code and its quality. If the code is a lot of awful, messed up spaghetti code, then you can bet performance and maintainability of the program and it’s code respectively, will suffer in the long term. The tools excuse is the oldest one in the book.
A lot of veteran dev studios said that the tools are fine and what is the most likely scenario is that devs have the tools, but haven’t gotten around to implementing many of them yet. So the “it’s the tools” excuse, isn’t really a valid excuse in the context that fanboys like Colt and Dealer who have latched onto the nonsense of nobodies like Tom Warren, Astal and MisterXMedia among others are trying to put it in. Who am I supposed to believe actual devs with experience working on those devkits or absolute nobodies with no experience in that field whatsoever?
Unfinished devkits and tools for a piece of hardware in which Microsoft spent a good 3-4 years working on is unacceptable given the time spent working on it. Shipping the tools 90% finished is understandable, shipping them half finished at the point in which the last devkits were sent to devs is unacceptable. Especially for a software giant like Microsoft, which is considered the golden standard among many people in terms of the Software engineering prowess of their employees.
 
Last edited:

Bo_Hazem

Banned
You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).
Using that as absolute values is obviously a flawed comparation that someone with minimal math knowledge knows.
You should know better than to keep trolling.

Sorry to tell you that, but you make yourself sound like a troll here. So FPS performance is fictional? It's ok to drop to 40-50fps now?

chappelle show lol GIF
 
Last edited:

onesvenus

Member
Code runs faster as long as you dedicate your time to optimizing it. The question isn’t how good a compiler is, the question is how good the code is and how well written it is. You can have the world’s best compiler and the fastest cpu around but that won’t do your code any good if it’s filled with flaws that can and will affect performance. It’s not the compiler that’s the problem, it’s the code and its quality. If the code is a lot of awful, messed up spaghetti code, then you can bet performance and maintainability of the program and it’s code respectively, will suffer in the long term.
I'm not arguing about the tools thing, I'm talking about Alex's quote talking about how the compiler improves the code.
Without taking into account the quality of the code itself, for a given code, different compiler versions can give better performant code. That's specially true in GPU code.

Unfinished devkits and tools for a piece of hardware in which Microsoft spent a good 3-4 years working on is unacceptable given the time spent working on it. Shipping the tools 90% finished is understandable, shipping them half finished at the point in which the last devkits were sent to devs is unacceptable.
Do you know if they were 90%, 95% or 35% finished?
 
Status
Not open for further replies.
Top Bottom