• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Panajev2001a

GAF's Pleasant Genius


Tests controller latency/input lag. Very interesting, and not shocking that PR and real results are... check for yourself

... and again, Sony did not give the lag, stiffness, etc... improvements for the DualSense a fancy name and they did not even shout about it too much (as opposed to someone else ;))... and yet delivered the results. Interesting results about analog stick tracking accuracy (drawing circles on the controller and measuring the path travelled as reported by the controller, not very consistent path on the Xbox controllers: even though it is mostly a non issue for non-pro gamers).

Just making a bit more of a point about this because everytime Sony is not bombastic about something the assumption by some seems to be that they must have put a low effort or just ignored it all... instead of thinking “well, companies are different, the one I like shouts about everything others shout less about, but they still do deliver cool shit nonetheless”.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Ok I've finished Control UE on performance mode on PS5 with all the side missions + DLC's:

-The plot/lore is great, feels strange and interesting, 10/10.
-Story? Probably ok, acting is good like 7/10 but story itself doesn't really click nor the characters seem to make sense or you feel anything towards them. But as said, the overall lore/story is 10/10 but how it's represented is 7/10.
-Facial animations feel like they all came from the dental clinic before having a convo.
-Graphics are decent, 8/10.
-Gameplay is solid, 9/10, love the idea of the shifting gun but having only 2 guns and needing to stop to load another type is stupid and emersion killer. A weapon wheel would've made much more sense.
-It gives me MGS1 vibes of massive indoor gameplay.
-Most of the enemies look mediocre, not impactful or fearsome. Only creepy one was that Dr. Hartman in the DLC, I think.
-Environment is very dynamic and destructible.

Overall it was a very unique experience, a solid 9/10 for the game. I think Remedy has so much potential to make even more better games in the future.

You should try it on Quality too for the graphics score :D.
 

PaintTinJr

Member
One thing that having an off-the-shelf RDNA2 GPU enables is making multiplataform ports easier. I'm sure that was largely MS rationale between that. They seem to focus on wanting to remove differences between Xbox and PC development. Their dream seems to be able to program your game once and then run it in PC and Xbox without any optimization whatsoever.

It's a good strategy IMO, seeing it's quite difficult to keep up with Sony in regards to sales number. If they can show Xbox/PC as a single market to third parties, most of them wouldn't leave an Xbox port on the table.

The thing is that UE5 primary focus is on PC and I don't think that will change in the near future. That means that all the big developments they do on the engine will need to work on PC. This will make Sony studios engines shine brighter.

He is talking about last-gen engines.

Yeah, let's wrap his words because they doesn't fit your narrative. Classic. He is saying what he is saying: the DF comparison is a good benchmark of how both GPUs perform on that engine. Does that mean that will translate to game performance? No. Does that mean one GPU is better than the other? No. Does that mean that one console hardware is better than the other? Not at all.
To better compare both GPUs, and only GPUs and not whole systems, we would need more benchmarks like this, where the CPU is not being taxed at all. That's why he is wishing for more.

You are benchmarking how that engine performs on the GPU only.
I don't quite get the point you are making.

How is the XsX GPU representative of the off-the-shelf (shelves that are empty, I might add) AMD RDNA2 GPUs?

The cache? the CUs per shader array? the ability to do BVH and texturing per CU, simultaneously? game clock and boost clock? because as far a I can tell, the only thing that tells me it is definitely RDNA2 is AMD's CEO and the PC graphics driver features it will support - otherwise, like others I'd swear blind it wasn't, which is neither here nor there, as it is custom RDNA2, but that massively impacts the point about making it easier with multiplatform ports compared to the PS5's custom RDNA2 GPU - which seems to aligned much closer on all those technical aspects that are different with the XsX GPU.

As for UE still being more PC focused, are you sure about that? Given the new licensing, isn't it more likely that their main money will come from store sales for the engine assets/plugins, and AAA blockbusters on PlayStation and Xbox that make significant money? Even Sony's £250m investment looks more like an indirect accounting for their de-badged - highly customized - use of UE in the vast majority of their top tier games.
 

Bo_Hazem

Banned
You should try it on Quality too for the graphics score :D.

Quality mode has input lag issue, also it's so choppy even though it's 30fps that choppiness gives it a feel below 30fps. I play 30fps games with zero problems, but it's near unplayable in Quality mode compared to Spiderman MM that I played in Fidelity mode. Maybe bad motion blur implementation? Not sure.
 

PaintTinJr

Member
He is saying XSX’s approach of more but slower CU’s can have an advantage on “unoptimised last generation game engines”.
Yeah, we can't be sure how specific he's really being - but as a very smart guy I'm guessing pretty specific - but if the "some", "unoptimised" and "last generation game engines" is totally specific. Then that is a tiny, tiny slither of the venn diagram - and everything else in the diagram represents all the situations where the PS5 performs better.

It could be a pretty damning statement in that context IMHO.
 

PaintTinJr

Member
Quality mode has input lag issue, also it's so choppy even though it's 30fps that choppiness gives it a feel below 30fps. I play 30fps games with zero problems, but it's near unplayable in Quality mode compared to Spiderman MM that I played in Fidelity mode. Maybe bad motion blur implementation? Not sure.
It is triple buffered with vsync, apparently, so latency is going to feel like 15-20fps - on a 30fps mode - if the frame pacing isn't brilliant.

Pretty sure the triple buffering is to help with cross-platform performance parity and likely an old ps3/360 gen "parity or better" clause in play, more than the PS5 needing triple buffering, which would put Remedy in a pretty rubbish position if their game is getting held back on the main platform, but no way out of the situation to fix things.
 

Lysandros

Member
It is triple buffered with vsync, apparently, so latency is going to feel like 15-20fps - on a 30fps mode - if the frame pacing isn't brilliant.

Pretty sure the triple buffering is to help with cross-platform performance parity and likely an old ps3/360 gen "parity or better" clause in play, more than the PS5 needing triple buffering, which would put Remedy in a pretty rubbish position if their game is getting held back on the main platform, but no way out of the situation to fix things.
Do you think that the game is still triple buffered in photo mode? How it can possibly tear in XSX in that case, just disabled in that mode maybe?
 
Last edited:

Bo_Hazem

Banned
It is triple buffered with vsync, apparently, so latency is going to feel like 15-20fps - on a 30fps mode - if the frame pacing isn't brilliant.

Pretty sure the triple buffering is to help with cross-platform performance parity and likely an old ps3/360 gen "parity or better" clause in play, more than the PS5 needing triple buffering, which would put Remedy in a pretty rubbish position if their game is getting held back on the main platform, but no way out of the situation to fix things.

Yeah won't expect much from a small studio going multi, they won't fully optimize for PS5. PS Studios gonna do wonders.
 

PaintTinJr

Member
Do you think that the game is still triple buffered in photo mode? How it can possibly tear in XSX in that case, just disabled in that mode maybe?
For it to tear, it either has to drop vsync, and either be double buffered. Or triple buffered, but missing the render time by more than the tear size- ie third buffer partial render time is completely overran, and dripped into the double buffer render time that is incomplete by the tear size, and multiplied by the frames that have the tear present. In the other thread the tear alignment to the frame pacing graph suggested it was 3 or 4 frames - if the graph alignment is the same for tearing and pacing.

If I had to guess, I would say that it is still triple buffered, without vsync - and that's why the frame-rate changes are smoother, because it is averaging across more back buffer frames before the problem, and gets ahead when the problem passes. Otherwise I would expect frame-rate to tank for a noticeable blip in a double buffer (vsync off) situation on the XsX due to a lack of infinity cache or scrubbers.

edit:
The claim it isn't a tear, and is frustum clipping seems very unlikely, because when the frustum clips it leaves the stuff outside of the near/far planes as the screen clear colour, so if the explanation is true, then it presents more questions about why the framebuffer isn't displaying an entirely new frame, on every frame, and makes the whole analysis unreliable, because frame-rate is supposed to represent the result of a newly assembled frame - at each buffer flip.
 
Last edited:

Lysandros

Member
For it to tear, it either has to drop vsync, and either be double buffered. Or triple buffered, but missing the render time by more than the tear size- ie third buffer partial render time is completely overran, and dripped into the double buffer render time that is incomplete by the tear size, and multiplied by the frames that have the tear present. In the other thread the tear alignment to the frame pacing graph suggested it was 3 or 4 frames - if the graph alignment is the same for tearing and pacing.

If I had to guess, I would say that it is still triple buffered, without vsync - and that's why the frame-rate changes are smoother, because it is averaging across more back buffer frames before the problem, and gets ahead when the problem passes. Otherwise I would expect frame-rate to tank for a noticeable blip n a double buffer (vsync off) situation on the XsX due to a lack of infinity cache or scrubbers.
I see, thanks for the explanation. 👍
 

assurdum

Banned
Yeah, we can't be sure how specific he's really being - but as a very smart guy I'm guessing pretty specific - but if the "some", "unoptimised" and "last generation game engines" is totally specific. Then that is a tiny, tiny slither of the venn diagram - and everything else in the diagram represents all the situations where the PS5 performs better.

It could be a pretty damning statement in that context IMHO.
Would be interesting to go in deeper to such chats with him. Because I'm quite convinced if you say something like this to Dictator(and not only) he should respond nope it needs more job to push better more CUs than lesser. An head to head between Matt and Dictator would be amazing just to see where will land.
 
Last edited:

Zoro7

Banned

onesvenus

Member
How is the XsX GPU representative of the off-the-shelf (shelves that are empty, I might add) AMD RDNA2 GPUs?
I was talking about the supported feature set. With AMD RDNA2 GPUs and Xbox GPU supporting the whole set of features it's going to be easier to port from/to the other one than dropping a feature or implementing it in another way (i.e PS5 VRS if it exists).

As for UE still being more PC focused, are you sure about that? Given the new licensing, isn't it more likely that their main money will come from store sales for the engine assets/plugins, and AAA blockbusters on PlayStation and Xbox that make significant money? Even Sony's £250m investment looks more like an indirect accounting for their de-badged - highly customized - use of UE in the vast majority of their top tier games.
We'll have to wait and see but how many third party games made in UE do not have a PC version? It doesn't make sense to me
 

PaintTinJr

Member
I was talking about the supported feature set. With AMD RDNA2 GPUs and Xbox GPU supporting the whole set of features it's going to be easier to port from/to the other one than dropping a feature or implementing it in another way (i.e PS5 VRS if it exists).


We'll have to wait and see but how many third party games made in UE do not have a PC version? It doesn't make sense to me
But that is assuming that DX features is the target for the games industry, when historically it has always been the platform agnostic API that has new features - immediately - by extensions that has determined feature set. And just like the PS4, the PS5 will have a DirectX wrapper for less experienced/smaller studios that can't or won't optimise to their SDK, so there will be a highlevel version of VRS in the wrapper that maps to the Geometry engine.

AAA game development funding isn't typically based on guaranteed PC sales, that's bonus money. And non-AAA games typical use more options than just UE.
 
I was talking about the supported feature set. With AMD RDNA2 GPUs and Xbox GPU supporting the whole set of features it's going to be easier to port from/to the other one than dropping a feature or implementing it in another way (i.e PS5 VRS if it exists).

Microsoft owns DX so maybe they are the only ones to use the terms for that technology. It could very well be possible that Sony couldn't use them even if they wanted to so their version is called something different.

I know some people are RDNA2 as some ultimate weapon but do we really know if the PS5 is extremely lacking because they don't have DX termed features?
 
  • Thoughtful
Reactions: Rea

kyliethicc

Member
Ok I've finished Control UE on performance mode on PS5 with all the side missions + DLC's:

-The plot/lore is great, feels strange and interesting, 10/10.
-Story? Probably ok, acting is good like 7/10 but story itself doesn't really click nor the characters seem to make sense or you feel anything towards them. But as said, the overall lore/story is 10/10 but how it's represented is 7/10.
-Facial animations feel like they all came from a dental clinic before having a convo.
-Graphics are decent, 8/10.
-Gameplay is solid, 9/10, love the idea of the shifting gun but having only 2 guns and needing to stop to load another type is stupid and immersion killer. A weapon wheel would've made much more sense.
-It gives me MGS1 vibes of massive indoor gameplay.
-Most of the enemies look mediocre, not impactful or fearsome. Only creepy one was that Dr. Hartman in the DLC, I think.
-Environment is very dynamic and destructible.

Overall it was a very unique experience, a solid 9/10 for the game. I think Remedy has so much potential to make even more better games in the future.

Note: Quality mode has input lag issues and feels choppy, probably bad motion blur implementation?
Don't forget to turn on photo mode and just stare at it! But don't actually .. play the game.

Thats the way its meant to be experienced. Count those static frames!
 

ethomaz

Banned
For it to tear, it either has to drop vsync, and either be double buffered. Or triple buffered, but missing the render time by more than the tear size- ie third buffer partial render time is completely overran, and dripped into the double buffer render time that is incomplete by the tear size, and multiplied by the frames that have the tear present. In the other thread the tear alignment to the frame pacing graph suggested it was 3 or 4 frames - if the graph alignment is the same for tearing and pacing.

If I had to guess, I would say that it is still triple buffered, without vsync - and that's why the frame-rate changes are smoother, because it is averaging across more back buffer frames before the problem, and gets ahead when the problem passes. Otherwise I would expect frame-rate to tank for a noticeable blip in a double buffer (vsync off) situation on the XsX due to a lack of infinity cache or scrubbers.

edit:
The claim it isn't a tear, and is frustum clipping seems very unlikely, because when the frustum clips it leaves the stuff outside of the near/far planes as the screen clear colour, so if the explanation is true, then it presents more questions about why the framebuffer isn't displaying an entirely new frame, on every frame, and makes the whole analysis unreliable, because frame-rate is supposed to represent the result of a newly assembled frame - at each buffer flip.
Interesting points.
I think the game is not having VSync on Photo Mode and before people says says Remedy said so... no they don’t... they said it is triple buffered and VSynced in gameplay (that shows that).

Your edit just add to the point that FF tools is really lacking on framerate count because they take these repeated frames as unique but with Valhalla, Bathallia did the count manually and now with this game he seems to have forget about it.

IMO I believe he did a rushed job... he had to be pointed about the tear and difference in textures to go research and test to come with a explanation... something he should have notified before the video gone online.
 

Bo_Hazem

Banned
Don't forget to turn on photo mode and just stare at it! But don't actually .. play the game.

Thats the way its meant to be experienced. Count those static frames!

fox tv GIF by American Grit


Seems like Xbox players won't finish Control, assuming they'll buy the next gen version because it's not on the "free" service. :lollipop_tears_of_joy:
 
Last edited:

kyliethicc

Member
fox tv GIF by American Grit


Seems like Xbox players won't finish Control, assuming they'll buy the next gen version because it's not in the "free" service. :lollipop_tears_of_joy:
I can already see it. Ricky will wait til the UE edition comes to game pass in like 2023. And then he'll just boot the game up and put the controller down lol... and just fucking stare at a hallway in photo mode, pretending he can tell if a static image is being rendered at 50 or 60 FPS.
 

onesvenus

Member
when historically it has always been the platform agnostic API that has new features - immediately - by extensions that has determined feature set
Like what, OpenGL and Vulkan? Is there some big company except for id that uses those in PC games?
My argument is basically that if you are targetting PC in addition to console versions, which most third parties are and I expect that number to increase, having the same supported features and using the same middleware in both markets makes a lot of sense. It's like having a free port to a whole new market. If, and that's a big if, Microsoft gets to a point where using DirectX allows you to develop a game for both markets without big changes, that's a huge win for them, IMHO. If we take off our fanboys glasses and look through the eyes of a publisher, being able to expand market reach (i.e. going from PC OR Xbox to PC AND Xbox) with little development costs (i.e. if porting DX games between PC and Xbox was easy), is a really good thing. I can see more third party games publishing their PC games to Xbox and I'm sure that's Microsoft end goal with the DX12U feature set in Xbox.
Microsoft owns DX so maybe they are the only ones to use the terms for that technology. It could very well be possible that Sony couldn't use them even if they wanted to so their version is called something different.

I know some people are RDNA2 as some ultimate weapon but do we really know if the PS5 is extremely lacking because they don't have DX termed features?
I am not talking about nomenclature as much as implementation details. I don't care if VRS is called VRS in PS5 or not, it doesn't matter. What I was trying to say is that if using VRS is the same in PC and Xbox but completely different in PS5, due to architecture differences, that's a big win for Xbox.
 

PaintTinJr

Member
Like what, OpenGL and Vulkan? Is there some big company except for id that uses those in PC games?
My argument is basically that if you are targetting PC in addition to console versions, which most third parties are and I expect that number to increase, having the same supported features and using the same middleware in both markets makes a lot of sense. It's like having a free port to a whole new market. If, and that's a big if, Microsoft gets to a point where using DirectX allows you to develop a game for both markets without big changes, that's a huge win for them, IMHO. If we take off our fanboys glasses and look through the eyes of a publisher, being able to expand market reach (i.e. going from PC OR Xbox to PC AND Xbox) with little development costs (i.e. if porting DX games between PC and Xbox was easy), is a really good thing. I can see more third party games publishing their PC games to Xbox and I'm sure that's Microsoft end goal with the DX12U feature set in Xbox.

I am not talking about nomenclature as much as implementation details. I don't care if VRS is called VRS in PS5 or not, it doesn't matter. What I was trying to say is that if using VRS is the same in PC and Xbox but completely different in PS5, due to architecture differences, that's a big win for Xbox.
Funny how you use the "glasses" comment, while forgetting about Macs, Linux, Nintendo, and Nvidia shield/ARM are all either PCs or consoles and have no easy port option if DX is used.
 

ethomaz

Banned
I'm trying to decide if I start to post on Beyond3D or not :D
I had a 2010 account that I basically give up for several years (I tried to login and remembered the password).
I don't like to have too much places and right now I only frequent GAF.
 
Last edited:
So he's definitely saying the PS5 outperform some XSX games in the future.

I'm wondering what kind of games those will be like?
In fact, I don't read his message like that. For me he is more saying that some non optimized engines from last gen will benefit more from a “wide” GPU than Deep ones, and vice versa, some other will benefit more from deep than wide.
 

IntentionalPun

Ask me about my wife's perfect butthole
In fact, I don't read his message like that. For me he is more saying that some non optimized engines from last gen will benefit more from a “wide” GPU than Deep ones, and vice versa, some other will benefit more from deep than wide.
Yeah I don't understand why people aren't seeing the "vice versa."

He made no statements about next-gen games.. just that cross-gen games might work work better on one or the other.

It's kind of a nothing comment though isn't it? Why wouldn't next-gen games also be similar? Some engines might benefit from wide over narrow and fast... and vice versa lol.
 

Lysandros

Member
I'm trying to decide if I start to post on Beyond3D or not :D
I had a 2010 account that I basically give up for several years (I tried to login and remembered the password).
I don't like to have too much places and right now I only frequent GAF.
Do not do that ethomaz, there is still hope in life. The place is run by their God the Dictator, they will exterminate you.
 

onesvenus

Member
Funny how you use the "glasses" comment, while forgetting about Macs, Linux, Nintendo, and Nvidia shield/ARM are all either PCs or consoles and have no easy port option if DX is used.
Are you telling me there's no work involved in porting a PC game to Mac, Linux, Nintendo Switch or Nvidia Shield? Really?
Because what I was saying was that if one of those platforms removed the porting entry barriers, like I think Microsoft was looking for, it would make a lot of sense for publishers to port their games to that platform be either Mac, Linux, Switch, Shield or, god forbids, Xbox.
 

DeepEnigma

Gold Member
... and again, Sony did not give its lag, stiffness, etc... improvements a fancy name for the DualSense and they did not even shout about it too much (as opposed to someone else ;)) and yet delivered the results. Interesting results about analog stick tracking accuracy (drawing circles on the controller and measuring the path travelled as reported by the controller, not very consistent path on the Xbox controllers).

Just noticing because everytime Sony is not bombastic about something the assumption by some seems to be that they must have put a low effort or just ignored it all... instead of thinking “well, companies are different, the one I like shouts about everything others shout less but still do”.
Velocity Architecture™

🤭
 
Status
Not open for further replies.
Top Bottom