• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Riky

$MSFT
Saying again" as if youve ever proved me wrong what a classic lier, a missing light source has no effect on the resolution or quality of your raytraced shadows, you can have a 100 light sources but if your raytracing has lower quality or uses less samples or a poorer denoiser its not going to change a thing. Im not saying the series x has less raytracing capabilities im just saying the settings they used on series x cod cold war are less than ps5s due to tools or whatever god knows.
Heres the video you can go to 13:30 and nx gamer is explaining the differences in raytracing on both versions


And also thanks for spotting the missing light source it seems to be another problem for series x cod up there with less muzzle flashes. Good luck.


You're so stupid, you claim nobody said there was a missing light source and call me a liar. I prove you wrong yet again by showing you the post and then at the end you go on to make the same claim🤣🤣

Which was debunked by DF.
 

Mr Moose

Member
So possible that some effects were changed, but I'm suprised with the resolution increase if the One S version version was 1080p they have enough room for that.
I like the stones in that part of the video more on the Series S version, those big ass stones on the Pro/One X/PS5/Series X look a bit shit.

You're so stupid, you claim nobody said there was a missing light source and call me a liar. I prove you wrong yet again by showing you the post and then at the end you go on to make the same claim🤣🤣

Which was debunked by DF.

BirdGate.png


The Birdssss!
 
Last edited:

Riky

$MSFT
As I shared in the past, I have the effect sometime when running the game on my XsX, seems that it disappeared as a bug sometimes...

You can have a look in the VGtech video, you have the muzzle light during this sequence for example :


I've already told them this, I have the game so can actually see it myself but thanks for finding proof.

Another load of FUD that can be put to bed.
 
You're so stupid, you claim nobody said there was a missing light source and call me a liar. I prove you wrong yet again by showing you the post and then at the end you go on to make the same claim🤣🤣

Which was debunked by DF.
I said a missing light source isnt the cause of poor raytracing thats what u where lying about,

And digital foundry didnt Debunk Anything Even more lies how do you debunk something that you can see for your own eyes. Go watch the video at 13:30 you can clearly see the difference in the raytracing. Saying digital foundry debunked that is lunatic.
 

Riky

$MSFT
I said a missing light source isnt the cause of poor raytracing thats what u where lying about,

And digital foundry didnt Debunk Anything Even more lies how do you debunk something that you can see for your own eyes. Go watch the video at 13:30 you can clearly see the difference in the raytracing. Saying digital foundry debunked that is lunatic.

I never mentioned anything about Ray tracing 😅 why you lying? I said someone tries to claim there was missing light sources, you said nobody ever said that and I just showed you the post, proving you wrong again. DF debunked it by showing the light and that the scene is random each play, fact.
Now your muzzle flash claim has been debunked as well🤣
 
I never mentioned anything about Ray tracing 😅 why you lying? I said someone tries to claim there was missing light sources, you said nobody ever said that and I just showed you the post, proving you wrong again. DF debunked it by showing the light and that the scene is random each play, fact.
Now your muzzle flash claim has been debunked as well🤣
You seem to pointlesslly throw emojis as if they make any point. All ur answers have been laughing emojis can u speak clear and proper like a man, the series x has poor raytracing and less muzzle flashes a missing light source/ emojis and the word "debunk" doesnt fix that, can you show me how they debunked poor raytracing and less muzzle flashes other than emojis.
 

Riky

$MSFT


To refresh memories a bit, from description:

"PS5 in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 2275x1280. PS5 in Performance Mode rarely renders at a native resolution of 3840x2160.

Xbox Series X in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being 1920x1080. Xbox Series X in Performance Mode rarely renders at a native resolution of 3840x2160.

The only resolution found on PS5 in Quality Mode was 3840x2160.

Xbox Series X in Quality Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 3328x1872. Drops in resolution below 3840x2160 on Xbox Series X in Quality Mode seem to be uncommon."


You don't need to refresh memories, DF recently retested it with the Switch version and after patching they are now the same.

 

Riky

$MSFT
You seem to pointlesslly throw emojis as if they make any point. All ur answers have been laughing emojis can u speak clear and proper like a man, the series x has poor raytracing and less muzzle flashes a missing light source/ emojis and the word "debunk" doesnt fix that, can you show me how they debunked poor raytracing and less muzzle flashes other than emojis.

Someone has just posted the video with the muzzle flashes above, please stop embarrassing yourself.
 

Riky

$MSFT
The muzzle light could be dynamic and be displayed only when the system has some headroom to spare. I am starting to be tired about the "bug" excuse each times the XSX display some kind of deficiency compared to PS5.

You really believe that? The framerate drops on both machines in that on rails section, the muzzle flash makes no difference. Keep grasping at straws. When you play the game it can be there for the entire play through of that section, I've played it.
 
The muzzle light could be dynamic and be displayed only when the system has some headroom to spare. I am starting to be tired about the "bug" excuse each times the XSX display some kind of deficiency compared to PS5.

That's clearly not the case, during the complete helicopter sequence, you have the muzzle light in a run, not the second one. Nothing to do with dynamic or headroom as it is exactly the same sequence, so speaking about deficiency in this case is dishonest when clearly it seems to really be a bug.
 
Last edited:

Riky

$MSFT
As I shared in the past, I have the effect sometime when running the game on my XsX, seems that it disappeared as a bug sometimes...

You can have a look in the VGtech video, you have the muzzle light during this sequence for example :



What! Did you even watch that video cause it clearly shows the series x has missing muzzle flashes. Its you whose embarassing yourself post after post.

Wrong again, please keep up.
 
What! Did you even watch that video cause it clearly shows the series x has missing muzzle flashes. Its you whose embarassing yourself post after post.

You clearly see the muzzle light in this video from VGtech which is missing in the same sequence from NXG or DF. That's why I'm saying that's a bug, because it's the same thing in the other sequence... I have seen that when I played the game...

 
Last edited:
Wrong again, please keep up.
You see what you are is just a clear thick denier even with video evidence you still stand with ur delusions, the video posted infact all the video comparisons online and infact everybody whose played cod on series consoles and whose seen the comparisons clearly saw missing muzzle flashes but for some reason in the universe you just deny it its like evidence and you are oil and water you cant mix so good luck, "steve wonder!"

OYSgsHa.jpg
 
Last edited:

Riky

$MSFT
You see what you are is just a clear thick denier even with video evidence you still stand with ur delusions, the video posted infact all the video comparisons online and infact everybody whose played cod on series consoles and whose seen the comparisons clearly saw missing muzzle flashes but for some reason in the universe you just deny it its like evidence and you are oil and water you cant mix so good luck, "steve wonder!"

OYSgsHa.jpg

Which bit of the word "bug" don't you understand? Sometimes it is there and sometimes it isn't as the video posted shows.
 

Panajev2001a

GAF's Pleasant Genius
Tbh that's already the Krisprolls emoji :messenger_beaming:

Also people are so desperate that they can't acknowledge a simple bug. Sad!
A bug is a bug, but after all the “12 > 10 is actually only underselling the gap” posturing and chest beating for a year or so to find ourselves discussing about systems in spitting distance of each other is perhaps the only true non cross generation next level entertainment ;).
 
Last edited:
To be precise, every downclock has a negative impact even if leads to just 1% performance loose.
You obviously can argue that this "worst case" of only having 18% TF advantage on the Xbox Series X is the usual case but It's simply not guaranted that the PS5 GPU is running at 2.23GHz all the time.

It doesn't have to be. But if it's 99.9% of the time, that 0.1% becomes negligible.

Since we don't have clock numbers we can't tell if current games are already here and there a bit below the 2.23 GHz mark and if not, how many next gen games will put the clocks down?

We can indeed look at the performance of current games on PS5 versus XSX versus PC and gain some confidence in the fact that if any downclocking is occuring, it simply isn't meaningful to overall performance. In almost every game tested, the PS5 GPU outperforms a PC desktop GPU with similar TF numbers. Likewise, in most non-BC titles those 10TF on PS5 are delivering higher performance than the 12TF on XSX... so again, any downclocking on the PS5 GPU isn't appearing to be meaningful to real-world performance.

Mark Cerny was stating that they expect the GPU to run most of its time at or "close" to that frequency and when downclocking occurs that they expect it to be pretty "minor".
Another statement was saying that to reduce power by 10% it only takes a couple of percent of lower clock speeds.

However all of that is of course not very precise and is based on "expectations", even if they are coming from Sony.
It's not like companies are right all the time or don't inject a bit of (too) optimistic marketing.

Of course, but then when the performance in real games consistently demonstrates the GPU is outperforming equivalent TF desktop GPUs and even the 12TF XSX GPU there becomes strong reason to simply believe that Mark Cerny wasn't lying or spouting optimistic marketing (and any historical precedent should tell you that Mark Cerny rarely does for that matter).

There's also the notion that the Road to PS5 talk was originally intended for devs. So there's very little reason to mislead devs with a statement like that. They are the ones who will be working with the hardware at the end of the day and will very quickly and very easily be able to call stuff like that out as BS if it was.

Normally, Technical Experts like Andrew Goosen on the Xbox side, and Mark Cerny on the PS side, speak plainly, concisely and factually. They aren't company PR people so aren't at all given to spouting marketing spiel. So it seems odd to want to dismiss their comments on the basis of "it could be marketing". These aren't the guys that do that.

If it was Spencer or Jim Ryan, then sure.

Now based on the claims and how it should fare, I wouldn't expect major downclocking to occur but next gen games, which also stress the CPU, could pass the treshold and consistently lower the clocks.

The CPU doesn't impact PS5's variable clocks. The clock frequency isn't varied on the basis of power; i.e. it's not measuring consumed power and adjusting frequency to keep within a threshold. It's deterministically adjusting frequency on the basis of GPU workload and GPU hardware occupancy. So whatever the CPU workload it won't impact GPU clocks.

It kinda seems like you might be conflating Smart Shift with their GPU variable clocks. Smart Shift will raise the power ceiling for the CPU if the GPU is idle, but I don't think it works the same the other way around because the GPU clocks are fixed at the top end, based not on overall APU power, but GPU stability limitations (as attested to by Cerny himself).

So I doubt the CPU running flat out would reduce the GPU power ceiling, as a) I would argue logically that as poor design, and b) it simply doesn't need to because the cooling system capacity will be sized for peak CPU and GPU power consumption (which with the variable GPU clock regime is still way below that of a fixed clock GPU).

On avg. the PS5 might run at 2.15 GHz in one game, 2.07 GHz in another.

That's not how it works. The GPU clock frequency adjustment is performed on the basis of GPU workload. So it will change rapidly and many times within the time span of rendering a single frame.

So for a 30fps game, you could see the clocks adjust up and down multiple times within that 33.3ms frame time.

So when Cerny says the GPU will be at max clocks most of the time, it's pretty clear he knows what he's talking about.

They never said a missing light source! some lie this is, series x simply had poor raytracing on cod than ps5 and the missing alpha effects on guns where there for everyone to see.

The 36 cu's are not a hypothethis they are real so i dont get your point there. and the variable clocks dont with a 1.8ghz fixed clock is even more confusing what fo you mean here?

I guess that english probably isn't your first language so it's clear you misunderstood that statement I made.

What I'm saying is that the the PS5's variable clock regime will perform better than a PS5 that was set with fixed clocks.

i.e.

Let's say that Cerny went the same route as Xbox with PS5 (i.e. a hypothetical PS5), and spec'd their machine as 36CUs clocked with a 1.8GHz fixed clock-speed. This would obviously perform significantly worse than the actual variable clocked 2.3GHz 36CU PS5 GPU.

Yes, variable clocks also benefits the Front End, ROPs and cache bandwidth too, but regardless, you cannot point to any of these features and say conclusively "this" or "that hardware feature running at higher clocks is the reason the PS5 provides more stable framerates across a range of different titles".

Every game is different, will have a different performance profile, and will be bottlenecked in different parts of the system at different points throughout the game. It's way more complex than just saying, higher PS5 clocks ==> higher average framerates in games with dynamic resolution.
 
I think the first time was today, they grow up so fast.
I am proud of you, Riky :messenger_heart:

I think it's good that Riky Riky sees it as a bug. However he didn't always say that in the past.

It's ok though we all make mistakes and have to learn to from them.

Oh my God, 'PS5 can't sustain its clocks/downclocks to 9.2 TF' is back from the grave after all this time, i can't believe it.

Oh jeez I better not see those 4TF claims again.
 
Last edited:
It doesn't have to be. But if it's 99.9% of the time, that 0.1% becomes negligible.



We can indeed look at the performance of current games on PS5 versus XSX versus PC and gain some confidence in the fact that if any downclocking is occuring, it simply isn't meaningful to overall performance. In almost every game tested, the PS5 GPU outperforms a PC desktop GPU with similar TF numbers. Likewise, in most non-BC titles those 10TF on PS5 are delivering higher performance than the 12TF on XSX... so again, any downclocking on the PS5 GPU isn't appearing to be meaningful to real-world performance.



Of course, but then when the performance in real games consistently demonstrates the GPU is outperforming equivalent TF desktop GPUs and even the 12TF XSX GPU there becomes strong reason to simply believe that Mark Cerny wasn't lying or spouting optimistic marketing (and any historical precedent should tell you that Mark Cerny rarely does for that matter).

There's also the notion that the Road to PS5 talk was originally intended for devs. So there's very little reason to mislead devs with a statement like that. They are the ones who will be working with the hardware at the end of the day and will very quickly and very easily be able to call stuff like that out as BS if it was.

Normally, Technical Experts like Andrew Goosen on the Xbox side, and Mark Cerny on the PS side, speak plainly, concisely and factually. They aren't company PR people so aren't at all given to spouting marketing spiel. So it seems odd to want to dismiss their comments on the basis of "it could be marketing". These aren't the guys that do that.

If it was Spencer or Jim Ryan, then sure.



The CPU doesn't impact PS5's variable clocks. The clock frequency isn't varied on the basis of power; i.e. it's not measuring consumed power and adjusting frequency to keep within a threshold. It's deterministically adjusting frequency on the basis of GPU workload and GPU hardware occupancy. So whatever the CPU workload it won't impact GPU clocks.

It kinda seems like you might be conflating Smart Shift with their GPU variable clocks. Smart Shift will raise the power ceiling for the CPU if the GPU is idle, but I don't think it works the same the other way around because the GPU clocks are fixed at the top end, based not on overall APU power, but GPU stability limitations (as attested to by Cerny himself).

So I doubt the CPU running flat out would reduce the GPU power ceiling, as a) I would argue logically that as poor design, and b) it simply doesn't need to because the cooling system capacity will be sized for peak CPU and GPU power consumption (which with the variable GPU clock regime is still way below that of a fixed clock GPU).



That's not how it works. The GPU clock frequency adjustment is performed on the basis of GPU workload. So it will change rapidly and many times within the time span of rendering a single frame.

So for a 30fps game, you could see the clocks adjust up and down multiple times within that 33.3ms frame time.

So when Cerny says the GPU will be at max clocks most of the time, it's pretty clear he knows what he's talking about.



I guess that english probably isn't your first language so it's clear you misunderstood that statement I made.

What I'm saying is that the the PS5's variable clock regime will perform better than a PS5 that was set with fixed clocks.

i.e.

Let's say that Cerny went the same route as Xbox with PS5 (i.e. a hypothetical PS5), and spec'd their machine as 36CUs clocked with a 1.8GHz fixed clock-speed. This would obviously perform significantly worse than the actual variable clocked 2.3GHz 36CU PS5 GPU.

Yes, variable clocks also benefits the Front End, ROPs and cache bandwidth too, but regardless, you cannot point to any of these features and say conclusively "this" or "that hardware feature running at higher clocks is the reason the PS5 provides more stable framerates across a range of different titles".

Every game is different, will have a different performance profile, and will be bottlenecked in different parts of the system at different points throughout the game. It's way more complex than just saying, higher PS5 clocks ==> higher average framerates in games with dynamic resolution.
Its not the english language thats the problem its your english terms and nouns that are confusing, 36cu's on the ps5 arent hypothesis they are physical. And i wasnt clear why you stated 1.8ghz which obviously everybody knows 36 cus at 1.8ghz would perform worse than series x.

About ps5 performing better in frame rates vs series x i said my guess would be the variable and efficient nature of ps5s design. That is my guess its not the universal truth it could be alot of things together from tools to the hardware itself. And the variable clocks to my guess are one of the reasons otherwise a theres no reason we shouldnt discuss it lets all go to sleep and its just magic that ps5 with 10tf is outperforming 12tf
Which bit of the word "bug" don't you understand? Sometimes it is there and sometimes it isn't as the video posted shows.
You didnt say its a bug you clearly said it was debunked!" Which is a clearly false statement, whether its a bug or a bottleneck of the series x xdk,gdk apis nobody knows the point is series x has clearly low muzzle flashes and inproper raytracing on cod. Thats the bottom line.
 
Last edited:
See what exactly?🤣

Um the post that got you the temp? I always thought it was because of those COD screens.

Do you understand me?

Maybe I'm not making myself clear.

Edit: I'm honestly not seeing anything ban worthy from your posts in Cyberpunk threads.

It's pretty much old news and whatever you did I'm sure you learned from it. Riky Riky

🤷‍♂️
 
Last edited:
What I dont get:

On consoles they at least have a fixed environment and can find a workaround or fix issues like that, on PC there is no way to cover everything.
Lets start with general notions:

1.- The consoles are not the highest end those are the High end PCs but as usually most of the profit of game or huge part of it are the consoles
make most of AAA have as target the consoles or at least should run in the consoles
2.-The PC hardware involve so fast that even if your target was some GPU when you finish your game that GPU is now in lower segment
3.- The level optimization which can happens in a PC is not the same as in consoles but the last point helps a lot
4.- As happens many times in your life if something sounds too simple is because is probably not so simple
5.- Remedy was not the only third party who complain actually a couple of ID Software engineers did the same few days before we know the acquisition then
they delete tweets, is not a complot and those are know for be one the more capables studios we have in the technical department
6.- Xbox makerting not dev promise the same experience as the XSX but you know less resolution and very easy to port
7.- Make a game is hard as fuck you don't want to face more problems and find some random user which tell you he can solve it in his options menu with out
problem any day

Regarding your comments you had to remember the marketing promise was next gen experiences only with a lower resolution and that is not always possible
also remember you want to have a minimum level of quality where you assets can "shine".

GPU bound scenario = lower visual effects
Lower effects is against to just "lower resolution" or quality textures as some user say
like raytracing
In which way? quantity of rays? but this probable make your image have more noise and again is against the promise
But do you think about the memory required for BVH ? this probably one the biggest problems because XSS has less memory but this doesn't scale with resolution
So what use more LoDs so your GPU suffers less? but you have less memory you need to be carefull
So what about have less LoDs? yeah now your GPU could suffer
But still, less pixels = less VRAM is needed.
Yes but not anything scale with resolution 1:1 for example BVH, stop thinking like PC gamer and more like a dev.
Just looks how much VRAM and fps can give you an specific GPU in differents resolution are not 1:1.
A game which needs (fictionnal numbers) 10 GB RAM for physics wont fit in the XSS, but when did we need that much RAM for CPU?
I don't know what you tried to said here but remember one of weakest aspects in this current consoles (XSX,PS5) is the memory now imagine XSS in 5 years from now.
These are devs which already release games on PC too but in the past non of them ever complained that we have so many different PC configs like different RAM speed, size, disc space, speed, CPU and GPU and OS etc.
All of them complain about the console but never heard them complaining about PC.
But they do but you never listen, just see all the GDC and Siggraph conferences you really think this people doesn't want to use all the new APIs and implemented in
its engines of course but they can.

And when they release a game they just put a range of GPUs from where test or even think its game can run well is your problem as PC gamer check how good that game
will run I mean I am not the only one who go to check benchmark from a game just to see how good runs with my GPU even in some cases the CPU.

Or you really think those developers are using budget to optimize for the GTX 700 series because is the lowest common denominator. Make a game in XSS is not hard but
make a game which use as should be the XSX and run decent in XSS that required more time and budget and sometimes is not wort it so you had to make sacrifices aside
resolution.
 
Status
Not open for further replies.
Top Bottom