• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

The Aya Neo and the GPD Win 3 already exist and are in people's hands. But yeah Tencent has so much cash they might flood the market at very cheap prices just to get the foot in the door.

I still have my rog phone 2 Tencent edition. Got it for $500 from HK when the global edition was priced $1200 and I now use it as my portable machine, like a Vita successor :D
 

ToadMan

Member
Very interesting. While my knowledge is extremely limited, going off this as well as many other comments. My take away is...

Xbox has more TFs. (Which is an utterly useless metric that means nothing)
Xbox may benefit from HW VRS.

Absolutely every other aspect of both the hardware as well as software significantly favors the PS5.

My question then is this. Why hasn't the PS5 fared significantly better than the XSX in comparisons? Based on the info being posted here, there should be quite a gap in favor of the PS5. While a couple of the first comparisons might have supported that trend occurring, almost every single comparison since has shown the opposite, with the results not only favoring the XSX, but with a widening difference over time.

Are developers perhaps being paid money in secret alleyways somewhere to gimp the PS5 versions?

Tflops aren’t an “utterly useless metric that means nothing”. They are a single indicator of performance but not the only indicator.

You don’t find it surprising that the Xsex with 2 extra tflops is unable to outperform the PS5? At least not to the extent that notional gpu delta would suggest. That was the argument being made before launch - Tflops was a measure of everything. It wasn’t and never has been but that’s not to say it is a totally irrelevant number either.

I know it’s irritating - the world won’t resolve down to one easy number for you...

So that is quite a gap in favour of PS5 - at least 18% Tflops advantage has been overcome by other parts of the system.

And they did that at the same price point with a better controller, a larger effective SSD capacity and there’s even the option for the same hardware $100 cheaper albeit without a blu ray drive.
 
Last edited:

Riky

$MSFT
On which games ?

Framerate is usually a bit better on PS5, they're not on par. And there's also usually stuttering on Series X (Valhalla, Crash 4, CoD...) you don't find on PS5.

Valhalla has a better framerate on Series X now once they adjusted the resolution scaler actually.
XcUEvXS.jpg


As for Call Of Duty in 60hz mode again the Xbox was slightly better.

XmfGILl.jpg


Facts matter.
 
Valhalla has a better framerate on Series X now once they adjusted the resolution scaler actually.
XcUEvXS.jpg


As for Call Of Duty in 60hz mode again the Xbox was slightly better.

XmfGILl.jpg


Facts matter.
Sure but remember that after the last patch AC Valhalla runs at up to 60% higher resolution on PS5. Yep, higher than Hitman 3 difference (44%). You didn't see DF writing about that, is that why you missed it? Well, VGTech didn't miss it.

PS5...and the lowest native resolution found being approximately 2432x1368.
XSX...the lowest native resolution found being 1920x1080.

COD has more effects on PS5. XSX completely lacks the muzzle light and most of the smoke coming out of the gun fired. Those 2 effects are constantly on-screen and quite demanding (alpha effects).
 
Last edited:

Riky

$MSFT
Sure but remember that after the last patch AC Valhalla runs at up to 60% higher resolution on PS5. Yep, higher than Hitman 3 difference (44%). You didn't see DF writing about that, is that why you missed it? Well, VGTech didn't miss it.




COD has more effects on PS5. XSX completely lacks the muzzle light and most of the smoke coming out of the gun fired. Those 2 effects are constantly on-screen and quite demanding (alpha effects).

The Hitman 3 difference is a constant 44% for the entirety of the game with higher settings on top. Valhalla the VG Tech breakdown says it actually goes to 1080p very rarely anyway and that normally they are the same, framerate matters though right?

"In many scenes the PS5 and Xbox Series X render at very similar resolution in performance mode"

That's the quote.

The settings are the same on COD as DF said, that was debunked. The PS5 version tanks in some areas where the scaler breaks.
 
The Hitman 3 difference is a constant 44% for the entirety of the game with higher settings on top. Valhalla the VG Tech breakdown says it actually goes to 1080p very rarely anyway and that normally they are the same, framerate matters though right?

"In many scenes the PS5 and Xbox Series X render at very similar resolution in performance mode"

That's the quote.

The settings are the same on COD as DF said, that was debunked. The PS5 version tanks in some areas where the scaler breaks.
DF conveniently missed the reduced smoke effect from the gun (when everyone else immediatly saw it) and said the lack of muzzle light was a "bug" on XSX. You buy that?
 

Lysandros

Member
Don't you mean 18% at best? Not sure how the difference could possibly be higher than the theoretical maximum.



Hence, why it's 18% difference at best, not at worst.
I agree. We know for a fact that there isn't a perfect/linear scaling with the number processors/CUs, with each additional CU the efficiency per CU drops even if it's only a few percent. Besides, i have hard time seeing how PS5 wouldn't have a higher CU saturation in average due to faster and more continious data flow with its individual CUs having significantly more L1 (%40) / L2 (14%) caches available at 22% higher bandwidth and the inclusion of Cache Scrubbers. Faster asynchronous shader and compute scheduling due to the Command Processor/ACEs/HWS running at 22% higher frequency shouldn't hurt either. The real difference in compute can only be lower than 18%.
 
Last edited:

Mr Moose

Member
The settings are the same on COD as DF said, that was debunked. The PS5 version tanks in some areas where the scaler breaks.
Then they were lying, the shadows were different and there were missing transparency effects. Could've been a bug but they weren't the same. Also missing birds lol #BirdGate


Missing transparency effects easily noticed timestamped
 
Last edited:

DF conveniently missed the reduced smoke effect from the gun (when everyone else immediatly saw it) and said the lack of muzzle light was a "bug" on XSX. You buy that?
ask him how does hitman 3 have more graphical effects and settings on ps4 pro than xbox one x and series s, if hitman 3 is a benchmark then it means ps4 pro is more powerful than xbox one x and series s, since they like roosting tools/bugs like chicken whenever series x performs poorely but call it a benchmark when ps5 doesnt,

As most devs have said including the recent remedy dev, crossgen games are held back by lastgen engines/ lastgen code doesnt matter if you port it to a new console and he explained the troubles they went through when porting control they are still not native games, series x sufferes in crossgen because of xdk to gdk issues but excels in back compatibility and ps5 sufferes in some crossgen because of the way it handles ps4 bc code uts emulation isnt efficient it doesnt inprove much from lastgen code and bc behaves the same as ps4,

But for games that are well optimised with good engines and bigger teams seem to function better on ps5 call of duty, ac valhalla, nba 2k 21 the real benchmark would be actual nextgen games and i cant wait because there will be no excuses when they start coming out there will be no air bags youll have to die like a real man with ur underpwerforming console.
 
Last edited:

RaZoR No1

Member


Timestamped the relevant portion.

Must listen for those who parrot Series S won't impact next-gen development at all, or how you can scale down by just lowering resolutions and other effects.....

What I dont get:
GPU bound scenario = lower visual effects like raytracing, resolution, lod etc. which they already do on PCs.

CPU bound scenario = wont / shouldnt happen, because the CPU is nearly on par with the XSX CPU.

Only bottleneck should be the RAM size and bandwidth.
But still, less pixels = less VRAM is needed.
A game which needs (fictionnal numbers) 10 GB RAM for physics wont fit in the XSS, but when did we need that much RAM for CPU?

These are devs which already release games on PC too but in the past non of them ever complained that we have so many different PC configs like different RAM speed, size, disc space, speed, CPU and GPU and OS etc.
All of them complain about the console but never heard them complaining about PC.
On consoles they at least have a fixed environment and can find a workaround or fix issues like that, on PC there is no way to cover everything.
 
What I dont get:
GPU bound scenario = lower visual effects like raytracing, resolution, lod etc. which they already do on PCs.

CPU bound scenario = wont / shouldnt happen, because the CPU is nearly on par with the XSX CPU.

Only bottleneck should be the RAM size and bandwidth.
But still, less pixels = less VRAM is needed.
A game which needs (fictionnal numbers) 10 GB RAM for physics wont fit in the XSS, but when did we need that much RAM for CPU?

These are devs which already release games on PC too but in the past non of them ever complained that we have so many different PC configs like different RAM speed, size, disc space, speed, CPU and GPU and OS etc.
All of them complain about the console but never heard them complaining about PC.
On consoles they at least have a fixed environment and can find a workaround or fix issues like that, on PC there is no way to cover everything.
Same. To me it sounds like the devs just don't want to spend time dealing with another console. It also might be depending on the engine. Something like Unreal Engine is almost endlessly scalable. Maybe Remedy engine is shit.
 

jroc74

Phone reception is more important to me than human rights
In the past GHz was the metric for CPU speed, which was also excluding a lot of important attributes which contribute to the final performance of the chips, a reason why AMD back then also tried to market their CPU models with a number similar to the GHhz speed of Intel CPUs, although their CPUs run at much lower clocks.
This should be stickied somewhere, lol. I remember some of us mentioning some of this.

Everything got turned upside down when it went from Pentium to Core 2 Duo, Quad Core. Seeing the clock speeds get reset had some ppl scratching their heads.

And its been mentioned numerous times, for the PS4 and XBO it wasnt just the tf number that made the difference.

Tflops aren’t an “utterly useless metric that means nothing”. They are a single indicator of performance but not the only indicator.

You don’t find it surprising that the Xsex with 2 extra tflops is unable to outperform the PS5? At least not to the extent that notional gpu delta would suggest. That was the argument being made before launch - Tflops was a measure of everything. It wasn’t and never has been but that’s not to say it is a totally irrelevant number either.

I know it’s irritating - the world won’t resolve down to one easy number for you...

So that is quite a gap in favour of PS5 - at least 18% Tflops advantage has been overcome by other parts of the system.

And they did that at the same price point with a better controller, a larger effective SSD capacity and there’s even the option for the same hardware $100 cheaper albeit without a blu ray drive.

This IMO is another thing that some ppl cant come to terms with.

There is a $399 PS5 out there that performs as good, at times better than a console thats $100 more. And Sony seems to be handling the supply situation better, despite all the hurdles they have.

At some point some ppl just need to accept the situation and stop trying to rationalize it, figure it out.
 

Lysandros

Member
What I dont get:
GPU bound scenario = lower visual effects like raytracing, resolution, lod etc. which they already do on PCs.

CPU bound scenario = wont / shouldnt happen, because the CPU is nearly on par with the XSX CPU.

Only bottleneck should be the RAM size and bandwidth.
But still, less pixels = less VRAM is needed.
A game which needs (fictionnal numbers) 10 GB RAM for physics wont fit in the XSS, but when did we need that much RAM for CPU?

These are devs which already release games on PC too but in the past non of them ever complained that we have so many different PC configs like different RAM speed, size, disc space, speed, CPU and GPU and OS etc.
All of them complain about the console but never heard them complaining about PC.
On consoles they at least have a fixed environment and can find a workaround or fix issues like that, on PC there is no way to cover everything.
Polygon throughput can also be a problem besides the RAM amount and bandwidth, XSS is one SE design with only half the number of Prim units and rasterizers compared to PS5/XSX. And those blocks run at significantly lower frequency lowering the geometry processing further.
 

assurdum

Banned
Valhalla has a better framerate on Series X now once they adjusted the resolution scaler actually.
XcUEvXS.jpg


As for Call Of Duty in 60hz mode again the Xbox was slightly better.

XmfGILl.jpg


Facts matter.
Would be something else if Valhalla would have worst average fps when it can be lowered at 1080p on series X...and dat FPS advantage in BLOPS with that juicy 0,01% and 1 FPS of difference in some spots. But I guess dignity goes to fuck itself when we have to find any sort of multiplat advantage...
 
Last edited:

RaZoR No1

Member
Polygon throughput can also be a problem besides the RAM amount and bandwidth, XSS is one SE design with only half the number of Prim units and rasterizers compared to PS5/XSX. And those blocks run at significantly lower frequency lowering the geometry processing further.
I didnt go into detail like these things, but all of these issues always existed on PC since forever and there it works.

They could just advertise their games like:best played on XSX and PS5.

I mean all of the XSS buyers should already know that they get all (maybe, if this doesnt turn into a 3DS / N3DS scenario) of the next gen games, but not the same resolution etc.

Again, like on PC
with a GTX 1060 I can play all games, but have to lower grafic settings , maybe in the future I will even have to go lower than 1080P but I can play them. With a RTX 3090 I still can play the same games but with more effects / eye candy.
Additionally, if ML is really thing for the console, then there wont be a need to have a XSS to render at 1080p, of AMDs DLSS tech can upres from 540p or 720p and still looks awesome.
IMO devs are never satisfied with the tools they get, but just look at the past console what they have achieved with much much less compute power and complety different console architectures (latest example would be Last of Us 1 and GTA V)

I think it was Turn 10 who even said, thanks to the Xbone X they learned new things to improve for base Xbox One, how to implement some features etc.
 
Last edited:

Neo Blaster

Member
God of War was never coming in 2021, you’re a fucking idiot if you believed Sony. If anything it will be Horizon thats coming this year.
If Sony can solve the supply problem, then near the holidays would be the perfect time to release Horizon, but only if they show gameplay till June/July.
 

Lysandros

Member


To refresh memories a bit, from description:

"PS5 in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 2275x1280. PS5 in Performance Mode rarely renders at a native resolution of 3840x2160.

Xbox Series X in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being 1920x1080. Xbox Series X in Performance Mode rarely renders at a native resolution of 3840x2160.

The only resolution found on PS5 in Quality Mode was 3840x2160.

Xbox Series X in Quality Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 3328x1872. Drops in resolution below 3840x2160 on Xbox Series X in Quality Mode seem to be uncommon."
 
Hasnt variable frequency or smartshift actually increase performance as stated by amd, this could be the reason why ps5 is good in variable situations like dynamic res scaling, when theres more unexpected stuff on screen, and keeps solid framerates most of the time cause lets face it dynamic or variable effects are the future, ie dynamic res, vrs, mesh shading/procedural/adaptive tessellation, cbr,dlss, vrr i kind of have a hunch on that.

Variable frequency on PS5 increases performance from a hypothetical 36CU PS5 GPU with something like 1.8GHz fixed clocks. There's no obvious direct connection between PS5's trend of more stable framerates and their variable frequency regime ("obvious" being the operative word).

Without knowing exactly what each individual game is doing under the hood, you cannot conclusively determine what is impact the more stable framerates on PS5. Every game is different and every engine will have different bottlenecks. So looking at one hardware feature, pointing to that and saying this is the reason for the overall better general performance of PS5 vs XSX is a mistake.
 
Last edited:

SafeOrAlone

Banned
Has anyone tried playing Steam games through the "Microsoft Edge" browser on Series S/X?

Apparently it's possible, and if the quality is good, I'd love to play games with my friends on Steam, despite having no PC.
 

Locuza

Member
Given the explanation for when this actually occurs, and for how long (i.e. millisecond inter-frame periodicity), we can realistically dismiss the variable frequency having any negative impact on PS5 performance vs XSX.

Given the performance in real games so far, I'd argue that's a pretty safe assumption.

But if it only happens with one frame out of 30 will anyone really notice it?

From Cernys talk he made it seem like any down locking would be extremely rare. Also from what I understand the PS5 can shift clocks extremely quickly and to the millisecond due to how RDNA2 clocks work. It's not like the clocks are going to drop 30% for 5 minutes or anything like that.
To be precise, every downclock has a negative impact even if leads to just 1% performance loose.
You obviously can argue that this "worst case" of only having 18% TF advantage on the Xbox Series X is the usual case but It's simply not guaranted that the PS5 GPU is running at 2.23GHz all the time.
Since we don't have clock numbers we can't tell if current games are already here and there a bit below the 2.23 GHz mark and if not, how many next gen games will put the clocks down?
Mark Cerny was stating that they expect the GPU to run most of its time at or "close" to that frequency and when downclocking occurs that they expect it to be pretty "minor".
Another statement was saying that to reduce power by 10% it only takes a couple of percent of lower clock speeds.


However all of that is of course not very precise and is based on "expectations", even if they are coming from Sony.
It's not like companies are right all the time or don't inject a bit of (too) optimistic marketing.

Now based on the claims and how it should fare, I wouldn't expect major downclocking to occur but next gen games, which also stress the CPU, could pass the treshold and consistently lower the clocks.
On avg. the PS5 might run at 2.15 GHz in one game, 2.07 GHz in another.
Maybe the TF advantage will go from 18% to 20%, would that be a major difference?
Obviously no, but it ties back to my initial statement that 18% is the worst case, it can't be worse than that but it can be better, without stating that it could be much better.

Just to share my perspective on here. :)
 
They went through this and used the example of the missing light source, it was there all along.
They never said a missing light source! some lie this is, series x simply had poor raytracing on cod than ps5 and the missing alpha effects on guns where there for everyone to see.
Variable frequency on PS5 increases performance from a hypothetical 36CU PS5 GPU with something like 1.8GHz fixed clocks. There's no obvious direct connection between PS5's trend of more stable framerates and their variable frequency regime ("obvious" being the operative word).

Without knowing exactly what each individual game is doing under the hood, you cannot conclusively determine what is impact the more stable framerates on PS5. Every game is different and every engine will have different bottlenecks. So looking at one hardware feature, pointing to that and saying this is the reason for the overall better general performance of PS5 vs XSX is a mistake.
The 36 cu's are not a hypothethis they are real so i dont get your point there. and the variable clocks dont with a 1.8ghz fixed clock is even more confusing what fo you mean here?

The gpu is 2.23ghz and it downclocks variably per frame depending on the stress and nobody knows how far back it down clocks it could be to anything minor so i dont get how you got 1.8 cause if ps5s gpu downclocked to 1.8ghz this means it would have been performing well below series x. And the available data we have is that ps5 performs better in most games and in the cases ive just mentioned.

It isnt my own words but amds themselves said smart shift increases performance and the variable clocks on ps5 seem to be an evolution of smart shift and it could be very well why it keeps stable framerates, the ability to switch power each frame from cpu to gpu depending on the stress/ workload keeps stable frame rates its the whole idea of variability and dynamism. When using dynamic res scaling your constantly dropping and increases resolution depending on the rendering budget in order to keep a steady frame rate and the nature of variable rendeering techniques go hand in hand with how variable your silicon is. Its why most engineers praise ps5s design for efficiency its not just the variable clocks its the cache scrubbers the io the overall system was built for efficiency,

Series x seems to have a hard time holding frames due to its traditional power setup, static fixed clocks no cache scrubbers and even the io has no other silicon despite the decompressor, it just doesnt seem like a system made for variable situations series x seems made for a certain resolution target either full 4k or upscaled 4k and its why it has frame rate problems cause when a game hits over the power budget theres not much else you can do. Its a consequence of design it just seems the ps5 is hardware efficient in variable workloads and series x is more static.
 
To be precise, every downclock has a negative impact even if leads to just 1% performance loose.
You obviously can argue that this "worst case" of only having 18% TF advantage on the Xbox Series X is the usual case but It's simply not guaranted that the PS5 GPU is running at 2.23GHz all the time.
Since we don't have clock numbers we can't tell if current games are already here and there a bit below the 2.23 GHz mark and if not, how many next gen games will put the clocks down?
Mark Cerny was stating that they expect the GPU to run most of its time at or "close" to that frequency and when downclocking occurs that they expect it to be pretty "minor".
Another statement was saying that to reduce power by 10% it only takes a couple of percent of lower clock speeds.


However all of that is of course not very precise and is based on "expectations", even if they are coming from Sony.
It's not like companies are right all the time or don't inject a bit of (too) optimistic marketing.

Now based on the claims and how it should fare, I wouldn't expect major downclocking to occur but next gen games, which also stress the CPU, could pass the treshold and consistently lower the clocks.
On avg. the PS5 might run at 2.15 GHz in one game, 2.07 GHz in another.
Maybe the TF advantage will go from 18% to 20%, would that be a major difference?
Obviously no, but it ties back to my initial statement that 18% is the worst case, it can't be worse than that but it can be better, without stating that it could be much better.

Just to share my perspective on here. :)

If the ps5 downclocked form 2.23 with 36cus more often then it would be performing worse than series x but thats not the case is it. Its not like a game runs at 2.23 or 2.1 ghz constantly this doesnt make any sense since every game nowadays is locked 30/60 and dynamic/upscaled4k so the game only needs certain clocks to hit that frame rate target and res so stressing the gpu at 2.23ghz while looking at the sky or menu doesnt make sense. Its the whole point of variable clocks its pointless wasting heat and power usage with fixed clocks.
 
Last edited:
To be precise, every downclock has a negative impact even if leads to just 1% performance loose.
You obviously can argue that this "worst case" of only having 18% TF advantage on the Xbox Series X is the usual case but It's simply not guaranted that the PS5 GPU is running at 2.23GHz all the time.
Since we don't have clock numbers we can't tell if current games are already here and there a bit below the 2.23 GHz mark and if not, how many next gen games will put the clocks down?
Mark Cerny was stating that they expect the GPU to run most of its time at or "close" to that frequency and when downclocking occurs that they expect it to be pretty "minor".
Another statement was saying that to reduce power by 10% it only takes a couple of percent of lower clock speeds.


However all of that is of course not very precise and is based on "expectations", even if they are coming from Sony.
It's not like companies are right all the time or don't inject a bit of (too) optimistic marketing.

Now based on the claims and how it should fare, I wouldn't expect major downclocking to occur but next gen games, which also stress the CPU, could pass the treshold and consistently lower the clocks.
On avg. the PS5 might run at 2.15 GHz in one game, 2.07 GHz in another.
Maybe the TF advantage will go from 18% to 20%, would that be a major difference?
Obviously no, but it ties back to my initial statement that 18% is the worst case, it can't be worse than that but it can be better, without stating that it could be much better.

Just to share my perspective on here. :)

We have already one very known insider (Matt on Ree, multiplat dev) who already leaked overall the PS5 has to be considered as a 10TF machine, implying the downlocks are indeed negligeable when you look at the whole picture.

BTW the CPU can actually either downclock or give some power to the GPU without loss of game performance when it's not being used (or not enough).
 
Then they were lying, the shadows were different and there were missing transparency effects. Could've been a bug but they weren't the same. Also missing birds lol #BirdGate


Missing transparency effects easily noticed timestamped


As I shared in the past, I have the effect sometime when running the game on my XsX, seems that it disappeared as a bug sometimes...

You can have a look in the VGtech video, you have the muzzle light during this sequence for example :

 
Last edited:
Proving you wrong yet again, please stop embarrassing yourself.
Saying again" as if youve ever proved me wrong what a classic lier, a missing light source has no effect on the resolution or quality of your raytraced shadows, you can have a 100 light sources but if your raytracing has lower quality or uses less samples or a poorer denoiser its not going to change a thing. Im not saying the series x has less raytracing capabilities im just saying the settings they used on series x cod cold war are less than ps5s due to tools or whatever god knows.
Heres the video you can go to 13:30 and nx gamer is explaining the differences in raytracing on both versions


And also thanks for spotting the missing light source it seems to be another problem for series x cod up there with less muzzle flashes. Good luck.
 


Missing here.



The birds, Mason!


I know, what I just said that sometimes, you have the effect in some sequence (like in VGtech video I shared), sometime that's not the case (same sequence in NX gamer video). For me, that could be simply a bug, not a setting change, and we already know how buggy is COD BOCW.... (I stopped playing it because of fucking craches).
 
Last edited:
Status
Not open for further replies.
Top Bottom