• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

onQ123

Member
Have been warning people awhile now


C2b8CAK.png
Self-quoting ?

Have you no shame? 😂
 
I went back and looked at the docs and the 45% increase is for rendering performance under increased GPU performance

I hope you are right though, regardless I am day one and if people still love me I will get it a few days early
I believe this 45% is like the 14 + 4 CUs Cerny thing on PS4. It's meant for developers, not for players. Here 45% is coincidentally (or not) the almost perfect average between bandwidth increase and Tflops increase. It should be an honest take about the rendering improvement of PS5 games if you don't use either AI, new RT features or dual issue optimization. So developers can already aim for a 45% improved native resolution by default (ex: from 1080p to 1296p), then add new features on top like AI upscaling or/and new RT effects.
 

Evolved1

make sure the pudding isn't too soggy but that just ruins everything
No no, its a 67 TF super monster that wont be eclipsed until the PS7

Oh and its going to be a super low price of $299 on Black Friday

It will also be bundled with a free 77" LG G3 OLED TV and have a $500 mail in rebate
i'll have to pass then. it's impossible to move around a tv that size by yourself. oh well.
 

Dorfdad

Gold Member
Honestly I understand the buzzword ray tracing and the benefits just don't believe owe should be creating consoles with half ass implementations of this stuff. Either do it all or don't its ruining gaming squeezing frames from the games to sell on looks, if you can't get a stable 40FPS with raytracing don't include it. It's not that important.
 

bitbydeath

Gold Member
I went back and looked at the docs and the 45% increase is for rendering performance under increased GPU performance

I hope you are right though, regardless I am day one and if people still love me I will get it a few days early
Rendering performance is a very specific task which could throw every estimate out, that said:

We know they’re aiming for 4K/60FPS+, 8K/30FPS. How they achieve it matters not.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Honestly I understand the buzzword ray tracing and the benefits just don't believe owe should be creating consoles with half ass implementations of this stuff. Either do it all or don't its ruining gaming squeezing frames from the games to sell on looks, if you can't get a stable 40FPS with raytracing don't include it. It's not that important.
Well, Avatar is the biggest game to use RT features so far and they are using everything from RTGI to reflections and shadows, and they just released a 40 fps patch which DF found to be very close to the 30 fps quality mode in pixels.

I think even this half ass implementation that we see in the base PS5 is strong enough to do 3 rt features at a very decent framerate and image quality. PS5 pro is set to be 2-4x faster which could mean 90%-160% better performance. You will get way more than 40 fps with all bells and whistles from the PS5 pro because AMD is finally getting their shit together and Sony as always is benefitting from it.

I think getting RT in the PS5 is extremely important because it let devs get that head start before we migrate to fully ray traced games next gen. They are getting crucial experience this gen instead of next gen and now we know exactly whats missing hardware wise and what devs need to do on the software front to extract more out of these consoles. Optimize, optimize, optimize.

As for whether or not its important, i think devs themselves have repeatedly said how much of a time saver realtime GI (rt or otherwise) can be when authoring assets and creating lighting bakes. It will save on dev time, save on disk space, and who knows maybe even vram and gpu power by switching to realtime lighting, reflections and shadows. Screenspace reflections and shadows are still the biggest performance killer on PCs outside of RT effects. If devs can get dedicated hardware that can handle these features in a very optimal way, it should technically free up the rest of the GPU to push other visual effects.

This console is a step in the right direction. This is where devs want to go. And who knows maybe in 3-4 years, sony and amd might learn from their mistakes with the PS5 pro and produce something that can do Path Tracing on the PS6.
 

S0ULZB0URNE

Member
Pro is actually more customized than standard PS5, it has (not seen before) RT and ML parts.

But PS5 was:

- 36 CU RDNA1 + RT from RDNA2
- 36 CU RDNA2 - VRS, SFS, MS and L3 cache

You can like whatever answer you want but fact is we know exactly what PS5 GPU represents - add 45% to that and you have raw raster power of PS5 Pro.
Which makes your off the shelf PC component comparisons even more silly.

You can think what you want.
If Sony releases the specs and they say 33.5 TF's... than anyone saying anything less is trolling.
 

OverHeat

« generous god »
Which makes your off the shelf PC component comparisons even more silly.

You can think what you want.
If Sony releases the specs and they say 33.5 TF's... than anyone saying anything less is trolling.
Still a long way from a 4090 82.58 TF32 TFLOPS for base 4090
 
Last edited:

Akuji

Member
what is "our" 4090?
If anyone thinks the PS5 Pro is designed to compete with PCs that are equipt with a 4090 and 7950x3d then i dont know what to say ...
 

OverHeat

« generous god »
what is "our" 4090?
If anyone thinks the PS5 Pro is designed to compete with PCs that are equipt with a 4090 and 7950x3d then i dont know what to say ...
Well I bought a PS5 and will buy a PS5 pro these things add up fast almost the price of a video card…😜 for 1 generation of console.
 
Last edited:

Lysandros

Member
It is a RDNA3 number so it is 16.75 real TFlops. That's inline with 50% increase in perf.
10.3 TF to 16.75 TF is a ~62% increase, not quite inline with 45%. Double rate fp32 is also a 'real' hardware feature and can bring an additional ~10-15% with decent use in compute throughput, it is not non-existent. I would imagine it will require some effort to use though.
 

Dorfdad

Gold Member
Well, Avatar is the biggest game to use RT features so far and they are using everything from RTGI to reflections and shadows, and they just released a 40 fps patch which DF found to be very close to the 30 fps quality mode in pixels.

I think even this half ass implementation that we see in the base PS5 is strong enough to do 3 rt features at a very decent framerate and image quality. PS5 pro is set to be 2-4x faster which could mean 90%-160% better performance. You will get way more than 40 fps with all bells and whistles from the PS5 pro because AMD is finally getting their shit together and Sony as always is benefitting from it.

I think getting RT in the PS5 is extremely important because it let devs get that head start before we migrate to fully ray traced games next gen. They are getting crucial experience this gen instead of next gen and now we know exactly whats missing hardware wise and what devs need to do on the software front to extract more out of these consoles. Optimize, optimize, optimize.

As for whether or not its important, i think devs themselves have repeatedly said how much of a time saver realtime GI (rt or otherwise) can be when authoring assets and creating lighting bakes. It will save on dev time, save on disk space, and who knows maybe even vram and gpu power by switching to realtime lighting, reflections and shadows. Screenspace reflections and shadows are still the biggest performance killer on PCs outside of RT effects. If devs can get dedicated hardware that can handle these features in a very optimal way, it should technically free up the rest of the GPU to push other visual effects.

This console is a step in the right direction. This is where devs want to go. And who knows maybe in 3-4 years, sony and amd might learn from their mistakes with the PS5 pro and produce something that can do Path Tracing on the PS6.
Hope so, but Im worried they will just add more bells and whistles and it will look better but were going to be stuck with sub 60 gameplay at the expense of more shadows and reflections.
 

ABnormal

Member
  • To that point above, many have already seen that if you try to reverse engineer a GPU clock speed from that 33.5TFLOP figure, you get a clock lower than the base PS5. NEVER in the history of video game hardware have we seen any "revision" of a system releasing AFTER the original system (of the same generation) with a lower clock speed than the base. That makes absolutely no sense, has never happened, and would have negative implications to how even existing games would run from a backwards compatibility perspective. Yet, 33.5 TFLOPs it is :messenger_smirking:
Actually makes sense for an iteration as PS5 Pro: it's just a little jump made to push double performance (or better, double actual RESULT on video) of the base PS5 games. Either double frame rate or double resolution, or a compromise of both. Consoles have to stay within certain thresholds of heating and power consumption, and a bigger GPU like that of PS5 Pro, if pushed to the same clock speeds, would heat up too much for those standards (but we still have to see how much the reduction in size of the nodes will impact that aspect). This is a console which has to be priced only a little more than a base PS5, and with comparable dimensions, energy consumptions and form factor.

The time for big steps in not here: that will be as usual with a new generation, and the generational leap requires to be felt as a significant leap. That would be hampered pushing for a several times more powerful Pro (even if very pricey).

The point is that it is not made to have a huge jum, but just to enjoy PS5 games at full resolution AND 60 fps. That's all. And as said, it's not important how it does that: what matters is the actual result on screen. AI upscaling may have reached a point where an AI upscaled image is so close to an actual native 4k image to make economically stupid to brute force through raw processing power based on costly GPU parts. There are dimishing returns even on the side of hardware investments, obviously.

And I am one who will surely buy it, even just to be able to have few great games to play on it, especially on PSVR2 (if it will allow to push more for native 90 fps, high resolution games, I'm all for it).
 
Last edited:

lh032

I cry about Xbox and hate PlayStation.
Well I bought a PS5 and will buy a PS5 pro these things add up fast almost the price of a video card…😜 for 1 generation of console.
most people will upgrade their GPU/CPU or both after 1 generation. (Especially you ;) ,let's not pretending here )

besides, you are comparing a whole machine to a PC part lol
 
Last edited:

Sanepar

Member
10.3 TF to 16.75 TF is a ~62% increase, not quite inline with 45%. Double rate fp32 is also a 'real' hardware feature and can bring an additional ~10-15% with decent use in compute throughput, it is not non-existent. I would imagine it will require some effort to use though.
Real perf doesn't scale on the same proportion.
 

JimRyanGOAT

Member
I want the Pro but theres not much going on this gen


Like besides Silent Hill and GTA6, I dont see anything worth buying on PS5 instead of my PC


Even COD games don't hit anymore, and that was always the safe buy :messenger_grinning_sweat:

Sorry for stating something thats been said 300 trillion times this gen lmfao

I
 

Fafalada

Fafracer forever
- 36 CU RDNA2 - VRS, SFS, MS and L3 cache
Just SF.
SFS is one of the algorithms that use SF.

Guys........the PS5 Pro will perform like a console that's 15 TFs with added\better Ray-tracing and Mach-Learning with PSSR.
The same docs that quote 45% speed improvement also quote game improvements on par with PS4 Pro. So that's probably a better measuring stick than TF/NoTF nomenclature.
 

ChiefDada

Gold Member
Pro is actually more customized than standard PS5, it has (not seen before) RT and ML parts.

But PS5 was:

- 36 CU RDNA1 + RT from RDNA2
- 36 CU RDNA2 - VRS, SFS, MS and L3 cache

You can like whatever answer you want but fact is we know exactly what PS5 GPU represents - add 45% to that and you have raw raster power of PS5 Pro.

I'd rather have PS5 Pro's 45% GPU upgrade with 2-4X RT power and PSSR, than a 7900xt thrown in with equivalent tf and feature set. I think most others will agree when they see the QUALITY of pixels on screen.
 
Yeah..........I'm confused why people are lying to themselves on this. Guys........the PS5 Pro will perform like a console that's 15 TFs with added\better Ray-tracing and Mach-Learning with PSSR. Overall, it'll probably feel like a console that's 70% better than the Xbox Series X games with raytracing turned on (like Cyberpunk) or 30-40% better without raytracing. But don't expect a next-gen level jump here.
Curb Your Enthusiasm Bingo GIF by Jason Clarke
 

Loxus

Member
It would be hilarious if the PS5 Pro leak document from MLiD and Tom Henderson was fake and we're all here arguing about 45% and 33.5TF for nothing and Sony ends up using AI/ML FSR4.

The lack of MLiD not responding to his video being taken down seem ever more fishy. Dude should of made a scene out of that.

Probably me just wanting more PS5 Pro info.
 

SlimySnake

Flashless at the Golden Globes
It would be hilarious if the PS5 Pro leak document from MLiD and Tom Henderson was fake and we're all here arguing about 45% and 33.5TF for nothing and Sony ends up using AI/ML FSR4.

The lack of MLiD not responding to his video being taken down seem ever more fishy. Dude should of made a scene out of that.

Probably me just wanting more PS5 Pro info.
DF confirmed from their own sources. the leak is real.
 
It would be hilarious if the PS5 Pro leak document from MLiD and Tom Henderson was fake and we're all here arguing about 45% and 33.5TF for nothing and Sony ends up using AI/ML FSR4.

The lack of MLiD not responding to his video being taken down seem ever more fishy. Dude should of made a scene out of that.

Probably me just wanting more PS5 Pro info.
Team real
 

onQ123

Member
Yeah..........I'm confused why people are lying to themselves on this. Guys........the PS5 Pro will perform like a console that's 15 TFs with added\better Ray-tracing and Mach-Learning with PSSR. Overall, it'll probably feel like a console that's 70% better than the Xbox Series X games with raytracing turned on (like Cyberpunk) or 30-40% better without raytracing. But don't expect a next-gen level jump here.
This !

But having such a a big focus on RT & ML vs the normal rendering pipeline will cause a shift closer to a new generation than PS4 Pro & Xbox One X.


The consoles we have now can do some pretty cool stuff but devs have to weigh the pros & cons of trying to push RT & ML over traditional rendering but PS5 Pro is like OK that's your results with the traditional pipeline we will give you a little more to push things in that direction but now there is no excuse not to push RT & ML .

Example: Dev looking at something they could do on PS5/Xbox Series X but they would have to use over 50% of the GPU compute to do so & it would lower the normal rendering by a lot so they make the choice to have a higher resolution but with PS5 Pro the dev will have more than that 50% of compute just sitting there to be used for extra stuff. If you need to drop the resolution ML is there to clean it up
 

Perrott

Gold Member
What do we think the "Ultra Boost Mode" mentioned by The Verge would entail?

It does seem pretty clear by now that when running titles optimized for it, the PS5 Pro would have 60CUs clocked at 2.18ghz. But going by how Sony has been tackling backwards compatibility in the past for unoptimized titles, it's fair to expect the Pro to be limited to the 36CUs of the standard PS5 when running in Ultra Boost Mode.

The thing is, the Pro's GPU clock speed when in Ultra Boost Mode not only cannot be the same as in it's 60CUs configuration because that's actually a slower top than that of the standard PS5, but it also does necessarily have to be greater than that, since there wouldn't be any "Boost" if the speeds were to be slower or equal to those in the base model.

So how high could a 36CUs Ultra Boost Mode be clocked at within the console's power budget and the constraints of its graphics architecture? 2.5ghz? 3ghz?
 
Last edited:
What do we think the "Ultra Boost Mode" mentioned by The Verge would entail?

It does seem pretty clear by now that when running titles optimized for it, the PS5 Pro would have 60CUs clocked at 2.18ghz. But going by how Sony has been tackling backwards compatibility in the past for unoptimized titles, it's fair to expect the Pro to be limited to the 36CUs of the standard PS5 when running in Ultra Boost Mode.

The thing is, the Pro's GPU clock speed when in Ultra Boost Mode not only cannot be the same as in it's 60CUs configuration because that's actually a slower top than that of the standard PS5, but it also does necessarily have to be greater than that, since there wouldn't be any "Boost" if the speeds were to be slower or equal to those in the base model.

So how high could a 36CUs Ultra Boost Mode be clocked at within the console's power budget and the constraints of its graphics architecture? 2.5ghz? 3ghz?
It's not just how high they can clock the CUs. PS5 Pro seems that it has around 25% higher VRAM bandwidth, which matters a lot in bandwidth-heavy scenarios, as well as a 10% higher clocked CPU. These will make a difference (depending on the scenario, though).
 

Lysandros

Member
Real perf doesn't scale on the same proportion.
I am well aware, not saying that it does. I only think that people are being way too fixated on this 45% figure without understanding the nuances and variability.

At the same time it's somewhat funny that the "XSX is 20% more powerful than PS5 because of teraflops" narrative doesn't quite apply when it comes to PS5/PRO anymore... We are finally there, it only took Sony to build another beefier machine for the general acceptance apparently, which is somewhat telling.
 
Last edited:

Bojji

Member
I am well aware, not saying that it does. I only think that people are being way too fixated on this 45% figure without understanding the nuances and variability.

At the same time it's somewhat funny that the "XSX is 20% more powerful than PS5 because of teraflops" narrative doesn't quite apply when it comes to PS5/PRO anymore... We are finally there, it only took Sony to build another beefier machine for the general acceptance apparently, which is somewhat telling.

We don't know raw power of Xbox when we know it for Pro, with this information teraflops numbers are irrelevant.

Before launch we only had raw specs for consoles and everyone (almost) predicted that 12TF RDNA2 GPU will be faster than 10TF RDNA2 GPU - teraflops numbers works the best when comparing GPUs within the same architecture. But reality was different and PS5 has some hardware advantages over Xbox (that most people ignored before launch) it just loses in compute and memory bandwidth.

Obviously pro will get much bigger boost in games with RT and overall boost in image quality thanks to PSSR.

What is sad to me that usage of PSSR is not guaranteed, developers are often lazy/incompetent/focused on other things and many really good options are not implemented.
Sony probably isn't forcing anyone to use it so it may be underutilized like 120Hz refresh rate with LFC, 40FPS/Hz modes etc.
 

SlimySnake

Flashless at the Golden Globes
What is sad to me that usage of PSSR is not guaranteed, developers are often lazy/incompetent/focused on other things and many really good options are not implemented.
Sony probably isn't forcing anyone to use it so it may be underutilized like 120Hz refresh rate with LFC, 40FPS/Hz modes etc.
yeah, RDR2 is the perfect example. A completely broken checkerboard implementation while X1X brute forced native 4k. This is why i take all these secret sauce customizations with a grain of salt. No one utilized Cernys IO and SSD for anything other than fast loading. The whole changing of design paradigms didnt happen. Insomniac did use it for portals and faster traversal so i guess one dev out hundreds used his secret sauce the way it was intended.

PS4 Pro's Checkerboard wasnt even used by Naughty Dog. They didnt even bother playing around it with and stuck with it for Uncharted 4, Lost Legacy and TLOU2. Thats 4 years and they basically said fuck it, im not even gonna try and learn to make it work with our engine. Other devs like Bend, Sucker Punch and Polyphony Digital settled for 1800p checkerboard which isnt as clean as the 2160p cb we saw in GOW and Horizon ZD.

So if Sony wants everyone to use 4k PSSR performance, how do we know most devs will even target that?
 

Lysandros

Member
We don't have full details on how the PS5 software handles Mesh Shaders, however it should support them fully on a hardware level. Remedy developers said they didn't have to make any specific optimisations for the PS5 in regards to their Mesh Shader implementation which is interesting. They only mentioned the Meshlet sizes were different across the platforms.

Alan Wake isn't the only game that uses Mesh Shaders by the way, so does Avatar FOP and the performance difference on that is negligible between the PS5 and Series X ,even though theoretically the Series X is supposed to perform better (Mesh Shaders scale well with compute).
PS5 is running the game better than XSX in the new 40 FPS mode apparently (at 23:25 onwards, video below). And at release nearly all comparison videos beside Tom's showed the game being more stable on PS5. There are also other hardware factors besides compute to determine Mesh Shader performance such as the Geometry Engine.

 
Last edited:
What do we think the "Ultra Boost Mode" mentioned by The Verge would entail?

It does seem pretty clear by now that when running titles optimized for it, the PS5 Pro would have 60CUs clocked at 2.18ghz. But going by how Sony has been tackling backwards compatibility in the past for unoptimized titles, it's fair to expect the Pro to be limited to the 36CUs of the standard PS5 when running in Ultra Boost Mode.

The thing is, the Pro's GPU clock speed when in Ultra Boost Mode not only cannot be the same as in it's 60CUs configuration because that's actually a slower top than that of the standard PS5, but it also does necessarily have to be greater than that, since there wouldn't be any "Boost" if the speeds were to be slower or equal to those in the base model.

So how high could a 36CUs Ultra Boost Mode be clocked at within the console's power budget and the constraints of its graphics architecture? 2.5ghz? 3ghz?
Yes. They'll deactivate 44 CUs and will be able to overclock both CPU and GPU well above the baseline clocks. That's my guess.
PS5 is running the game better than XSX in the new 40 FPS mode apparently (at 23:25 onwards, video below). And at release nearly all comparison videos beside Tom's showed the game being more stable on PS5. There are also other hardware factors besides compute to determine Mesh Shader performance such as the Geometry Engine.


Yes still those streaming stutters on XSX not seen on PS5 in Avatar. Not hidden by VRR. Same thing with Lords of the Fallen last update. Still those annoying (unprofiled by those lazy devs!) traversal stutters on XSX and PC, but not on PS5? What's happening here? No narrative or plausible explanation from the DF crew? Still a mysterious phenomenom? Also Fallout 4 update is looking to perform better on PS5! Haha.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Yes. They'll deactivate 44 CUs and will be able to overclock both CPU and GPU well above the baseline clocks. That's my guess.

Yes still those streaming stutters on XSX not seen on PS5 in Avatar. Not hidden by VRR. Same thing with Lords of the Fallen last update. Still those annoying (unprofiled by those lazy devs!) traversal stutters on XSX and PC, but not on PS5? What's happening here? No narrative or plausible explanation from the DF crew? Still a mysterious phenomenom? Also Fallout 4 update is looking to perform better on PS5! Haha.
poor optimization. nothing more. the XSX has more than enough power to run games at a decent resolution and framerate. those drops to 0 fps are just bugs. Same shit as Skyrim on the PS3.

The PS5 and XSX literally have the same exact RDNA2 GPU and Zen 2 CPU. Both use GDDR. There is no ESRAM vs GDDR fiasco this gen. No cell vs xenon processor nonsense. No Nvidia vs ATI GPU delta. If its Directx issues then it shouldve affected the PC versions as well. This is just a very simple case of people not playing their own games before they are shipped.

XSX outperforms the PS5 in a lot of games. If these issues were a hardware thing than every single game would be affected. Its possible that the slower clocks and wide design is making the xsx perform roughly on par with the PS5 in some games, but those traversal stutters in avatar and 0 fps bugs are not due to PS5's super secret cerny IO sauce or higher clocks. They are just bugs. There is nothing for DF to investigate in those scenarios. Thats for the devs to investigate.

Most of us are millennials on this board. We all should remember how tearing and texture filtering was a massive issue in virtually every single UE3, Ass Creed and ubisoft game on the PS3. Nothing worked until around 2010 when they all mysteriously went away. All of a sudden, AC games, EA games, and virtually every third party game was virtually identical to the X360 which was supposedly the more powerful console. It turned out devs just got more familiar with the PS3 hardware and all the work sony engineers put in to train devs on how to code the cell. Hell, even ND were baffled by the tearing and shipped Uncharted 1 with severe screen tearing only to be told by GG that the fix is rather simple and it took them a week to add that in to Uncharted 2 or something silly like that. but that was the cell and it was early days of HD era game development. XSX architecture is anything but exotic so these bizarre traversal stuters and 0 fps issues are indeed lazy devs.
 

Lysandros

Member
Yes still those streaming stutters on XSX not seen on PS5 in Avatar. Not hidden by VRR. Same thing with Lords of the Fallen last update. Still those annoying (unprofiled by those lazy devs!) traversal stutters on XSX and PC, but not on PS5? What's happening here? No narrative or plausible explanation from the DF crew? Still a mysterious phenomenom? Also Fallout 4 update is looking to perform better on PS5! Haha.
What is happening? Anything but the hardware of course. In all seriousness, a graphically intensive current gen game having RT GI and using mesh shaders running better on PS5 is quite the narrative breaker.
 

onQ123

Member
poor optimization. nothing more. the XSX has more than enough power to run games at a decent resolution and framerate. those drops to 0 fps are just bugs. Same shit as Skyrim on the PS3.

The PS5 and XSX literally have the same exact RDNA2 GPU and Zen 2 CPU. Both use GDDR. There is no ESRAM vs GDDR fiasco this gen. No cell vs xenon processor nonsense. No Nvidia vs ATI GPU delta. If its Directx issues then it shouldve affected the PC versions as well. This is just a very simple case of people not playing their own games before they are shipped.

XSX outperforms the PS5 in a lot of games. If these issues were a hardware thing than every single game would be affected. Its possible that the slower clocks and wide design is making the xsx perform roughly on par with the PS5 in some games, but those traversal stutters in avatar and 0 fps bugs are not due to PS5's super secret cerny IO sauce or higher clocks. They are just bugs. There is nothing for DF to investigate in those scenarios. Thats for the devs to investigate.

Most of us are millennials on this board. We all should remember how tearing and texture filtering was a massive issue in virtually every single UE3, Ass Creed and ubisoft game on the PS3. Nothing worked until around 2010 when they all mysteriously went away. All of a sudden, AC games, EA games, and virtually every third party game was virtually identical to the X360 which was supposedly the more powerful console. It turned out devs just got more familiar with the PS3 hardware and all the work sony engineers put in to train devs on how to code the cell. Hell, even ND were baffled by the tearing and shipped Uncharted 1 with severe screen tearing only to be told by GG that the fix is rather simple and it took them a week to add that in to Uncharted 2 or something silly like that. but that was the cell and it was early days of HD era game development. XSX architecture is anything but exotic so these bizarre traversal stuters and 0 fps issues are indeed lazy devs.
What the hell is this post?
 

SlimySnake

Flashless at the Golden Globes
all games are not created the same.
So you agree its software. Glad to hear it.

I also love how you conveniently ignored the fact that we were discussing massive stutters to 0 in lords of the fallen and huge stutters in avatar's 30 and 40 fps modes that drop 10-20 fps in a second almost randomly. it was literally in the video that was posted. That is whats being discussed, not average performance of xsx vs ps5 where the slower clocks or wide design of the xsx HARDWARE might be keeping it from consistently outperforming the ps5.
 
Top Bottom