Where are you guys getting the TMU, ROP etc. numbers from? As far as I know neither Sony nor MS nor AMD provided full specs of the next-gen consoles GPUs, I don't think they ever will, but maybe I missed something?
The dev prefers extra 8.9TB/s vs 100MB instead of extra 4,7TB/s vs 100MB, despite the latter being a whole 4800% difference vs HDD. More is always better, but more ACTUAL bandwidth, memory, processing power, not more percents. And the way you and some others paint it is that a car with extra 20HP is the same as a car with extra 200HP, if not better, which is just plain wrong. That's what that "most profound moment" in MGS2 was talking about, people are being given content without context, and are just making everything up themselves, while like I said few times already, the first batch of 3rd party titles will show what's what.
20 or 200 more hp means shit by itself.
200 is ten times 20.
Cars with 100 hp and 120 hp have the same difference than a 1000 and 1200 cars. While the linear difference will be very different, as 200 hp will add much more mph, also the base speed of the latters is very different than the previous.
What we are facing here is:
1000+ hp cars need to run for 50 meters. Both could do that in around 3 sec (just saying), the 1200 hp car will be 20% faster so it would take a bit less than 3 secs.
The 100 hp car would take ten secs, the 120 hp car 8 secs.
The difference will result in 0,X secs vs 2 secs, despite the linear difference (actual mph difference) being ten times bigger on the 1000+ cars, but not only it doesn't result in 10x actual difference, it's even less relevant the more you go on.
The 50 meters track is 4K, for istance.
Bonus example: a little ball falls from high, you don't die. A 2x bigger ball falls, you die. Big difference, right?
Now, a car falls from high, you die. A 2x bigger car falls... you fucking die.
Because from human limits, at one point it's all the same. In this example we are taking human resilience, on consoles we talk about human perception.
SeX could have a noticeable advantage IF devs uses all resources for fps (because at 60 you still don't have big diminishing returns) and try to make run the PS5 version exactly as SeX version, in a context where SeX is pushed to the limit. This would result in 15% difference in which is not so little on 60 fps.
But on 4K, it is, because res went through a lot more improvement than fluidity, getting 4x every gen. If this was the case for fps, we would now argue on 15% difference over 1000 fps, which is complete bollocks.
On the other hand, devs could use dynamic res to hit 4K when possible, further diminishing the difference, or they could leave res the same and scale assets a bit, which again in a photoreal context doesn't mean shit.
Congratulations, you are discovering proportions and diminishing returns.