PCs are expected to have less stable performance than a console, because the user can tweak as much/little as he/she wishes. Consoles generally have to go for many years, and the hardware is what it is. The main way to optimize for it and squeeze everything out of it, is to have a set system. If you make things variable, it inevitably makes the optimization more difficult. Sony is known to make things more difficult than necessary though. Look at the PS3....
Sure, but with nVIDIA’s RSX not dropping the ball and developers having more time to get to grip with the CELL (nVIDIA had Sony by the balls once the Toshiba RS design was discarded) you would have heard a different song... kind of like towards the end of the generation when not just the first party devs were finding appreciation for what the SPE brought to the table.
Anyways, PS3 CELL detour aside, no Cerny architected system has been accused of being overly complicated and/or difficult to optimise for (Cerny’s time to triangle metric gets smaller not higher going from PS3 to PS4 to PS5... PS Vita was also quite loved from the HW design and OS points of view... the track record is positive also in terms of listening to developers).
Obviously, yes. But that was not the point. The point is that the XSX design is in such a way that the cooling is sufficient to run the system at a constant speed at all times. In other words, no part of the hardware is being hampered from working at its max design capacity.
It was part of the point as thermal constraints exist for XSX as well as you made the point yourself in terms of SMT being turned off to allow frequency to scale to 3.8 GHz. Which we were discussing about earlier... guess we agree to disagree there.
I actually suspect that they went with a smaller GPU, because GCN had a scaling issue. The most obvious example was the Vega 56 and Vega 64, where they both performed exactly the same despite the Vega 64 having 14% more CUs. Sony most likely figured out that past 40CUs, scaling was bad, so, they didn't want to risk spending die size and thus cost on a larger GPU. This is why Cerny was talking about the whole clock speed vs CU thing. Apparently RDNA has fixed that though, so, despite their good initial intention, it didn't turn out right for them. If the XSX has issues scaling towards 52 CUs, we'll find out soon enough.
This is an interesting angle.
The question is whether this is because of a need, since they are weak in comparison to the XSX specs, or if it really was intended like that from the beginning. The former is more logical, although the second one is not impossible.
I think the solution overall is elaborate, complex, and linked to many other design choices enough to be intentional. Why build an entire design around very high frequency target and fixed power consumption (linked to their choice of cooling and noise control) if you do not plan to use it or did they spend tons of money just in case they were going to be weaker and panicking.
Surely, if they could lock it to 2.2 GHz and then let it boost to 2.4 GHz it would be even better so there is a point in saying they are thermally limited... hot water being hot seems a painfully obvious situation.
In terms of TFLOPS it's over 17% when the PS5 is boosting. So it's at least 17%. Add in the RAM bandwidth difference, and the difference widens quite quickly.
The bandwidth difference, limited to the fast memory (I do not think you can serve accesses from both slow and fast busses, so a slow access will lock it until it is done), is there to feed the additional units, it is not am the advantage you are looking for. RAM is an interesting subject if you take streaming into the equation and how much space you can save with a faster streaming setup.
That still doesn't change the fact that they have to experimentally determine (i.e. a slow and tedious process) what the optimal way of doing things is.
Same as optimising your software for console generally is and it depends on how strong the variation in clock speed is. Not too different than finding out if you want to go wider than 8 threads and need SMT or not. You can assume the worst, others the best... and some something in the middle.
The system they came up with seems quite refined to be thrown together at the last minute and given good cooling, where they reportedly spent a lot of money, I think they will have a forced worst case variation that is both relatively small and infrequent.
What is more likely to happen is that for AA and Indies the console will be very very quiet and should still be quiet enough when loading the best people like ND can throw at them. We shall see
. Braniac vs ~”Speed Demon”... fight
!
Really? How on earth have you missed that? Go read the Bleeding Edge thread. You'll see 3 or 4 of the usual suspects at it. And yes, there's quite a few idiots on the other side of the fence as well, but I'm honestly surprised you haven't noticed anyone trashing Microsoft.
The same way I notice a tempest and may forget a rainfall... but if you do not want to see it... fine.
I did not say nobody was caught with their pants down trolling, although if you count not bowing down with shame and pretending to discuss specs reality that can be considered trolling then huh?!
Then again seeing Xbox fans trolling around and shouting SonyGAF as soon as anyone calls them out