A
Allandor
After reading your reply, it feels like quoting it would be wrongly validating the strawman arguments you've made.
To claim someone using specific terms with technical meanings is using marketing buzzwords - without listing your supposed buzzwords - is disingenuous, because the terms used where by Cerny, and he's not marketing anything, other than how smart and technically proficient he is. So please state "the buzzwords" you have issue with.
Your comment about the cache scrubbers and priority levels indicates you are in no way experienced in data comms, because if you don't understand that memory bandwidth contention lessened in one area, impacts all the items that share/contented for that bandwidth - such as the IO complex(and by extension the SSD controller and SSD), and the CPU and GPU, then it seems no matter what I write you're going to wave away.
The latency of the IO complex Esram is what the CPU LLC cache and GPU LLC see - in directly through their the GDDR6 latency - when they need data. Not the latency hiding of the SSD flash controller or connected SSD.
You've then made a further strawman about write speeds and wear and tear on the SSD nand chips in the "use it sort of like RAM" scenario, when the data that needs to use the SSD like an extension of RAM is static 3D geometry and texture data that won't be written back to disk - that Epic have in UE5 nanite info said is about 90% of a typical game scene data, when they talked about nanite data being effectively immutable in game. To anyone capable of game programming watching the Road to PS5 it was obvious which data needed the streaming capability.
As for the "PS5 processor kits", that testing doesn't reflect the PS5 scenario in any shape or from, where the PS5 is offloading all IO and audio off the CPU, or match up with Cerny's concerns that heavy CPU use could use up to 35GB of memory bandwidth (IIRC), or that the CPU is expected to be under utilised by wattage, as it is expected to redirect excess power to the GPU for increased rasterizing . It is also a huge leap to suggest that slightly defective PS5 APU boards go direct to AMD sell off.
Sony could easily use those boards with lesser defects for other things, like in their £10K 8K flagship TVs as DSP boards, to mention just one re-bin use, and what was left as waste with AMD is probably highly defective in comparison to the the PS5 CPU, too.
As for the GDDR6 latency on a PS5 system, how can you possibly suggest some defective kit has the same memory timings? The board maybe defective because of the GDDR6, for all we know.