It goes both ways. XSX higher tflops doesn't negate PS5's SSD benefits.
On that note, I'm just going to make a more elaborate post, because apart from this cat and mouse game, I'm also genuinely curious.
Let's say we have a game that takes place outside of that wide open area in the UE5 demo.
PS5 and XSX.
When turning around as a player, the game has to load a lot of data to prevent pop-in and keep the graphical fidelity consistent.
PS5 can move twice as much data as XSX, so it can easily keep up and even outdo XSX.
So, logically speaking, to keep up with PS5, XSX has to either lower graphical fidelity (i.e. texture quality and whatnot), find a way to hide the extra loading of data, or suffer pop-in.
Then you'd have the difference in GPU, so XSX can run everything at a higher res, like native 4k vs 1440p on PS5, as well as implement better ray-tracing (which I assume will take out a chunk of XSX' poweradvantage).
And yes, that means that XSX performs better than PS5, but PS5 will be able to do stuff that XSX just can't.
Now, my understanding of hardware is limited, so feel free to correct me on this, but from everything I've read and heard so far (from Cerny, to devs, to discussions on gaf), this seems to be what seperates PS5 and XSX.
And this is what's going to be the result as devs learn of new ways to develop their games and make full use of the switch to SSDs.
Like I said, feel free to educate me on this if I'm missing something, would be appreciated actually.