• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Matt weighs in on PS5 I/O, PS5 vs XSX and what it means for PC.

Handy Fake

Member
Shaking my head my head?
giphy.gif
 
Who knows how it works ... they didn’t explain that part .

maybe it maps addresses right over from the GDDR 6 to the SSD so it sees only a 100 GB pool and serves what’s needed on screen in time to the 16 GB VRAM automatically.

and maybe it works totally different. We won’t know until we have a deep dive.

You know the craziest speculation one can have here? That the 100 GB is a cluster of PCM, like Micron's 3D Xpoint. However I've doubted that just based on the pricing; even if they could get that as low as $1.50 per GB, that would still be $150 for that 100 GB slice alone. Granted that was me taking the price-per-GB of an Intel Optane DC Persistent Memory module, factoring out the possible profit margin cut (half the cost), then factoring about half of the remainder out. Which is probably too generous.

Though, maybe it could be 100 GB of the other Optane style memory if someone wanted to entertain that idea. That version's a lot cheaper per GB, still better latency and speeds than NAND but lower performing than the DC variant. Outside of that one of the only other ways of explaining what the Dirt 5 dev was saying is the idea of the GPU having a direct streaming access to the 100 GB block of memory on the SSD, but I had come around to ruling out that option thanks to some discussion with a few other posters in the Velocity Architecture thread...

...that said, hell who knows, maybe that's something they are doing after all? If you're only accessing streamed data a single time per frame at a few MB/s over the duration of a second where the overall stream rate fits in line with the bandwidth of the storage, provided the GPU textures are formatted properly and the GPU has some customizations to facilitate streamed data workloads off of some initial instructions from the CPU, if it's treating the 100 GB block as a virtual RAM disk then I suppose you could stream in the data from that block to the GPU directly in lieu of passing the data to RAM first. At the end of the day it still has to go to the GPU caches anyway.

So maybe I should keep that idea around in my back pocket just in case something turns up substantial with it, though again, I won't be surprised if it's operating through some other means to do what that dev is claiming to be done. Interesting to think about in any case.
 
Last edited:
Dunno, you tell me how long.

We'll see after today.

Why would today be the determining factor? What is it about today that impacts anyones current game development? I mean anyone's: pc, xbox, ps doesn't matter.

Today tells us exactly what about the technology available to develop games?
 
Last edited:
Didn’t DF analyze this and found out it was running at 24 fps 1600p. It wasn’t even gameplay. It was a cinematic in-engine trailer.

Not to mention it literally cannot be UE5 unless you‘re from EPIC. You can’t license or touch UE5 until 2021. So this is clearly UE4.

This is untrue. Elements of ue5 are already available for ue4.x.
 

Psykodad

Banned
Why would today be the determining factor? What is it about today that impacts anyones current game development? I mean anyone's: pc, xbox, ps doesn't matter.

Today tells us exactly what about the technology available to develop games?
"A" determining factor, not "the".
But Playstation as a brand is so strong nowadays, they can easily set next-gen up the same way as current-gen back in 2013. It's about consumer-trust/anticipation, not at all implying anything related to game-development.

My initial comment was more tongue-in-cheek though.
 

THE:MILKMAN

Member
Actually aren't the Codemasters devs comments a good thing for all of us no matter which platform you prefer? It is a multi-platform and cross-gen game taking full advantage of the new tech. If a lot of the third-parties do this we all eat.
 

Mobilemofo

Member
Like the SNES/Mega Drive days... SNES more powerful. Mega Drive faster... Or am I completely wrong on that?
Snes had custom hardware (mode7, parallax scrolling, custom sound chip) whereas the megadrive had 'off the shelf' parts so to speak. The sound chip on the megadrive was both awful and with a bit of work, very good. Weird console.
 

quest

Not Banned from OT
Actually aren't the Codemasters devs comments a good thing for all of us no matter which platform you prefer? It is a multi-platform and cross-gen game taking full advantage of the new tech. If a lot of the third-parties do this we all eat.
Yes means the series x io is not crappy as many here thought. So the baseline is higher and with time can be pushed further letting them do more with the PS5. He was doing good to avoid the RT question in the major Nelson interview lol.
 
Didn’t DF analyze this and found out it was running at 24 fps 1600p. It wasn’t even gameplay. It was a cinematic in-engine trailer.

Not to mention it literally cannot be UE5 unless you‘re from EPIC. You can’t license or touch UE5 until 2021. So this is clearly UE4.
Well, I mean, it's years away from development given that it wasn't ready for gameplay reveal :/ It's not by any means a finished product. We'll see how it looks and runs in the MS July event next month.
 
Last edited:

Leyasu

Banned
What answer do you want to have?
A lot of the time, long development cycles include development of the game engines, which some of Sony's studios already have. That time can already be cut, which cut save them what? 1-2 years?

If let's say HZD2 already started development 3 years ago, when the first reports about it came to surface, and supposedly it's development was shifted to PS5 2-3 years ago, it isn't far-fetched that they might have it ready next year. Possibly Q1 2021 going by recent rumors.

Sony also had little announcements in recent years and they have enough studios that have been working on the down-low.

When we get games like COD every year, that had 2-3 years of development time, what makes you say that Sony can't have PS5 titles ready by the time PS5 releases? Especially if they aim for technical showcases rather than fullblown AAA titles.

Not to mention that UE5 is designed to drastically cut development time, which doesn't make it seem unreasonable to assume that Sony 1st party has developed similar techniques with similar results.
The show hasn't finished yet, but what did I say?

The wait for real next gen wont be over in 2020.. May not even be in 2021 either
 

Psykodad

Banned
The show hasn't finished yet, but what did I say?

The wait for real next gen wont be over in 2020.. May not even be in 2021 either
Sure dude, keep telling yourself that. Some games surpass current-gen easily. If only graphically.
 
Last edited:

Ogbert

Member
I think, following that presentation, we can safely say that what PS5 and X1X mean for PC is that it will remain the platform that is light years ahead of both consoles.

That said, maybe Sony's amazing SSD will help me load up that potato game in under a second. Can't wait.
 

quest

Not Banned from OT
After the Ps5 reveal its fun looking back on these threads.
If only i was clever enough to meme the endless wait until you see what Sony first party does with the ssd it will make the pc and series x look a generation behind lol.
 
If people call the PS5's 10.3 weak because Xbox Series X is 12, then the Xbox Series X which is half the speed of PS5, must be weak. That 50% difference is a lot bigger than the difference between 10.3 and 12.

10.3 variable*

From the looks of it a lot of those games were probably running at 9TF.
 

rnlval

Member
Like most you’ve misunderstood what Sony is using this technology for. It’s not to save power - it is to optimise the use of power.

It’s not about saving $1 a unit on cooling - Sony’s cooling Solution is likey more expensive that Xsex.

The reason Sony are using this is to drive the clock speed that high. High clocks are only useful if one can make use of that clock tick to perform useful work - that needs power.

Assuming MS aren’t using a similar system in Xsex, they’re leaving performance on the table. Tflops is only one third of the calculation that predicts gpu performance, another third is power (the final third is “activity” the code that’s running).

Smartshift is about providing power - if MS haven’t done that then their power cap will be their limit of performance.

So far, we “think” we know they’re making 200W available. 60 of that goes to the cpu, so MS are left with around 130W for their gpu (allowing 10W for the rest of the system - that’s probably a bit light).

That’s less than half the power a 2080ti and about half a 5700xt.
CU scaling affects more than ALUs e.g. TMU, TFU, RT, L0/L1 caches, LDS, wavefront queue storage and 'etc'. MS hasn't revealed XSX GPU's ucore sections.
 

rnlval

Member
Didn’t DF analyze this and found out it was running at 24 fps 1600p. It wasn’t even gameplay. It was a cinematic in-engine trailer.

Not to mention it literally cannot be UE5 unless you‘re from EPIC. You can’t license or touch UE5 until 2021. So this is clearly UE4.
The resolution is 3840 x 1600p not 2844 x 1600p(16:9). 3840 number value is 4K with ~21:9 IMAX like ratio.
 

rnlval

Member
Well multiplats will tell the story ...

Oh look multiplats run the same PS5 and Xsex.

How can it be the most powerful console can’t show that mighty 2tf advantage in simple multiplat games?
RTX 2080 Super (48 CU equivalent) is stronger than RTX 2070 (36 CU equivalent) manual overclocked.
 

ToadMan

Member
RTX 2080 Super (48 CU equivalent) is stronger than RTX 2070 (36 CU equivalent) manual overclocked.

“Stronger” lol.

It’s not gonna matter man -there’s no way 15% tflops on the you gets any practical difference in quality.

Peddling PC comparisons is a waste of time - just go buy one and be happy. They have next to zero relevance to console performance.
 

rnlval

Member
“Stronger” lol.

It’s not gonna matter man -there’s no way 15% tflops on the you gets any practical difference in quality.

Peddling PC comparisons is a waste of time - just go buy one and be happy. They have next to zero relevance to console performance.
I'm glad you noticed it. LOL.

My two gaming PCs have MSI RTX 2080 Ti Gaming X Trio (68 CU) and ASUS RTX 2080 Dual EVO GPUs (46 CU).

XSX's TFLOPS advantage is backed by higher memory bandwidth which is important for render targets. From Gears 5's results, XSX GPU's results are about RTX 2080 level which is about 25% higher over RX 5700 XT's results.

Raw TFLOPS difference between XSX and PS5 roughly equals to an entire PS4 machine.

PS5's GPU 10.23 TFLOPS gain will be diminished due to the lack of memory bandwidth increase relative to RX 5700 XT's 448 GB/s and extra memory bandwidth consumer from the CPU.

My argument between RTX 2080 Super (48 CU with 496 GB/s BW) vs RTX 2070 manual overclocked (36 CU with 448 GB/s BW) example for consoles debate is relevant.

Your PS5 GPU is like NAVI 10 36 CU with RDNA 2 improvements and don't expect miracles from it i.e. RDNA 2 52 CU with ~560 GB/s ~= RTX 2080 with 46 CU and 448 GB/s. LOL. I expected more from AMD. LOL I'm still waiting for AMD to deliver RTX 2080 level efficiency with 448GB/s memory bandwidth.
 
Last edited:
Top Bottom