• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Sinthor

Gold Member
I can't see myself waiting, but would try them out on PS5 if nothing interesting by launch. I'll preorder the deluxe edition. I'll not do the same mistake that I fell for the drama about Days Gone that made me wait for months before buying it, and turned out to be one of my best games ever, and I play lots of different games and standing beyond 180 games so far on PS4. Personally, might replay Days Gone if the PS5 patch is good enough.

In the meanwhile need to kill the wait for PS5 with some quality time with quality games. Some others have built some kind of stamina and endurance with all the wait they've been through but I personally can't. 90% of my games are preordered.:messenger_face_screaming:

Bingo. This makes it NO REASON to wait. Yes, built for the next gen from the ground up games will be more impressive, but having a patch available to make the current games 'next gen' will be quite an upgrade. Good times ahead! I've got TLOU and GOT for when they come out. That and a few other things should keep me more than busy till November or whenever. Plus I'll have more things to play on day one!
 

Sinthor

Gold Member
Well I hope they add more than just resolution and framerate. Hopefully we get other enchantments like better AA for example.

Eh....unfortunately I don't think that's too likely. Would take a LOT more effort from the developers and they HAVE been targeting current generation hardware. They would have needed to plan for next gen support for a lot longer in order to do that. So it's possible, but I really don't see it happening. I think it will be exactly as it has been for the PS4 Pro. But maybe we'll be surprised! I'd LOVE to be wrong about that and what we get is.....
 
Eh....unfortunately I don't think that's too likely. Would take a LOT more effort from the developers and they HAVE been targeting current generation hardware. They would have needed to plan for next gen support for a lot longer in order to do that. So it's possible, but I really don't see it happening. I think it will be exactly as it has been for the PS4 Pro. But maybe we'll be surprised! I'd LOVE to be wrong about that and what we get is.....


I guess the norm is that the resolution will improve and they the framerate will be better. But I'm hoping that some developers do a bit more than that instead of just setting it on autopilot.
 

ZeroFool

Member
Do you know a SSD solution with an IO which can give you 9 GB/s of real bandwidth in PC ? because if we are going to explore possibilities first we need to know
what is available in PC more than GPU/CPU.
What if I have 64gb of ddr4 available, if it was used to feed textures to the GPU that should be magnitudes better than my raid nvme setup.

I don't have 64GB available because I would rather save that money for new consoles. But just curious, years ago I was trying to get a virtual disk running but that was buggy. Worked fine on my 386... 😏
 

HAL-01

Member
Like when you believe something new will replace the established metrics of power that have been universally used for decades! You’re getting it!
its the opposite of that, but im sure you're misunderstanding on purpose. in this case:
"The term cognitive dissonance is used to describe the feelings of discomfort that result when your beliefs (CPU and teraflops are everything) run counter to new information that is presented to you (IO will play a more important role this gen)

Similarly to when flat earthers perform an experiment that proves the earths roundness, but rather than believe it they claim it was flawed.

Btw "something new that replaces long established rules" is not some insane idea. It's called a paradigm shift, it is what advances in technology are built upon, and they happen all the time. Without them we'd still be measuring consoles by how many "bits" their processors have.
 

Corndog

Banned
They said it was run in 2019 as well right? We know it was an older slower devkit.

For fun I will say it was slow enough to match XSX speeds 😜😘
Are you just joking or serious? I have seen nothing talking about the demo running on old devkit let alone 2019.
 

Corndog

Banned
Do you know a SSD solution with an IO which can give you 9 GB/s of real bandwidth in PC ? because if we are going to explore possibilities first we need to know
what is available in PC more than GPU/CPU.
Ya. Raid a couple high end ssd’s. One poster in here talked about 12 GB/second.
 
What if I have 64gb of ddr4 available, if it was used to feed textures to the GPU that should be magnitudes better than my raid nvme setup.

I don't have 64GB available because I would rather save that money for new consoles. But just curious, years ago I was trying to get a virtual disk running but that was buggy. Worked fine on my 386... 😏

Nobody is going to write a PC game that expects 64 GB of RAM for a long time. My brother's ThreadRipper build has 64 GB of DDR4, but imagine what thin a slice that would be on a Steam Hardware survey. By the time that amount is a minimum requirement, the rest of his PC and the speed of his RAM would be a joke.
 

Mod of War

Ω
Staff Member
Everybody take a break from he UE5 circular theatrics. There are many other threads to discuss all the splitting of the hairs that are going on, and this thread does not need to constantly be taken over by the same tiring arguments that seem to crop up just as everyone moved on from it.
 

PaintTinJr

Member
That second paragraph as if you read my mind and broke it down to me. Yes, that was what I wanna learn more about, but I'm not pretty much that educated in that matter though.
My logic behind the question was: If the game is already there (game size occupied in the SSD), and the size is the same, why (double-compress) when they might be already compressed to begin with? Here even Zlib might sound better as it's also lossless AFAIK.
To me it sounds like you order a meal and instead of eating it all you left some leftover. It's better to have something you can fully eat/finish (final results on screen) so you don't pay extra money or waste the extra food (storage).
Many of the things in your post are pretty technically deeper than my knowledge, which I like it but not necessarily fully understood yet but I get the overall idea. Thanks a lot for your time and glad to have you here among us.

From reading your comment, it feels like you are asking ,me: why did they do BCpaclk? What's the point when they could zlib? And that is the crux of the matter, really.

Since the beginning of GPU block compression acceleration with S3_TC/DXT, the idea is to either accept lower texture quality to save on VRAM needed, RAM footprint, transfer IO (at the same texture dimensions), or to use the same resources as the RAW texture to get the job done, but use the compression saving to go up a texture dimension size - which depending on circumstances may result in better texturing overall in spite of introducing errors through the lossy compression.

Multi-texturing also provided different options, as you could blend two compressed textures in the storage of 1xRAW, etc, and it probably wasn't until shader based multi-texturing - like two tone speckled paint - and the ability to decompress zlib raw textures on PS3, that lossy texture formats weren't seen as a automatic win (IMHO).

Once zlib decompression was possible on GPU or SPU RAW textures became viable for more things again, and by default block compressed textures don't compress well with zlib based compression AFAIK, so using DXT with zlib offered limited gains for its inferior image quality.

(IIRC) Prior to DirectX10 there was quite a bit of research in the games industry on ways to use DXTn with shaders to improve the compression time for generating optimal textures, and also techniques for moving to/from different colour schemes (using shaders)to retaining detail where it mattered most and place the errors where they mattered least (a little bit like comparing a CCD sensor to a CMOS sensor for photography) .

DX10 incorporated DXTn as BC – I’m not sure if they enhanced the compression beyond DXT – but this is where BCpack comes in with a potential 2-3x storage saving over and above regular S3_TC/DXT/BC. BCpack presumably works by looking to find block repetition or similar-enough repetition between texture blocks, so that Bcpack can share index tables and/or blending values between multiple texture blocks.

I suspect that with textures so big in today’s games that little perceptible difference will be seen by a gamer looking at a Bcpack texture, and a BC one, or a raw texture on a model, in a current gen title, and that I believe is why xbox has backed this strategy.

ps. Thanks for the nice comment - I've been really enjoying my brief time here, and love the upbeat positive vibe you and others bring about the excitement of new games and tech in this thread.
 
How is that breaking NDA? It’s their product, not Sony’s. They’ve done it in the past with new console generations.
They know Sonys configuration as evident in them "working closely" to develop Unreal 5. If they release that info in the negative then they are breaking NDA. For instance PCs cant run this because they dont have XYZ. We can infer that the system that could run it has xyz.
 

Corndog

Banned
That was me. It's great for sequential copying of large files, still poor for random read, and takes Minecraft (loaded up to the tits with mods) actual minutes to load despite being paired with a ThreadRipper. There's more to IO than just flash speed.
True. Is random read bad because of the potential of not being able to read from all chips at once?
 

Bo_Hazem

Banned
From reading your comment, it feels like you are asking ,me: why did they do BCpaclk? What's the point when they could zlib? And that is the crux of the matter, really.

Since the beginning of GPU block compression acceleration with S3_TC/DXT, the idea is to either accept lower texture quality to save on VRAM needed, RAM footprint, transfer IO (at the same texture dimensions), or to use the same resources as the RAW texture to get the job done, but use the compression saving to go up a texture dimension size - which depending on circumstances may result in better texturing overall in spite of introducing errors through the lossy compression.

Multi-texturing also provided different options, as you could blend two compressed textures in the storage of 1xRAW, etc, and it probably wasn't until shader based multi-texturing - like two tone speckled paint - and the ability to decompress zlib raw textures on PS3, that lossy texture formats weren't seen as a automatic win (IMHO).

Once zlib decompression was possible on GPU or SPU RAW textures became viable for more things again, and by default block compressed textures don't compress well with zlib based compression AFAIK, so using DXT with zlib offered limited gains for its inferior image quality.

(IIRC) Prior to DirectX10 there was quite a bit of research in the games industry on ways to use DXTn with shaders to improve the compression time for generating optimal textures, and also techniques for moving to/from different colour schemes (using shaders)to retaining detail where it mattered most and place the errors where they mattered least (a little bit like comparing a CCD sensor to a CMOS sensor for photography) .

DX10 incorporated DXTn as BC – I’m not sure if they enhanced the compression beyond DXT – but this is where BCpack comes in with a potential 2-3x storage saving over and above regular S3_TC/DXT/BC. BCpack presumably works by looking to find block repetition or similar-enough repetition between texture blocks, so that Bcpack can share index tables and/or blending values between multiple texture blocks.

I suspect that with textures so big in today’s games that little perceptible difference will be seen by a gamer looking at a Bcpack texture, and a BC one, or a raw texture on a model, in a current gen title, and that I believe is why xbox has backed this strategy.

ps. Thanks for the nice comment - I've been really enjoying my brief time here, and love the upbeat positive vibe you and others bring about the excitement of new games and tech in this thread.

That has pretty much raped my brain to be honest with you :messenger_face_screaming: , but I get the overall idea, I think:goog_unsure:. The way I understand it, it's like comparing PNG (lossless) image compared to JPG (lossy) image. Most people won't notice the difference, but the PNG tends to be heavier. Most people won't mind JPG 99% of the time. Probably like 8-bit videography, you don't mind it until you get those banding on the sky due to shooting on D-Log to extract extra details that 8-bit can't keep up with sometimes.

You're technical background is much appreciated, although comprehending them as a casual could be challenging, yet I hope you keep it that way instead of trying to oversimplifying it. That way we can learn and some names/codes intrigue us to search and read more to have better understanding.

Problem is there are some around here are trying so hard to not understand the simplest phrases recently.:lollipop_downcast_sweat:
 
Last edited:

Bo_Hazem

Banned
PaintTinJr PaintTinJr & BadBreathOfTheWild BadBreathOfTheWild I would need your help here. Assuming that the leaked Phison (flash controller?) for XSX is true, why DRAM-less?

EO18iiRW4AcLAY5.jpg


Could they use their own DRAM soldered in the motherboard to be shared with both the internal and external SSD? Or it's a strange cost-cutting stunt?

 
Last edited:

Entroyp

Member
PaintTinJr PaintTinJr & BadBreathOfTheWild BadBreathOfTheWild I would need your help here. Assuming that the leaked Phison (flash controller?) for XSX is true, why DRAM-less?

EO18iiRW4AcLAY5.jpg


Could they use their own DRAM soldered in the motherboard to be shared with both the internal and external SSD? Or it's a strange cost-cutting stunt?



Sounds like a cost cutting measure more than anything. Most the budget went to TFs so sacrifices had to be made in RAM and storage. I don’t know shit of course.
 
Last edited:

Bo_Hazem

Banned
Sounds like a cost cutting measure more than anything. Most the budget went to TFs so sacrifices had to be made in RAM and storage. I don’t know shit of course.

Interestingly, Linus said that the SSD map can be stored in the RAM, so could that 3.5GB of slow RAM have it inside to shave some dollars out of the BOM? Strangely though, it doesn't sound like saving much, or it could be associated with the Velocity Architecture thing? It's like a puzzle as I don't know that much about those deep tech topics but I find them quite dense and interesting to discuss and understand how such decisions would work out.
 
True. Is random read bad because of the potential of not being able to read from all chips at once?

I think it’s more down to APIs and OS. For Minecraft loading it’s single threaded, for lots of small files it’s all the overhead and back and forth in the protocols when asking for a new file before the SSD can actually open the taps up and get a file sending.

This particular setup is 4x Samsung 970 Evo (or Pro, can’t remember) so it can read from four different controllers at once. You’d think this link in the chain is more than enough.
 

ZeroFool

Member
Nobody is going to write a PC game that expects 64 GB of RAM for a long time. My brother's ThreadRipper build has 64 GB of DDR4, but imagine what thin a slice that would be on a Steam Hardware survey. By the time that amount is a minimum requirement, the rest of his PC and the speed of his RAM would be a joke.
Yep agreed, look at what it is now for example on the Steam survey.
 
PaintTinJr PaintTinJr & BadBreathOfTheWild BadBreathOfTheWild I would need your help here. Assuming that the leaked Phison (flash controller?) for XSX is true, why DRAM-less?

EO18iiRW4AcLAY5.jpg


Could they use their own DRAM soldered in the motherboard to be shared with both the internal and external SSD? Or it's a strange cost-cutting stunt?



Appreciate the shout out but it’s not something I know well at all other than a very basic understanding.
 

Bo_Hazem

Banned
Appreciate the shout out but it’s not something I know well at all other than a very basic understanding.

Fair enough:lollipop_winking: My brain gymnastic so far aid me to think that they'll using the 3.5GB slow RAM to replace it, and maybe something to do with their DirectStorage/Velocity Architecture would try to do it sufficiently. Overall, I'm still not amazed by the 16GB on either console, would've liked 24GB just for safety measures. Cutting that into 10+6 doesn't help, it doesn't sound like future proofing for 7 years. Probably both are liking the idea of Pro/(XXX: Xbox Xtreme X?) by 2023-2024.
 

Dee_Dee

Member
Fair enough:lollipop_winking: My brain gymnastic so far aid me to think that they'll using the 3.5GB slow RAM to replace it, and maybe something to do with their DirectStorage/Velocity Architecture would try to do it sufficiently. Overall, I'm still not amazed by the 16GB on either console, would've liked 24GB just for safety measures. Cutting that into 10+6 doesn't help, it doesn't sound like future proofing for 7 years. Probably both are liking the idea of Pro/(XXX: Xbox Xtreme X?) by 2023-2024.
Lol i do wonder what MS would call the enhanced version of the Series X
 

ZeroFool

Member
Are you just joking or serious? I have seen nothing talking about the demo running on old devkit let alone 2019.
Not joking, but now I need to find it. It was either on here or one of the trusted Twitter threads. I thought it mentioned being filmed in late 2019 on one of the slower devkits.

Bo_Hazem Bo_Hazem do you recall what I am trying to recall?

The spoiler however was a joke in my post.
 

Bo_Hazem

Banned
Not joking, but now I need to find it. It was either on here or one of the trusted Twitter threads. I thought it mentioned being filmed in late 2019 on one of the slower devkits.

Bo_Hazem Bo_Hazem do you recall what I am trying to recall?

The spoiler however was a joke in my post.

Do you mean the spider-man demo? Yes it was running on a slow, pre-devkit silver tower:

 

ZeroFool

Member
Ugh, now my brain is twisted up. I will hunt for what I read to make sure I heard it right.

In other cool news UE4.25 was showing off amazing volumetric clouds in the video posted today on their site and PC app. That alone will be crazy for next gen open world games.
 

SSfox

Member
So, do we think an announcement about June 3rd would happen tomorrow to build things up?

Hard to tell, in general big thing at that level are tease like 4 days to a week earlier. (we're talking about first time ever next gen show, since the Deep Dive while was huge, but it was more of a tech thing)

But maybe they'll tease it just a day earlier, and that would be a (usual) tuesday tease. and if so i think it would be a first time ever thing.
 
Last edited:

saintjules

Member
Hard to tell, in general big thing at that level are tease like 4 days to a week earlier. (we're talking about first time ever next gen show, since the Deep Dive while was huge, but it was more of a tech thing)

But maybe they'll tease it just a day earlier, and that would be a (usual) tuesday tease. and if so i think it would be a first time ever thing.

Yeah I agree. With the PS5 logo generating all those likes and showing the DualSense out of nowhere, they could drop an announcement and it will still catch on fire regardless of day and time.
 

Neo Blaster

Member
And have been told all of those games you mentioned will have a day zero PS5 patch can't wait to see what they look like on there.
I played TLOU on PS3 in Brazilian Portuguese, wonderfully dubbed by a professional studio from here. Then I got the remaster on PS4 to play in English, higher difficulty and 1080p60. Can't wait to do it again with TLOU2 on PS4/PS5.
 
Status
Not open for further replies.
Top Bottom