• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

IntentionalPun

Ask me about my wife's perfect butthole
It's not about 5.5-22 gigabytes per second, more about 5.5-22 megabytes per millisecond which is more critical.
So you think somehow even though every frame of the demo allegedly had over 1 billion source triangles, that they managed to do that in under 10GB? That's just ignoring textures even... just the 3d model data was massive for that demo even taking into account compression.
 

Bo_Hazem

Banned
So you think somehow even though every frame of the demo allegedly had over 1 billion source triangles, that they managed to do that in under 10GB? That's just ignoring textures even... just the 3d model data was massive for that demo even taking into account compression.

The narrator said hundreds of billions of triangles in the demo, do you have an equation to estimate the size?

EDIT: Timestamped:

 
Last edited:
Sony has opted not to do it in the same way for the PS4 Pro and now the PS5, seems to me they feel the performance boost they are getting with it is certainly worth it!

Is the XSX creating a virtual machine for BC? I'm just trying to figure out why it would consume more resources than what Sony is doing.
 
Last edited:

jamwest24

Member
The narrator said over a hundreds of billion polygons in the demo gameplay, do you have an equation to estimate the size?

EDIT: Timestamped:



I’m no developer, but aren’t polygons essentially math and therefore free? So the less texture work and normal mapping you have to do, the less the file size would be? The files would essentially be information plotting where each poly goes and less about huge amounts of data. I would guess that since it’s a ton of the same rock color/texture, the file size is relatively low.
 

IntentionalPun

Ask me about my wife's perfect butthole
The narrator said over a 100 billion polygons in the demo gameplay, do you have an equation to estimate the size?
Well an integer to store a single "point" would be 4 bytes. So really depends, usually people talk triangles (12 bytes each) but IIRC it's also a combination of triangles and then vertexes (where the points meet in space, which takes 12 bytes itself.) So it's a mix.

If it was nothing but triangles being stored, 100 billion of them would be 1.2 TB if I'm doing the conversions right.

But then there's all the points where the triangles meet in space being stored.... of course there's also compression.

edit: Here's a convo where some rendering folks are talking about 2 million per GB.


I think I'm wrong about how storing a triangle works though. But I'm also way undershooting the math I think.

2 million per GB would mean 500 GB per billion polygons. Or 50 TB for 100 billion.
 
Last edited:

yewles1

Member
I’m no developer, but aren’t polygons essentially math and therefore free? So the less texture work and normal mapping you have to do, the less the file size would be? The files would essentially be information plotting where each poly goes and less about huge amounts of data. I would guess that since it’s a ton of the same rock color/texture, the file size is relatively low.
No, that's why GPU's have been making advancements in additional Geometry Processors that now take culling into consideration. Plus I believe a trangle can be as big as 18 bytes.

EDIT: Mistype, 114 bytes.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
I’m no developer, but aren’t polygons essentially math and therefore free? So the less texture work and normal mapping you have to do, the less the file size would be? The files would essentially be information plotting where each poly goes and less about huge amounts of data. I would guess that since it’s a ton of the same rock color/texture, the file size is relatively low.
Math is not "free".

A single 32 bit integer takes 4 bytes. So 250,000 integers takes a megabyte to store.

If you store 1 billion of them you get to 4.5 GB. Since a single polygon requires multiple integers to store it's points in space, when you start multiplying that by billions you start getting up into terabytes.

edit: You MIGHT be thinking of procedural generation, which is close to "free" disk space wise (but not RAM wise.) However when talking about pre-defined insanely detailed models, you have huge files. CGI movies come from terabytes of source data for their models for instance. Videogames use much lower detailed models and other tricks to make things look good.

But the UE5 demo was stressing the sheer size of the assets.. so I don't know how anyone could look at that demo and not think the source files were massive... likely way beyond what is reasonable for a game. UE tech is in fact capable of essentially rendering film quality CGI so it makes sense.
 
Last edited:

jamwest24

Member
No, that's why GPU's have been making advancements in additional Geometry Processors that now take culling into consideration. Plus I believe a trangle can be as big as 18 bytes.

But you don’t store all those triangles on the hard drive.. Each frame is only rendering a maximum 20 million polys. You don’t literally store all that geometry in the entire level. It’s plotted out. And I’m sure there’s plenty of those tiny triangles that are duplicated. The reference files are all crazy poly numbers, but not the ones you see on screen in each frame. And it’s the same reason for why the creators of UE5 said it’s designed to make a real game, not just a tech demo. And there’s no way the dev kits hold far more data than the final console at those speeds.. Heck, when you load hundreds of games up in Dreams, the overall size of the game doesn’t increase like crazy.. That game is tiny because it’s just an engine running the math of each level, which is why each level loads so quickly even on a base PS4.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
But you don’t store all those triangles on the hard drive.. Each frame is only rendering a maximum 20 million polys. You don’t literally store all that geometry in the entire level. It’s plotted out. And I’m sure there’s plenty of those tiny triangles that are duplicated. The reference files are all crazy poly numbers, but not the ones you see on screen in each frame. And it’s the same reason for why the creators of UE5 said it’s designed to make a real game, not just a tech demo. And there’s no way the dev kits hold far more data than the final console at those speeds.. Heck, when you load hundreds of games up in Dreams, the overall size of the game doesn’t increase like crazy.. That game is tiny because it’s just an engine running the math of each level, which is why each level loads so quickly even on a base PS4.

Definitely lots of data duplication. That 30 million polygon model was repeated all over the place for instance.

I'm certainly not suggesting the demo was terabytes.. but even a single 30 million polygon model is going to be several gigabytes.

None of this is meant to insult the PS5... but a tech demo literally designed to show off the insane I/O speed of a device isn't necessarily going to produce a realistic game... if it truly pushed the I/O hard, even for 10 seconds total of that 5 minute demo for unique data.. then that demo was at least 50GB.
 
Last edited:

Bo_Hazem

Banned
Great, the way I understand it that those triangles are essentially empty:

a-breakdown-of-the-powerful-visual-effects-work-in-spider-man-homecoming.jpg


So they should be cheap in storage, then you put a skin on them:

Exploring.gif


And those assets can be used over and over again and still produce unique surfaces/environments. So it's not simple to calculate them, or how a game that's as big and massive with high details like Ghost of Tsushima and with all the duplicates and LOD's is still 35GB?
 
Last edited:

jamwest24

Member
Definitely lots of data duplication. That 30 million polygon model was repeated all over the place for instance.

I'm certainly not suggesting the demo was terabytes.. but even a single 30 million polygon model is going to be several gigabytes.

Yeah, I’m definitely curious to know how the data is working. I can’t remember who it was, but I believe one dev found many of the assets on Quixel and said that the tots number in the demo was crazy low. It was just that they were creative in how they scaled and placed them. I’ve seen Tim Sweeney respond to many tweets, I just might shoot him a question about it and see if he responds 😂
 

IntentionalPun

Ask me about my wife's perfect butthole
Yeah, I’m definitely curious to know how the data is working. I can’t remember who it was, but I believe one dev found many of the assets on Quixel and said that the tots number in the demo was crazy low. It was just that they were creative in how they scaled and placed them. I’ve seen Tim Sweeney respond to many tweets, I just might shoot him a question about it and see if he responds 😂
Oh for sure they didn't exactly have a TON of unique shit on screen.. I've seen what you are talking about, the walls IIRC are basically a few models copied over and over.

But I digress to it just being simple math. HZD famously had 500k polygon models.. if you are expecting games to suddenly have 30 million polygon models.. then you are expecting models 60 times the size of even the most detailed game of last gen... so compression is twice as good? Great, still a model 30 times larger than last gen.

So I must digress.. PS5 is capable of pushing an unrealistic amount of data around screen. That's NOT some insult.. it's just identifying what is to me a rather obvious bottleneck and something that makes the UE5 demo hard for me to think of as realistic.

What it does is it frees the developer to basically do whatever the fuck they want I/O wise... right up until they start getting unrealistic with game size when it comes to detail. It's literally hard to imagine any game ever actually needing MORE I/O than the PS5 has.. even when we get to the day of games that are a TB in size.
 
Last edited:

Lethal01

Member
If that was true, XSX and PC will be able to run that exact demo easily. Guess what? They simply can't. And no one knows the demo size, I personally doubt it to be more than 10GB if not much less.

I'm just gonna compile relevant facts from the recent video, since it's probably the biggest info dump we have gotten.
1 million triangles with a single UV channel is about as much as one 4k map
The statue has 33 million triangles.
8k texture for each map, 8 different sections, 3 texture maps for each secon (my 8k maps are <t 40mb)
So 24 maps each meaning that's around 1GB for just the statue texture (2GB if you believe it's uncompressed)
1.3GB if for texture + the model.

They say again and again that this statue isn't special and the same amount of detail is in all the props in the room. So if we can say that there are even just 10 objects that's already 13GB for just the room. I'd assume there are more objects in that room, and far more on the areas outside the room and the flying section.

I can see the whole Demo going over 100GB easy.
 
Last edited:
I'm just gonna compile relevant facts from the recent video, since it's probably the biggest info dump we have gotten.
1 million triangles with a single UV channel is about as much as one 4k map
The statue has 33 million triangles.
8k texture for each map, 8 different sections, 3 texture maps for each secon (my 8k maps are <t 40mb)
So 24 maps each meaning that's around 1GB for just the statue (2GB if you believe it's uncompressed)

Damn.
 

Bo_Hazem

Banned
I'm just gonna compile relevant facts from the recent video, since it's probably the biggest info dump we have gotten.
1 million triangles with a single UV channel is about as much as one 4k map
The statue has 33 million triangles.
8k texture for each map, 8 different sections, 3 texture maps for each secon (my 8k maps are <t 40mb)
So 24 maps each meaning that's around 1GB for just the statue (2GB if you believe it's uncompressed)

They say again and again that this statue isn't special and the same amount of detail is in all the props in the room. So if we can say that there are even just 10 objects that's already 10GB for just the room. I'd assume there are more objects in that room, and far more on the areas outside the room and the flying section.

Ok, first DF reached EG and got info, and they said "uncompressed", probably among other reports. Then the demo didn't use oodle texture compression which was recently released and compatible with PS5 that can reach up to 96.58% of size reduction with Kraken:


And here the whole 8K texure mode for the whole Skyrim game is only 5.5GB:





So I think people are exaggerating the size of the demo.
 
Last edited:

Lethal01

Member
I accidentally hit post way to early and have just been editing it since.
Ok, first DF reached EG and got info, and they said "uncompressed", probably among other reports. Then the demo didn't use oodle texture compression which was recently released and compatible with PS5 that can reach up to 96.58% of size reduction with Kraken:


And here the whole 8K texure mode for the whole Skyrim game is only 5.5GB:





So I think people are exaggerating the size of the demo.


Really hard to judge when I don't know how the 8k textures are beings used in skyrim. Is it an 8k map that contains the textures for 10 different trees? What I can say is that I just downloaded the pack for just the snow and it take about 90MB per texture on disk There are only 14 Textures in all with 7 for color and 7 for Normals.

It would seem that The demo uses far more textures in far less space with just a segment of the statue having more textures applied to it than entire Biomes in Skyrim modded to be 8k.

Additionally I think you need to check on how much an space an OODLE Texture actually takes up on disk. The number you are checking may refer to how much space it occupies on the GPU.
 
Last edited:

Bo_Hazem

Banned
I accidentally hit post way to early and have just been editing it since.


Really hard to judge when I don't know how the 8k textures are beings used in skyrim. Is it an 8k map that contains the textures for 10 different trees? What I can say is that I just downloaded the pack for just the snow and it take about 90MB per texture on disk There are only 14 Textures in all with 7 for color and 7 for Normals.

It would seem that The demo is use for more textures in far less space with just a segment of the statue having more textures applied to it than entire Biomes in Skyrim modded to be 8k.

Additionally I think you need to check on how much an space an OODLE Texture actually takes up on disk. The number you are checking may refer to how much space it occupies on the GPU.

I'm feeling sleepy here, sorry. But let's wait for official numbers instead of how big the demo is. If we don't get it now, we'll know it when UE5 is available.
 

AeneaGames

Member
Is the XSX creating a virtual machine for BC? I'm just trying to figure out why it would consume more resources than what Sony is doing.

To me it sounds like it should not to cost too many resources sncw my understanding is that it's just an extra abstraction layer around it. Maybe I'm missing something since it would normally be the better way to do things...
 
Last edited:

AeneaGames

Member
Meh, that's stupid, you can edit your post but can't post it after the editing possibility expires. Sigh

What I wanted to add was that come to think of it that it could very well be a light vm wrapper instead.

Because otherwise I can't explain why Sony is doing it so differently all because of performance...

Still one has to admit that the DirectX APIs have to add way more abstraction around the hardware to make sure DX works on many PCs...
 
Last edited:

AeneaGames

Member
The official box art for Dirt 5, the big badge meme continues
Ec54p4YXYAMsaEy

Ec54qXfWsAYSAKv
I just don't understand why they want that logo to be so big. It would've been better in the green bar up top on the left or right side instead. Or make the black bar underneath it a little shorter and add it there in the row of other Info. Or add it to the black bar.

Plus, I feel they are basically lying to the customers. When you buy this game it will not have the XSX version on the disc, just the X1 version with perhaps optimisations for the One X but I can not imagine it also containing the XSX version on the same disc. It will download it once you insert it into a XSX so can they legally claim it's optimised for a system when it has zero code on the disc for that system?

PS when do I stop being a Neo and be able to keep editing my posts?
 
Last edited:

Bo_Hazem

Banned
They stated it is 2.5 times better with SFS

4.8 Compressed * 2.5 = 12

Bear in mind this whole thing was me pointing out that we don't really know what that 2.5 really means.

You do understand that the decompression block can only handle ~6GB/s? And that number and even 2.4GB/s is extremely generous compared to real results in action showed by MS so far, as officials admitted that it's around 4x faster loading than Xbox one X HDD.

"With the Xbox Series X, out of the gate, we reduced our load-times by more than 4x without any code changes."

.

Stop those funny mental gymnastics, people.
 
Last edited:

Rea

Member
Can anyone answer my question? It's been bothering me and i can't seem to find the right answer.

Question: TLOU2 and GOT are easily ps5 lauch titles. A lot of people gonna buy ps5 with those games just to play these 2 games.

Why Sony didn't push these 2 games for ps5, since their lauch time are also very close to PS5's lauch?

They will sell lots of PS5 with these games and nobody would complain. The visual quality also top notch and all they need is a little squeeze in pixels and FRAMERATES with fast loading times.
I understand this is good for consumers like us who has ps4 and didn't wanna upgrade to ps5 just for these 2 games, but hey even if Sony push for ps5 lauch games, i wouldn't complain. At least for me and i believe many others IMO.
 

Bo_Hazem

Banned
Can anyone answer my question? It's been bothering me and i can't seem to find the right answer.

Question: TLOU2 and GOT are easily ps5 lauch titles. A lot of people gonna buy ps5 with those games just to play these 2 games.

Why Sony didn't push these 2 games for ps5, since their lauch time are also very close to PS5's lauch?

They will sell lots of PS5 with these games and nobody would complain. The visual quality also top notch and all they need is a little squeeze in pixels and FRAMERATES with fast loading times.
I understand this is good for consumers like us who has ps4 and didn't wanna upgrade to ps5 just for these 2 games, but hey even if Sony push for ps5 lauch games, i wouldn't complain. At least for me and i believe many others IMO.

Some like TLOU2 got delayed, even GOT, but that shows Sony's commitment that it doesn't neglect its current costumers but gives them quality games after quality games. That makes you have faith in what's coming to PS5 compared of being empty handed of quality games for like months or even years if Sony wanted to shift focus to PS5 instead. Now you can't see PS4 gamers complaining about PS5 games exclusives as it's been so generous with us during this gen. It's funny that only xbox heads and fans complaining about it and don't want a new gen as they make it sound.
 
Last edited:

Rea

Member
You do understand that the decompression block can only handle ~6GB/s? And that number and even 2.4GB/s is extremely generous compared to real results in action show by MS so far, as officials admitted that it's around 4x faster loading that Xbox one X HDD.

"With the Xbox Series X, out of the gate, we reduced our load-times by more than 4x without any code changes."

.

Stop those funny mental gymnastics, people.
I can't believe people are saying this SFS can make Xbox SSD raw speed by 2.5 times faster. It really doesn't make any senses honestly. How can a software API breaks laws of physics and breaks the limits of hardware capabilities. it's like saying ps5 gpu scrubber make GPU so efficient, the effective multipliers will be ( let's say 1.33) so it becomes 10 x 1.33 = 13.3 TFLOPS. But this is not the case.
 

Dolomite

Member
Actually no, Hellblade 2 (except that character model) looked worse than the Unreal 5 demo, the textures, the amount of geometry, lighting and overall scenery are just worse than the Unreal 5 demo, that demo was next-level stuff!

Now I know when you read this comment, you'll be saying "WTF?!" & mark me as a Sony licker but when you take a deep breath & think about it, it makes Hellblade 2 look feasible graphically and you should be happy about that.
😂😂😂 I'm wheezing
The volcano shot, the stone granularity on the troll hill, the sea kelp moving and reflecting into itself and proper RT reflected shine from the rain, hell the fire looked better than the Million clone statue sanctuary from the UE tech demo. HB2 was so impressive Digital foundry literally questioned how LOD of that level was possible ( then UE5 arrived😂) and questioned it's validity. The character model from UE5 was damn near 2006's Kameo. The rocks and caves themselves? Stunning. Lumen? Impressive. It was a taste of next gen but Senua's trailer was more impressive. 🤷🏾‍♂️🤷🏾‍♂️ 9 days till the show
 

Bo_Hazem

Banned
I can't believe people are saying this SFS can make Xbox SSD raw speed by 2.5 times faster. It really doesn't make any senses honestly. How can a software API breaks laws of physics and breaks the limits of hardware capabilities. it's like saying ps5 gpu scrubber make GPU so efficient, the effective multipliers will be ( let's say 1.33) so it becomes 10 x 1.33 = 13.3 TFLOPS. But this is not the case.

Well, that's less laughable than saying that XSX has 13TF extra and 25TF total with RT.
 

FeiRR

Banned
I'm guessing they will keep it how it is but up the mAh. The current DS4 battery life is fine for me as I rarely play more than 2 hours at a time these days (though did yesterday on TLOU2!) and some tests have it as much as 8-9 hours. If DualSense is say ~10 hours on the battery life then most will be happy I would think.
I think DualSense will last for 5-6 hours, maybe. Haptic feedback means kinetic movement of rotors and that's a lot of energy, far more than LEDs or even BT radio. I also don't understand people who cry it's a low number. If you are hardcore enough to play more than 4-5 hours straight (I envy your free time), you should have at least two controllers and a charger, problem solved with a 15 second swap, say, between your matches or on pause.

Also what to expect form Halo 6. Might be just Xbone One X? visuals, there´s some shadow issue (hand). The 2018 e3 looked better than the e3 2019 halo teaser. Looks nice but ...
It looks like an average current gen game. There's nothing impressive about IQ in those shots. Models are quite low-poly, draw distance isn't impressive, aliasing on edges is very apparent. At one take (camera shift from the environment to MC holding a helmet), the DOF is really bad or simply faked. Unless they've totally changed the design (very unlikely), Halo 6 won't be much of a looker. I think they're aiming at full 4K@60 FPS or even some MP 120 FPS mode.

Can anyone answer my question? It's been bothering me and i can't seem to find the right answer.

Question: TLOU2 and GOT are easily ps5 lauch titles. A lot of people gonna buy ps5 with those games just to play these 2 games.

Why Sony didn't push these 2 games for ps5, since their lauch time are also very close to PS5's lauch?

They will sell lots of PS5 with these games and nobody would complain. The visual quality also top notch and all they need is a little squeeze in pixels and FRAMERATES with fast loading times.
I understand this is good for consumers like us who has ps4 and didn't wanna upgrade to ps5 just for these 2 games, but hey even if Sony push for ps5 lauch games, i wouldn't complain. At least for me and i believe many others IMO.
There are over 100 million consumers out there who are potential buyers of those two games, making them profitable in turn. Console launch exclusives are a loss for the platform holder. That loss is necessary to build up the brand but must be painful for shareholders. Microsoft is minimizing that loss by making Halo forward compatible and promoting their service platform, while Sony is taking the exactly opposite approach. TLOU2 and GOT are backwards compatible with PS4 and you can see that in the way both engines work, how data is handled, how sound is designed and even in the size of GOT download.
 

Bo_Hazem

Banned
Not using LOD uses more RAM though.

Not like that can just be ignored completely. PS5 does not have more RAM than XSX, but does have more bandwidth to pull data in faster meaning more can be used for what is on screen but with a lot of far away objects in a scene they might go beyond that and still need to do LOD.

Seems like you forget that no LOD's = frame polygon budget. That UE5 had 20 million polygon budget, crunches any amount of polygons up to billions losslessly to 20 million polygons per frame. You maybe only need 8.5 million per frame for native 4K though to have 1:1 polygon-to-pixel.
 

Lethal01

Member
You do understand that the decompression block can only handle ~6GB/s? And that number and even 2.4GB/s is extremely generous compared to real results in action show by MS so far, as officials admitted that it's around 4x faster loading that Xbox one X HDD.

"With the Xbox Series X, out of the gate, we reduced our load-times by more than 4x without any code changes."

.

Stop those funny mental gymnastics, people.

It's not about what the decompression block can handle, It's that by using SFS you only have to send 1/3 (on average) of the texture data.

So a system that is sending 4.8GB/s on average of compressed texture using SFS would give more useful data than one that is giving 10GB/s of compressed data. The system with 10GB/s would be sending 6GB/S of texture data that would not be used.Where are the mental gymnastics? What happened to being sleepy.
 
Last edited:

Custom NVME SSD: The foundation of the Xbox Velocity Architecture is our custom, 1TB NVME SSD, delivering 2.4 GB/s of raw I/O throughput.

Sampler Feedback Streaming (SFS): This innovation results in approximately 2.5x the effective I/O throughput and memory usage above and beyond the raw hardware capabilities on average.

SSD raw I/O throughput * 2.5x the effective I/O throughput of the raw hardware capabilities = effective multiplier on available system memory and I/O bandwidth
2.4 * 2.5 = 6 GB/s

I don't know how people got the 12 or even 14 GB/s I/O throughput number, if the article gives you all you need to know to reach the 6 GB/s result that has been thrown around a lot of times in the past months.
 
If assets for games were not individually baked and saved with materials, textures, and normal data unique to every single model and level for every single game but instead called upon a console-wide library of modelites, basic mini textures, and micro detail normals accessible to all game engines then having several hundred GB of just asset data stored on your drive wouldn't be as big a deal.

Basically if all engines were design to handled data with a common library of assets then each game installation would just be the blueprints for piecing the assets back together at runtime. Sort of like a real-time quixel library. We then wouldn't have to worry so much about complexity of scenes or drive space.

But this won't happen, except for maybe UE5 only games, even then they aren't going to break down textures, materials, and normals down to the poly level in the way I'm imagining.

Another thing that comes to mind is if we are in micro polygon territory, do you still even need textures? Why not go with flat shading at that point?
 

SlimySnake

Flashless at the Golden Globes
Lol so much ssd talk. Especially on the day hellblade 2 was confirmed to be real-time. Xbox fans don't know when to take a W. Your game looks a gen apart. Who cares if your ssd is slower. It's not even like Sony is using this ssd as some kind of secret sauce. None of these games looked as good as hellblade 2 despite having 2.5x more i/o speeds.

I really hope ms blows us away come July 23rd with some truly next gen games and puts the pressure back on Sony. Sony was at its best during the ps3 era when they had something to prove. If hellblade 2 quality visuals are what we can expect from MS, that's great news for everyone.
 
Last edited:

Lethal01

Member
48b0ng.jpg


Time to give up, enjoy the moment until you face a reality check.
🤷‍♂️






SSD raw I/O throughput * 2.5x the effective I/O throughput of the raw hardware capabilities = effective multiplier on available system memory and I/O bandwidth
2.4 * 2.5 = 6 GB/s

I don't know how people got the 12 or even 14 GB/s I/O throughput number, if the article gives you all you need to know to reach the 6 GB/s result that has been thrown around a lot of times in the past months.


JTdFJfn.jpg


Enjoy the reality check.
This is a Microsoft Engine Architect saying your wrong.

Now if you want to claim Microsoft is lying I can get with that. But please stop acting like I'm the one misunderstanding what they are claiming

For texture data
It's the raw throughput (2.4) * Effective modifier due to compression (2) * Effective modifier due to SFS(2.5) = 12 (
Atleast according to the guy working on it

unknown.png
 
Last edited:

dwish

Member
Well an integer to store a single "point" would be 4 bytes. So really depends, usually people talk triangles (12 bytes each) but IIRC it's also a combination of triangles and then vertexes (where the points meet in space, which takes 12 bytes itself.) So it's a mix.

If it was nothing but triangles being stored, 100 billion of them would be 1.2 TB if I'm doing the conversions right.

But then there's all the points where the triangles meet in space being stored.... of course there's also compression.

edit: Here's a convo where some rendering folks are talking about 2 million per GB.


I think I'm wrong about how storing a triangle works though. But I'm also way undershooting the math I think.

2 million per GB would mean 500 GB per billion polygons. Or 50 TB for 100 billion.

Afaik you use 3x 32- or 64-bit floats to store the position (x,y,z) of each vertex. Then you also have an index buffer of 3x 32- or 64-bit integers that refers to which three vertices make up each triangle. Taking the statue as an example, you only store one of those on disk and in RAM and use matrix translations and rotations (2×4×4 32- or 64-bit floats) to position multiple statues in the world.
 

Lethal01

Member
Lol so much ssd talk. Especially on the day hellblade 2 was confirmed to be real-time. Xbox fans don't know when to take a W. Your game looks a gen apart. Who cares if your ssd is slower. It's not even like Sony is using this ssd as some kind of secret sauce. None of these games looked as good as hellblade 2 despite having 2.5x more i/o speeds.

I really hope ms blows us away come July 23rd with some truly next gen games and puts the pressure back on Sony. Sony was at its best during the ps3 era when they had something to prove. If hellblade 2 quality visuals are what we can expect from MS, that's great news for everyone.

This thread is for discussing and analyzing the consoles. No reason to stop analyzing it because xbox "Won" at something. The goal isn't to prove that it's "Better" It's to have fun talking about how good the improvements seem.
 

Rea

Member
JTdFJfn.jpg


Enjoy the reality check.
This is a Microsoft Engine Architect saying your wrong.

Now if you want to claim Microsoft is lying I can get with that. But please stop acting like I'm the one misunderstanding what they are claiming

For texture data
It's the raw throughput (2.4) * Effective modifier due to compression (2) * Effective modifier due to SFS(2.5) = 12 (
Atleast according to the guy working on it

unknown.png
One question, what if I need to stream a texture data, so called useful or needed raw texture size is more than 2.4GB/s??
 

Nickolaidas

Member
You can at least use your tv's optical, the HDMI 2.1 of the PS5 should provide enough quality. But it should be less quality due to the limit of the optical audio.

You better use eARC for that.
I cannot connect a device with my home theatre via hdmi. Only via optical.

My TV has an optical port, but it only produces stereo sound, not dolby or dts.

It is quite frustrating.
 

Lethal01

Member
One question, what if I need to stream a texture data, so called useful or needed raw texture size is more than 2.4GB/s??

Ofcourse in those cases it's better to have "raw power" but usually you will be getting 2.5x more useful data with SFS on average. I'm not claiming it's magic. Infact the first time I said I was pointing out how silly it would be to take 12GB/s at face value.

And once again, Nobody is saying only Xbox can do SFS I'm simply pointing out that the claim they are making is being made ontop of the texture compression. I could easily use PS5 as an example instead.

something something 9*2.5=23GB/s average throughput happy now?

Anyway, I could be wrong about all of this and totally misunderstand anything but nobody is actually give me reasons to doubt the Microsoft Engineer saying "Yes, you get 2.5 better performance in addition to the 2x raw throughput due to compression"
 
Last edited:

LED Guy?

Banned
😂😂😂 I'm wheezing
The volcano shot, the stone granularity on the troll hill, the sea kelp moving and reflecting into itself and proper RT reflected shine from the rain, hell the fire looked better than the Million clone statue sanctuary from the UE tech demo. HB2 was so impressive Digital foundry literally questioned how LOD of that level was possible ( then UE5 arrived😂) and questioned it's validity. The character model from UE5 was damn near 2006's Kameo. The rocks and caves themselves? Stunning. Lumen? Impressive. It was a taste of next gen but Senua's trailer was more impressive. 🤷🏾‍♂️🤷🏾‍♂️ 9 days till the show
Nope, DF questioned Hellblade 2's validity because they didn't know such tech like that (infinite geometric details with no visible LODs) until Unreal 5 demo came out then they now know that it is possible.

Unreal 5 demo looks way better.
 
Status
Not open for further replies.
Top Bottom