• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Epic sheds light on the data streaming requirements of the Unreal Engine 5 demo

Panajev2001a

GAF's Pleasant Genius
The entire point of the SSD is volume capability not possible with the limited capacity of dedicated RAM, you're going to swap out data as infrequently as possible.

“Great SSD as RAM” “Virtual Texturing/SFS to stream only the data you need per frame or per scene”... surely not designed to swap data in an out frequently... said no one ever when thinking about streaming systems built around first request latency and throughput. Still waiting your jump in the XVA threads ;).
 
Last edited:

Dolomite

Member
You’re conflating amount of data transferred with speed to transfer that data. It wouldn’t matter if the amount was 50 megabytes. It’s about latency and throughput.

aybe you don’t remember but N64 had a memory subsystem that could transfer 500 MB/second. What was the biggest cartridge that ever released on that system? 64MB? It was about speed and latency not the “amount” of data. This slide doesn’t change that at all.

You do not know more about the UE5 demo than Epic.

You do not know more about computer and system engineering than the PlayStation team.

There is a serious need to try and disprove Epics plainly stated facts about UE5 and PlayStation 5 for some reason. Do you really think they made the statements they did just a few weeks ago knowing, if they were lies, that it would then be discovered when they talked about it a little bit more? These people are not idiots. But they would have to be to be doing what you, the OP, is ascribing to them.

So yeah. Pretty sure everything is in the same place as before this presentation.
The source is EPIC themselves. They never said you need an SSD as fast as the PS5, fanboys took their praise of the SSD and assumed that this demo couldn't run on the XSX. EPIC THEMSELVES have just shared that the tech demo. Was Gpu heavy, not as SSD hungry as GAF made it seem. The facts are facts
800mb/s> reads mean that the XSX (with a far more capable GPU) could possibly run this demo better. The SSD is more than capable for what the demo demands and the GPU is more powerful than the machine the demo ran on. Facts are facts from the horses mouth
 

Xplainin

Banned
I have spent hours on here trying to put some reality into the outlandish claims and desires some here had about SSDs and the next Gen.
Sony has stated the reason they went as hard as they did on their SSD, and that was because their goal was zero loading time. It had nothing to do with nanite texture streaming or LOD pop in, or open worlds, it had to do with loading times.
Now I'm not saying that Sony never wanted to increase streaming into the RAM to create better worlds, of course they did. Same as Microsoft.
However they did it over and above those other reasons to get that instant loading. Hey, thats not a bad thing. Good on Sony for wanting to do that for their customers, and I as one of them look forward to not waiting minutes before I can play my Damm game.

I posted here many times, and it was just over looked by the SSD crazies here, that the SSD does not stream into the GPU, it fills the RAM. The RAM then supplies the GPU, and it's the GPU that draws the image in screen, not the SSD.
So if we were to fall for that whole "bottleneck", "phat pipe" sound bites, then the PS5s RAM bandwidth would be a bottleneck, as would the GPU. The bandwidth feeding the GPU on the PS5 is only 446gbs, while the XSXs is 560gbs. If you fall for that SSD fanboy belief, then as the RAM cant feed the GPU on the PS5 as fast as the XSXs can, then the PS5 must be bottlenecked by it's RAM.
But of course it's not, just like the XSXs isn't bottlenecked by the SSD speed.
We have people here who believe the SSD is going to stream into the RAM at 9gbs for large periods of time. Now think about it. A whole game is maybe 100gb in size. If your SSD is streaming into the RAM at 9gbs, then your game is only going to be 1 minute long. Or, if it needs 9gbs to be streamed a second, a 9 hour game is going to be 10 tb of data.
That's just not how games work. It doesn't stream from an SSD like that.
And as I also said a few times, which again was ignored by the fanboys, if increased I/O makes your system be able to show more detailed textures and reduces the performance gap between the XSXs 12tflop GPU and the PS5s 10.28tflop GPU, then all MS needed to do to close the performance gap between the PS4 and the Xbox One was to change out their 5200rpm hard drive for a 7200rpm hard drive, and then the Xbox One would have produced better graphics and performance than it otherwise would. Now of course that isnt true, but that's effectively what you are saying by thinking the faster PS5 SSD is in anyway going to give any advantage in a image or game world point of view compared to the SSD in the XSX.

The PS5 is less powerful than the XSX. It's GPU is less, the CPU is less and it has less bandwidth. But again like I have said a dozen times, there isnt enough of a gap to really show much of a difference on the image thats on screen. Nobody is really going to be able to pick one from the other.

The tribe of SSD is starting to get a bit embarrassing at this point.
 
Last edited:
No, not the first time you found something that seems to suit your angle and ram with it to rub it on people’s faces (some people 2+ times more than others).
Welcome to reality.

shave tail louie
" Shave tail " was a term originally used in the 19th century among U.S. cavalry regiments. Newly assigned cavalry troopers were given horses with a shaved tail, to let other troopers know that the rider was dangerously inexperienced, and should be given extra room to maneuver during training. " Louie " is a nickname for lieutenant, the lowest ranking, and least experienced, rank among U.S. Marine Corps officers."
In Avatar, "Colonel Quaritch mentions that being on Pandora made him feel "like a shave tail Louie."
 

Mister Wolf

Gold Member
The source is EPIC themselves. They never said you need an SSD as fast as the PS5, fanboys took their praise of the SSD and assumed that this demo couldn't run on the XSX. EPIC THEMSELVES have just shared that the tech demo. Was Gpu heavy, not as SSD hungry as GAF made it seem. The facts are facts
800mb/s> reads mean that the XSX (with a far more capable GPU) could possibly run this demo better. The SSD is more than capable for what the demo demands and the GPU is more powerful than the machine the demo ran on. Facts are facts from the horses mouth

Big facts. We will never hear Sweeny say the demo would run better on the Series X.
 

Panajev2001a

GAF's Pleasant Genius
Welcome to reality.

shave tail louie
" Shave tail " was a term originally used in the 19th century among U.S. cavalry regiments. Newly assigned cavalry troopers were given horses with a shaved tail, to let other troopers know that the rider was dangerously inexperienced, and should be given extra room to maneuver during training. " Louie " is a nickname for lieutenant, the lowest ranking, and least experienced, rank among U.S. Marine Corps officers."
In Avatar, "Colonel Quaritch mentions that being on Pandora made him feel "like a shave tail Louie."

Lovely, nice history lesson and completely unrelated to the point but lovely. Is it a metaphor for this whole thread?
 

Elog

Member
It's going to be a shocker for some and for others like me not so much, because I've been saying this for months now. I hate to break it to some of you but that demo's data streaming could be handled by a 5 year old SATA SSD.

8wl1rua.png


768MB is the in view streaming requirement on the hardware to handle that demo, 768 MEGABYTES... COMPRESSED. And what was the cost of this on the rendering end?

Did you watch the video? This is not what was said. The video does not contain enough information to fully map out the hardware requirements as you do.

The key point is that the demo used 768 MB of compressed assets in VRAM (so not the SSD...). That is the pool used to render the view from the camera. That means that in a worst case scenario those 768 MB only contain the assets for one frame. In almost all practical cases those 768 MB of course represents much more than one frame. The point is that the bandwidth and hence latency requirements can on paper be crazy high to run at that visual quality.

That is the complete opposite of what you wrote. Note sure if you actual did view and comprehend what was being stated?
 
Big facts. We will never hear Sweeny say the demo would run better on the Series X.
They were grooming Sony for an investment, which they got so it's not at all shocking how that whole situation played out. Then shortly after they got the money information about the technical data surrounding the engine and the PS5 is released.

Coincidence? Nope, it's a legal antitrust issue they had to resolve because they became financially beholden to an involved party, it's a conflict of interest.
 
[snip]


This confirms everything I've said, not that these SSD's are useless, because they're 100% not. That data streaming would be impossible with mechanical drives, however, and this is a big however. That amount of visual data and asset streaming is already bottlenecking the renderer, it's bringing that GPU to its knees. There's very little cost to the CPU as you will see below, but as noted about 100 different times on this website and scoffed at constantly by detractors; the GPU will always be the limiting factor..

lNv2lKl.png


I've maintained this since square one, Microsoft and Sony both went overkill on their SSD's. That amount of I/O increase is not capable of aligning with the rendering pipeline in terms of the on demand volume of data streaming these SSD allow.

So what's the point here? You've got two systems with SSD's far more capable than their usefulness, but one came at a particularly high cost everywhere else in the system. I'll let you figure out which one that is and where.

deadest.png

Exactly. Someone gets it.
Rendering is still the main bottleneck on the Series X.
 

Panajev2001a

GAF's Pleasant Genius
Did you watch the video? This is not what was said. The video does not contain enough information to fully map out the hardware requirements as you do.

The key point is that the demo used 768 MB of compressed assets in VRAM (so not the SSD...). That is the pool used to render the view from the camera. That means that in a worst case scenario those 768 MB only contain the assets for one frame. In almost all practical cases those 768 MB of course represents much more than one frame. The point is that the bandwidth and hence latency requirements can on paper be crazy high to run at that visual quality.

That is the complete opposite of what you wrote. Note sure if you actual did view and comprehend what was being stated?

Not surprised the thread was rushed to brown nose some console fans faces in... especially on this aspect.
 

Dolomite

Member
Big facts. We will never hear Sweeny say the demo would run better on the Series X.
I don't think we will ever hear Sweeney reference that tech demo in the same breath with MS😂
He made it very clear that "nanites" and "lumen" will be great in Xbox, but he chose his words carefully
 

Xplainin

Banned
Not only that, they also say that during the scene where she flies through the city, they were streaming in 500K objects and they can easily stream 1 Million.

I don't know too much about tech, but I'm pretty sure, going by this, that they weren't maxing out PS5 at all.
Or that the GPU was a bottleneck, if that's a better way of saying it.
With the move to efficency with all devs as well as both Sony, MS, AMD and Nvidia, in that flying scene your GPU would not be asked to render the full detail of the buildings as you fly past them and wont even notice the detail. It's the exact scenario where things such as VRS and lower quality textures and assets would be used.
 
You literally have not said anything to dispute his claims, just thrown insults. You're one of the ssd diehards... let's hear the facts that dispute his claim.
He's not even worth talking to, it's the same song and dance every time. Throws out some insulting kind of post, you respond with something logically sound and then he hits you with an "LOL" emote.

So many of them do this exact same thing.
 
It's going to be a shocker for some and for others like me not so much, because I've been saying this for months now. I hate to break it to some of you but that demo's data streaming could be handled by a 5 year old SATA SSD.

8wl1rua.png


768MB is the in view streaming requirement on the hardware to handle that demo, 768 MEGABYTES... COMPRESSED. And what was the cost of this on the rendering end?

Well, this is the result...

dQOnqne.png


This confirms everything I've said, not that these SSD's are useless, because they're 100% not. That data streaming would be impossible with mechanical drives, however, and this is a big however. That amount of visual data and asset streaming is already bottlenecking the renderer, it's bringing that GPU to its knees. There's very little cost to the CPU as you will see below, but as noted about 100 different times on this website and scoffed at constantly by detractors; the GPU will always be the limiting factor..

lNv2lKl.png


I've maintained this since square one, Microsoft and Sony both went overkill on their SSD's. That amount of I/O increase is not capable of aligning with the rendering pipeline in terms of the on demand volume of data streaming these SSD allow.

So what's the point here? You've got two systems with SSD's far more capable than their usefulness, but one came at a particularly high cost everywhere else in the system. I'll let you figure out which one that is and where.

deadest.png

If only those idiots who designed these consoles had thought to call you they could have saved millions of dollars.....
 

Hudo

Member
I am the first to shit on any of Microsoft's and Sony's overhyped tech-parades (fast SSDs are nothing new). But I would be careful to say that they "overspecced" because Unreal has vastly different requirements as Microsoft's and Sony's proprietary engines by their own devs. Unreal has to run on multiple platforms and has to deliver general use cases, whereas more specifically developed proprietary engines might require a higher data rate, depending on what the devs want do to.
The only thing I see as overhyped right now is Epic's UE5. We have to wait and see whether the PS5 and the XseX getting games that use their respective specs to the fullest.
 
Last edited:
The pool is in RAM, it's per updated view from disk with just whats needed, nothing about "per second". It's data that's in view, and potentially per frame updates to this pool.
The latency of the SSD helps more than the pure throughput here, but you didn't get those numbers so...
Also 4.5ms is well within frametime (so not a bottleneck), I think Lumen is the bottleneck here, as it pushed it down to 30fps.

Why did you make this thread?


 
Last edited:

Panajev2001a

GAF's Pleasant Genius
You literally have not said anything to dispute his claims, just thrown insults. You're one of the ssd diehards... let's hear the facts that dispute his claim.

Do we need to disprove his hot take claim based on a video he watched and rushed to post a castle of his own claims built on top to do some more console warring with (let’s not joke about with the pretence of console fairness his argument would actually say that both consoles are obscenely overspecced and both PS5 and XSX’s XVA are a waste of money and resources and source of false hype, if we took his argument as is, and I somehow do not see him fighting with the hordes of people hyped up in the XVA threads?)? Nah, not how it works mate.

So if I am the first to say something about you it is now your duty to prove me wrong beyond any reasonable doubt? Again, not how it works.
Several people, myself included, already discussed his interpretation, the data he presented and the conclusion he took: he identified the size of one of the video memory pools UE5 allocates and decided that it was not a transient pool meant to have data move in and out.
 
Last edited:

Xplainin

Banned
While I agree the PS5s SSD is a bit overkill, I will never shit on a company for pushing tech. I love to see that Sony created such a fast SSD. We want innovation, we want boundaries pushed.
I'm also not sure if this tech push was solely for the PS5. It could well be that Sony has other uses for this tech outside of just the PS5. Maybe it will be used in PC parts they are making, maybe in other areas of their consumer electronics divisions.
It could well be that the cost wasnt just Borne by the PS division, so the full R&D cost might not have been fully allocated by PS.
 

Reallink

Member
Have any of you guys downloaded the 4K video of the demo and watched it carefully/critically? I'm a little concerned by the number of prolonged scenes where the nanite system seems to fail at it's cinema asset > pixel interpretation resulting in some rather nasty looking IQ. These engine artifacts are readily apparent in real time in motion and don't require still frames or slowmo. Many instances wind up looking worse than traditional current gen renderers like ND's or Decima.


 
Last edited:

tryDEATH

Member
This could be a huge oof for Sony reminiscent to Cell, if this ends up being the case. Because they seem to have engineered the living sh*t out of that SSD for apparently nothing, it would actually be a shame I can't imagination how much hard work and resources went into creating it. Hopefully it isn't really this bad of a bottleneck on the GPU and the SSD can push those numbers higher.

But this is starting to look more and more like the 360 vs PS3. Sony came out with a special part, but their just not going to utilize it to its fullest ability while probably costing a ton and taking up a nice chunk in cost for built. Maybe their looking in getting PC parts, where I think people would go apeshit for this sort of storage solution, be it professional or personal.
 
Last edited:

ZywyPL

Banned
I'm just gonna quote myself from the other thread:

So just as I suspected, Nanite uses laughable amount of resources - 4.5ms, so just 1/8th or 1/4th of entire 30/60FPS timeframe budget, and not even 1GB RAM, and it's the Lumen that completely tanks the performance to 30, and is most likely responsible for the resolution going down all the way to 1400p. Good deep dive into the technology, much better than the initial reveal, but still, give me games that actually look like late UE3.5-early UE4 tech demos, then I'll actually get excited, because until then, it's all just a show and no go.

So nothing really new, everyone with a tiny tiny bit o knowledge and experience already saw it coming, it was just a bunch of folks desperately trying to spin the narrative once their 13TF dream died.



Have any of you guys downloaded the 4K video of the demo and watched it carefully/critically? I'm a little concerned by a number of prolonged scenes where the nanite system seems to fail at it's cinema asset > pixel interpretation resulting in some rather nasty looking IQ. These engine artifacts are readily apparent in real time in motion and doesn't require still frames or slowmo.


Typical CBR artifacts. If the demo run natively at 1620-1800p it would be much less noticeable, but it's sitting at sub-QHD most of the time so it is what it is. Hopefully they will optimize Lumen a bit better so it doesn't tank the performance/resolution so much.
 

Panajev2001a

GAF's Pleasant Genius
The pool is in RAM, it's per updated view from disk with just whats needed, nothing about "per second". It's data that's in view, and potentially per frame updates to this pool.
The latency of the SSD helps more than the pure throughput here, but you didn't get those numbers so...
Also 4.5ms is well within frametime (so not a bottleneck), I think Lumen is the bottleneck here, as it pushed it down to 30fps.

Why did you make this thread?



Not sure beyond trying to rub some people’s noses in it.

This could be a huge oof for Sony reminiscent to Cell, if this ends up being the case. Because they seem to have engineered the living sh*t out of that SSD for apparently nothing, it would actually be a shame I can't imagination how much hard work and resources went into creating it. Hopefully it isn't really this bad of a bottleneck on the GPU and the SSD can push those numbers higher.

This is starting to look more and more like the 360 vs PS3. Sony came out with a special part, but their just not going to utilize it to its fullest ability. Maybe their looking in getting PC parts, where I think people would go apeshit for this sort of storage solution, be it professional or personal.

Lol, sure you were quick to just go from OP to here skipping tons of stuff and bring a CELL comparison. I guess it will end again with XSX third place and it’s successor soundly beaten ;)? Is this the console warring climate you want?

Your link between CELL and PS5’s SSD is very very thin (completely ignoring XVA btw, that is good to bring into the picture when this narrative fails, devs and consumers are still
Impressed by the I/O throughput and latency and we need to convince people they XVA is just as fast... it is nice to see the argument jumping between PS5’s I/O solution is wastefully overspecced and XVA is just as fast arguments :LOL:).
 
Last edited:

tryDEATH

Member
Not sure beyond trying to rub some people’s noses in it.



Lol, sure you were quick to just go from OP to here skipping tons of stuff and bring a CELL comparison. I guess it will end again with XSX third place and it’s successor soundly beaten ;)? Is this the console warring climate you want?

Your link between CELL and PS5’s SSD is very very thin (completely ignoring XVA btw, that is good to bring into the picture when this narrative fails, devs and consumers are still
Impressed by the I/O throughput and latency and we need to convince people they XVA is just as fast... it is nice to see the argument jumping between PS5’s I/O solution is wastefully overspecced and XVA is just as fast arguments :LOL:).

Your the one that brought up the console warring, I could care less and even know that XSX still isn't going to outsell the PS5 just like the PS3 eventually did with the X360, you got me confused with some of the diehards your used to warring with.

I am also not a SSD believer anyway so what ever Sony or Xbox do in that regard I don't care until I actually play it see that the SSD is doing something that was never possible so don't put words in my mouth.

As for my Cell comparison, it is a legitimate argument Cell was ahead of its time just like Sony's SSD is, there isn't anything inherently bad about it, just that at this point and time apparently it will not be fully utilized.

Don't take every comment so personal.
 

martino

Member
768MB/s is already like 20x the throughput of what we see now in games. People are having a tough time grasping that this is an immense amount of data, and at the same time given the specifications of these SSD's they're largely underutilized.
Don't get me wrong I'm skeptical xsx ssd can reach 768 mo/s compressed data (even if it seems 768 is not representing that here)
look at ramdom / mix bench of pcie 4.0 ssd : https://thinkcomputers.org/sabrent-rocket-1tb-pcie-4-0-m-2-solid-state-drive-review/6/
it's probable 12 channel is helping a lot and will achieve over 1gb/s
I doubt it will be the case of xsx one if there is not something special we don't know about it.
 
Last edited:

T-Cake

Member
This could be a huge oof for Sony reminiscent to Cell, if this ends up being the case. Because they seem to have engineered the living sh*t out of that SSD for apparently nothing, it would actually be a shame I can't imagination how much hard work and resources went into creating it. Hopefully it isn't really this bad of a bottleneck on the GPU and the SSD can push those numbers higher.

You’ve got that completely backwards. The GPU is the bottleneck not the SSD.
 

Snake29

RSI Employee of the Year
Man, the thing is, you really, really don't. :messenger_pensive: But no skin off my teeth in the end. Just frustrating to read sometimes.



Epic claimed they did it really fast. That's how a lot of tech demos work when you finally start showing things with your proof of concept. Once you get to that point of it working you complete a "quick and dirty" version of what you want to show. I'm sure the whole thing is nowhere near production ready and has quite a bit of work left to do. I'm sure we will see higher res games running on PS5 and Xbox and PC and whatever else down the road.

There are a number of reasons they might have picked that resolution/framerate btw. They compared it to Fortnite. Maybe they decided to see what they could accomplish using the same GPU requirements so only allowed a certain amount of GPU power to be used. Nanite might need a lot of optimization still. Because they felt like it. Any number of internal reasons that none of us are privy to and not even worth guessing at because they already stated how much power it needed in terms of GPU compute.

They also stated that this demo was created on a early version of the devkit with not final PS Tools. That’s already kinda impressive what they’ve achieved with early hardware/tools in this demo.
 
Last edited:

farmerboy

Member
So you have all somehow managed to change the narrative from "2 times greater io throughput" to "its 2 times over-engineered and thus wasted resources that could have been used for gpu, but hey, (I'm a good guy qualifier incoming) its ok because you probably won't tell the difference anyway"

 

BigLee74

Member
As I've said all along, this PS5 SSD is going to boil down to loading games twice as fast as the XSX SSD. But when it's 2 seconds compared to 4 seconds, at the start of the game/level, who's going to notice/care?
 

ZehDon

Gold Member
Ask and you shall receive.


Sorry friend, I'm not picking up what you're putting down. The gentlemen doesn't breakdown what the 700+mb of compressed Nanite data is - only that it's Nanite's scene data, and it's much larger data footprint than what they're used to working with, and assured the viewer that Epic will be working to make it smaller as soon as possible.

You're throwing the 700mb figure against the PS5's 5.5gb/s SSD and equating the two values, saying it only needs 700mb per second. The gentlemen in the video didn't make that comparison, so I'm not going to either.
He only said this represented what the Nanite scene needed based on the current view. So, with that in mind your interpretation doesn't make sense. For example, given that the view can change every frame, at 30FPS the worst case scenario would be the full 700 loaded every 33.33ms, which scales up to something like 23gb of data streamed per second. We don't know how long it takes for the initial scene to load - they may be front loading the initial data load, meaning they only need to stream in a small amount of changed data per frame. 160mb of altered data per frame, for example, would tax the raw PS5's SSD capability.

As you can see, this can be interpreted in several different ways. Frankly, no one here is qualified or experienced enough to explain it properly.

So, no - Sony have not made an error of building a 5.5gb/s SSD when they only needed a 1gb/s SSD. Microsoft didn't build a 2.4gb SSD when they only needed a 1gb/s SSD. Neither company has wasted hundreds of millions in R&D because they screwed up primary school math.
 
Last edited:

geordiemp

Member
It's going to be a shocker for some and for others like me not so much, because I've been saying this for months now. I hate to break it to some of you but that demo's data streaming could be handled by a 5 year old SATA SSD.

8wl1rua.png


768MB is the in view streaming requirement on the hardware to handle that demo, 768 MEGABYTES... COMPRESSED. And what was the cost of this on the rendering end?

Well, this is the result...

dQOnqne.png


This confirms everything I've said, not that these SSD's are useless, because they're 100% not. That data streaming would be impossible with mechanical drives, however, and this is a big however. That amount of visual data and asset streaming is already bottlenecking the renderer, it's bringing that GPU to its knees. There's very little cost to the CPU as you will see below, but as noted about 100 different times on this website and scoffed at constantly by detractors; the GPU will always be the limiting factor..

lNv2lKl.png


I've maintained this since square one, Microsoft and Sony both went overkill on their SSD's. That amount of I/O increase is not capable of aligning with the rendering pipeline in terms of the on demand volume of data streaming these SSD allow.

So what's the point here? You've got two systems with SSD's far more capable than their usefulness, but one came at a particularly high cost everywhere else in the system. I'll let you figure out which one that is and where.

deadest.png

Do you know what streaming pool means ?

Its not the size of data on the disk or the total game size.

We dont know the time required for streaming the 768 mb, so you do not know the speed required.

Funny.
 
Last edited:
Top Bottom