You ignored all the info from both the nanite and lumen deep dives - from the developers, and then straw-man arguments ignoring the immense pixel-rate of a PS5 that matches the technologies, and now are trying the same with DirectStorage.
I think we are all pro-DirectStorage for games on this forum, because it is a step in the right direction - so trolling about it is really odd IMHO. How it measures up and improves things for Windows will be interesting.
To answer your trolling question, why would anyone have to do anything? Other than last-gen, that's always how things end up eventually with consoles. Last-gen it was worse, because the PS4 had no grace period of having a bigger userbase with better specs from the beginning - for any part of the system at anytime - and yet it sold well, had the games with the best production quality and all gamers were happy..
Huh? I created both theses videos that have gone viral with over 300k views. Let’s see who is the troll here and a blant liar whose only goal is to make up and spread bullshit. You did it all in that UE5 Nanite thread.
Notice how he calls everything in the PS5 demo complex but everything in the Valley demo simple when they are exactly the same thing and some times even way more complex and abudant in the valley demo.
PS5 Demo refers to the lumen in the land of nanite PS5 demo in 2020.
First it was
Lie #1
valley used sunlight with animated sun cover
Truth: Valley uses the same volumetric cloud, sky and directional light system as the PS5 demo
Lie #2: PS5 demo stressed tested lumen
Truth: The valley demo has way more light source than the PS5 demo and lumen is orders of magnitude better in quality than what it was last year according to the creator of Lumen
Lie #3: Valley demo isn’t about nanite
Truth: Nanite cost over 2x than what it cost on the PS5 demo because of ridiculous overdraw due to lack of optimizations.
Lie #4: Valley has a 4.5 loading screen
Truth: Valley unload almost the entire level and loads a completely different level from scratch. No where is that done in the PS5 demo.
Lie #5: Valley only rendered two characters that ain’t nanite geometry.
Truth: The PS5 demo rendered only 1 character so the valley demo is doing more and the ancient one IS a nanite geometry.
Lie #6: and the large boss cinema level asset, was so high above the floor and other nanite geometry the lighting interactions with any parts of the model that were nanite meshes were not possible (AFAIK).
Truth: This makes no sense, I don’t even know what it’s trying to say other than it’s another made up bs.
Lie #7: All the complex volumetric simulations, etc that are missing from this demo
Truth: The valley demo uses the same volumetric sky and cloud as the PS5, it’s a duplicate. The valley demo uses way more volumetric fog and dust than the PS5 demo.
Lie #8: and with much better character animations
Truth: The valley demo uses the exact same locomotion system made for the PS5 demo with the exact same animations. Created by Epic’s Caleb (“The locomotion animations came from the first UE5 nanite demo that was unveiled last year.
Lie #9: there’s nothing to look at like...the complex simulation for the huge blue vortex at the end of the Ps5 demo?
Truth: The helmet Niagara vfx sim effects is the same as the portal vfx sim effects. It was made by Epic’s Ryan Burke.
Lie #10: There’s nothing to look at like...the complex creature behaviours reacting to the spotlight, etc, etc?
Truth: This exact same Niagara vfx has been released as a content example which you can play around like a toy. You can easily add it into the valley demo and costs virtually NOTHING. Same with the birds. Thi s is another BS made up to claim the Valley demo is inferior to the PS5 demo.
Lie #11: The other demo for PS5 looks a little bit uncharted, and a lot Last Guardian
Truth: Epic already said they copied tomb raider right down to the environment, the artifacts, the character design and the markings on the rocks. But of course praise song because nothing can ever come out your mouth that is a accrediance that isn’t attributed in some shape or form to Sony.
Lie #12: Editor is less performance heavy than a fully cooked/compiled EXE.
Truth: Debunked by Daniel Wright as he runs the PS5 demo on his PC and said that editor is more heavy.
Lie #13: Pixel rate on PS5 is higher than PC RTX Gpus which is why the PS5 demo ran at 1440p instead of 1080p. Later with the same argument but now Lumen and Nanite running in the Tempest audio engine IS the reason why it ran 1440p last year.
Truth: Brian Karis the creator of Nanite, Daniel Wright the creator of Lumen and several other Epic game engineers and project managers of the demos have already come out and said the valley demo is WAYYYYYY more performance heavier than the PS5 demo which is why it ran at 1080p vs 1440p.
Lie #15: like this new world area grid solution in the pre-release, that Brian said wasn't in the 2020 PS5 demo, but is in what he showed on his work from home PC. He also stated that the code is different in the 2020 version because they didn't have the world grid partition system implemented - or whatever UE5 specifically calls it.
Truth: No he didn’t. He said the PS5 demo DOES NOT use the world partition and the data layer system because it wasn’t created when that demo was made. The demo hasn’t been altered. It wasn’t recreated to use the new system. It’s still using the old UE4 style of having separate levels. This is confirmed by a Epic Games Engineer and by looking at the different maps/levels there are when both Brian and Daniel opens the file browser inside UE5.
Other lies that are just ridiculous and totally made up like the rest. I could keep going but they are infinite!
Lie # 16: Editor runs on consoles.
Lie # 17: Brian Karis had a 10k computer to run the PS5 demo.
Lie # 18: Lumen and Nanite ram on PS5’s Tempest Engine 3D Audio Chip because Lumen in the land of nanite demo is called Reverb.
We'd have to see the other demo running exactly the same to backup those conclusions,
The original demo used all the UE5 systems in tandem and really stressed lumen hard by camera speed and internal lighting. today's presentation had a 4.5second loading screen, focused on lighting nanite geometry with sun light with animated cloud cover, and was mostly rendering two characters, that aren't nanite geometry (the character/boss), and the large boss cinema level asset, was so high above the floor and other nanite geometry the lighting interactions with any parts of the model that were nanite meshes were not possible (AFAIK).
The statement is exactly as is to avoid any inference between PC min specs, XsX and PS5, but that seems to be exactly what you are doing. All the complex volumetric simulations, etc that are missing from this demo, but were present - and with much better character animations - in the ps5 demo that ran at 1400p and was capped at 30fps - and there's text/video confirming they expected to optimise - the resource hog - lumen to run at 60fps, all says that this demo is light weight to run on the PS5 - and probably XsX too- because of the in GPU streaming pool compression, and in & out to SSD streaming compression - as mentioned in the slides of the UE5 demo unrealfest videos.
There is nothing to look at in the 2nd demo, where is all the stuff mentioned in the Unrealfest videos - that the ps5 does, like Brian mentioning the complex simulation for the huge blue vortex at the end of the Ps5 demo? Or the complex creature behaviours reacting to the spotlight, etc, etc?
This recent demo is pretty vacant and dull, and nothing like something from an actual game(IMHO) - but playing mostly exclusives on platforms I might be out of touch with a more varied quality bar. The other demo for PS5 looks a little bit uncharted, and a lot Last Guardian - two actual game scenarios I can already see being fabricated into existing IPs.
It wasn't lowered, 1080p30 was the target for the editor (without stuttering) for that demo on PCM PC, XsX, PS5 and old PCM PC Maxell GPU. No inference is made about console top capability from what I recall, and if you continue to say it does, then feel free to bring the transcribed text from the video's audio that proves otherwise.
The part I bolded, you can't be real about that, are you? Do you even understand the purpose of the devkits vs testing on retail decks?
What data? We can only benchmark 4.5 second high latency loading on a PC, we can't benchmark how well the XsS compares to the 1080, or a 3090 measures up to a XsX or PS5. But we know that nanite and lumen are ROP bound because unrealfest video slides tells us that the nanite/lumen (SDF) stuff is done in just a fragment shader - ie ROP bound - so everything else like IO latency/decompression/cache scrubbing/async compute being equal, the performance of the original PS5 demo will scale by ROP count, as will this lesser demo.
No one is sucking up anything, and it is for you to transcribe verbatim to prove your so-called PR fact.
Have you actually watched the full video of the first demo? We are talking 10,000km draw distance, and then flying it, cinema assets being GI lit while streaming into view constantly, at 20m triangles per frame giving constant IQ from models with millions of source triangles and three layers of 8K textures per unique model(IIIRC) and all instances being uniquely scaled. How big did they need to make the world for you?
Pretty sure I just finished watching nearly three hours of interesting stuff about nanite - dovetailed lots of interesting stuff many of the technical here at gaf have discussed - like Brian's comparison of mipmaps for textures and difficulty of LoDs not having an equally simple filtering method, him clarifying why nanite meshes aren't real-time generated, which cause the limitations to non-uniform scaling and no animation, etc, etc with lots that can be discussed in this and other threads, and this was your opening post to trolling people - and you weren't even bringing quotes of what people should apologize for.
If I ever said the demo definitely won't run on a PC in any form - like this new world area grid solution in the pre-release, that Brian said wasn't in the 2020 PS5 demo, but is in what he showed on his work from home PC, that he also said had preloaded the entire data in memory for Lumen in the land of nanite, in his editor PC showcase - and you can find a quote - as they are the words you are attributing to me - then by all means, take a well earned apology, if you can.
So much of the twitch nanite stream really drilled into the way nanite taxes a GPU in ROPs and scales linearly by resolution by being all done in the rasterization (fragment shader) - according to Brian , which I also tried to discuss for weeks without people like VFXVeteran and you acknowledging.
So if we take - a mid range game clock for - a RTX 3090 and doing a quick estimate of pixel-rate (ROPs x clock = 112x1.7) ~190billion/s, and do the same for the highest next-gen console pixel-rate, we get 140billion/s. So assuming console optimizations like cache scrubbers, and streaming out compression to the SSD made no difference in the first UE5 Demo solution, then a RTX 3090 PC should be able to manage nanite at 1.35x the native resolution, = 3,456 x 1895 @ 30fps at the same fidelity settings AFAIK.
I might be mistaken, but his editor was showing 4096px in the top right menu for something, so I'd presume that was for textures, as virtual shadow maps are 16K according to the video. So if the textures were 4 times the size( and storage) for the 2020 real-time gameplay shown, and in 2020 they didn't have the world grid system of the new UE5 pre-release demo - as is stated in the twitch nanite video - so there's every possibility the PS5's version was streaming data crazily throughout - even if the editor mode he showed was all running from RAM - as he stated. The 2020 demo slides in an Unrealfest video does state the PS5 was stream compressing out to the PS5's SSD, as well as compressing geometry on the GPU in ram.
In the context of an "Inside nanite" twitch video it is the exact same demo with the same geometric "nanite" assets in the editor, so my belief is that both statements are true. Somethings would be better on PS5 - say like textures - where latency matters, and other things would be better with more pixel-rate on a top of the line PC GPU, because nanite's limiting factor is pixel-rate for key bottlenecked areas.
He also stated that the code is different in the 2020 version because they didn't have the world grid partition system implemented - or whatever UE5 specifically calls it.
We also don't know what type of WFH PC he has, it isn't beyond realms of possibility that it is one of those £10k cards that AMD did with an SSD on the side.
They are the same solution from the info we had when RTX IO first surfaced, and will have the same latency as each other, not throughput which is decompression rate. So not a 100x latency improvement, but a 20x latency improvement like XVA's technology showed in their info reveal many months back before the RTX IO board slide info.
edit: On the 4096px issue, look at the complex physics vortex simulation (Gbuffer?) texture at the end of the land of nanite scene in his editor. It looks to be as is before fully triggered, but the texture is a fraction of the portal size - maybe less than a quarter, or maybe a quarter hard to remember exactly without checking the 3hr stream.
I was confused, too but having read the quote in context with the surrounding text, I believe he means the performance "requirements" in the editor are less - but sadly the word requirements is missing, so the sentence by itself says something completely different, and in context if the requirements word isn't intended, then the text reads like it is contradicting itself.
Because he specifically mentions the additional burden for memory of the editor, as though the next sentence should by comparison say lower performance requirements. But I've no idea, now. It looks like a rushed response to someone trying to stop their words wrongly setting fire to a forest