• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5 Deep Dive on May 26th

Yep. I feel like I am going crazy because I am blown away by these graphics. Gonna post these gifs here again to restore some sanity.

rYg9Qdz.gif

00m4Rx0.gif

looks great

don’t see any “downgrade”

but at the same time, nothing more impressive than what was shown last year so not quite as impactful...just more of the same
 

Loope

Member
just tell them ps5 is the second coming of christ. i am actually worried for their health now.
Well it's easier yes. And it's what is grating as shit over here. The constant need to say that a console is superior to even pc, and if the pc is superior then it is a 5000€~10000€ pc. There was a time that they would throw 1500€ or 2000€, now they use big numbers to make it look like a fucking Fiat Punto is competing with a Ferrari.
 
Well it's easier yes. And it's what is grating as shit over here. The constant need to say that a console is superior to even pc, and if the pc is superior then it is a 5000€~10000€ pc. There was a time that they would throw 1500€ or 2000€, now they use big numbers to make it look like a fucking Fiat Punto is competing with a Ferrari.
I had a feeling today would have gone down like this, but I was hopeful there would be a chance to just talk about the demo, the tech behind it, and what we can possibly expect. But no, the usual suspects pop in and have to make the thread all about them, and how pc/Xbox can't do this, can't do that, downgrades, etc. I can't stand the fanboys of that camp, cause they are the most immature of all the gamers. A bunch of Debbie downers if you ask me.

I wonder how long it will take the devs to implement it into Fortnite (not that I would ever play that shit), but would be interesting to see if there's a big performance jump on lower end hardware, while being able to enable higher settings. I'd love to see how it performs on phones though, as we have came a long way with mobile SOC's
 

Jon Neu

Banned
Are some of you crazy? I read the posts before watching it and was prepared to be disappointed by the ”downgrade” but instead I was just as amazed as at the first demo. It still looks unbelievable. Don’t tell me that you won’t shit your pants if games will really look like this in a few years. I really hope I will get to play games on these consoles that look like this.

In fact it looks quite better than the first demo.

The Ancient boss was impressive, more impressive than static statues for sure.
 

mortal

Gold Member
It's all worthless fluff if nobody utilizes it. In a game.

It doesn't matter if the effects are TOTALLY AWESOME if it can't be achieved with realistic performance targets.

It's all just cock tease until it's all in a game.
You do realize that Unreal Engine is also utilized by many animators and animation studios? It's not solely for game development.

Doesn't seem like you're the target audience for this sort of showcase, at least inferring from your posts ITT.
That's fine, but I don't understand your contention here when this is very clearly aimed at game developers, 3D artists and animators, or even enthusiasts interested in the technical and creative process of game design.
 
Last edited:
They use a proxy nanite mesh or you can fall back on traditional LOD system for that.

Also link to the Lumen doc page.

The documentation acutally tells us quite a lot that has been argued about.
They are even talking about storage size issues as a limiting factor now rather then latency and datatroughput.

General Advice on Data Size and a Look to the Future​

Nanite and Virtual Texturing systems, coupled with fast SSDs, have lessened concern over runtime budgets of geometry and textures. The biggest bottleneck now is how to deliver this data to the user.

Data size on disk is an important factor when considering how content is delivered — on physical media or downloaded over the internet — and compression technology can only do so much. Average end user's internet bandwidth, optical media sizes, and hard drive sizes have not scaled at the same rate as hard drive bandwidth and access latency, GPU compute power, and software technology like Nanite. Pushing that data to users is proving challenging.

Rendering highly detailed meshes efficiently is less of a concern with Nanite, but storage of its data on disk is now the key area that must be kept in check. Outside of compression, future releases of Unreal Engine should see tools to support more aggressive reuse of repeated detail, and tools to enable trimming data late in production to get package size in line, allowing art to safely overshoot their quality bar instead of undershoot it.

Looking to the future development of Nanite, many parallels can be drawn to how texture data is managed that has had decades more industry experience to build, such as:

  • Texture tiling
  • UV stacking and mirroring
  • Detail textures
  • Texture memory reports
  • Dropping mip levels for final packaged data
Similar strategies are being explored and developed for geometry in Unreal Engine 5.
 
Last edited:
Last edited:

CamHostage

Member
The project includes the source assets uncompressed and all that. Quixel megascans are huge files. On top of that there's the super high resolution base models. Games will be much more compressed than a demo unzipped.

Ratchet and Clank is probably hundreds of gigs in a project file, but packaged up for PS5 is like 30-40 after installation.

Yes, although part of the point of the demo was that it's super high resolution base models. We're looking at 100GB on screen here.

community-1920x1080-3b9984ee9c6b.jpg


And although it's full-scale, original Quixel megascan data, it's probably not a lot of different Quixel scans. They're taking a few huge Quixel megascans and twisting them and shrinking or zooming them, making pebbles out of boulders and building mountains out of one rock; they're getting the most out of each of those big megascans included in the sample. Which is what you'd do when making a game, so they're already optimizing a bit, like you would in a game build. The assets here are uncompressed, but it's also only a small chunk of world and two characters, and that's right now 100GB to 'play" it.

The megascans are left as-is for the purpose of the UE5 sample demo purposefully (so that developers can play around themselves with the assets that actually made this demo.) One of the interesting experiments with UE5 Early Access kits would be to export a viable version of the game with optimization. (Quixel megascans themselves are not new, so that part of it is probably easy to calculate for those who have used megascans to make UE4 projects.) And Nanite mesh models save a ton in their approach, so that's a plus in savings there.

But I'm still balking at that file size for now...

I wonder how long it will take the devs to implement it into Fortnite (not that I would ever play that shit), but would be interesting to see if there's a big performance jump on lower end hardware, while being able to enable higher settings. I'd love to see how it performs on phones though, as we have came a long way with mobile SOC's

There's not a lot of talk about what UE5 can do beyond its next-gen advancements (Lumen and Nanite will be only on next-gen-spec hardware,) so I'm expecting the Fortnite UE5 rollout to be as much a wisp for gamers as the Chaos Physics Engine integration a bit back. Fortnite will be Fortnite.

But I would be interested to know more or see some examples of what UE5 actually does for past-gen-spec platforms (if anything?) Lots of the improvements for UE5 are in the toolkit rather than the "game engine" (tech like MetaSounds and improved control rigs and a new World Partition system, which would help making any game or cross-platform games, but still your end result would have to include some compromises once it comes to producing a build below next-gen-specs,) but I do think UE5 will be good news for lesser platforms as well, just not nearly as exciting (obviously) as next-gen hardware.
 
Last edited:

geary

Member
Do you think 16GB of RAM will be enough from 2022 forward? Or people should aim for 32GB for recommended?
 

elliot5

Member
Lumen supports Hardware Ray Tracing, and that works fine with Nanite.


Rendering​

The following rendering features are not currently supported:
  • Raytracing against the fully detailed Nanite mesh
    • Ray Tracing features are supported but rays intersect the coarse representation (called a proxy mesh) instead of the fully detailed Nanite mesh

nanite-sme-proxymesh.webp

This is an example low tri coarse proxy mesh. I'm not a graphics/lighting engineer or artist so I don't know how likely RT will be used if the visual quality is degraded by using rays on this kind of mesh.

I know Nanite works with Lumen, and Lumen can work with RT, but it seems Lumen + Nanite + RT all at once may have some limitations or unsupported features. Lumen itself doesn't require RT.
 

Bo_Hazem

Banned
Also:

AMD said: Temporal Super Resolution: TSR is a new technique of upscaling a lower resolution frame to maximize performance and visual fidelity. AMD has worked closely with Epic to optimize TSR features on Radeon™ powered systems. A standard feature of UE5, TSR is enabled for all GPUs and provides state-of-the-art upscaling not just on PC, but on PlayStation 5, and Xbox Series X/S, too.

All that glory according to the UE5 video was running at 4K with a budget of 1080p...

nVidia and DLSS 2.0...

hide-pain-harold-title-red%20-web.jpg
 
Last edited:

Rendering​

The following rendering features are not currently supported:
  • Raytracing against the fully detailed Nanite mesh
    • Ray Tracing features are supported but rays intersect the coarse representation (called a proxy mesh) instead of the fully detailed Nanite mesh

nanite-sme-proxymesh.webp

This is an example low tri coarse proxy mesh. I'm not a graphics/lighting engineer or artist so I don't know how likely RT will be used if the visual quality is degraded by using rays on this kind of mesh.

I know Nanite works with Lumen, and Lumen can work with RT, but it seems Lumen + Nanite + RT all at once may have some limitations or unsupported features. Lumen itself doesn't require RT.

From their documentation:
Lumen uses multiple ray tracing methods to solve Global Illumination and Reflections. Screen traces are done first, followed by a more reliable method.

Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

And yes UE5 or Nanite and Lumen still have lots of issues or "unsupported" cases.
They are still working on the engine and they will keep on working on those issues for years to come.
Still both features are great ways to accomplish their goal on all devices/plattforms.
 

Tripolygon

Banned

Rendering​

The following rendering features are not currently supported:
  • Raytracing against the fully detailed Nanite mesh
    • Ray Tracing features are supported but rays intersect the coarse representation (called a proxy mesh) instead of the fully detailed Nanite mesh


This is an example low tri coarse proxy mesh. I'm not a graphics/lighting engineer or artist so I don't know how likely RT will be used if the visual quality is degraded by using rays on this kind of mesh.

I know Nanite works with Lumen, and Lumen can work with RT, but it seems Lumen + Nanite + RT all at once may have some limitations or unsupported features. Lumen itself doesn't require RT.
Thats how current RT works, they trace rays against lower detail models at lower resolution.
 
Last edited:

OverHeat

« generous god »
Also:

AMD said: Temporal Super Resolution: TSR is a new technique of upscaling a lower resolution frame to maximize performance and visual fidelity. AMD has worked closely with Epic to optimize TSR features on Radeon™ powered systems. A standard feature of UE5, TSR is enabled for all GPUs and provides state-of-the-art upscaling not just on PC, but on PlayStation 5, and Xbox Series X/S, too.

All that glory according to the UE5 video was running native 4K with a budget of 1080p...

nVidia and DLSS 2.0...

hide-pain-harold-title-red%20-web.jpg
hahahahaha
 

Corndog

Banned
"Valley of the Ancient is a separate download of around 100 GB. If you want to run the full demo, the minimum system requirements are an NVIDIA GTX 1080 or AMD RX Vega 64 graphics card or higher, with 8 GB of VRAM and 32 GB of system RAM. For 30 FPS, we recommend a 12-core CPU at 3.4 GHz, with an NVIDIA RTX 2080 or AMD Radeon 5700 XT graphics card or higher, and 64 GB of system RAM. We have successfully run the demo on PlayStation 5 and Xbox Series X consoles at full performance."



You think it's not because they had to make it work for a slower SSD and I/O? No
So I guess some people need to eat some crow.
 

DenchDeckard

Moderated wildly
What’s a bit upsetting is, I honestly think they were bullshitting with the ps5 demo from last year. There’s absolutely a drop in pure numbers of elements on screen and I’m not savvy enough to really comment so maybe I’m wrong, but i deffo feel like there is something fishy going on here. Why wouldn’t they just release the flying demo if current hardware can actually run it the same?

unless I’m imagining the demo wrong...gonna go watch it.
 
Last edited:
I am curious now though how nanite stores deformed geometry. Like for example and explosion that creates a crater. Will it just be a temporary change or will it be possible to store the terrain deformation and at which cost.
 
Last edited:

VFXVeteran

Banned
OMG Nanite is so completely limited.. I can't believe it doesn't support the following:

The following rendering features are not currently supported:
  • View-specific filtering of objects using:
    • Scene Capture with:
      • Hidden Components
      • Hidden Actors
      • Show Only Components
      • Show Only Actors
    • Minimum Screen Radius
    • Distance culling
    • Anything filtered by FPrimitiveSceneProxy::IsShown()
  • Forward Rendering
  • Stereo rendering for Virtual Reality
  • Split Screen
  • Multisampling Anti-Aliasing (MSAA)
  • Lighting Channels
  • Raytracing against the fully detailed Nanite mesh (only the proxy mesh, not the full detailed one).

I don't see this tech being used much at all this generation. It has so many limitations.
 

Bo_Hazem

Banned
What’s a bit upsetting is, I honestly think they were bullshitting with the ps5 demo from last year. There’s absolutely a drop in pure numbers of elements on screen and I’m not savvy enough to really comment so maybe I’m wrong, but i deffo feel like there is something fishy going on here. Why wouldn’t they just release the flying demo if current hardware can actually run it the same?

For example, that first statue in the first demo was 33M (500 were placed in that room as well), and this massive giant is just 15M. The first demo was mostly an extreme stress test, this one is more "realistic" and the perception is overall similar.

Also that boss fight seems to have heavy FPS drops.

Even this one is still very hard to make with realistic game sizes.
 
Last edited:

elliot5

Member
I am curious now though how nanite stores deformed geometry. Like for example and explosion that creates a crater. Will it just be a temporary change or will it be possible to store the terrain deformation and at which cost.
Not all meshes have to be Nanite enabled meshes. From what I read I don't think deforming or dynamic meshes (like skeletal meshes) can work with nanite right now. That's why the elemental being had static meshes attached to a skeleton.
 

GreyHand23

Member
OMG Nanite is so completely limited.. I can't believe it doesn't support the following:

The following rendering features are not currently supported:
  • View-specific filtering of objects using:
    • Scene Capture with:
      • Hidden Components
      • Hidden Actors
      • Show Only Components
      • Show Only Actors
    • Minimum Screen Radius
    • Distance culling
    • Anything filtered by FPrimitiveSceneProxy::IsShown()
  • Forward Rendering
  • Stereo rendering for Virtual Reality
  • Split Screen
  • Multisampling Anti-Aliasing (MSAA)
  • Lighting Channels
  • Raytracing against the fully detailed Nanite mesh (only the proxy mesh, not the full detailed one).

I don't see this tech being used much at all this generation. It has so many limitations.

Seems a bit over dramatic considering this is an early release. Some of those features may be supported in the final release.
 
Top Bottom