• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

News The Unreal 5 Demo on the PS5 Used Software Ray-Tracing Similar to ReShade’s Ray-Tracing Shader (Ray Traced GI)

Lunatic_Gamer

Member
Jan 15, 2018
258
1,671
450
Denver, CO
The Unreal Engine 5 demo that was recently shown off running on the PS5 started a slew of discussions across different forums and media. This includes the resolution and frame rates of the demo, the technologies used, and how it compares against contemporary PC hardware.

Today we are going to look at another such facet of the PS5 demo, namely the lighting. Epic has been touting it Lumen GI as a next-generation global illumination technology, and now we have a better idea of how it actually works. In an interview, Daniel Wright, the Technical Director of Graphics at Epic, said:

Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing. Lumen traces rays against a scene representation consisting of signed distance fields, voxels, and height fields. As a result, it requires no special ray tracing hardware.

Lumen uses a combination of different techniques to efficiently trace rays. Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer.

 
Last edited:

Kenpachii

Member
Mar 23, 2018
4,472
4,742
620
Yea already guessed this when those boxes got revealed. It's the only way consoles can do raytracing as the real thing simple is to demanding.

I hope we see some better push on PC front also with this. However i think they will just sit with real raytracing and call it a day after it. Sadly.
 
Last edited:
  • Thoughtful
Reactions: DoctaThompson

pawel86ck

Member
Jan 27, 2018
1,461
1,975
400
In reshade RT lighting is constantly fluctuating when you are moving and I havent noticed anything like that in UE5 tech demo.
 

mckmas8808

Member
May 24, 2005
43,473
6,967
1,835
Yea already guessed this when those boxes got revealed. It's the only way consoles can do raytracing as the real thing simple is to demanding.

I hope we see some better push on PC front also with this. However i think they will just sit with real raytracing and call it a day after it. Sadly.
I'd be curious how many people will be able to tell the difference in real-time.

In reshade RT lighting is constantly fluctuating when you are moving and I havent noticed anything like that in UE5 tech demo.
It happens in the demo. You just have to slow the video down by 50% to notice it.
 

M1chl

Gold Member
Dec 25, 2019
2,411
3,370
620
Czech Republic
I'd be curious how many people will be able to tell the difference in real-time.
Well to be honest, many people even say that old approach, like for example Metro: Exodus looks better, because devs got too good at "faking" things... But also they did not recognize, how it streamlines the development, when you are not going to have check if everything is faked as good as it could be.
 
Nov 5, 2016
21,825
39,005
1,085
One Big Room, Full Of Bad Bitches
I know a bunch of us have been hoping that ReShade, or something like it, would be available on next-gen consoles. I would use it all the time for a little sharpening and color saturation tweaks. I know this isn't saying "ReShade on console" but being compared to how ReShade works is still kinda exciting for me. Can improve image quality for minimal performance hits.

As far as Lumen, I tend to agree with that one analysis that estimated PS5 will get more out of Nanite tech due to SSD bandwidth and XsX will get more out of Lumen due to GPU ability. Seems reasonable that the hardware with faster storage bandwidth (PS5) will get more out of asset streaming scalability (Nanite), and the XsX with more GPU power will get more out of RT effects (Lumen)
 
Jan 29, 2019
2,746
2,181
375
What is impressive is that this demo is made with a software solution, not hardware. It hasn't been point enought.
Those shaders are still running on hardware accelerated shading engines made for that kind of thing.
Well to be honest, many people even say that old approach, like for example Metro: Exodus looks better, because devs got too good at "faking" things... But also they did not recognize, how it streamlines the development, when you are not going to have check if everything is faked as good as it could be.
It will streamline development when developers and engine developers don't have to worry about baking in shadows, etc.

I think Control is the only truly convincing implementation of RT I have seen so far, and I'm fairly sure I would not use it if I had a 2080ti laying around, because of the performance hit.
 

M1chl

Gold Member
Dec 25, 2019
2,411
3,370
620
Czech Republic
Those shaders are still running on hardware accelerated shading engines made for that kind of thing.

It will streamline development when developers and engine developers don't have to worry about baking in shadows, etc.

I think Control is the only truly convincing implementation of RT I have seen so far, and I'm fairly sure I would not use it if I had a 2080ti laying around, because of the performance hit.
Ohh especially with Control DLSS 2.0 works like a charm : )
I'VE SEEN MOTHERFUCKING LIGHT

DLSS (60 FPS, VSYNC > 70% GPU utilisation )





NATIVE (cca 40FPS)



2080Ti

Just HOW?
 

pawel86ck

Member
Jan 27, 2018
1,461
1,975
400
I'd be curious how many people will be able to tell the difference in real-time.



It happens in the demo. You just have to slow the video down by 50% to notice it.
Maybe, but I havent noticed any annoying artifacts while I was watching UE5 tech demo with normal video speed.

What's interesting, in UE5 tech demo character model is also lit from the light sources not currently displayed on the screen. Thats not the case with reshade RT and it looks to me like Lumen is much supperior compared to Reshade.
 

Kenpachii

Member
Mar 23, 2018
4,472
4,742
620
I'd be curious how many people will be able to tell the difference in real-time.



It happens in the demo. You just have to slow the video down by 50% to notice it.

Barely anybody, also a reason i said raytracing is useless if it takes a huge chunk of performance. Consoles don't even give 2 shits about something as simple as AF. Imagine raytracing.
 
Last edited:

Keihart

Member
Jun 23, 2013
3,022
1,057
665
IDK man, between expensive as hell performance using ray tracing or using the new dynamic GI system, the new GI system looks way better and without the ugly noise from RT.
I think games are gonna be able to use a mix of this new dynamic GI with RT for things like reflections and what not. Seems way more viable than trying to RT everything.
 
  • Like
Reactions: mckmas8808

ethomaz

Gold Member
Mar 19, 2013
31,225
15,651
1,025
38
Brazil
Epic commented that on the day they showed the demo.
It is even on DF article.
 
Last edited:

MasterCornholio

Gold Member
Mar 27, 2020
1,686
4,392
505
GI in the demo looked pretty nice to me. Full blown ray tracing is a big resource hog so I'm all for the use of GI to produce good lighting results. The use of GI over full RT should allow developers to use the freed up resources other areas such as framerate or resolution.
 
Nov 5, 2016
21,825
39,005
1,085
One Big Room, Full Of Bad Bitches
GI in the demo looked pretty nice to me. Full blown ray tracing is a big resource hog so I'm all for the use of GI to produce good lighting results. The use of GI over full RT should allow developers to use the freed up resources other areas such as framerate or resolution.
I could definitely see this being a godsend throughout this gen. I am not sure full-blown RT is worth the large resource expense for these consoles, especially if we can get a similar image with far less cost.
 
Last edited:
  • Like
Reactions: Investor9872

MasterCornholio

Gold Member
Mar 27, 2020
1,686
4,392
505
I could definitely see this being a godsend throughout this gen. I am not sure full-blown RT is worth the large resource expense for these consoles, especially if we can get a similar image with far less cost.
Don't get me wrong that Minecraft path raytracing demo was impressive.

But Minecraft is a really old game with really basic geometry and it was running at a variable 30-40FPs at 1080P. Try bumping that up to next gen resolutions and it becomes unplayable.

Now apply that same raytracing technique to more complicated next gen titles.

I personally don't believe next gen is ready for full raytracing which is why lighting techniques like Global Illumination can become extremely important.
 
  • Like
Reactions: mckmas8808

itsjustJEFF

Member
Apr 12, 2018
163
128
235
I feel like if it looks the same as, or close to, "real" raytracing at half the cost, or whatever, and because of that we get higher-res or higher fps, i hope the dev's go that route? I feel like we've fallen in love with raytracing and now anything less is unacceptable. PC's can barely do raytracing right now. I think the new cards will definitely be able to handle ray tracing, but at what cost and how much implementation will there be? will it be full raytracing: shadows, reflections, lighting etc.? or will it just be a couple things will less hit to performance.

I thought the lighting in the UE5 demo looked fantastic, it looked damn real to me. I also feel like we've fallen in love with 4k, but not just any 4k, it has to be native 4k. For years, PC players have been praising 1440/60/120. I would like to see more dev's go the route of better visuals, bigger worlds etc. @ upscaled 4k/60 or better fps at least.
 
  • Praise the Sun
Reactions: mckmas8808

PaintTinJr

Member
Jan 30, 2020
294
769
275
If you have hardware-accelerated raytracing, why do you use software raytracing?
In a ideal situation the hardware would be completely versatile for general purpose code, while providing the same or better performance than fixed path hardware. Committing algorithms to hardware isn't an ideal use of silicon if it doesn't offer any benefit, because the algorithm can't be improved upon without new hardware - hence the names 'hard'ware and 'soft'ware

edit:
The software GI/ray-tracing in UE5 doesn't brute force like path tracing H/w accelerated does, and does a better job performance wise on static objects with 10s of millions of polygons, therefore leaving the hw RT free to be used in addition on selective (non-static/deformal/destructible) foreground geometry/models.
 
Last edited:

Nothing Unusual

Neo Member
Nov 29, 2019
14
12
90
Not gonna lie but while I"m sure the UE5 demo on PS5 has some impressive techno jargon going on, what I saw with my eyes didn't wow me at all. I would have easily mistaken it for any PS4 title currently. That said the old UE4 apartment demo or the Unity Book of the Dead demo looked far more amazing. So I'm having a hard time understanding why this particular demo was used to help hype a next gen console. The only thing I see regarding UE5 is how it helps make photogrametry much easier for them. Software Ray Traced or not I don't see how it looks any better than the lighting I saw on the Book of the Dead demo.
 

nikolino840

Member
Dec 30, 2018
2,532
2,201
515
36
Because it can run since mobiles to super high-end PCs.
Epic said they will do platform optimization using hardware RT in the future.
Yeah but months ago software ray tracing was a blasphemy ,i also Remember a thread arguing about hardware based and hardware accelerated wording
As some user Say a mix Is perfect...i don't know if it's possibile....
 

Lady Bernkastel

Gold Member
Mar 8, 2018
2,313
4,186
760
Yeah but months ago software ray tracing was a blasphemy ,i also Remember a thread arguing about hardware based and hardware accelerated wording
As some user Say a mix Is perfect...i don't know if it's possibile....
Its this one
 

PaintTinJr

Member
Jan 30, 2020
294
769
275
Yeah but months ago software ray tracing was a blasphemy ,i also Remember a thread arguing about hardware based and hardware accelerated wording
As some user Say a mix Is perfect...i don't know if it's possibile....
Surely the criticism of software RT back then, was it was deemed inferior without any 'win' to show in comparison to h/w RT? And technically speaking, if the level of models, 8k textures and level of light bouncing is only at that IQ in UE5 because of the hardware accelerated IO unit in the PS5, then the GI/RT is indirectly h/w accelerated IMHO. The ability to take movie level models and textures, and real-time GI them for the entire background scene with a software algorithm that is crunching down data to micro-polygons is a massive win, and a win I don't foresee HW RT bettering/matching without orders of magnitude more vram and RT cores.
 

Bigfroth

Member
Feb 21, 2013
227
138
500
That demo looked good, if doing lighting that way is less resource intensive than "real" RT and looks that good go for it. The 4k 60/120fps with full RT is a pipe dream lol.
 
  • Like
Reactions: mckmas8808

Investor9872

Member
Jan 4, 2020
387
421
275
I could definitely see this being a godsend throughout this gen. I am not sure full-blown RT is worth the large resource expense for these consoles, especially if we can get a similar image with far less cost.
I would take it a step further by saying that I hope the next-generation systems will NOT fully use hardware Ray-tracing since that will lower the overall performance of the games. I remember buying the PS4 Pro to play Monster Hunter. When you boot that game up, it will ask you if you want to play it in performance mode or resolution mode. Now I bought an 82" Samsung 4K TV and the PS4 Pro to play these games in 4K. But as hard as I had tried, I just can't play the game in 4K resolution mode (it's actually less than that), because the game just didn't run well, and ended up going back to performance mode a few minutes later. A lot of games this gen had tried to push for 4K, and sacrificed their performances. I truly believe that hardware RT on the next-gen consoles are akin to this gen's marketing full 4K games while running like horse#%#$%. So the bottom line is I hope next-gen games stay away from RT unless you can run them at 60 fps in 4K, 1440p or 1080p.
 

ethomaz

Gold Member
Mar 19, 2013
31,225
15,651
1,025
38
Brazil
Yeah but months ago software ray tracing was a blasphemy ,i also Remember a thread arguing about hardware based and hardware accelerated wording
As some user Say a mix Is perfect...i don't know if it's possibile....
If I didn’t remember wrong they are only using software RT for indirect lighting.
 

yurinka

Member
Jan 19, 2007
10,144
795
1,340
Barcelona, Spain
www.capcom-town.es
Because it can run since mobiles to super high-end PCs.
Epic said they will do platform optimization using hardware RT in the future.
Epic also said UE5 projects would be scalable to different hardware, but current gen PC (mostly all the PC market), consoles and mobile wouldn't support these new high end features like Lumen.

UE5 would have tools to scale down the project to use traditional techniques on non next-gen hardware.

Software Ray-Tracing keyword
This demo was to showcase the new and game changer Nanite and Lumens features, and they don't require hardware RT, and didn't use it to prove it can look great without hardware RT.

Obviously the consoles and UE5 will support hardware RT, but it was not a good moment to showcase it, they may do that in some other future demo. This time it was to highlight Nanite and Lumens.
 
Last edited:
  • Like
Reactions: LED Guy?

DoctaThompson

has 30k subs on YT (will livestream reply ban)
Jan 5, 2020
1,472
1,790
515
Big Caulk County
Epic also said UE5 projects would be scalable to different hardware, but current gen PC, consoles and mobile wouldn't support these new high end features like Lumen.

UE5 would have tools to scale down the project to use traditional techniques on non next-gen hardware
First time I'm reading this. Where did anyone say that current gen PC can't run lumen? If anything, it would scale better on pc than next gen consoles. Or is this FUD as per usual...?
 

yurinka

Member
Jan 19, 2007
10,144
795
1,340
Barcelona, Spain
www.capcom-town.es
First time I'm reading this. Where did anyone say that current gen PC can't run lumen? If anything, it would scale better on pc than next gen consoles. Or is this FUD as per usual...?
Read this thread from Tim Sweeney, it's pretty clear. As I said Nanite and Lumen will be for next gen consoles and PCs, and UE5 will hafe tools to scale it down to run the games on current gen platorms (including PCs) with traditional rendering and lighting:

"The Nanite and Lumen tech powering it will be fully supported on both PS5 and Xbox Series X and will be awesome on both. And high end PCs.

And with features for scaling the content down to run on current generation platforms using traditional rendering and lighting techniques. Commodore 64 will not be supported."


Then he gets asked 'why PS5 and not a PC? Was it due to SSD perf? ' and replies 'Systems integration and whole-system performance. Bringing in data from high-bandwidth storage into video memory in its native format with hardware decompression is very efficient.'
 
Last edited:

scalman

Member
Feb 6, 2019
1,835
1,076
365
It doesnt matter how they would do it , just so that would fit game, i really dont care in what ways devs will achieve what they will achieve.
 

DoctaThompson

has 30k subs on YT (will livestream reply ban)
Jan 5, 2020
1,472
1,790
515
Big Caulk County
Read this thread from Tim Sweeney, it's pretty clear. As I said Nanite and Lumen will be for next gen consoles and PCs, and UE5 will hafe tools to scale it down to run the games on current gen platorms (including PCs) with traditional rendering and lighting:

"The Nanite and Lumen tech powering it will be fully supported on both PS5 and Xbox Series X and will be awesome on both. And high end PCs.

And with features for scaling the content down to run on current generation platforms using traditional rendering and lighting techniques. Commodore 64 will not be supported."


Then he gets asked 'why PS5 and not a PC? Was it due to SSD perf? ' and replies 'Systems integration and whole-system performance. Bringing in data from high-bandwidth storage into video memory in its native format with hardware decompression is very efficient.'
Which has been debunked already. Even Tim said it will be for high end PC's. The actual devs that worked on the engine debunked the "need" for next gen SSD. You made it seem like current PC's would suffer and require an RDNA2 or Ampere gpu or even newer newer, which isn't the case at all.

I think this demo will be pretty fun to mess with when it gets released on PC later on. I'd love to see it run in 4K with high end raytracing enabled. I think all hardware will run UE5 games really well, and it will be interesting to see the differences with 20+ TF gpu's. 2020 and beyond are going to amazing for all gamers across the board.
 
  • Like
Reactions: mckmas8808

ethomaz

Gold Member
Mar 19, 2013
31,225
15,651
1,025
38
Brazil
Epic also said UE5 projects would be scalable to different hardware, but current gen PC (mostly all the PC market), consoles and mobile wouldn't support these new high end features like Lumen.

UE5 would have tools to scale down the project to use traditional techniques on non next-gen hardware.


This demo was to showcase the new and game changer Nanite and Lumens features, and they don't require hardware RT, and didn't use it to prove it can look great without hardware RT.

Obviously the consoles and UE5 will support hardware RT, but it was not a good moment to showcase it, they may do that in some other future demo. This time it was to highlight Nanite and Lumens.
That is not true at all.

Both techs can scale down to mobile.
Nanite for example in 1080p with 1-2 triangles per pixel already requires MB/s of storage.

Mobile can run Nanite at 720p with 1 triangle per pixel or even lower.
 
Last edited:

yurinka

Member
Jan 19, 2007
10,144
795
1,340
Barcelona, Spain
www.capcom-town.es
That is not true at all.

Both techs can scale down to mobile.
Which has been debunked already. Even Tim said it will be for high end PC's. The actual devs that worked on the engine debunked the "need" for next gen SSD. You made it seem like current PC's would suffer and require an RDNA2 or Ampere gpu or even newer newer, which isn't the case at all.

I think this demo will be pretty fun to mess with when it gets released on PC later on. I'd love to see it run in 4K with high end raytracing enabled. I think all hardware will run UE5 games really well, and it will be interesting to see the differences with 20+ TF gpu's. 2020 and beyond are going to amazing for all gamers across the board.
Who debunked what? Tim and I say that Lumen and Nanite will be fully supported on next gen consoles and PC. And that UE5 games will run on current gen devices scaled down with tools to use traditional tecniques.

Is Tim Sweeney lying in his Twitter thread I linked? Where is the official info debunking him and saying that Lumens and Nanite is going to be fully supported on mobile, Switch, Xbox One S and a similar PC?

https://pasteboard.co/J9l6hgZ.png

As I posted above he says: "And with features for scaling the content down to run on current generation platforms using traditional rendering and lighting techniques".

Current gen platforms -this include not high end PCs- will run these games using traditional rendering and lighting techniques instead of Lumens and Nanite. Which is normal because they'll have to use LOD, lightmaps, normal maps, way smaller textures, way less detailed scene models etc. and stream a whole room/like 30 seconds or 1 minute of gameplay or show a loading screen because can't stream that fast and don't have enough horsepower.
 
Last edited:

ethomaz

Gold Member
Mar 19, 2013
31,225
15,651
1,025
38
Brazil
Who debunked what? Tim and I say that Lumen and Nanite will be fully supported on next gen consoles and PC. And that UE5 games will run on current gen devices scaled down with tools to use traditional tecniques.

Is Tim Sweeney lying in his Twitter thread I linked? Where is the official info debunking him and saying that Lumens and Nanite is going to be fully supported on mobile, Switch, Xbox One S and a similar PC?

He even joked about Comodore64 not being supported.

 
Last edited: