• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Unreal Engine 5 Matrix City Sample PC Analysis: The Cost of Next-Gen Rendering

winjer

Gold Member
NXG does a good job at highlighting why PC's can't handle the demo - and why PS5 handles it better, albeit with lower specs - in some places.



Scale the Matrix demo down and it will run in 60fps on PS5.


His results can't be right. His frame rate is way too low, compared to mine.
I do have a slightly better PC than him. But not that much.
I have a 3700X and an RTX 2070S. He has a 2700X and an RTX 2070 non super.
But the 3700X is just 7% faster in games, than a 2700X.

My 2070S is clocked at 1920Mhz, resulting in 9.8 TFLOPs.
His 2070 is clocked at 2040, resulting in 9.4 TFLOps. This is a difference of 4.2%.

On the same spot, I get 38 fps. He gets 23 fps. This is a difference of 65% in performance. In a demo that is CPU bound.
While the 2 CPUs should have a difference of around 7% in performance.
Either he f***d up really bad with his packaged demo. Or he has some performance issues with his PC, which would also explain why in so many of his analysis he gets lower results, than on consoles.

Then he speaks about how when he is driving, he gets to 10-14 fps.
In my PC, I get 30-31 fps. With lows of 26 fps when crashing.

Here is a screenshot, in the demo, settings at 3, resolution 1440p with TSR at 75%, meaning it's rendered at a base of 1080p.
LZQ5zBt.jpg


Here is his result, on the exact same spot. He is rendering at 1080p, settings at 3.
QkaIqWu.jpg
 
Last edited:

RoadHazard

Gold Member


Versus



It was better in virtually every way.

You guys are really clouded by your rose-tinted goggles memory of the Elemental Demo. It looks shit by today's standards. Hell, plenty non-UE games for example InFamous Second Son blew it out of the water.


Arkham Knight isn't even UE4, it's UE3 (with customizations of course).
 

8BiTw0LF

Banned
His results can't be right. His frame rate is way too low, compared to mine.
I do have a slightly better PC than him. But not that much.
I have a 3700X and an RTX 2070S. He has a 2700X and an RTX 2070 non super.
But the 3700X is just 7% faster in games, than a 2700X.

My 2070S is clocked at 1920Mhz, resulting in 9.8 TFLOPs.
His 2070 is clocked at 2040, resulting in 9.4 TFLOps. This is a difference of 4.2%.

On the same spot, I get 38 fps. He gets 23 fps. This is a difference of 65% in performance. In a demo that is CPU bound.
While the 2 CPUs should have a difference of around 7% in performance.
Either he f***d up really bad with his packaged demo. Or he has some performance issues with his PC, which would also explain why in so many of his analysis he gets lower results, than on consoles.

Then he speaks about how when he is driving, he gets to 10-14 fps.
In my PC, I get 30-31 fps. With lows of 26 fps when crashing.

Here is a screenshot, in the demo, settings at 3, resolution 1440p with TSR at 75%, meaning it's rendered at a base of 1080p.
LZQ5zBt.jpg


Here is his result, on the exact same spot. He is rendering at 1080p, settings at 3.
QkaIqWu.jpg
You can't expect to make a 1:1 comparison on a very unstable tech demo - which isn't optimized for the countless different hardware setups people have. Also it looks like you forgot to turn on lumen?
 

winjer

Gold Member
You can't expect to make a 1:1 comparison on a very unstable tech demo - which isn't optimized for the countless different hardware setups people have. Also it looks like you forgot to turn on lumen?

The demo isn't unstable, it never crashes on my PC.
We have very similar PCs on that scene, so it's very comparable.
Lumen is on, setting 3, like his.

edF1Na2.jpg
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
His results can't be right. His frame rate is way too low, compared to mine.
I do have a slightly better PC than him. But not that much.
I have a 3700X and an RTX 2070S. He has a 2700X and an RTX 2070 non super.
But the 3700X is just 7% faster in games, than a 2700X.

My 2070S is clocked at 1920Mhz, resulting in 9.8 TFLOPs.
His 2070 is clocked at 2040, resulting in 9.4 TFLOps. This is a difference of 4.2%.

On the same spot, I get 38 fps. He gets 23 fps. This is a difference of 65% in performance. In a demo that is CPU bound.
While the 2 CPUs should have a difference of around 7% in performance.
Either he f***d up really bad with his packaged demo. Or he has some performance issues with his PC, which would also explain why in so many of his analysis he gets lower results, than on consoles.

Then he speaks about how when he is driving, he gets to 10-14 fps.
In my PC, I get 30-31 fps. With lows of 26 fps when crashing.

Here is a screenshot, in the demo, settings at 3, resolution 1440p with TSR at 75%, meaning it's rendered at a base of 1080p.
LZQ5zBt.jpg


Here is his result, on the exact same spot. He is rendering at 1080p, settings at 3.
QkaIqWu.jpg
The 2700X is a pretty shitty gaming CPU.
Also are you running the same version he compiled?
 

winjer

Gold Member
So now a comparison with Alex results. It's not like for like, but it can put some things into perspective.
In this part of the test, he is running the demo at 720p. So he PC is completely CPU bound, and his 3090 is almost idling.
He gets around 40-50 fps, while running around in this area. He is using a 10900K CPU, which is around 10% faster than a 3700X.

DPPteqy.jpg


I set my resolution scale to render at 720p.
Then I went to the same place and got similar performance to Alex. . Around the 40-50 fps.
I couldn't really match direction and movement. But it's within the same ballpark of performance, in a very CPU bound scenario.

tBcUb9A.jpg
 
Last edited:

winjer

Gold Member
Notice his system memory usage. He has some heavy stuff running in the background to lower the performance. [like multiple browsers open with videos running or some stuff]

That is a good point you make. I was running the demo with a couple of programs open, Brave browser with a couple of tabs open and MSi AfterBurner.
Despite this, my whole system Ram usage is around 9GB. While his system is using 13GB. He probably has a ton of programs and bloatware running in the background. This is some very shoddy benchmark practice.
My vram usage is also lower, but it's just a difference of 500MB. It's margin of error stuff.

The 2700X is a pretty shitty gaming CPU.
Also are you running the same version he compiled?

Did he publish the package he compiled? If so, I can download it and try it out.

Regarding the 2700X vs 3700X, take a look at this:
OtDX2WW.jpg
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That is a good point you make. I was running the demo with a couple of programs open, Brave browser with a couple of tabs open and MSi AfterBurner.
Despite this, my whole system Ram usage is around 9GB. While his system is using 13GB. He probably has a ton of programs and bloatware running in the background. This is some very shoddy benchmark practice.
My vram usage is also lower, but it's just a difference of 500MB. It's margin of error stuff.



Did he publish the package he compiled? If so, I can download it and try it out.

Regarding the 2700X vs 3700X, take a look at this:
OtDX2WW.jpg
Its trading blows with 4 core 10100s and 3300Xs?
 

Fredrik

Member
That website i dont think it tests every component it shows. I think they test some parts then use some scaling for the rest. The 12900k results seem to be on par with whats expected, but they probably aproximated every other result
It’s bad either way. I used to be able to run the valley demo on my 1080ti at roughly 60fps in 1080p, it made me think things would be okay performance-wise, but these results in the Matrix demo is something else.
Is this demo free to try for all like the valley? I want to verify how bad it is on my own PC.
 

winjer

Gold Member
Its trading blows with 4 core 10100s and 3300Xs?

That chart rank is normalized by 1% low frame rate.
Like it has been stated many times on this thread, clock speeds and IPC are more important than core count.
And CPU is dominant in this demo, especially at low resolutions like 1080p.

Now mind you that the 2700X is doing 41 fps average with 30 fps lows.
The 3800X is doing 43 fps average with 36 fps lows. This is just a bit higher than what I get on my PC, probably because he uses a 3080Ti.
Take a look at the result from the 10700K. 45 fps average and 40 fps lows. This is also similar to what Alex is getting on his 10900K.
So all these results are matching up, from various people benchmarking the demo. There are always some differences, and margins of error. But performance is in the same ballpark.
All, except the results from NXgamer. With his 2700X, he is getting 20-23 fps averages. With 10-14 fps lows.
There is something very wrong with his benchmarks.
 
I’m not sure what all the panic is about. Engine is scalable and devs will turn features on and off as suits. No reason why the trend of performance and fidelity choices won’t continue.
Yeah I’m just confused on why it’s “concerning” to df - does it mean that cpus have to evolve differently or something?
 

DukeNukem00

Banned
It’s bad either way. I used to be able to run the valley demo on my 1080ti at roughly 60fps in 1080p, it made me think things would be okay performance-wise, but these results in the Matrix demo is something else.
Is this demo free to try for all like the valley? I want to verify how bad it is on my own PC.


Sure. Everyone can download the demo and try it. Epic enginner Andrew Lauritzen gives some more info here regarding performance:

 
When games start being made from the ground up for these consoles with these new engines, 60fps (for a lot of games at least) will slowly fade into the abyss
I don't think so. I expect devs will continue the practice of having a performance mode and a quality mode. They will cut back what they need to to get that 60fps.

On top of that there is alot of tech that will help keep the 60fps. The big push this generation outside of IO speed is efficiency. The XSX has shown this with VRS (a way to maximise performance through efficiency gains) Mesh Shaders (again, a way to maximise performance through efficiency gains) Sampler Feedback Streaming (a method of optimisation of RAM allocations) , DirectML (again, upscaling through Machine Learning) and FSR 2.0 which again is about higher resolutions without having the GPU cost.
The majority of these options haven't been utilised, and when they do I think we will keep a constant with what we have now performance wise.
 

VFXVeteran

Banned
These articles should be proof to you guys about what I say how so far we are off to having realtime CG visuals. That demo isn't even close to CG and every single platform out right now struggles getting any reasonable FPS. It always makes me chuckle when people get excited over these demos thinking that there will be games like it right around the corner and running at good FPS. Imagine throwing in characters, music, AI, physics, etc.. into the mix.

Epic always puts out "proof of concept" demos every generation so that we can get a glimpse of what to expect on that scale years later. This particular demo will take a couple of iterations of PC GPUs to brute force their way into running this at reasonable FPS.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
I don't think so. I expect devs will continue the practice of having a performance mode and a quality mode. They will cut back what they need to to get that 60fps.

On top of that there is alot of tech that will help keep the 60fps. The big push this generation outside of IO speed is efficiency. The XSX has shown this with VRS (a way to maximise performance through efficiency gains) Mesh Shaders (again, a way to maximise performance through efficiency gains) Sampler Feedback Streaming (a method of optimisation of RAM allocations) , DirectML (again, upscaling through Machine Learning) and FSR 2.0 which again is about higher resolutions without having the GPU cost.
The majority of these options haven't been utilised, and when they do I think we will keep a constant with what we have now performance wise.
If they go the way you believe they will, then the leap won't be as big in visuals, I think they'll revert back to 30fps maybe even 40fps, but we can agree to disagree
 

Shmunter

Member
These articles should be proof to you guys about what I say how so far we are off to having realtime CG visuals. That demo isn't even close to CG and every single platform out right now struggles getting any reasonable FPS. It always makes me chuckle when people get excited over these demos thinking that there will be games like it right around the corner and running at good FPS. Imagine throwing in characters, music, AI, physics, etc.. into the mix.

Epic always puts out "proof of concept" demos every generation so that we can get a glimpse of what to expect on that scale years later. This particular demo will take a couple of iterations of PC GPUs to brute force their way into running this at reasonable FPS.
Every analyst has come out and confirmed a cpu bottleneck and here you are in your own world stating GPU
 

yamaci17

Member
His results can't be right. His frame rate is way too low, compared to mine.
I do have a slightly better PC than him. But not that much.
I have a 3700X and an RTX 2070S. He has a 2700X and an RTX 2070 non super.
But the 3700X is just 7% faster in games, than a 2700X.

My 2070S is clocked at 1920Mhz, resulting in 9.8 TFLOPs.
His 2070 is clocked at 2040, resulting in 9.4 TFLOps. This is a difference of 4.2%.

On the same spot, I get 38 fps. He gets 23 fps. This is a difference of 65% in performance. In a demo that is CPU bound.
While the 2 CPUs should have a difference of around 7% in performance.
Either he f***d up really bad with his packaged demo. Or he has some performance issues with his PC, which would also explain why in so many of his analysis he gets lower results, than on consoles.

Then he speaks about how when he is driving, he gets to 10-14 fps.
In my PC, I get 30-31 fps. With lows of 26 fps when crashing.

Here is a screenshot, in the demo, settings at 3, resolution 1440p with TSR at 75%, meaning it's rendered at a base of 1080p.
LZQ5zBt.jpg


Here is his result, on the exact same spot. He is rendering at 1080p, settings at 3.
QkaIqWu.jpg
136wQF0.png


(3400 cl14 with tight timings might be playing a role here)



and i never get drops to 14s 15s (aside from crashes) with %100 crowd-car density (btw video recorded at 3.7 ghz. dont ask me why. i usually use the system at 3.7 ghz for lower fan speed and lower power consumption. i still good enough fps for my tastes and i dont need that extra 0.3 ghz. but for that comparison i enabled 4 ghz again=)

i have to test with stock 3000 cl15. but i dont think my fps would drop to 23s

then again, there's no point of using stock 3000 cl15 with these old Zens. they love tight timings and get HIGH gains out of it. literally, no other architecture benefits from tight timings as much as them. and CRUCIAL ballistix are pretty cheap. they just like any other kits ! my random 3000 cl15 ballistix kits gracefully, stably working for 2.5 years with tight timings at 3400 mhz. i love my 2700x too, i got it for 115 dollars LMAO. i will never let it go unless it fails to reach 60+ fps in games. as long as i get upwards of 60 frames, im happy with my cpu

cyberpunk with RT on, i had frame drops to 50s while driving fast in the city, but that was not enough for me to let go of my cpu. i still love it. maybe UE5 can force my hand to get a 5600x in the future. but i want that 5600x for dirt cheap 120 130 dollars :D :D
 
Last edited:

winjer

Gold Member
136wQF0.png


(3466 cl14 with tight timings might be playing a role here)



and i never get drops to 14s 15s (aside from crashes) with %100 crowd-car density (btw video recorded at 3.7 ghz. dont ask me why. i usually use the system at 3.7 ghz for lower fan speed and lower power consumption. i still good enough fps for my tastes and i dont need that extra 0.3 ghz. but for that comparison i enabled 4 ghz again=)

i have to test with stock 3000 cl15. but i dont think my fps would drop to 23s

then again, there's no point of using stock 3000 cl15 with these old Zens. they love tight timings and get HIGH gains out of it. literally, no other architecture benefits from tight timings as much as them. and CRUCIAL ballistix are pretty cheap. they just like any other kits ! my random 3000 cl15 ballistix kits gracefully, stably working for 2.5 years with tight timings at 3466 mhz. i love my 2700x too, i got it for 115 dollars LMAO. i will never let it go unless it fails to reach 60+ fps in games. as long as i get upwards of 60 frames, im happy with my cpu

cyberpunk with RT on, i had frame drops to 50s while driving fast in the city, but that was not enough for me to let go of my cpu. i still love it. maybe UE5 can force my hand to get a 5600x in the future. but i want that 5600x for dirt cheap 120 130 dollars :D :D


Zen+ does benefit from memory oc. But it's around 5-15%.
The difference we are seeing with NXGamer is of ~60%. He is running 3200 MT/s memory. He coul be running at CL40 it would not make a 60% loss in performance.
 

ethomaz

Banned


Versus



It was better in virtually every way.

You guys are really clouded by your rose-tinted goggles memory of the Elemental Demo. It looks shit by today's standards. Hell, plenty non-UE games for example InFamous Second Son blew it out of the water.

I could not watch yesterday night so I did not reply for to give a change to I having remembering it wrong.

Watched now.

The Elemental demo is way better than AK.
Man the AK look is not even in best looking games last generation.

And you want to use the same Engine comparison…. AK doesn’t reached the level of U3 Samaritan demo.



So let’s not talk about the Elemental Demo that is in another level.
 
Last edited:
Sure. Everyone can download the demo and try it. Epic enginner Andrew Lauritzen gives some more info here regarding performance:


From that thread it's the gi that's causing cpu overload not nanite.

10700k
City-Sample-2022-04-21-15-48-14-316.png

City-Sample-2022-04-21-16-05-29-321.png


gi off and suddenly GPU is maxed. Whats strange is that the reflections are better when gi is set to 1 (?)
City-Sample-2022-04-21-14-44-12-414.png
 
Is that so ? What game developers house does he work for and what games has he shipped ?

Well he doesn't claim to be a journalist. Maybe ask NXGamer NXGamer about that?

"Thompson began programming games at age 7 or 8, and he’s worked in engineering and computer science for the past 30 years. In 2013, a lighter workload at his day job left him time to start NX Gamer as a side project that he hoped would help inform less tech-savvy gamers."


From that he doesn't appear to be a journalist. Just a programmer who runs his NXgamer channel as a side gig.
 
Last edited:

DukeNukem00

Banned
Well he doesn't claim to be a journalist. Maybe ask NXGamer NXGamer about that?

"Thompson began programming games at age 7 or 8, and he’s worked in engineering and computer science for the past 30 years. In 2013, a lighter workload at his day job left him time to start NX Gamer as a side project that he hoped would help inform less tech-savvy gamers."



So he has nothing at all to do with programing and game making. That would explain why he's been a laughinstock of this domain for years and years. And why on beyond3d they actually opened a thread dedicated to his mistakes that are present in every video because they got tired of people face palming on his every video and infecting every thread with how wrong he is with absolutely every video he ever makes. Including the video this thread is about being disproven by Alex and Epic devs
 
So he has nothing at all to do with programing and game making. That would explain why he's been a laughinstock of this domain for years and years. And why on beyond3d they actually opened a thread dedicated to his mistakes that are present in every video because they got tired of people face palming on his every video and infecting every thread with how wrong he is with absolutely every video he ever makes. Including the video this thread is about being disproven by Alex and Epic devs

So your calling him a liar?
 

DukeNukem00

Banned
So your calling him a liar?

A liar ? No, but bedroom "programing" has nothing to do with the subject that he pretends he knows nor does working in the vast area of computer science has any meaning. John Lineman also wrote train software as a day job in the past. Does that mean he knows fuck all about game programing ? No, it doesnt.

The fact remains that nxgamer has no idea what he's talking about, was called out in the past by people like Durante and is constantly proven wrong by actual game engineers. The only place that gives him attention is this forum, probably because every one of his videos manages to make playstation as an entity touched by god
 

sinnergy

Member
Well he doesn't claim to be a journalist. Maybe ask NXGamer NXGamer about that?

"Thompson began programming games at age 7 or 8, and he’s worked in engineering and computer science for the past 30 years. In 2013, a lighter workload at his day job left him time to start NX Gamer as a side project that he hoped would help inform less tech-savvy gamers."


From that he doesn't appear to be a journalist. Just a programmer who runs his NXgamer channel as a side gig.
Not in game development..
 

yamaci17

Member
well all the stuff he said about how ps5 xbox offloads stuff and how cache scrubber does extra job and how bad 2700x performs are invalid since i've proven that a stock 2700x pushes 30+ fps and with a mild ram oc it can push 35+ fps and other one showed that his 3700x pushes 38-40 fps. in his video , situation is dramatic, 10-15-20 fps are common.

step 1) he should immediately remove the video
step 2) he must find the properly cooked build. there's a build out there with extra "profiler" stuff that hammers the CPU even further. maybe he himself cooked it and does not know about this. (THEN again... even with the "profiler" enabled demo, my performance was nowhere near that bad
step 3) if he indeed uses the proper build, then he should take a look at what appears to be the problem in his system
 

winjer

Gold Member
So your calling him a liar?

I wouldn't say he is a liar, but his benchmarking methodology is very shoddy.
He also uses a lot of technical verbose, to justify his conclusions, but most are not accurate and some are downright wrong.
This is why he manages to impress people who have less knowledge about hardware, or software, or game making. But he fails to impress more knowledgeable people, becoming a bit of a laughing stock.

We live at a time when anyone can create a youtube channel and pretend to be more, than what they really are.
He's not as bad as crap like Red Gaming Tech and Moore's Law is Dead. But he is not doing a good job, and is doing a lot to spread a decent amount of wrong information.
 
Last edited:
I could not watch yesterday night so I did not reply for to give a change to I having remembering it wrong.

Watched now.

The Elemental demo is way better than AK.
Man the AK look is not even in best looking games last generation.

And you want to use the same Engine comparison…. AK doesn’t reached the level of U3 Samaritan demo.



So let’s not talk about the Elemental Demo that is in another level.


I'm kinda dumbfounded by this, ethomaz.

Are we not looking at the same demos?

The Elemental Demo looks like ass by today's standards, and the Samaritan demo is not much better. Meanwhile, Arkham Knight shits all over both from a great height.

And I agree AK wasn't the best looking game last gen. TLOUII on PS4 blows the Elemental and Samaritan demos out of the freaking stratosphere. And yes, while TLOUII isn't UE4 or even UE at all, it shows that neither the hardware nor the full extent of UE capabilities were limited to produce visuals only ever as good as those two UE4 demos. Both UE and the console hardware delivered visuals that blasted so far past those demos it's not even funny.
 

ethomaz

Banned
I'm kinda dumbfounded by this, ethomaz.

Are we not looking at the same demos?

The Elemental Demo looks like ass by today's standards, and the Samaritan demo is not much better. Meanwhile, Arkham Knight shits all over both from a great height.

And I agree AK wasn't the best looking game last gen. TLOUII on PS4 blows the Elemental and Samaritan demos out of the freaking stratosphere. And yes, while TLOUII isn't UE4 or even UE at all, it shows that neither the hardware nor the full extent of UE capabilities were limited to produce visuals only ever as good as those two UE4 demos. Both UE and the console hardware delivered visuals that blasted so far past those demos it's not even funny.
Watched the two demos and AI video you posted.

Elemental > Samaritan > AK

It is very clear AK make some cut off maybe because it is a full playable game… in AK it looks like plastic surfaces and the environment is all static… in Elemental everything seems to be alive… wind, sand, dust, snow, fire etc… everything is reacting to the events… it seems to be higher resolution but using some temporal construction because some parts have a blur… or maybe it is the AA.

BTW I had to look for the original Elemental demo because you posted a modified one.
 
Last edited:

VFXVeteran

Banned
I'm kinda dumbfounded by this, ethomaz.

Are we not looking at the same demos?

The Elemental Demo looks like ass by today's standards, and the Samaritan demo is not much better. Meanwhile, Arkham Knight shits all over both from a great height.

And I agree AK wasn't the best looking game last gen. TLOUII on PS4 blows the Elemental and Samaritan demos out of the freaking stratosphere. And yes, while TLOUII isn't UE4 or even UE at all, it shows that neither the hardware nor the full extent of UE capabilities were limited to produce visuals only ever as good as those two UE4 demos. Both UE and the console hardware delivered visuals that blasted so far past those demos it's not even funny.
These kinds of arguments are a waste of time if people can't quantify what they propose "blows that away". Every single argument on these boards relating to graphics has been deduced to subjective meaningless opinions with no objective facts.

If we are going to agree or disagree with what visuals look best - we need to put objective facts in place. What techniques in rendering make a particular game "blow another game away"? What resolution is used for all framebuffers, what do the FX accomplish that the other game doesn't? What about the animation cycles, keyframes, character rigs, etc..?

At some point, we all have to either avoid the subjective comments because they are meaningless or we will all need to get educated on the techniques and how they make a game approach a better approximation to the rendering equation and discuss in a meaningful manner.
 
Last edited:

Lethal01

Member
They are programmers .. please .. so the coalition doesn’t know what they do? NXG and DF are just reporters ..

If you think that this specific demo and unreal engine as a whole can't be optimized far more you don't know what you're talking about
I'd be suprised if upcoming patches don't have a slew of fixes and things that would make it perform better.
 

Lethal01

Member
These kinds of arguments are a waste of time if people can't quantify what they propose "blows that away". Every single argument on these boards relating to graphics has been deduced to subjective meaningless opinions with no objective facts.

If we are going to agree or disagree with what visuals look best - we need to put objective facts in place. What techniques in rendering make a particular game "blow another game away"? What resolution is used for all framebuffers, what do the FX accomplish that the other game doesn't? What about the animation cycles, keyframes, character rigs, etc..?

At some point, we all have to either avoid the subjective comments because they are meaningless or we will all need to get educated on the techniques and how they make a game approach a better approximation to the rendering equation and discuss in a meaningful manner.

How much people subjectively like something is by far the most important and interesting though.
 
Last edited:
Watched the two demos and AI video you posted.

Elemental > Samaritan > AK

It is very clear AK make some cut off maybe because it is a full playable game… in AK it looks like plastic surfaces and the environment is all static… in Elemental everything seems to be alive… wind, sand, dust, snow, fire etc… everything is reacting to the events.

BTW I had to look for the original Elemental demo because you posted a modified one.

This is the original version.

I still maintain that it doesn't look good. The only thing that's impressive by today's standards is the GPU particle effects. The lighting is very basic GI with very poor indirect lighting implementation (barely any bounce lighting at all).

The textures are low res and the surfaces all look horribly flat because it's not using PBR materials anywhere.

Now look at AK, everything is correctly lit, materials have a tangibility to them because its PBR materials. Metal looks like metal, the cloth looks like cloth and concrete looks like concrete. PBR is the single biggest differentiator between the two and makes the most difference between everything looking like flat cardboard in the Elemental demo and stuff looking convincingly real in AK.

AK just simply looks way better than both the Elemental and Samaritan demos.

You may personally prefer the looks of the latter. But that's not the same as trying to assess whether one presents an objectively better graphical presentation.
 

Neys

Neo Member
I think the demo might be CPU cache bound. If somebody has a 5800x3D and can post results in comparable scenes of this thread. I think some ray tracing part is the major issue as you can disable hardware accelerated Lumen and use the software one and get better performance (logical if the game was heavily GPU bound, but strange case here), the other is shaders compilation as I have read on reddit that letting the game run idle for about 30 minutes to 1hour completes shaders compilation and you get a higher GPU usage but how much better performance you get I couldn't verify myself.

With Nanite and hardware accelerated Lumen, do you need to rebuild the BLAS (done on the GPU) all the time with the triangle count/LOD changing all the time(Lumen ray tracing)? We know that raytracing impacts CPU performance too with DXR and Vulkan ray tracing enabled but I couldn't find any information why, and so rebuilding the BLAS all the time could have a heavy CPU cost. The BVH tree (BLAS/TLAS) is done on GPU with both AMD and Nvidia with either DXR and Vulkan Ray tracing, but on Vulkan you can do it on CPU too for TLAS (Nvidia didn't support it and still don't I think, don't know for AMD), Nvidia says here anyway to accelerate it asynchronously with their GPUs, so it shouldn't impact CPU performance. Epic says here again that TLAS has an impact on the rendering thread, so it impacts CPU. It would be useful if somebody could run the demo with the command Stat rendering and show all the stats under the TLAS section here. Also GPU Profiler for BLAS rebuilds (these stats right under BLAS section in particular).

Also do we know if the meshes overlap have been optimized for raytracing as to not ray hit multiple meshes that are hidden at all times for the PC build ? Also the first UE5 PC demo was smaller in scope in terms of objects and far view and so in how much it impacted the GI cost with far field traces (under Far Field section).

If my intuition is correct, ray tracing is heavily cache dependant, only big L3 or L2 caches can help it but on the CPU side or on the APU in the case of consoles, but also cache scrubbers/flushers but smart ones. So future AMD CPUs with 3D caches and future consoles revision with infinity cache on the APU, normal or 3D might get us better performance at ray tracing. I know that Scott Herkelman (AMD SVP and GM of Radeon) claimed that developpers could use the Infinity Cache in other ways, and I think somebody said on Beyond3D something about it with BVH tree being stored there, but I still think it's not something accessible to devs.
 

ethomaz

Banned

This is the original version.

I still maintain that it doesn't look good. The only thing that's impressive by today's standards is the GPU particle effects. The lighting is very basic GI with very poor indirect lighting implementation (barely any bounce lighting at all).

The textures are low res and the surfaces all look horribly flat because it's not using PBR materials anywhere.

Now look at AK, everything is correctly lit, materials have a tangibility to them because its PBR materials. Metal looks like metal, the cloth looks like cloth and concrete looks like concrete. PBR is the single biggest differentiator between the two and makes the most difference between everything looking like flat cardboard in the Elemental demo and stuff looking convincingly real in AK.

AK just simply looks way better than both the Elemental and Samaritan demos.

You may personally prefer the looks of the latter. But that's not the same as trying to assess whether one presents an objectively better graphical presentation.

We will agree to disagree because even taking all the things Elemental do and AK not… the first looks is way better than the plastic look of AK.

Like I said AK doesn’t even look better than Samaritan demo but I agree here it is close enough.
 
Last edited:
These kinds of arguments are a waste of time if people can't quantify what they propose "blows that away". Every single argument on these boards relating to graphics has been deduced to subjective meaningless opinions with no objective facts.

If we are going to agree or disagree with what visuals look best - we need to put objective facts in place. What techniques in rendering make a particular game "blow another game away"? What resolution is used for all framebuffers, what do the FX accomplish that the other game doesn't? What about the animation cycles, keyframes, character rigs, etc..?

At some point, we all have to either avoid the subjective comments because they are meaningless or we will all need to get educated on the techniques and how they make a game approach a better approximation to the rendering equation and discuss in a meaningful manner.

Your opinion on this subject is well known. That said I largely disagree with you.

While it's possible to make objective observations about a game's graphical presentation and also objectively assess the physical accuracy of certain visual effects, the overall graphical presentation of a game will invariably be very much subjectively assessed.

The use of certain artistic effects like film grain, colour grading and chromatic aberration is just as important to the overall visual look of a game as the accuracy of the GI, reflections and indirect lighting effects.

So focussing only on objective technical metrics is a mistake.

The rendering tech serves the artistry, and not the other way round. Hence, why I can understand how ethomaz ethomaz might feel Batman Arkham Knight looks like plastic, where I personally really like the look.

So yeah, I think you're wrong and there's more than enough room for people to be able to claim that Minecraft looks better than RDR2. And we don't have to artificially constrain the discussion to only focus on the objective quality of the rendering techniques used to be able to assess whether a game looks good overall or not.

How much people subjectively like something is by far the most important and interesting though.

Agreed.
 
Last edited:
Top Bottom