• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5 Benchmarks. 6900xt outperforms 3090 at 1440p. 28 GB RAM usage.

Md Ray Md Ray ANIMAL1975 ANIMAL1975 I was kindly replying to someone. Now if you don't realize you guys are both of topic, please form your own thread. Rea Rea and I finished our convo long time ago. Unless you are talking about either GPU's I'm not sure why you guys keep derailing the thread?


Unreal Engine on Epic's pc vs Unreal engine on every other pc

HTB3rFB.jpg
Mine looks and runs like Epics build, as well as people running well before the minimum specs. Not sure why people are mistaking the editors performance for the baked demo. It's pathetic at this point to keep trying to peddle this nonsense.
 

Md Ray

Member
You sound triggered Md Ray Md Ray Md Ray Md Ray go to bed.
No, I'm chill, breh. :messenger_tears_of_joy:
Before you jumped in, I was talking was about how we don't know how much RAM it's allocated to the OS.
Yes. But what does primitive shaders have anything to do with that? :messenger_tears_of_joy:
there's no indication of what it uses, but what would 2.5gb of RAM be out of the norm? Do you have any reason to believe otherwise? Md Ray Md Ray Md Ray Md Ray ??
Believe what? I've posted this already 👇
There's no word from Sony yet, AFAIK. But I'd imagine it's similar to XSX's 13.5 GB.
To address your concern, how is that downplaying ps5?
To answer your question, see your post: 👇
if I were using an outdated tech (primative shaders)
Bringing up something entirely different like primitive shaders when the question/discussion was about something else entirely. If this is not downplaying then IDK what is. :messenger_tears_of_joy:
 
No, I'm chill, breh. :messenger_tears_of_joy:

Yes. But what does primitive shaders have anything to do with that? :messenger_tears_of_joy:

Believe what? I've posted this already 👇


To answer your question, see your post: 👇

Bringing up something entirely different like primitive shaders when the question/discussion was about something else entirely. If this is not downplaying then IDK what is. :messenger_tears_of_joy:
Outdated=/= downplaying. I still prefer combustible over EV, even though it's outdated as fuck. EV's can't power any of my alternators or have enough juice to power my sound system for any period of time.
 

hyperbertha

Member
No, I'm chill, breh. :messenger_tears_of_joy:

Yes. But what does primitive shaders have anything to do with that? :messenger_tears_of_joy:

Believe what? I've posted this already 👇


To answer your question, see your post: 👇

Bringing up something entirely different like primitive shaders when the question/discussion was about something else entirely. If this is not downplaying then IDK what is. :messenger_tears_of_joy:
No sense arguing with insecure pc simps.
 

Md Ray

Member
Outdated=/= downplaying. I still prefer combustible over EV, even though it's outdated as fuck. EV's can't power any of my alternators or have enough juice to power my sound system for any period of time.
Unless you are talking about either GPU's
No one brought up about primitive shaders in the first place but you, when the question was simply about usable RAM.

Why, what's the correlation between them...?
I'm not sure why you guys keep derailing the thread?
Don't act so innocent.
 
No one brought up about primitive shaders in the first place but you, when the question was simply about usable RAM.

Why, what's the correlation between them...?

Don't act so innocent.
If you kept up with the thread, it was used an example with things that hasn't revealed, which was said along with it's absence of RAM usage. You could just read the thread to see what was said you know. Not sure what the act is for.


Now that it's been answered twice, you gonna get back on topic Md Ray Md Ray or further derail? Not every thread needs to be about PlayStation.
 
Last edited:

Bo_Hazem

Banned
Outdated=/= downplaying. I still prefer combustible over EV, even though it's outdated as fuck. EV's can't power any of my alternators or have enough juice to power my sound system for any period of time.

I hope you seriously wanna know about primitive shaders, they kill unnecessary data before it hurts the GPU/CPU performance, hence "primitive".





More into it:

image.png


image.png


image.png


image.png


image.png


Timestamped:



In other words it's like an agnostic hardware-based Nanite technique.
 
Last edited:

LiquidMetal14

hide your water-based mammals
I came in here for some PC GPU discussion and see this is a thread about PS and XB.

Anyways, the results make sense and don't. These synthetic benchmarks are something I always don't like to gauge performance with unless it's just to see some metrics on specific things I want to compare my rig with others findings.
 

Kenpachii

Member
Some interesting results found by a youtuber.

  • 3090 at 4k outperforms 6900xt. 10-20% better performance.
  • 6900xt at 1440p outperforms 6900xt. 10-15% better performance.
  • 6900xt is only 20 Tflops. 3090 is 36 tflops.
  • VRAM usage at both 1440p and 4k resolutions is around 6GB. Demo Allocates up to 7GB.
  • System RAM usage at both 1440p and 4k resolutions goes all the way up to 20+ GB. Total usage 2x more than PS5 and XSX.
  • PS5 and XSX only have 13.5 GB of total RAM available which means their I/O is doing a lot of the heavy lifting.
  • 6900 XT is roughly 50-60 fps at 1440p. 28-35 fps at native 4k.
  • 3090 is roughly 45-50 fps at 1440p. 30-38 fps at native 4k.
6900xt 1440p


3900 1440p


4k Comparison. Timestamped.


DWkKNS7g0wSf9zMt4q7b06g4aRuLgaCiKJjkP17Slscg3wkq5YJgFQPyRvncS_o0iyeoT2pyiJkRB5Rk5yWnJDXUsSMetUx_FRdBlWNO5h8EYAJhh57KdjveB-KK-3bazTnl8B8POd8wX4CH1zK-qD9Z_nub8TvYroTBKwFFiHz7cUXftZvo0tiRWb0nww


And that's why i said at the reveal of the demo that PC will just use ram to swap the data in and out and more cpu cores unless it shifts to gpu from ( rtx i/o). Yet tech youtubers on the other end not even once.

Also found this.

1060 gtx 6gb v-ram
Intel i7-4790K 4 core intel
16GB DDR3 1600

 
Last edited:

Bo_Hazem

Banned
It's just a fucking rocky desert which in the end don't look that much better than any other desert....
That's more like movie props than game engine showcase

Trees and dynamic, non-rigid objects add so much performance penalty as they're still rendered the old way. You should expect less than half the performance.

We should see real games to see if UE5 would provide the next gen promise.
 

Md Ray

Member
If you kept up with the thread, it was used an example
That's such an odd example to use.

And then outright calling primitive shaders = outdated, without providing any data (visual/perf difference) as proof is hilarious.

Now that it's been answered twice, you gonna get back on topic Md Ray Md Ray Md Ray Md Ray or further derail? Not every thread needs to be about PlayStation.

This is pretty much on topic of the GPUs now. It's not about PlayStation. Primitive shader is an AMD GPU feature.
 
Last edited:

Dampf

Member
Why post some video when you have this:


4rkkHHX.png


AMD performs pretty equal to Nvidia. This is a good sign, as before UE was heavily favoring Nvidia.

RDNA1 tanks hard though, probably because compared to Turing, Ampere and RDNA2 it's featureset is still from stone-age.
 
Last edited:

GHG

Gold Member
The whole OP doesn't even mentioned Sony or playstation? Wtf are you talking about 😂😂😂😂?! This thread is about 2 different GPU's in the performance of a demo only released on PC.


Cat Glasses GIF by Leroy Patterson


Some interesting results found by a youtuber.

  • System RAM usage at both 1440p and 4k resolutions goes all the way up to 20+ GB. Total usage 2x more than PS5 and XSX.
  • PS5 and XSX only have 13.5 GB of total RAM available which means their I/O is doing a lot of the heavy lifting.

Regardless, its nothing to be upset about since this is all still in early access, things will improve and evolve over time, especially on PC where directstorage is still to come.

It bodes well that the 6900XT is providing double the performance of the PS5 even at this early stage.
 

rofif

Can’t Git Gud
Trees and dynamic, non-rigid objects add so much performance penalty as they're still rendered the old way. You should expect less than half the performance.

We should see real games to see if UE5 would provide the next gen promise.
Exactly. It is maddening. They are pushing this crap to games and it should go to movies.
50fps on 3090 in 1440p... for bare desert. Didn't UE5 promised to "optimize" the polygon count so it should give amazing performance with amazing visuals up close?
Death Stranding rocky lands are 4k120 and look amazing. I fail to see how this is so much better.... I understand it is "infinitely" detailed but cmon
 
Last edited:

Bo_Hazem

Banned
Cat Glasses GIF by Leroy Patterson




Regardless, its nothing to be upset about since this is all still in early access, things will improve and evolve over time, especially on PC where directstorage is still to come.

It bodes well that the 6900XT is providing double the performance of the PS5 even at this early stage.

PS5 was doing around 40fps at 1440p though, also cinematic assets, raw, zero optimization while using Lumen by software. Interesting if there is a console version to test.

Still UE5 needs sometime but it's a good start.

Exactly. It is maddening. They are pushing this crap to games and it should go to movies.
50fps on 3090 in 1440p... for bare desert. Didn't UE5 promised to "optimize" the polygon count so it should give amazing performance with amazing visuals up close?
Death Stranding rocky lands are 4k120 and look amazing. I fail to see how this is so much better.... I understand it is "infinitely" detailed but cmon

You would bet that we'll see many Death Stranding indies going forward minus compelling story.
 
Last edited:

Dampf

Member
Exactly. It is maddening. They are pushing this crap to games and it should go to movies.
50fps on 3090 in 1440p... for bare desert. Didn't UE5 promised to "optimize" the polygon count so it should give amazing performance with amazing visuals up close?
Death Stranding rocky lands are 4k120 and look amazing. I fail to see how this is so much better.... I understand it is "infinitely" detailed but cmon
This is a sample project in an early access engine. You shouldn't expect miracles in terms of performance right now. It's basically running at ultra-extreme settings from the get go.

The performance is still crazy, as it is rendering soooooo much more polygons than any games to date. You can spam a dozen of these extremly high detailed ancient ones and performance still won't suffer. The Polygon count is out of this world. You need glasses if you think Death Stranding even remotely compares.

As for deformable meshes, support is coming. It is still early access after all. I don't think this will be a big deal at all, you will get tons of interactive stuff in future games. The post you quoted is incorrect about non Nanite stuff being rendered the old way. Eventually, it will be using mesh shading on PC and Xbox and primitive shading on PS5, which are much, much faster and better pipelines than vertex shading (what's used today). Future is looking very bright indeed.
 
Last edited:

Bo_Hazem

Banned
This is a sample project in an early access engine. You shouldn't expect miracles in terms of performance right now. It's basically running at ultra-extreme settings from the get go.

The performance is still crazy, as it is rendering soooooo much more polygons than any games to date. You can spam a dozen of these extremly high detailed ancient ones and performance still won't suffer. The Polygon count is out of this world. You need glasses if you think Death Stranding even remotely compares.

As for deformable meshes, support is coming. It is still early access after all. I don't think this will be a big deal at all, you will get tons of interactive stuff in future games. The post you quoted is incorrect about non Nanite stuff being rendered the old way. Eventually, it will using mesh shading on PC and Xbox and primitive shading on PS5, which are much, much faster and better pipelines than vertex shading (what's used today). Future is looking very bright indeed.

Best Wishes Reaction GIF by GIPHY Studios Originals


I believe they can pull this off, the deformable meshes, and other stuff. For now, let's make games about space, no trees ;)
 

rofif

Can’t Git Gud
This is a sample project in an early access engine. You shouldn't expect miracles in terms of performance right now. It's basically running at ultra-extreme settings from the get go.

The performance is still crazy, as it is rendering soooooo much more polygons than any games to date. You can spam a dozen of these extremly high detailed ancient ones and performance still won't suffer. The Polygon count is out of this world. You need glasses if you think Death Stranding even remotely compares.

As for deformable meshes, support is coming. It is still early access after all. I don't think this will be a big deal at all, you will get tons of interactive stuff in future games. The post you quoted is incorrect about non Nanite stuff being rendered the old way. Eventually, it will be using mesh shading on PC and Xbox and primitive shading on PS5, which are much, much faster and better pipelines than vertex shading (what's used today). Future is looking very bright indeed.
"You need glasses if you think Death Stranding even remotely compares"
Hell I need glasses to see any of this difference. Games look so good nowadays....
 

martino

Member
This is a sample project in an early access engine. You shouldn't expect miracles in terms of performance right now. It's basically running at ultra-extreme settings from the get go.

The performance is still crazy, as it is rendering soooooo much more polygons than any games to date. You can spam a dozen of these extremly high detailed ancient ones and performance still won't suffer. The Polygon count is out of this world. You need glasses if you think Death Stranding even remotely compares.

As for deformable meshes, support is coming. It is still early access after all. I don't think this will be a big deal at all, you will get tons of interactive stuff in future games. The post you quoted is incorrect about non Nanite stuff being rendered the old way. Eventually, it will be using mesh shading on PC and Xbox and primitive shading on PS5, which are much, much faster and better pipelines than vertex shading (what's used today). Future is looking very bright indeed.
yeah this is why we won't a released game made from strach using nanite and strength of it before long....
using v5 of the engine that's another strory.
Sadly poeple tend to mix the two and see thing not there when studio say they use v5
 

Dampf

Member
Best Wishes Reaction GIF by GIPHY Studios Originals


I believe they can pull this off, the deformable meshes, and other stuff. For now, let's make games about space, no trees ;)
Another important thing I want to mention is that RT hardware-acceleration for Lumen is literally broken right now.

Lumen should get a great speedup once it's able to make use of dedicated hardware found in modern GPUs and consoles.
 

Bo_Hazem

Banned
Another important thing I want to mention is that RT hardware-acceleration for Lumen is literally broken right now.

Lumen should get a great speedup once it's able to make use of dedicated hardware found in modern GPUs and consoles.

To me it looks laggy compared to normal RT as well, something like few cycles slow, maybe to make it less taxing? You notice it when you move the sun around.
 

Dampf

Member
To me it looks laggy compared to normal RT as well, something like few cycles slow, maybe to make it less taxing? You notice it when you move the sun around.
Yes it does have a high latency. I expect this to get better in the final release as well.
 

RaySoft

Member
No, but can you provide specs as to what they use? Not sure if they are more efficient than Xbox, as they won't say. And any time anyone has a win in a department, they don't shut down when their opponent states their findings. But then again, if I were using an outdated tech (primative shaders), I would keep my mouth shut
On what grounds do you conclude that Sony's primitive shaders are outdated tech?
 
Last edited:
On what grounds do you conclude that Sony's primitive shaders are outdated tech?
Are you and the bandwagon gang all here yet? Y'all finish assembling the squad?


In 2017, to accommodate developers’ increasing appetite for migrating geometry work to compute shaders, AMD introduced a more programmable geometry pipeline stage in their Vega GPU that ran a new type of shader called a primitive shader. According to AMD corporate fellow Mike Mantor, primitive shaders have “the same access that a compute shader has to coordinate how you bring work into the shader.” Mantor said that primitive shaders would give developers access to all the data they need to effectively process geometry, as well.

Primitive shaders led to task shaders, and that led to mesh shaders.
 
Last edited:

Whitecrow

Banned
I hope you seriously wanna know about primitive shaders, they kill unnecessary data before it hurts the GPU/CPU performance, hence "primitive".





More into it:

image.png


image.png


image.png


image.png


image.png


Timestamped:



In other words it's like an agnostic hardware-based Nanite technique.

This post should be sticky on every fkn VRS/RDNA2 thread. Tired of people doing like GE is just smoke.
 
No news here, we already know that RDNA2 is better than Ampere in pure rasterization. What happened in the last generations was inverted, now Nvidia's architecture is the one with "utilization issues".
 
Little is known about Sony's implementation of their geometry engine wich also houses their primitive shaders. What we do know, is that they've brought a lot of their own customization to it and comparing that to the older primitive shaders are probably not accurate.
Which was my main point before someone alerted all of you guys to come defend this thread. It's customized, yes, but a spade is still a spade at the end of the day. Who knows, it might even be better, but it still fits under primitive. They were pretty vocal about "SSD, SSD, SSD!!!", so it's a bit odd they have been on hush mode about this.

If you and your squad want to discuss this further, create a thread. This isn't really the place for discussion of you haven't noticed yet.
 
Eh. the S only has 7.5 GB of ram available for games. and the xsx and ps5 can only run it at 1440p 30 fps.

the S will probably run it at 800p if its lucky and doesnt have its 7.5 GB of vram become a bottleneck.
Couldn't they load half precision assets and set the resolution to meet target performance with that instead (I'm not sure how the data stored and if this is even possible)? It would still be more detailed than what we have now.

Also if this tech is heavily dependent on IO series x will not be able to overcome the throughput difference with extra memory so either:
- asset quality is reduced in advance so they fit in the budget (assuming that this is the limiting factor)
- pop-up galore will ensue when there are too many new objects to stream in


My guess is that we are about to reevaluate what our GPUs can and cannot do.
 
I need to give it a try myself, I have two SSDs (3.2GB/s and 1.3) + an HDD.

My video card is a 2060, not beefy... But pretty expensive these days.
 

ZywyPL

Banned
So the VRAM usage is already smaller than some current-gen titles, nice. RAM usage on the other hand skyrockets, but once DirectStorage is utilized that one will go down as well. And with DLSS it will be possible to play at 4K60. Great times ahead.
 

DaGwaphics

Member
Recommended System Specs (100% Screen Percentage)

Minimum System Specs (50% Screen Percentage)

  • 12-core CPU at 3.4 GHz
  • 64 GB of system RAM
  • GeForce RTX 2080 / AMD Radeon 5700 XT or higher

  • 12-core CPU at 3.4 GHz
  • 32 GB of system RAM
  • GeForce GTX 1080 / AMD RX Vega 64

Valley of the Ancient Sample

That's the specs required for running in the editor right? People that have compiled it and run it natively outside editor don't need anywhere near that.
 

Dampf

Member
So the VRAM usage is already smaller than some current-gen titles, nice. RAM usage on the other hand skyrockets, but once DirectStorage is utilized that one will go down as well. And with DLSS it will be possible to play at 4K60. Great times ahead.
I've always been telling people VRAM consumption is not going to increase next gen, atleast when using techniques like Sampler Feedback and Nanite+Virtual Texturing.

These techniques allow for much higher fidelity without the need to increase physical VRAM.
 

SlimySnake

Flashless at the Golden Globes
Why post some video when you have this:


4rkkHHX.png


AMD performs pretty equal to Nvidia. This is a good sign, as before UE was heavily favoring Nvidia.

RDNA1 tanks hard though, probably because compared to Turing, Ampere and RDNA2 it's featureset is still from stone-age.
thanks.

TheThreadsThatBindUs TheThreadsThatBindUs Three Jackdaws Three Jackdaws the 570 which is roughly on par with the 4 tflops series s (1.5x IPC gain over Polaris) is doing 10 fps at 1440p. this is likely an 800p game to hit 30 fps.

In the other thread, Alex mentioned how Lumens is software based and is equivalent to Medium Settings. High Lumen settings is hardware accelerated using RT cores and can only do 1080p 30 on both the PS5 and XSX. this would put the series s version at 540p 30 fps.

It looks like series s isnt going to be holding back next gen. Devs dont seem to really care that it exists and are just content with pushing fidelity instead of resolution and framerate.
 

Godfavor

Member
thanks.

TheThreadsThatBindUs TheThreadsThatBindUs Three Jackdaws Three Jackdaws the 570 which is roughly on par with the 4 tflops series s (1.5x IPC gain over Polaris) is doing 10 fps at 1440p. this is likely an 800p game to hit 30 fps.

In the other thread, Alex mentioned how Lumens is software based and is equivalent to Medium Settings. High Lumen settings is hardware accelerated using RT cores and can only do 1080p 30 on both the PS5 and XSX. this would put the series s version at 540p 30 fps.

It looks like series s isnt going to be holding back next gen. Devs dont seem to really care that it exists and are just content with pushing fidelity instead of resolution and framerate.
I am not sure but I think that rasterization in UE5 uses GPU compute shaders to draw meshes and not the hardware accelerated mesh shading (or the equivalent of geometry engine of the PS5).

My speculation is that it shifts some shader work to draw polygons.
I might be completely wrong though.

Edit: The unreal engine 5 demo for PS5 might use the geometry engine instead.
 
Last edited:
No, but can you provide specs as to what they use? Not sure if they are more efficient than Xbox, as they won't say. And any time anyone has a win in a department, they don't shut down when their opponent states their findings. But then again, if I were using an outdated tech (primative shaders), I would keep my mouth shut
Primitive Shaders are far from outdated, they actually offer the same functionality as Mesh Shaders do which is to bring compute shader functionality into the graphics pipeline. The underlying hardware for Mesh/Primitive Shaders is also the same, in fact AMD have not highlighted any changes to their geometry engine and command processor since Vega and the Mesh Shaders (which are simply an API implementation) are converted into Primitive Shader code on all RDNA 2 cards and this very likely includes the Series X/S.

Now Primitive Shaders modify the graphics pipeline while Mesh Shaders intend to replace it all together, which one is better? not really a good question since both offer the same functionality and performance when coded and optimised correctly, I know that tessellation can be programmed a little bit easier with Mesh Shaders but that's about it.

Unreal Engine 5 doesn't offer support for Mesh Shaders as far as I'm aware, I know that they use "hyper optimised compute shaders" which were featured in the PS5 demo last year and the same demo also featured Primitive Shaders according to Epic.

thanks.

TheThreadsThatBindUs TheThreadsThatBindUs Three Jackdaws Three Jackdaws the 570 which is roughly on par with the 4 tflops series s (1.5x IPC gain over Polaris) is doing 10 fps at 1440p. this is likely an 800p game to hit 30 fps.

In the other thread, Alex mentioned how Lumens is software based and is equivalent to Medium Settings. High Lumen settings is hardware accelerated using RT cores and can only do 1080p 30 on both the PS5 and XSX. this would put the series s version at 540p 30 fps.

It looks like series s isnt going to be holding back next gen. Devs dont seem to really care that it exists and are just content with pushing fidelity instead of resolution and framerate.

The results are interesting, I'm kind of surprised at the performance cost of Lumen considering it's a software solution, but then again true dynamic bounce global illumination is very expensive on the GPU.
 

DaGwaphics

Member
thanks.

TheThreadsThatBindUs TheThreadsThatBindUs Three Jackdaws Three Jackdaws the 570 which is roughly on par with the 4 tflops series s (1.5x IPC gain over Polaris) is doing 10 fps at 1440p. this is likely an 800p game to hit 30 fps.

In the other thread, Alex mentioned how Lumens is software based and is equivalent to Medium Settings. High Lumen settings is hardware accelerated using RT cores and can only do 1080p 30 on both the PS5 and XSX. this would put the series s version at 540p 30 fps.

It looks like series s isnt going to be holding back next gen. Devs dont seem to really care that it exists and are just content with pushing fidelity instead of resolution and framerate.

Until we actually see something running on XSS and PS5/XSX it is very presumptuous to assume anything. Epic themselves stated that while they targeted 1080p/30, 1080p/60 (or presumably higher resolutions) were possible and expected. You might see the XSS running at 30fps with a similar fidelity target as XSX/PS5 at 60fps. Taking an old GPU that might lineup in theoretical TF count when adjusting for ipc increases is not indicative of performance on a part with modern accelerators (that only holds for old software). This can easily be seen by trying to run WDL on the 570 with equivalent RT as the XSS.

It will be something interesting to see in 3rd-party UE5 releases, especially in seeing the approach taken. Will they lower fidelity of Nanite/Lumen to work with XSS (can Nanite/Lumen be used on the XSS?), or do they auto-generate LODs and continue to work more traditionally with XSS?
 
Last edited:

REDRZA MWS

Member
Eh. the S only has 7.5 GB of ram available for games. and the xsx and ps5 can only run it at 1440p 30 fps.

the S will probably run it at 800p if its lucky and doesnt have its 7.5 GB of vram become a bottleneck.
Eh. The engine will be optimized more and more over time and it will run on cell phones. As every unreal engine since the beginning. No need to worry.
 

CrustyBritches

Gold Member
Has anybody been able to get the requisite SDK installed and package this into a standalone .exe, or does anyone have a link to where somebody has been able to do this?
---
The whole point of UE5 early access is so devs(from hobbyist to professional) can get a look at UE5 functionality and start pushing development towards that. It's not meant to be console fanboy fodder.
 

truth411

Member
Is there any confirmation that PS5 usable RAM is 13.5GB? I only found that XSX is confirmed 13.5gb for games.
No. Folks just assumed since thats the case for MS then that must be the case for Sony. In fact since the PS5 SSD is much faster they don't need to allocate as much ram to the OS as MS. Rumor is the PS5 allocates 2GB to OS thus leaving 14GB available for games.
 
Last edited:
  • Thoughtful
Reactions: Rea
Top Bottom