• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Unreal Engine 5 Benchmarks. 6900xt outperforms 3090 at 1440p. 28 GB RAM usage.

SlimySnake

Member
Feb 5, 2013
12,296
34,714
1,260
Eh. The engine will be optimized more and more over time and it will run on cell phones. As every unreal engine since the beginning. No need to worry.
the engine would but the games wont. you dont see big AAA third party games running on cellphones.
 
  • Like
Reactions: jose4gg

DeaDPooL_jlp

Member
Sep 20, 2015
2,935
4,412
640
and people were so excited thinking a PS5 would be running games looking like that lol.
To be fair both "next gen" consoles have to make a lot of sacrifices to run most games @ 60fps/4k and it's only going to become more of a problem as we move forward. A mid gen refresh will have to come sooner rather than later to try and keep up with technology.

With that said, I'm so glad I built my first gaming pc last year, best decision I've made in a long time when it comes to gaming.
 

REDRZA MWS

Member
Jan 7, 2018
1,014
1,321
450
To be fair both "next gen" consoles have to make a lot of sacrifices to run most games @ 60fps/4k and it's only going to become more of a problem as we move forward. A mid gen refresh will have to come sooner rather than later to try and keep up with technology.

With that said, I'm so glad I built my first gaming pc last year, best decision I've made in a long time when it comes to gaming.
That’s why I’m glad MS offers 1440p as an option for my series X on my LG GX. Still waiting for that update Sony 😡
 

dxdt

Member
Jun 21, 2015
60
51
390
No. Folks just assumed since thats the case for MS then that must be the case for Sony. In fact since the PS5 SSD is much faster they don't need to allocate as much ram to the OS as MS. Rumor is the PS5 allocates 2GB to OS thus leaving 14GB available for games.
I am confused how the fast SSD can help to reduce OS memory footprint unless you're using a portion as a swap. But OS swapping is probably not desirable during gaming to reduce disk access.
 

//DEVIL//

Member
May 28, 2014
2,360
1,774
690
I am a proud owner of Asus 6900xt water cooled. But I don’t understand.
Wasn’t the first demo running at 30fps 2k on ps5 ? How on earth 3090 can’t reach 60 frames at 2k when that card is probably 3 times more powerful than a ps5 ?
Something is off here
 

Kenpachii

Member
Mar 23, 2018
9,062
11,288
815
I am a proud owner of Asus 6900xt water cooled. But I don’t understand.
Wasn’t the first demo running at 30fps 2k on ps5 ? How on earth 3090 can’t reach 60 frames at 2k when that card is probably 3 times more powerful than a ps5 ?
Something is off here

It's another demo, not optimized yet. Just look at the MS on that 3090 in the video the topic maker posted.
 
Jun 1, 2016
2,621
3,425
795
I am a proud owner of Asus 6900xt water cooled. But I don’t understand.
Wasn’t the first demo running at 30fps 2k on ps5 ? How on earth 3090 can’t reach 60 frames at 2k when that card is probably 3 times more powerful than a ps5 ?
Something is off here
The original demo wasn't fake, but it wasn't being played either. It's easier to make a qte run well than an interactive experience.
 
  • Fire
Reactions: jaysius

SlimySnake

Member
Feb 5, 2013
12,296
34,714
1,260
I am a proud owner of Asus 6900xt water cooled. But I don’t understand.
Wasn’t the first demo running at 30fps 2k on ps5 ? How on earth 3090 can’t reach 60 frames at 2k when that card is probably 3 times more powerful than a ps5 ?
Something is off here
because 3090 isnt 3x more powerful than the 10 tflops ps5. or any 10 tflops rdna 2.0 card on the market.

Nvidia increased the shader processors 3x for the 3080 and that allows them to manipulate the tflops but they took some shortcuts to get there as explained below.

8704 shader cores * 2 instructions per clock * 1.71 ghz Boost clock speeds = 29 tflops. For the 3080. 3090 has a few more shader cores and is able to hit 35 tflops.

The RTX 3000 cards are built on an architecture NVIDIA calls "Ampere," and its SM, in some ways, takes both the Pascal and the Turing approach. Ampere keeps the 64 FP32 cores as before, but the 64 other cores are now designated as "FP32 and INT32.” So, half the Ampere cores are dedicated to floating-point, but the other half can perform either floating-point or integer math, just like in Pascal.

With this switch, NVIDIA is now counting each SM as containing 128 FP32 cores, rather than the 64 that Turing had. The 3070's "5,888 cuda cores" are perhaps better described as "2,944 cuda cores, and 2,944 cores that can be cuda."

As games have become more complex, developers have begun to lean more heavily on integers. An NVIDIA slide from the original 2018 RTX launch suggested that integer math, on average, made up about a quarter of in-game GPU operations.

The downside of the Turing SM is the potential for under-utilization. If, for example, a workload is 25-percent integer math, around a quarter of the GPU’s cores could be sitting around with nothing to do. That’s the thinking behind this new semi-unified core structure, and, on paper, it makes a lot of sense: You can still run integer and floating-point operations simultaneously, but when those integer cores are dormant, they can run floating-point instead.


Your 6900xt has 23 tflops, way below the tflops count of both the 3080 and the 3090 but outperforms both GPUs.

The 3080 has 3x more tflops than the 2080 but offers only 1.8x more performance.
 
  • Like
Reactions: Md Ray

//DEVIL//

Member
May 28, 2014
2,360
1,774
690
because 3090 isnt 3x more powerful than the 10 tflops ps5. or any 10 tflops rdna 2.0 card on the market.

Nvidia increased the shader processors 3x for the 3080 and that allows them to manipulate the tflops but they took some shortcuts to get there as explained below.

8704 shader cores * 2 instructions per clock * 1.71 ghz Boost clock speeds = 29 tflops. For the 3080. 3090 has a few more shader cores and is able to hit 35 tflops.




Your 6900xt has 23 tflops, way below the tflops count of both the 3080 and the 3090 but outperforms both GPUs.

The 3080 has 3x more tflops than the 2080 but offers only 1.8x more performance.
Yes I get the 3090 semi fake tflops. But again . If not 3 times it should be easily twice as powerful easily. It’s not like the 10tf on ps5 is accurate either as this is the whole thing combined with the cpu performance when the 3090 is just that gpu.
I guess it’s because of the editor maybe ? Not optimized ? Or like other posts suggested different demo type ? No clue but doesn’t add up .
 

Epic Sax CEO

Member
Nov 4, 2019
713
1,412
385
By the way, that's why pure resolution don't matter anymore:

 

Lethal01

Member
Jun 15, 2019
2,501
4,127
590
Yes I get the 3090 semi fake tflops. But again . If not 3 times it should be easily twice as powerful easily. It’s not like the 10tf on ps5 is accurate either as this is the whole thing combined with the cpu performance when the 3090 is just that gpu.
I guess it’s because of the editor maybe ? Not optimized ? Or like other posts suggested different demo type ? No clue but doesn’t add up .

Guess the customizations made to the Geometry Engine are really paying off.
 
  • Like
  • Thoughtful
Reactions: Md Ray and Rea
Jul 29, 2013
3,622
7,023
1,095
US
I am a proud owner of Asus 6900xt water cooled. But I don’t understand.
Wasn’t the first demo running at 30fps 2k on ps5 ? How on earth 3090 can’t reach 60 frames at 2k when that card is probably 3 times more powerful than a ps5 ?
Something is off here
Doesn't the benchmark posted here show 6900XT at 60fps avg, while 5700XT is at 29fps? That's double.
 
Last edited:
  • Like
Reactions: octiny

//DEVIL//

Member
May 28, 2014
2,360
1,774
690
Ok I laughed lol.
I could be wrong but I thought when they say ps5 is 10 and Xbox sx 12tf. They count the whole processing power of the PlayStation . Not just the GPU. Am I wrong ?
 
Last edited:

Rea

Member
Jul 7, 2020
992
4,009
445
Ok I laughed lol.
I could be wrong but I thought when they say ps5 is 10 and Xbox sx 12tf. They count the whole processing power of the PlayStation . Not just the GPU. Am I wrong ?
Yup, you're wrong. 10tf and 12 tf is the measurements of GPU compute power respectively. Nothing to do with CPU performance. CPU performance is another story.
 
  • Thoughtful
Reactions: Md Ray

//DEVIL//

Member
May 28, 2014
2,360
1,774
690
Yup, you're wrong. 10tf and 12 tf is the measurements of GPU compute power respectively. Nothing to do with CPU performance. CPU performance is another story.
Okay so it’s not the whole system. Got it thanks
 
  • Strength
Reactions: Rea

Whitecrow

Member
May 7, 2018
2,330
3,778
615
By the way, that's why pure resolution don't matter anymore:

While I mostly agree with you, there are purists who still want the cleanest IQ possible.
I'm no nvidia user so I'm still not sure how DLSS can compare no native 4K (if I'm not mistaken, it can even look better), but outside DLSS realm, native is still king.

Playing my Pro on my C9 most games have very clean IQ but I can still see some slight blurryness and room for improvement, and I would gladly appreciate it in the cases where it does not affect
a stable frame rate.
 
  • Like
Reactions: ZywyPL

ZywyPL

Member
Nov 27, 2018
6,019
10,713
725
By the way, that's why pure resolution don't matter anymore:


With all the different reconstruction techniques out there, from which most of them only blur the image and create artifacts, I'm afraid native resolution is still the only way that guarantees proper image sharpness and clarity.
 
Dec 24, 2020
429
2,571
340
With all the different reconstruction techniques out there, from which most of them only blur the image and create artifacts, I'm afraid native resolution is still the only way that guarantees proper image sharpness and clarity.
I can hardly notice the artifacting and other side effects of resolution upscaling methods, honestly.

Literally have to watch DF zoom into strands of hair in 2-3 frames to notice such things but maybe I have a poor eyesight lol
 

99Luffy

Banned
Sep 10, 2016
1,973
309
450
Vram usage is interesting. But I'll wait until we the PS5 scenes running, it looked alot more impressive than valley of the ancients.
 

SlimySnake

Member
Feb 5, 2013
12,296
34,714
1,260
Yes I get the 3090 semi fake tflops. But again . If not 3 times it should be easily twice as powerful easily. It’s not like the 10tf on ps5 is accurate either as this is the whole thing combined with the cpu performance when the 3090 is just that gpu.
I guess it’s because of the editor maybe ? Not optimized ? Or like other posts suggested different demo type ? No clue but doesn’t add up .
the 35 tflops 3090 IS 2x more powerful than the 2080. It's the 30 tflops 3080 thats only 1.8x more powerful.




6900xt is offering a linear performance increase over the roughly 10 tflops 5700xt. It's doing what it should. The Ampere cards dont seem to be scaling as well in this demo. A 3080 should be at least 1.8x more powerful than the 2080 but here its offering only a 1.4x more performance.
 
  • Like
Reactions: Md Ray