• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

Fredrik

Member
I'm all relaxed. It's just funny how everyone bashing Nvidia since forever (because of what?) then other company gives false advertising and people are okay with that. :messenger_grinning_sweat:

I know. I think RDNA3 is comparable only to 8nm RTX3000 series. And we will see it shortly.
It’s because Nvidia has been the leader so long that they can do whatever they want price-wise. Until now. The 4080 needs to have a price reduction going by these AMD figures, could be PR bs though.
 
7900 XTX - $999
7900 XT - $899

Both Available December 13th


pmdljdg.jpg


AMD Radeon RX 7900 XTX & XT​

Today, AMD announced its first Radeon RX 7000 desktop cards, starting with the high-end 7900 series.

  • Full Ray Tracing Support
  • 1.7x times better than 6950 XT
  • Multiple Games with FSR support at launch








AMD Radeon RX 7900 series








AMD confirm Navi 31 GPU has 58 billion transistors, and it offers up to 61 TFLOPs of single-precision compute performance. This GPU has 5.3 TB/s chiplet interconnect. With 5nm node, Navi 31 GPU has 165% higher transistor density compared to Navi 2X.





AMD Navi 31 GPU

AMD is confirming that its new RX 7900 XTX will be up to 1.7x faster than RX 6950XT at 4K resolution. The company is sharing first performance claims in some popular titles:

wPSSeC5bUkYbQqVjqNqdea-1200-80.png.webp
d3hK244936xb8Hgs8FKqZa-1200-80.png.webp
7ReRy8hVpkfEpByyGisGJa-1200-80.png.webp







AMD RX 7900 XTX vs. RX 6950XT Performance

The Radeon 7900 XTX is the first Radeon card to support DisplayPort 2.1 display interface, offering up to 8K 165Hz support or 4K at 480 Hz.

Let's recap the event
•Rdna3 announced
•300mm2 die size
•Up 56 Billion transistors (165% more transistors per mm2)
•First displayport 2.1 gpus on the market(support for up to 4k480fps & 8k165fps)
•Does not use the hazardous VHPWR connector(no need to change your psu, if you have a good one)
•World's first chiplet based architecture (5nm gcd & 6nm mcd chiplets)
•52 TFLOPS & 61 TFLOPS for 7900xt and 7900xtx respectively
•2.1Ghz & 2.3Ghz on 7900xt and 7900xtx respectively
•Memory bandwidth at 800GB/s & 960GB/s on Xt & Xtx respectively (according to amd product pages)
•7900xt features 20GB Gddr6 vram on a 320bit bus, 24GB and 384bit bus on 7900xtx
•Second gen infinity cache
•Faster interconnect bandwidth (Up to 5.3 TB/s)
•80 & 96MB of infinity cache on 7900Xt & 7900Xtx respectively
•Costs between 899 & 999(600-700 dollars cheaper than 4090)
•New/faster display/media engine(Radiance Display Engine, dual media engine up to 7x faster)
•Optimised performance for some encoding/streaming apps(Obs, ffmpeg, Premiere ,handbrake. More apps soon)
•1.54x improvement in perf/watt over rdna2
•New dedicated Ai accelerators with 2.7x ai improvement over rdna2(2 ai acceleration units in each cu)
•1.5x higher rt performance compared to previous high end flagship(rt performance can be up to 1.8x higher)
•Total board power at 320w for 7900xt & 355w for 7900xtx respectively
•New adrenalin software with more unified features
•Sneak peak at Fsr3.0 with new frame doubling feature

Small correction(Thanks AncientOrigin for the input):
The 7900Xt has a total board power of 300w, not 320w like on one of my bullet points . Thanks for the correction guys!
 
Last edited:
Are these fast with Ray-tracing ?? I mean NVIDIA has a pretty clear edge on that front .
Nowhere near the RTX4000 series, but should be similar to the RTX3000 series.

I guess FSR2.2 could getting very close to DLSS 2.0 with less ghosting and other improvements. Wonder how close FSR3 will be to DLSS 3.0.
 

sinnergy

Member
Nowhere near the RTX4000 series, but should be similar to the RTX3000 series.

I guess FSR2.2 could getting very close to DLSS 2.0 with less ghosting and other improvements. Wonder how close FSR3 will be to DLSS 3.0.
So AMD is basically a generation behind on NVIDIA with their Ray-tracing , that’s a shame really.
 

daninthemix

Member
These cards seem decent, but I don't know what the point of releasing 2 cards within $100 of each other is.

Happy I snagged my 4090.
 
Let's recap the event
•Rdna3 announced
•300mm2 die size
•Up 56 Billion transistors (165% more transistors per mm2)
•First displayport 2.1 gpus on the market(support for up to 4k480fps & 8k165fps)
•Does not use the hazardous VHPWR connector(no need to change your psu, if you have a good one)
•World's first chiplet based architecture (5nm gcd & 6nm mcd chiplets)
•52 TFLOPS & 61 TFLOPS for 7900xt and 7900xtx respectively
•2.1Ghz & 2.3Ghz on 7900xt and 7900xtx respectively
•Memory bandwidth at 800GB/s & 960GB/s on Xt & Xtx respectively (according to amd product pages)
•7900xt features 20GB Gddr6 vram on a 320bit bus, 24GB and 384bit bus on 7900xtx
•Second gen infinity cache
•Faster interconnect bandwidth (Up to 5.3 TB/s)
•80 & 96MB of infinity cache on 7900Xt & 7900Xtx respectively
•Costs between 899 & 999(600-700 dollars cheaper than 4090)
•New/faster display/media engine(Radiance Display Engine, dual media engine up to 7x faster)
•Optimised performance for some encoding/streaming apps(Obs, ffmpeg, Premiere ,handbrake. More apps soon)
•1.54x improvement in perf/watt over rdna2
•New dedicated Ai accelerators with 2.7x ai improvement over rdna2(2 ai acceleration units in each cu)
•1.5x higher rt performance compared to previous high end flagship(rt performance can be up to 1.8x higher)
•Total board power at 320w for 7900xt & 355w for 7900xtx respectively
•New adrenalin software with more unified features
•Sneak peak at Fsr3.0 with new frame doubling feature
Total boardpower for the 7900xt was 300
 

Dr.D00p

Member
These cards seem decent, but I don't know what the point of releasing 2 cards within $100 of each other is.

The 7900XT probably has a $100 price reduction built in to its launch pricing, just in case Nvidia respond by dropping the price of the 4080.

The 7900XTX will stay at $999 but dropping the 7900XT to $799 would make more sense...but AMD won't do it until they have to.
 

Popup

Member
Good pricing, good tech...but, sigh, availability will suck until well into next year and the scalper cunts will ruin everything at launch.
It's a shame. Lol, maybe we should create our own army of enthusiasts who join the scalper pool, using the same methods to obtain stock, but with the purpose of helping other forum members obtain them for the price paid, rather than making any money
 
Last edited:
Yes, the RX 7900XT is really not attractive enough for 100USD less. Wish they would have scapped that completely and use their production capacities only for the XTX. Now we get less stock, because the also have to build the other card.
 

PeteBull

Member
Can anyone help me figure out if I could swap out my GTX 1080 for one of these new AMD cards on this prebuilt PC? Or what would I need to do?

ProcessorIntel Core i7-8700 3.20 GHz
Processor Main Features64 bit 6-Core Processor
Cache Per Processor12 MB L3 Cache
Memory16 GB DDR4 + 16 GB Optane Memory
Storage2 TB SATA III 7200 RPM HDD
Optical Drive24x DVD+-R/+-RW DUAL LAYER DRIVE
GraphicsNVIDIA GeForce GTX 1080 8 GB GDDR5X
EthernetGigabit Ethernet
Power Supply600W 80+
CaseCYBERPOWERPC W210 Black Windowed Gaming Case
Depends on particular game, but here i got oced 8700k(about 10-15% faster than urs) and it holds steady 60fps in every game, including driving in cp2077 which is considered one of if not the heaviest cpu loads, so tldr cpu will be ur bottleneck in games/areas that are very cpu heavy, otherwise u are golden for 60fps stable in 4k.
Imho get rx 7900 xtx, and if u see ur fps dips with lower than say 95% gpu usage(easy to check it by instaling msi afterburner, its free), then time to upgrade cpu/mobo/ram for something more powerful, doesnt need to be topend cpu but something like am4 mobo/ 32gigs of 3600mhz cl15-16 ram and r7 5800x3d (3d part is important).
 

Denton

Member
Sigh, guess i'm waiting for reviews seeing AMD pulled all sorts of shit nvidia and intel would be proud of here. Like using a 5900x cpu for the 6950x and used a 7900x for the 7900xtx.
https://images.anandtech.com/galleries/8202/AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2071_575px.png
Then they used their frame generation tech in the RT slide saying up to 2x performance but hid the fact the 7900 was using frame gen while the 6950 just fsr2.
https://images.anandtech.com/galleries/8202/AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2074_575px.png
Oh wow what the fuck. Very poor form, AMD

Yes, the RX 7900XT is really not attractive enough for 100USD less. Wish they would have scapped that completely and use their production capacities only for the XTX. Now we get less stock, because the also have to build the other card.

That's not how it works. 7900XT are just defective XTX, where some units don't work so are disabled and sold for cheaper.
 
Last edited:

Marlenus

Member
It'll have some clear outliers with 1.7x performance and some more in the range of 1.4, etc. In the end, i'm betting that at review time, these will compare to the 4080 in rasterization, but with 3090 RT (and probably worse with crazy RTGI games coming like Cyberpunk 2077 overdrive), for $200 less than the 4080, assuming Nvidia doesn't budge on price or doesn't pull a rabbit out of their hat.

What's shocking, and Dictator, Alex from DF says, is that the RT uplift is so bad here. Like it seems it scaled linearly with RDNA 2, no huge improvements like "leakers" were suggesting. Like he says, we're at the dawn of having waves of games with full RT, without any ways to toggle it off. RT remix sprinkled on top of that and... well i have to ask the question, what's the point of having so much rasterization performances if you're not a pro-gamer playing at 1080p with low settings? This range of card is basically scoffing at any pure rasterization games coming at them, it's wasted power left on the table, while RT NEEDS all the performance gain it can get.

Not sure i understand the proposition here. Benchmarks will be a cold shower. It'll make the 4080 look appealing in the end. Surprised pikachu right there, as everyone were down on it.

The 4080 is only about 20% faster in raster than the 3090Ti according to NVs own slides.
 

IDKFA

I am Become Bilbo Baggins
I was looking to build a PC next year.

Looks like AMD are getting my money for GPU with the XTX.
 

tusharngf

Member
Can anyone help me figure out if I could swap out my GTX 1080 for one of these new AMD cards on this prebuilt PC? Or what would I need to do?

ProcessorIntel Core i7-8700 3.20 GHz
Processor Main Features64 bit 6-Core Processor
Cache Per Processor12 MB L3 Cache
Memory16 GB DDR4 + 16 GB Optane Memory
Storage2 TB SATA III 7200 RPM HDD
Optical Drive24x DVD+-R/+-RW DUAL LAYER DRIVE
GraphicsNVIDIA GeForce GTX 1080 8 GB GDDR5X
EthernetGigabit Ethernet
Power Supply600W 80+
CaseCYBERPOWERPC W210 Black Windowed Gaming Case
you need atleast 12100 to avoid bottleneck for next 2-3 yrs.
 

Orta

Banned
Hopefully this is the long awaited bitch slap nVidia has been desperately begging for the past few gens.

Any idea when benchmarks are due?
 
Last edited:

kuncol02

Banned
GOW has no built-in benchmarks but the 4090 maintains 120fps+ at 4K/Ultra according to computerbase and TestingGames. This favors NVIDIA. This needs an asterisk because they could have tested totally different sections.
What's the point of comparing AMD cards with Nvidia cards that are 60% more expensive and use almost twice as much electricity?
 

MidGenRefresh

*Refreshes biennially
What's the point of comparing AMD cards with Nvidia cards that are 60% more expensive and use almost twice as much electricity?

There is no point. Both cards are aimed at different consumers with different expectations.

But it's fun so people will still do it.
 

OZ9000

Banned
Sooooo as a console gamer that has held off building his own gaming pc (still using a 1070 Alienware) is there any consensus as to where these will fall performance wise?

I’m assuming <4090 but >4080?
Stolen from Reddit:

"Raster performance
  • 4090 144 fps ($1,600)
  • 7900XTX 131 fps ($999)
  • 7900XT 115 fps ($899)
  • 4080 16 110 fps ($1,200)
  • 4080 12 90 fps ($900) - or whatever it renamed to
For RT it might be more like (I did raster * 0.65 for NV and raster * 0.5 for AMD here)
  • 4090 94 fps ($1,600) 66 fps with new scaling
  • 4080 16 72 fps ($1,200) 51 fps with new scaling
  • 7900XTX 65 fps ($999) 41 fps with new scaling
  • 4080 12 59 fps ($899) 41 fps with new scaling
  • 7900XT 55 fps ($899) 37 fps with new scaling"
If raytracing is important to you then you should go with the 4080. If you want to push for performance otherwise, 7900XTX.

Only a couple of weeks until the 4080 reviews come out. Will be interesting for sure.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
What's the point of comparing AMD cards with Nvidia cards that are 60% more expensive and use almost twice as much electricity?
Top card vs top card. We've been doing this since times immemorial.

Also news to me that 450 is almost 2x 300. You can also power limit the 4090 to 350W and retain 95% of its 450W performance.
 

OZ9000

Banned
There is no point. Both cards are aimed at different consumers with different expectations.

But it's fun so people will still do it.
I just think Nvidia cards represent awful value for money and there is nothing anyone can convince me otherwise.

Didn't the GTX Titan top out at $1,000? Now this is the 'norm' for mid to high end GPUs. Fucking lol.

We need a complete reset of the GPU market IMO. A top end graphics card should not cost more than $1,000.
 

MidGenRefresh

*Refreshes biennially
I just think Nvidia cards represent awful value for money and there is nothing anyone can convince me otherwise.

Didn't the GTX Titan top out at $1,000? Now this is the 'norm' for mid to high end GPUs. Fucking lol.

We need a complete reset of the GPU market IMO. A top end graphics card should not cost more than $1,000.

People who buy 4090s are not looking for "value for money". They're looking for the absolute best card and as long as this is the case, these cards will sell. There's market for it.
 

Fredrik

Member
Stolen from Reddit:

"Raster performance
  • 4090 144 fps ($1,600)
  • 7900XTX 131 fps ($999)
  • 7900XT 115 fps ($899)
  • 4080 16 110 fps ($1,200)
  • 4080 12 90 fps ($900) - or whatever it renamed to
For RT it might be more like (I did raster * 0.65 for NV and raster * 0.5 for AMD here)
  • 4090 94 fps ($1,600) 66 fps with new scaling
  • 4080 16 72 fps ($1,200) 51 fps with new scaling
  • 7900XTX 65 fps ($999) 41 fps with new scaling
  • 4080 12 59 fps ($899) 41 fps with new scaling
  • 7900XT 55 fps ($899) 37 fps with new scaling"
If raytracing is important to you then you should go with the 4080. If you want to push for performance otherwise, 7900XTX.

Only a couple of weeks until the 4080 reviews come out. Will be interesting for sure.
Oof Nvidia will have to have some masterful PR to sell the 4080 without a price adjustment
 

Fredrik

Member
People who buy 4090s are not looking for "value for money". They're looking for the absolute best card and as long as this is the case, these cards will sell. There's market for it.
Depends how close AMD comes and how much 7900XTX can be overclocked. I’d say it’s fairly reasonable to think Nvidia’s market will shrink this time, especially if they don’t change the price on 4080.
 

Pakoe

Member
Thinking about going full team red this generation and selling my 3080.
I might get some solid gains on ultrawide.
 
Last edited:

Fredrik

Member
Nvidia buyers have already done the necessary mental gymnastics to justify spending a lot more for not much extra.

..They're a bit like Apple customers in that regard.
Lol holster that gun warrior! The thing is there is no extra with the 4080… going by the available figures it’s literally behind, and more expensive, except for RT maybe. I just don’t see how that can turn out working.

There is of course the possibility that AMD has been trolling everyone with some sneaky PR tricks, Gamer Nexus had some fun about some figures, but even before AMD’s unveil the 4080 was said to be badly priced.
 
Last edited:

GHG

Member
Lol holster that gun warrior! The thing is there is no extra with the 4080… going by the available figures it’s literally behind, and more expensive, except for RT maybe. I just don’t see how that can turn out working.

There is of course the possibility that AMD has been trolling everyone with some sneaky PR tricks, Gamer Nexus had some fun about some figures, but even before AMD’s unveil the 4080 was said to be badly priced.

Both them and Nvidia like to get creative with their marketing slides, it's nothing new. We will need to wait for benchmarks to see the reality but based on AMD's and Nvidia's respective claims (the latter regarding the 4080 cards), AMD are looking like they are in a strong position.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Every time this happens people are cheering over saying game over nvidia, only to order a nvidia cars five minutes after in hope some other fool will keep buying amd so they hopefully one day can be competitive.
 

GymWolf

Member
I only have 3 questions before jumping in the amd bandwagon:

-Are ue5 titles gonna suffer on amd cards since they use lumen that is a form of rtx? I don't wanna spend 1000+ euros for something that is not gonna do 4k60 in those titles (or close to that with some setting down a notch)
Did anyone tried the matrix demo on both a 3090ti and a 6950 to know how much is the difference in perf? Ue5 games are gonna be a big thing going on and i don't want a gpu that perform much worse with that engine.

-People says that the vanilla card from amd are pretty low overclock wise, is there some kind of physical lock or it is just a matter of turning a slider higher on afterburn to make them identical to a third party (except how cool the temp are are of course)

-do we have info on when preorder starts, europen prices and any hint if this is gonna be another paper launch?

Thanks.
 
Stolen from Reddit:

"Raster performance
  • 4090 144 fps ($1,600)
  • 7900XTX 131 fps ($999)
  • 7900XT 115 fps ($899)
  • 4080 16 110 fps ($1,200)
  • 4080 12 90 fps ($900) - or whatever it renamed to
For RT it might be more like (I did raster * 0.65 for NV and raster * 0.5 for AMD here)
  • 4090 94 fps ($1,600) 66 fps with new scaling
  • 4080 16 72 fps ($1,200) 51 fps with new scaling
  • 7900XTX 65 fps ($999) 41 fps with new scaling
  • 4080 12 59 fps ($899) 41 fps with new scaling
  • 7900XT 55 fps ($899) 37 fps with new scaling"
If raytracing is important to you then you should go with the 4080. If you want to push for performance otherwise, 7900XTX.

Only a couple of weeks until the 4080 reviews come out. Will be interesting for sure.
If that's the case, then we're looking at 11% more Rt performance in favour of the 4080 and 45% more rt performance in favour of the 4090. While in raster, we're looking at 10% more performance in favour of the 4090. Not too shabby honestly, I expected a bigger advantage in Rt in Nvidia's favour should this end up being the case. Amd has great performance here considering the price (great perf/watt) in addition to having less board power at 355w vs 450w on the 4090. I imagine had Amd went for that high of a board power, they'd be up there with the 4090, but it's great they decided to go with a balanced efficiency/performance ratio rather than aiming for a "space heater ". This reinforces my assumption that a 7950xtx is more likely on the cards to compete with the 4090 on that tier with 390-400w of board power to maintain that 2x8pin cable. Time will tell. Hopefully reddit did their research properly and is in the money, as that will make navi3x super competitive with Nvidia's offerings. Plus navi gen3 now offers dedicated ai acceleration in addition to displayport 2.1, which is great.
 
Last edited:

Tams

Member
It'll have some clear outliers with 1.7x performance and some more in the range of 1.4, etc. In the end, i'm betting that at review time, these will compare to the 4080 in rasterization, but with 3090 RT (and probably worse with crazy RTGI games coming like Cyberpunk 2077 overdrive), for $200 less than the 4080, assuming Nvidia doesn't budge on price or doesn't pull a rabbit out of their hat.

What's shocking, and Dictator, Alex from DF says, is that the RT uplift is so bad here. Like it seems it scaled linearly with RDNA 2, no huge improvements like "leakers" were suggesting. Like he says, we're at the dawn of having waves of games with full RT, without any ways to toggle it off. RT remix sprinkled on top of that and... well i have to ask the question, what's the point of having so much rasterization performances if you're not a pro-gamer playing at 1080p with low settings? This range of card is basically scoffing at any pure rasterization games coming at them, it's wasted power left on the table, while RT NEEDS all the performance gain it can get.

Not sure i understand the proposition here. Benchmarks will be a cold shower. It'll make the 4080 look appealing in the end. Surprised pikachu right there, as everyone were down on it.
Any developer that doesn't make raytracing an optional setting like every other setting on PC needs to be boycotted. We don't need that restrictive shit.

And RT is overblown anyway. Yes, it looks good. Pretty much only if you stop look at surfaces using it.
 

Buggy Loop

Member
Sigh, guess i'm waiting for reviews seeing AMD pulled all sorts of shit nvidia and intel would be proud of here. Like using a 5900x cpu for the 6950x and used a 7900x for the 7900xtx.
[/URL]

Then they used their frame generation tech in the RT slide saying up to 2x performance but hid the fact the 7900 was using frame gen while the 6950 just fsr2.
[/URL]

Holy shit AMD, what have you done


monkey-clown.gif
 

adamosmaki

Member
Ah yes couldn't imagine the day where we cheer a 1000gpu been good value
7900xtx looks good though XT at that price doesn't really make sense . I was expecting 700 maybe 750 at most
 

hlm666

Member
If that's the case, then we're looking at 11% more Rt performance in favour of the 4080 and 45% more rt performance in favour of the 4090. While in raster, we're looking at 10% more performance in favour of the 4090. Not too shabby honestly, I expected a bigger advantage in Rt in Nvidia's favour should this end up being the case. Amd has great performance here considering the price (great perf/watt) in addition to having less board power at 355w vs 450w on the 4090. I imagine had Amd went for that high of a board power, they'd be up there with the 4090, but it's great they decided to go with a balanced efficiency/performance ratio rather than aiming for a "space heater ". This reinforces my assumption that a 7950xtx is more likely on the cards to compete with the 4090 on that tier with 390-400w of board power to maintain that 2x8pin cable. Time will tell. Hopefully reddit did their research properly and is in the money, as that will make navi3x super competitive with Nvidia's offerings. Plus navi gen3 now offers dedicated ai acceleration in addition to displayport 2.1, which is great.

Someone made some comparisons from AMD's numbers over on beyond3d and the 4090 was far from just 45% faster in RT. We are going to need proper reviews to have any idea how things pan out, its frustrating AMD were not more transparent so now have to wait another month to know exactly how these all really perform.


Some preliminary numbers comparing 4090 vs 7900XTX based on published AMD numbers.

Cyberpunk 2077, native 4K:
4090: 40fps
3090Ti: 23fps
6900XT LC: 11fps
7900XTX: 17fps (50% faster than 6900XT LC)
The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 35% faster.

Metro Exodus EE, native 4K:
4090: 87fps
3090Ti: 48fps
6900XT LC: 25fps
7900XTX: 37.5fps (50% faster than 6900XT LC)
The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 28% faster.

Dying Light 2 native 4K:
4090: 44fps
3090Ti: 24fps
6900XT LC: 11fps
7900XTX: 20fps (56% faster than 6900XT LC)
The 4090 is 2.2X faster than 7900XTX, the 3090Ti is 20% faster

Hitman 3 native 4K:
4090: 43fps
3090Ti: 23fps
6900XT LC: 16fps
7900XTX: 26fps (85% faster than 6900XT LC)
The 4090 is 65% faster than 7900XTX
 

OZ9000

Banned
Oof Nvidia will have to have some masterful PR to sell the 4080 without a price adjustment
To be honest even with the likely better RT performance it still isn't worth getting a 4080 for that price over the XTX.

The 4080 would only be worth it if it cost sub 1k.
 

Mister Wolf

Gold Member
Someone made some comparisons from AMD's numbers over on beyond3d and the 4090 was far from just 45% faster in RT. We are going to need proper reviews to have any idea how things pan out, its frustrating AMD were not more transparent so now have to wait another month to know exactly how these all really perform.

[/URL][/URL][/URL]

Some preliminary numbers comparing 4090 vs 7900XTX based on published AMD numbers.

Cyberpunk 2077, native 4K:
4090: 40fps
3090Ti: 23fps
6900XT LC: 11fps
7900XTX: 17fps (50% faster than 6900XT LC)
The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 35% faster.

Metro Exodus EE, native 4K:
4090: 87fps
3090Ti: 48fps
6900XT LC: 25fps
7900XTX: 37.5fps (50% faster than 6900XT LC)
The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 28% faster.

Dying Light 2 native 4K:
4090: 44fps
3090Ti: 24fps
6900XT LC: 11fps
7900XTX: 20fps (56% faster than 6900XT LC)
The 4090 is 2.2X faster than 7900XTX, the 3090Ti is 20% faster

Hitman 3 native 4K:
4090: 43fps
3090Ti: 23fps
6900XT LC: 16fps
7900XTX: 26fps (85% faster than 6900XT LC)
The 4090 is 65% faster than 7900XTX

Starting next year many more games like Metro Exodus EE will release completely designed around utilizing raytraced lighting. Silent Hill 2 Remake is using Lumen from UE5 in 2023. I'm not sure that Jedi Survivor will use Lumen but it is an UE game and the developers have already said they intend to use a raytraced lighting system. Both are slated for a 2023 release. We even got a confirmation through a job listing that Starfield will utilize raytracing.


 
Last edited:
Top Bottom