• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Rentahamster

Rodent Whores
I know he prefers Nvidia, but to say the value proposition of the $1000 AMD card is worse than that of the $1500 NV card is total bollocks

He doesn't say it's worse. He says it's equally as bad. Actually listen to the video, and hear what he says and not what you want to hear.

 

Rentahamster

Rodent Whores
It's really sad how seriously you guys take this.
giphy.gif


giphy.gif
 
Looks like the custom AIB model 6900XT cards have 3 8pin power connectors vs the 2 of the 6800XT.

That means they are going to be able to draw a hell of a lot more power than the 6800XT card, which means these things are definitely going to overclock like crazy.

The main thing holding back clock speeds/performance scaling at higher clock speeds is that the 6000 series of cards were power limited. Especially we see this with the reference models which don't overclock anywhere near as well as the partner models which have a higher power rating.



It appears that premium Radeon RX 6900 XT custom graphics cards will all ship with three 8-pin power connectors. We have already reported on ASRock Taichi, ASUS ROG STRIX LC and PowerColor Red Devil – each of these cards has three connectors. Today we can confirm that Radeon RX 6900 XT GAMING OC from Gigabyte will be no exception.
 
Last edited:

Ascend

Member
For once Asrock is releasing a card that doesn't look terrible. Not great, but not terrible.

Anyway... Watch Dogs Legion patch incoming;

Global
  • We have added the ability to retire Prestige Operatives, which will function like their Permadeath behavior: They will be gone for the remainder of the playthrough. Starting a new game will add the Operatives in that playthrough.
  • Fixed the Unequip button in the store not unequipping equipped items.
  • Fixed an issue where skipping a cutscene could cause the cutscene to continue to play while the player regained control over their operative.
  • Fixed an issue that caused the Welcome Page settings to not correctly reset when choosing the Reset to Default option.
  • Fixed an issue that caused clothes to not show up on a character after having saved or loaded the game while wearing an outfit. This fix is going live for Xbox One and PC and will follow shortly for PlayStation 4.
  • Fixed an issue that could cause reflections to judder and pop in with Raytracing enabled.
  • Fixed incorrectly applying low resolution textures on the Outwear Hoodie item in the "Synthetic Error" clothing pack.
  • Fixed an issue that caused the paint from paint grenades and paint guns to not show up when hitting enemies.
PC
  • Enabled Ray Tracing for Ray Tracing capable AMD GPUs.
  • Fixed an issue that could cause the game to crash when using DirectX12.
  • Fixed shimmering artifacts behind particle effects when DLSS was enabled.
  • Fixed an issue that caused the game to not launch with Korean localization despite selecting Korean localization.
  • Fixed an issue that caused double clicking on the Back button in any Tutorial menu to return to the same submenu they attempted to leave.
  • Fixed several issues that could cause the game to crash due to a UI bug.
  • Fixed missing highlight when hovering over buttons in any Clothing Shops in-game.
  • Fixed an issue that could cause the game to crash after interacting with the Walkie Talkie during the Initiation mission.
  • Reordered the DLS options to correctly reflect which are available when playing on an NVIDIA GPU.
  • Fixed an issue that caused prompts to not correctly reflect a key binding change.
  • Fixed an issue that could crash the game when switching from DX12 to DX11, and then back to DX12 again.
  • Fixed an issue that could cause a crash when force closing the Ubisoft Connect overlay, while the game is launching.
  • Fixed an issue that could cause the game to crash when controlling a drone.
  • Fixed an issue that could cause the game to crash during a Spiderbot takedown sequence.
  • Fixed an issue that could cause the game to crash if an NPC died.
  • Fixed an issue that could cause the game to crash if it was minimized for more than 5 minutes.
  • Fixed an issue that prevented Photomode pictures to be locally saved after uploading to the cloud.
  • Fixed an issue that prevented players to map Parkour and Aimed fire to the same keybinding.
  • Fixed an issue that could cause the game to crash when loading into the Open World or starting a new game.



Full list;

That's a LOT of fixes... But I guess this confirms that the benchmarks with RT between nVidia and AMD that were released were pretty much bogus.
 
Last edited:
Especially UWP games from MS Store perform so damn well on Radeon GPUs. But then again, the reviewers usually test the currently most popular/biggest AAA titles, because those are the games people are getting the newest cards for, and it rarely happens to be games made with AMD architecture in mind, that's just the way it is and there's little to no reason to manipulate the benchmarks only to show how well Radeons perform in games nobody cares about anymore or never even heard about.

yeah that's undoubtedly true. Jensen just knows where to best invest engineering effort with his field engineers. that's exactly the reason why i needed to get a 3080 because i wanted to play cyberpunk in all it's glory. was clear from the beginning that nvidia wanted this to be the poster child for ampere. i think it's save to assume that this is the biggest co-engineering effort between a GPU manufactorer and a game developer to date.
 
Especially UWP games from MS Store perform so damn well on Radeon GPUs. But then again, the reviewers usually test the currently most popular/biggest AAA titles, because those are the games people are getting the newest cards for, and it rarely happens to be games made with AMD architecture in mind, that's just the way it is and there's little to no reason to manipulate the benchmarks only to show how well Radeons perform in games nobody cares about anymore or never even heard about.

Because it's pretty much same code as Xbox is running taken alive from console first development. It was whole point of UWP
 
Anyway... Watch Dogs Legion patch incoming;

PC
  • Enabled Ray Tracing for Ray Tracing capable AMD GPUs.

Full list;

That's a LOT of fixes... But I guess this confirms that the benchmarks with RT between nVidia and AMD that were released were pretty much bogus.

Interesting, looks like the RT issues on AMD GPUs were on Ubisoft's side rather than AMDs side. So it looks like the Ray Tracing was not supposed to even be enabled on AMD GPUs until now. Which makes the whole missing reflections thing make more sense now.

Hopefully we will see someone rebenchmark with RT enabled to gauge the real performance on AMD GPUs.
 

Sun Blaze

Banned
Very interesting benchmarks here... Radeon GPUs scale almost linearly in a multi-gpu configuration;
Saw it which is cool but it's mostly older games. Multi-GPU being completely reliant on driver support is pretty much dead. That's really too bad because they would be the only way of getting 4K/60fps/RT in modern games.
 

Ascend

Member
Saw it which is cool but it's mostly older games. Multi-GPU being completely reliant on driver support is pretty much dead. That's really too bad because they would be the only way of getting 4K/60fps/RT in modern games.
It doesn't have to be reliant on driver support though. The developers of Ashes of the Singularity implemented multi-GPU in their game, and you could run an nVidia and AMD GPU together. And funnily enough, mixing them gave better performance than using the cards of one brand. And performance was better when AMD was the 'lead' card, likely due to their hardware scheduler. Example;

mixed_4k.png


 

Sun Blaze

Banned
It doesn't have to be reliant on driver support though. The developers of Ashes of the Singularity implemented multi-GPU in their game, and you could run an nVidia and AMD GPU together. And funnily enough, mixing them gave better performance than using the cards of one brand. And performance was better when AMD was the 'lead' card, likely due to their hardware scheduler. Example;



It's either that or the devs do it themselves and most won't be bothered to do it for like 0.5% of the gaming population.
Whatever the case, multi-gpu in this day and age is almost dead.
 

llien

Member
Mindfactory (german pc parts shop) stats for week 50.

Nvidia Units 2720 = 57.14%
Radeon Units 2040 = 42.86%


Radeon Top 5 Selling Brand Lines!

  1. RX 580 = 590 Units.
  2. RX 6800 = 530 Units.
  3. RX 5600XT = 400 Units
  4. RX 570 = 230 Units.
  5. RX 5500XT = 160 Units.


Nvidia Top 5 Selling Brand Lines!

  1. RTX 3060 TI 8GB = 765 Units.
  2. RTX 3080 = 540 Units
  3. GTX 1650 = 355 Units.
  4. GTX 3070 = 295 Units.
  5. GTX 1660 Super = 290 Units.
 
In more stupid AIB pricing news we have this:



I guess we can add it to the pile of shitty AIB pricing for these cards right now. This model is even more expensive than the equivalent 3080 from Gigabyte, completely stupid.

I can only hope that pricing across the board drops once supply is steady but I'm starting to think this might be the new normal. Yaaay.....
:messenger_neutral:
 

llien

Member
This model is even more expensive than the equivalent 3080 from Gigabyte, completely stupid.
Can you actually buy the said equivalent GPU for the claimed price?
If not, I think we know the reason for inflated MSRPs.

I, frankly, do not have issues with actual card manufacturers getting some surplus income per card (which might still mean they earn less overall, if there are production shortages) instead of some random scalpers.
 
Can you actually buy the said equivalent GPU for the claimed price?
If not, I think we know the reason for inflated MSRPs.

I, frankly, do not have issues with actual card manufacturers getting some surplus income per card (which might still mean they earn less overall, if there are production shortages) instead of some random scalpers.

I think this is a pretty bad take, I'm quoting the official MSRP for both cards as that is all we can really go on. Are retailers adding on their "shortage tax" too? Yeah it looks like it, as are scalpers.

But this argument I don't think flies as we can only go off the MSRP. If the Nvidia cards are going for above MSRP then so are the AMD cards. It sounds like your argument is that the Nvidia cards are selling at retailers for a higher % above MSRP than AMD cards but I think they are probably around the same. They seem to be charging whatever they can get away with.

Aside from that, the official MSRP sounds like it will hold even after supply shortages are over. I'm hoping the MSRP on these AIB cards drop but they may just keep them where they are and pocket the extra profit. Welcome to the new shitty normal.
 

llien

Member
I think this is a pretty bad take, I'm quoting the official MSRP for both cards as that is all we can really go on. Are retailers adding on their "shortage tax" too? Yeah it looks like it, as are scalpers.
Welp, remind me MSRP on 2080Ti.

My point is, you can only compare real MSRPs, if even one (or both) sides are cheating, it's apples to oranges.
 
Last edited:

Ascend

Member
It's either that or the devs do it themselves and most won't be bothered to do it for like 0.5% of the gaming population.
Whatever the case, multi-gpu in this day and age is almost dead.
This is true. I do think somewhere down the line it will be peak its head above water again.
 
Welp, remind me MSRP on 2080Ti.

My point is, you can only compare real MSRPs, if even one (or both) sides are cheating, it's apples to oranges.

I get where you are coming from regarding MSRP vs what you can actually buy them for. But my worry is that this is the official MSRP so after all the supply issues are resolved this card and the other AIB models will probably retain this MSRP, which is stupidly high compared to the reference model MSRP.

But then it might all be pointless as we may just live in a post MSRP world when it comes to GPUs and they might all go for above MSRP at retailers as the new normal now.
 

llien

Member
I get where you are coming from regarding MSRP vs what you can actually buy them for. But my worry is that this is the official MSRP so after all the supply issues are resolved this card and the other AIB models will probably retain this MSRP, which is stupidly high compared to the reference model MSRP.

But then it might all be pointless as we may just live in a post MSRP world when it comes to GPUs and they might all go for above MSRP at retailers as the new normal now.

Ultimately, it's supply/demand, right? If people want to spend that much, why not let them.
 

Ascend

Member
I will not ever buy any card that is more than ~$50 above MSRP. Water cooled ones might be an exception, and even those are limited to $100 above MSRP in my book.
And that is assuming the MSRP is reasonable, which has also been questionable lately for quite a few cards.

I think next year things will normalize though. Maybe not in January, but, I don't expect the shortages to last more than 3 months from this point in time. We'll see.
 

BluRayHiDef

Banned
I will not ever buy any card that is more than ~$50 above MSRP. Water cooled ones might be an exception, and even those are limited to $100 above MSRP in my book.
And that is assuming the MSRP is reasonable, which has also been questionable lately for quite a few cards.

I think next year things will normalize though. Maybe not in January, but, I don't expect the shortages to last more than 3 months from this point in time. We'll see.

MSRP is rising, however. So, what difference do your personal rules make in regard to how much you'll have to spend in the future for a card of any class?

The "80 Ti" class cards used to cost $700, but they now cost $1200 and higher. The "60 Ti" class cards used to cost $300, but they now cost $400. So, if $50 to $100 more than MSRP are your cutoff points, then why aren't the increases in MSRPs themselves cutoff points?
 

Ascend

Member
MSRP is rising, however. So, what difference do your personal rules make in regard to how much you'll have to spend in the future for a card of any class?

The "80 Ti" class cards used to cost $700, but they now cost $1200 and higher. The "60 Ti" class cards used to cost $300, but they now cost $400. So, if $50 to $100 more than MSRP are your cutoff points, then why aren't the increases in MSRPs themselves cutoff points?
Did you miss this line?;

And that is assuming the MSRP is reasonable, which has also been questionable lately for quite a few cards.
nVidia has been price gouging like crazy. It's yet another reason why I refuse to buy their products.

What I look for with the MSRP is simple; the performance per dollar should have at least some improvement compared to the previous generation, looking back at two generations.
If we take the 5700XT at $400, and the 6800XT at $650, considering the performance jump, the 6800XT just passes the test.
Take the 5700XT itself, and compare it to the previous card, the RX 580. The RX580 had an MSRP of $230, the 5700XT of $400. You pay about 75% more for 85% more performance. That one also barely passes the test.
In case the 5700XT would have failed the test, I would have looked if the 6800XT card is better than the RX580 in terms of value to determine if it's worth buying.

All that flies out the window when you look at many (not all) of the "recent" nVidia cards.

And obviously when there are good deals, I take them. An R9 Fury (which I am still rocking) for $300 was an amazing deal at the time, since the MSRP was $550. That deal gave me 20% more performance for the same price as an R9 390. If it wasn't for that deal, the Fury would not have been worth buying.
 
Last edited:

BluRayHiDef

Banned
Did you miss this line?;


nVidia has been price gouging like crazy. It's yet another reason why I refuse to buy their products.

What I look for with the MSRP is simple; the performance per dollar should have at least some improvement compared to the previous generation, looking back at two generations.
If we take the 5700XT at $400, and the 6800XT at $650, considering the performance jump, the 6800XT just passes the test.
Take the 5700XT itself, and compare it to the previous card, the RX 580. The RX580 had an MSRP of $230, the 5700XT of $400. You pay about 75% more for 85% more performance. That one also barely passes the test.
In case the 5700XT would have failed the test, I would have looked if the 6800XT card is better than the RX580 in terms of value to determine if it's worth buying.

All that flies out the window when you look at many (not all) of the "recent" nVidia cards.

And obviously when there are good deals, I take them. An R9 Fury (which I am still rocking) for $300 was an amazing deal at the time, since the MSRP was $550. That deal gave me 20% more performance for the same price as an R9 390. If it wasn't for that deal, the Fury would not have been worth buying.

My question still applies. Are you simply going to stop buying graphics cards if they become too expensive by your standards?

As for your example regarding the 5700 XT being 85% faster than the RX 580 but costing only 75% more, it implies that if a card is faster than it's predecessor, then it should cost more. Well, new-generation cards of a particular tier are typically faster than their predecessors, which means that - according to your logic - MSRPs should continually rise indefinitely.

As for claim that Nvidia is price gouging, well AMD is doing the same. Their RX 6000 Series cards are almost as expensive as the competition but lack the features of the competition (e.g. AI upscaling and respectable ray-tracing performance).
 

Ascend

Member
My question still applies. Are you simply going to stop buying graphics cards if they become too expensive by your standards?
I will not buy a graphics card until one comes around that I determine to have good enough value. They will inevitably come around.

As for your example regarding the 5700 XT being 85% faster than the RX 580 but costing only 75% more, it implies that if a card is faster than it's predecessor, then it should cost more. Well, new-generation cards of a particular tier are typically faster than their predecessors, which means that - according to your logic - MSRPs should continually rise indefinitely.
I don't think you really understood what I look at. Whether there is a price rise or not, the performance rise should be higher than the price rise if there is one.
Prices of everything rise over time, because inflation (technically it's your time being stolen by printing money, but I won't go into that here).

As for claim that Nvidia is price gouging, well AMD is doing the same. Their RX 6000 Series cards are almost as expensive as the competition but lack the features of the competition (e.g. AI upscaling and respectable ray-tracing performance).
AMD always had to undercut nVidia for being the underdog. AMD could never have increased prices by themselves, because all their products would be left on the shelves. All the power is in nVidia's hands here. AMD is doing it because nVidia enabled it. Saying 'but AMD also does it' is a weak argument considering the dynamics of the market and the mind share nVidia holds.

Technically it's in the hands of the consumer, but they are too busy enjoying their dopamine rush to make conscious buying decisions.
 

Buggy Loop

Member
So Nvidia is to blame for AMD‘s fucked up MSRP (with reference cards production stopped). Are you MLID?

I’m all for shitting on them for the 2080 Ti MSRP gauging, but as of now?

Let’s not forget that not even 0.1 s after AMD was ahead for the first time against Intel with the zen 3 series that the prices raised up.

AMD is not your friend, Nvidia is not your friend. As soon as one can get the advantage, they’ll be the typical corporation. AMD had to be open source because their market share and being late to the party in technology, typically, means they have to push their tech as being as mainstream as possible to even be relevant, as being proprietary would just make no sense, nobody would adopt.

The single day that they find a technology that gives the edge, you can be sure they’ll brand it and make it proprietary. Like they tried with SAM being exclusive to 500 series mobo, zen 3 and 6000 series, although that’s a cute attempt.. it shows they’re not the holy knight you think they are.
 

Ascend

Member
So Nvidia is to blame for AMD‘s fucked up MSRP (with reference cards production stopped). Are you MLID?
Right now, no. But for the price hike since the RTX 2000 series for example, yes.

I’m all for shitting on them for the 2080 Ti MSRP gauging, but as of now?
Note that the RTX 3080 MSRP only looks so good because of the RX 2080 Ti MSRP. In reality it's not that big of a jump if you look at the years vs the performance improvement per price when compared to Maxwell and Pascal.

Let’s not forget that not even 0.1 s after AMD was ahead for the first time against Intel with the zen 3 series that the prices raised up.
Yes. By 50 bucks, which is not that much at all.
I was more concerned about the price hike of the X570 over the X470 boards than from Zen 2 to Zen 3.

AMD is not your friend, Nvidia is not your friend. As soon as one can get the advantage, they’ll be the typical corporation. AMD had to be open source because their market share and being late to the party in technology, typically, means they have to push their tech as being as mainstream as possible to even be relevant, as being proprietary would just make no sense, nobody would adopt.
So you agree that they were in no position to dictate prices.

The single day that they find a technology that gives the edge, you can be sure they’ll brand it and make it proprietary. Like they tried with SAM being exclusive to 500 series mobo, zen 3 and 6000 series, although that’s a cute attempt.. it shows they’re not the holy knight you think they are.
Is that why they are collaborating with Intel to implement the tech on their platform?
There are limitations to the Zen 2 and older CPUs to implement the technology, if you didn't know.

That being said, AMD might indeed abuse their power if they are ever in that position. Which is exactly why competition is important, and which is exactly why supporting nVidia and only wanting AMD around for cheaper nVidia prices is stupid.

----------------------------

I actually came here to post this and almost forgot... Lol;
 
Last edited:

thelastword

Banned
Well. The "NDF" was preoccupied with the Hardware Unboxed thread...
Man I have not even had time to read that thread and watch Steve's video on it... Should be a gold mine that thread....

Anyhoos, it's looking like AMD will dominate with most of the new titles releasing next year, they have made good inroads to get their hardware out there and also into devs hands, we are now seeing much more games being developed with AMD in mind, but still not gimping Nvidia. Good guy AMD, it's nice to see.
 

spyshagg

Should not be allowed to breed
No reason to buy either of nvidia or amd current cards this year. One gimps on Ram, the other on RT. If you need both things, wait for the late 2021 GPU's.

If there is a need to upgrade today and you will have to keep it for years, Ram is more important. Neither card will handle RT well enough in 2022 and beyond, it doesn't matter if one does 5fps and the other 20fps. Not playable is not playable.
 

llien

Member
makes a heck of a difference for games
I love how people refer to a handful of mostly "it will gimp your fps for no good reason, on top of forcing you to run your games at lower resolution with temporal anti-aliasing, while mumming 'buh it's bettah theh native 4k'" as "games".

One gimps ...on RT
Sure, John.

7IlhY2r.png

 
No reason to buy either of nvidia or amd current cards this year. One gimps on Ram, the other on RT. If you need both things, wait for the late 2021 GPU's.

If there is a need to upgrade today and you will have to keep it for years, Ram is more important. Neither card will handle RT well enough in 2022 and beyond, it doesn't matter if one does 5fps and the other 20fps. Not playable is not playable.


AMD's cards dont handle RT today, so sure they wont handle it in 2022. But 3080 and 90 are top of the line. Why would't they handle RT in 2022 since there wont be anything better until around the same time this year, when nvidia launched ampere ? And even with Ampere, 2080Ti didnt suddenly become obsolete. DLSS will make sure to give them long life. VRAM is always better to have more than less. But what can you do, this was the compromise required to hit the price point.

The 3080Ti will cost 1200 dollars i suspect. It will have limited adoption, the 10 gig will still be the go to card. We shall simply have to wait and see at what point will 10 gigs become an issue.
 

Antitype

Member


3080ti ftw. No reason to buy 6800xt or 6900xt imo.

A 3080 Ti with 12GB 384bit would have been much better. As it stands those 20GB 320bit will surely inflate the price significantly while not adding a whole lot of performance compared to the 3080. The 3090 is already not that much faster, even at 4K, and with a lower bandwidth the 3080 Ti doesn't make a whole lot of sense for gaming IMO.

It's great that AMD is even remotely competitive this year to keep Nvidia's prices (somewhat) in check, but there's indeed no real reason to buy their cards at the price points they went with. A 6800XT at ~150 €/$ cheaper would have been a killer product for those who don't care about RT and everything else Nvidia has to offer, basically those gamers interested purely in raster, but alas...
 
Last edited:

llien

Member

3% lol. How would "big companies" handle that, poor things.


even remotely competitive
That green reality distortion field keeps giving.
This is such a crazy statement in this context.
 
Last edited:

spyshagg

Should not be allowed to breed
I love how people refer to a handful of mostly "it will gimp your fps for no good reason, on top of forcing you to run your games at lower resolution with temporal anti-aliasing, while mumming 'buh it's bettah theh native 4k'" as "games".


Sure, John.

7IlhY2r.png



The review on AMD RT performance has been studied and published on all corners of the internet, and I do not believe for one moment you aren't aware of it to believe the Dirt5 RT is representative of it.

AMD's cards dont handle RT today, so sure they wont handle it in 2022. But 3080 and 90 are top of the line. Why would't they handle RT in 2022 since there wont be anything better until around the same time this year, when nvidia launched ampere ? And even with Ampere, 2080Ti didnt suddenly become obsolete. DLSS will make sure to give them long life. VRAM is always better to have more than less. But what can you do, this was the compromise required to hit the price point.

The 3080Ti will cost 1200 dollars i suspect. It will have limited adoption, the 10 gig will still be the go to card. We shall simply have to wait and see at what point will 10 gigs become an issue.

Our opinion doesn't differ that much except I believe that in a generation where consoles will dictate the baseline (16GB and simple RT), both the 3080 and 3070 are a no-go unless you upgrade yearly. If you do not update yearly, wait another year. If you have to update today, always chose Rasterization performance + RAM.

Much like the birth of the pixel shaders, when the tech finally becomes the baseline of all new games, none of the first generations of cards will be enough, DLSS or not.

The most ignored issue is that it is a big gamble to bet games will start being made fully with RT even on PC, because the market is regulated by what the consoles can do. PS4 launched 2013 with 8GB of ram, 7 years later 8GB is still all you need on PC GPUs. They dictate the rules, and now they have 16GB and simple RT capabilities. Owners of the 3080 will be crying foul play for years to come, with its anemic 10GB of ram. My 1080ti has 11GB.
 
Last edited:

Ascend

Member
AMD's cards dont handle RT today, so sure they wont handle it in 2022. But 3080 and 90 are top of the line. Why would't they handle RT in 2022 since there wont be anything better until around the same time this year, when nvidia launched ampere ? And even with Ampere, 2080Ti didnt suddenly become obsolete. DLSS will make sure to give them long life. VRAM is always better to have more than less. But what can you do, this was the compromise required to hit the price point.
Define "handling" RT. I would argue that the nVidia cards don't handle RT today either. Let's take the biggest most important game that has been used to push RT. None of the RTX cards can handle Cyberpunk max settings with RT.

In order to get playable framerates at 4K, you have to run DLSS in Ultra Performance mode, which is literally running at 720p and upscaling it to 4K (which inevitably doesn't look that great), to get the game running higher than 60fps average, and even then the minimums are below 60. And nobody can dispute that fact.

Cyberpunk-2077-early-benchmarks-2.jpg



"In 4K/Ultra/RT with DLSS Quality, the RTX3090 was able to only offer a 30fps experience."
If you think this performance is somehow acceptable, especially for a $1500 graphics card or you classify it as "handling RT" or being "top of the line", then the performance in the majority of other games with RT is perfectly acceptable for the 6800 cards as well. The same applies to lowering of RT settings to achieve playable framerates.

The best use of DLSS is still without RT. Because it enables something like an RTX 2060 to give playable framerates at 4K. RT is currently a liability, not an asset.

More hilarious is the fact that you think this $1500 graphics card that can't hit 4k/30fps with maxed out RT settings today without upscaling, and can't even hit 4K/60fps with DLSSB (1080p rendering), is somehow good enough for RT in 2022 🤷‍♂️
 
Last edited:
Top Bottom