• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Buggy Loop

Member
Damn, this game annihilates all graphics cards at native 4K with Ultra settings and ray tracing.

Witcher 2 with ubersampling at 4k will also do the same haha.

Really, there must be a bunch of settings that are there in « ultra » for future proofing, like almost all CDPR games. I bet a few settings to high would drastically improve performances without any noticeable visual difference (then digital foundry does a 400% zoom, stops frame and counts pixels with a machine...)
 

M1chl

Currently Gif and Meme Champion
Who would buy this? I mean you are buying this card just for now, or you don't want raytracing for 999USD which is probably going to be more common as the time goes? What a bad deal. I know that these cards also are not available, so it's not "hurrr, I can buy my Radeon now", you can't same thing with RTX3080/3090... I mean seriously, Radeon cards lacking big time of their feature set.
 
Capturerrt.jpg


What a turn around for AMD in the GPU space. This while drawing quite a lot less power, the efficiency of RDNA2 kicks Ampere's ass.
 
Looks like AMD created best performing 999$ 1080p gaming card :D


Performance drop at 4K is so bad that it barely matches 3080.
 

longdi

Banned
Yucks, 6900xt is $200 too expensive..
I expect amd to panic price drop next month once Nvidia confirm the 3080ti at $999

Anyway no reason to rush for new gpu with the current slate of games. Imo make sense to wait for 4080ti :messenger_sleeping:
 
Looking at a few of the reviews the 6900XT Actually seems to perform a little better in general than I assumed beforehand, although not by much.

Pity we will have to wait for good AIB models to see how far it can be pushed with a solid manual OC. I'm predicting 2900Mhz+ for some AIBs which should give it a very solid performance uplift overall. It is a pity custom AIB models have been pushed back and are not launching close to the reference model.

All in all 6900XT seems to perform really well at 1080p and 1440p generally beating out the 3090 but falling behind in 4K a little the same way we saw the 6800XT fall behind the 3080 in 4K. All in all pretty solid for a card that is $500 cheaper than the 3090, is much smaller in size and draws a hell of a lot less power.
 
Last edited:

Bolivar687

Banned
Capturerrt.jpg


What a turn around for AMD in the GPU space. This while drawing quite a lot less power, the efficiency of RDNA2 kicks Ampere's ass.

No one was expecting them to compete in performance, let alone win at 1440p, but taking the performance-per-watt crown is something I don't think anyone even could have expected. Then again, Ampere efficiency is pretty bad, so it's not hard to improve on garbage as Captain Price says.
 
Last edited:
No one was expecting them to compete in performance, let alone win at 1440p, but taking the performance-per-watt crown is something I don't think anyone even could have expected. Then again, Ampere efficiency is pretty bad, so it's not hard to improve on garbage as Captain Price says.



But they didnt win at 1440p. Is this gonna go the same as with 6800XT, where Techspot was showing them winning at 1440p because they suddenly included 3 AMD sponsored games for the 6800 review, while 50 other websites show the 6800 losing at 1440p ? Most other websites have the 6900 losing at both 1440p and 4k. The 6900Xt is around the 3080 level, 2% faster.
 

Papacheeks

Banned
I love that no one is taking into account most developers and engineers have been making games and their engines around Nvidia cards/cuda cores ect. for the past 5-6 years.

It's going to take time until we see a more even playing field in how engines run/optimized for Radeon cards.

The fact that they are competing at all on this level is great to see. Just think that they should not be charging a premium for the cards if they are on average underperforming in some area's.

Ray tracing means nothing to me at the moment.
 

Ascend

Member
No one was expecting them to compete in performance, let alone win at 1440p, but taking the performance-per-watt crown is something I don't think anyone even could have expected. Then again, Ampere efficiency is pretty bad, so it's not hard to improve on garbage as Captain Price says.
I expected it... As soon as the PS5 could run a 2.2Ghz RDNA2 chip on a console, it was pretty much a given that they were doing well in terms of power consumption, especially because it's 36CUs, the same as the 5700. Major improvements in power must have been in place for this to be achieved. What I did not expect was clock speeds exceeding 2.3 GHz.

But they didnt win at 1440p. Is this gonna go the same as with 6800XT, where Techspot was showing them winning at 1440p because they suddenly included 3 AMD sponsored games for the 6800 review, while 50 other websites show the 6800 losing at 1440p ? Most other websites have the 6900 losing at both 1440p and 4k. The 6900Xt is around the 3080 level, 2% faster.
And how many of those games on other websites are nVidia sponsored games...?
 
Last edited:
I love that no one is taking into account most developers and engineers have been making games and their engines around Nvidia cards/cuda cores ect. for the past 5-6 years.

It's going to take time until we see a more even playing field in how engines run/optimized for Radeon cards.

The fact that they are competing at all on this level is great to see. Just think that they should not be charging a premium for the cards if they are on average underperforming in some area's.

Ray tracing means nothing to me at the moment.


How were they making their engines around nvidia if both consoles have amd hardware since 2013 ? Same as now. Of course ray tracing means nothing when AMD cand compete. Even now when the biggest game just released with it
 
Last edited:

Ascend

Member
How were they making their engines around nvidia is both consoles have amd hardware since 2013 ? Same as now. Of course ray tracing means nothing when AMD cand compete. Even now when the biggest game just released with it
Where none of the cards in existence actually give respectable performance with max settings. And somehow you expect this RT feature in these cards to be 'future proof'...? Get real.
 

Sun Blaze

Banned
OK, I'm really annoyed by what has befallen the GPU landscape.

It all started with the GTX Titan back in Feb 2013. A $1,000 single-GPU was unheard of. Sure, you had the GTX 690 but it was two GTX 680 ($500) taped together so it made sense. The OG Titan had 6GB of VRAM whereas the biggest NVIDIA had at the time were 4GB variants of the 680/670 and later on 3GB 780/Ti.
Twice as much VRAM as most cards, it walloped everything in performance but most importantly, was incredible in compute and mopped the floor with everything in double-precision workloads. Yes, it was expensive but there 100% was value to own by far the best gaming and compute card on the market.

As time went by, NVIDIA stripped features away from it until it just became a glorified gaming card. The $2500 of the last Titan is absolutely bonkers, even with the 24GB of VRAM.

Now we're at a point even AMD charges $1000 for what is a single-GPU gaming oriented card. The 6900XT is 2080-Ti levels of bad in terms of value proposition.

I'm really displeased with all of this. Thanks to NVIDIA, $1000+ top-tier GAMING GPU's are a thing and AMD wanting to cash in just followed suit.

The only losers here are the gamers.
 
Where none of the cards in existence actually give respectable performance with max settings. And somehow you expect this RT feature in these cards to be 'future proof'...? Get real.


1440p, max settins, max ray tracing, dlss, 60 frames with 3080 and 3090. Sounds like a winner to me for a game that employs the full suite of current ray tracing effects
 
Last edited:

Papacheeks

Banned
The fact that the 6900xt even the 6800xt compete with a 3090 which is $1500 is not being talked about. All we get are dumb nvidia shills talking about ray tracing and DLSS which currently with cyberpunk runs like shit. I'm hearing those numbers of cyberpunk are not indicative to final performance, because DRM was on review code. Which according to CDPRJRED hinders performance by a decent margin.

I'll wait till next week to see with final drivers and all that. And remember the DXR/Next gen update for cyberpunk isn't till next year. SO any optimization for team red wont be seen till next year as Cyberpunk among a couple other titles are nvidia based titles.

So I stand by that Radeon cards always age well, let's revisit in 5 months when stock is actually available and developers have had more time.
 
The fact that the 6900xt even the 6800xt compete with a 3090 which is $1500 is not being talked about. All we get are dumb nvidia shills talking about ray tracing and DLSS which currently with cyberpunk runs like shit. I'm hearing those numbers of cyberpunk are not indicative to final performance, because DRM was on review code. Which according to CDPRJRED hinders performance by a decent margin.

I'll wait till next week to see with final drivers and all that. And remember the DXR/Next gen update for cyberpunk isn't till next year. SO any optimization for team red wont be seen till next year as Cyberpunk among a couple other titles are nvidia based titles.

So I stand by that Radeon cards always age well, let's revisit in 5 months when stock is actually available and developers have had more time.


So stating facts, backed with hard data from multiple websites is nvidia shilling ? We should instead behave like you, waiting for theorethical improvements that dont exist and outright lying about the way Cyberpunk runs even though we have a benchmark for it running at 60 frames with ultra ray tracing and raster ?
 

Buggy Loop

Member
I'll give them that, the 8 additional CUs had a better scalability for the price than 3090, i was expecting a tad lower impact.
The 3090 is kind of in a weird place. It's a blender farm card. If the 6900XT can give enough motivation to Nvidia to compete in the ~1000$ range, the 3080 Ti hopefully is priced in that range.

As much as i am impressed at the 3080's performance for the price, 3090 does not make any fucking sense for gaming now.

So it seems RDNA 2, while personally i'm not interested at the moment for the many tradeoffs just for a few trading blows at rasterization, should scale well with a raise in CUs in 2021's refresh. It seems ready for MCM (multple chip module), while Nvidia will go MCM with hopper.
 

Papacheeks

Banned
How were they making their engines around nvidia if both consoles have amd hardware since 2013 ? Same as now. Of course ray tracing means nothing when AMD cand compete. Even now when the biggest game just released with it

Those are all based on old GCN. Which were running at this point api's that can run anything. RDNA 2 is brand new arc and will improve with time. Most dev kits for PC were running Nvidia cards. ALl the rigs and test units at E3 2018-2019 all had nvidia gpu's when they showed off Cyberpunk. Same gears 5.
 
Last edited:
The fact that the 6900xt even the 6800xt compete with a 3090 which is $1500 is not being talked about. All we get are dumb nvidia shills talking about ray tracing and DLSS which currently with cyberpunk runs like shit. I'm hearing those numbers of cyberpunk are not indicative to final performance, because DRM was on review code. Which according to CDPRJRED hinders performance by a decent margin.

At the same time 3080 competes against 6900xt and takes performance per dollar crown from it easily
 

Bolivar687

Banned


Jay's review is interesting. Apparently the review BIOS were limiting his to 250w and he had to do some tinkering to get it at the advertised 300w. It seems like AMD was trying to make the temperatures look better than they are because its reference cooler really wasn't up for it. It's ironic that AMD was reluctant to let AIBs take a crack at it, because it seems now that this card really needs them to figure out the right configuration.

Interestingly, he seems to say the 6800 non-XT is the best overall value card this hardware cycle. I was definitely not expecting that from him but now it's pushing me more towards grabbing one of them if I can find it.
 
Last edited:

MadYarpen

Member



Interestingly, he seems to say the 6800 non-XT is the best overall value card this hardware cycle. I was definitely not expecting that from him but now it's pushing me more towards grabbing one of them if I can find it.

That would be true if you could buy it at normal price. They are overpriced now, quite a lot.
 
Can you provide screenshots for that?

i will have to return the 6800 on thursday. i took a few screenshots and will try to screenshot the same areas when i put my rtx 3080 back in the rig.

while screenshotting there seems to be some strange issue with cutting off (or not shading textures in) the raytracing volume at steeper angles. i didn't notice that when i compared them last time. i was on a older driver revision and i only compared night scenes back than. maybe i did not notice that it was not shading the non-emissive materials correctly because of the dark.

back then the RT was glitching out all together so that you had just grey areas in the puddles. you could fix that by simply disabling and enabling RT in the settings menu.

furthermore the hideout area (inside area where you have some nice RT effects) does crash the game now completely. it worked fine with the last driver version.
 


Jay's review is interesting. Apparently the review BIOS were limiting his to 250w and he had to do some tinkering to get it at the advertised 300w. It seems like AMD was trying to make the temperatures look better than they are because its reference cooler really wasn't up for it. It's ironic that AMD was reluctant to let AIBs take a crack at it, because it seems now that this card really needs them to figure out the right configuration.

Interestingly, he seems to say the 6800 non-XT is the best overall value card this hardware cycle. I was definitely not expecting that from him but now it's pushing me more towards grabbing one of them if I can find it.



I mean, its a logical conclusion. The 6800XT is a card that costs as much as the 3080, yet its worse in every measurable area. The 6900XT is a card thats the same as a 3080 in raster, worse again in everything else and costs several hundred dollars more. These are cards that are very ill placed in the current market price wise.

People were saying a few months before that AMD had nvidia spooked, thats why they released early and priced the 3080 at 700. When the actual fact seems to be nvidia messed with AMD. Thats why they have even worse stock than nvidia, they have this bizzare price on paper thats not actually feasible on the market and the price is 200 dollars more for the 6800XT right from the start. AMD must've expected Turing prices and were caught by surprise and the "650 dollar" price for the XT was just to apear better value
 

llien

Member
computerbase:

AIxoDvy.png



however, in newest games 6900XT is ahead even before SAM and OC:

qilA4ea.png


IAxYHYB.png


3090 also seems to have 3080's illness at lower resolution:

CffRhTe.png



amazing cooler (noise in games):

7sKO45x.png


power consumption as promised by AMD, on 6800XT levels:

4gZv7j1.png


Very power efficient:

t7U2Q9y.png


Easily OCes to beyond 2.5Ghz. OC-ing memory can decrease perf.


amd to panic price drop

When people buy cards at MSRP + 20% instantly, but AMD gonna "panic price drop" because NV will squeeze a card between 3080 and 3080+10% cards. :messenger_beaming:
L - logic.
 

Bolivar687

Banned
I mean, its a logical conclusion. The 6800XT is a card that costs as much as the 3080, yet its worse in every measurable area. The 6900XT is a card thats the same as a 3080 in raster, worse again in everything else and costs several hundred dollars more. These are cards that are very ill placed in the current market price wise.

People were saying a few months before that AMD had nvidia spooked, thats why they released early and priced the 3080 at 700. When the actual fact seems to be nvidia messed with AMD. Thats why they have even worse stock than nvidia, they have this bizzare price on paper thats not actually feasible on the market and the price is 200 dollars more for the 6800XT right from the start. AMD must've expected Turing prices and were caught by surprise and the "650 dollar" price for the XT was just to apear better value

Since we're in a one-on-one here, I'd like to make a good faith appeal to you to please turn down the warrioring because you can't honestly think the 6800XT is "worse in every measurable area." Surely you must admit that the 6800XT is irrefutably better in efficiency, overclocking headroom and VRAM. I've also seen that it has better frametimes, such as in the GamersNexus review. Hopefully we can just agree to disagree that the reviewers with larger sample sizes seem to mostly give the 1440p crown to AMD, even if you think it's unfair to include AMD sponsored titles and that they should only keep the Nvidia sponsored ones, for whatever reason.

But the 6900 XT having the same rasterization performance as a 3080 is just not true and I don't really know what else to tell you beyond that.

Finally, with respect to what "people were saying," I think we should be honest there as well - the majority of posts in GAF's speculation threads were from green team members swearing on their lives that Big Navi was going to be a 3060 competitor and probably fall short of that. If AMD didn't have Nvidia spooked, there would be no reason to see them destroying the power efficiency across their lineup to squeeze out every last drop in performance, especially after how big of a deal Nvidia and their fans made out of power efficiency over the last few cycles. If Nvidia wasn't scared, there would be no reason to release possibly the worse value proposition of all time in the 3090 at production card prices but without the production card drivers. And there would be no rumors of a 3080 Ti coming in a few months to completely obliterate the 3090 out of the hardware stack shortly after it launched. As far as cards not hitting MSRP - I'm seeing the exact same retailer markups across both manufacturers, so I don't see how that factors into this conversation either. And I just have no idea what you mean by your last sentence - $700 for the x80 and $499 for the x70 are Turing prices, so I don't see how anyone was caught off guard here.
 

Sun Blaze

Banned
Finally, with respect to what "people were saying," I think we should be honest there as well - the majority of posts in GAF's speculation threads were from green team members swearing on their lives that Big Navi was going to be a 3060 competitor and probably fall short of that. If AMD didn't have Nvidia spooked, there would be no reason to see them destroying the power efficiency across their lineup to squeeze out every last drop in performance, especially after how big of a deal Nvidia and their fans made out of power efficiency over the last few cycles. If Nvidia wasn't scared, there would be no reason to release possibly the worse value proposition of all time in the 3090 at production card prices but without the production card drivers. And there would be no rumors of a 3080 Ti coming in a few months to completely obliterate the 3090 out of the hardware stack shortly after it launched. As far as cards not hitting MSRP - I'm seeing the exact same retailer markups across both manufacturers, so I don't see how that factors into this conversation either. And I just have no idea what you mean by your last sentence - $700 for the x80 and $499 for the x70 are Turing prices, so I don't see how anyone was caught off guard here.
Nah, Samsung just sucks.
 
Since we're in a one-on-one here, I'd like to make a good faith appeal to you to please turn down the warrioring because you can't honestly think the 6800XT is "worse in every measurable area." Surely you must admit that the 6800XT is irrefutably better in efficiency, overclocking headroom and VRAM. I've also seen that it has better frametimes, such as in the GamersNexus review. Hopefully we can just agree to disagree that the reviewers with larger sample sizes seem to mostly give the 1440p crown to AMD, even if you think it's unfair to include AMD sponsored titles and that they should only keep the Nvidia sponsored ones, for whatever reason.

But the 6900 XT having the same rasterization performance as a 3080 is just not true and I don't really know what else to tell you beyond that.

Finally, with respect to what "people were saying," I think we should be honest there as well - the majority of posts in GAF's speculation threads were from green team members swearing on their lives that Big Navi was going to be a 3060 competitor and probably fall short of that. If AMD didn't have Nvidia spooked, there would be no reason to see them destroying the power efficiency across their lineup to squeeze out every last drop in performance, especially after how big of a deal Nvidia and their fans made out of power efficiency over the last few cycles. If Nvidia wasn't scared, there would be no reason to release possibly the worse value proposition of all time in the 3090 at production card prices but without the production card drivers. And there would be no rumors of a 3080 Ti coming in a few months to completely obliterate the 3090 out of the hardware stack shortly after it launched. As far as cards not hitting MSRP - I'm seeing the exact same retailer markups across both manufacturers, so I don't see how that factors into this conversation either. And I just have no idea what you mean by your last sentence - $700 for the x80 and $499 for the x70 are Turing prices, so I don't see how anyone was caught off guard here.


You're right, if we go on the power, efficiency its something else. What i was thinking when i said every measurable area, was in regards to performance. Thats where the 6800xt has literally nothing going for it. Outside the vram which is a qunatity that will have no practical purpose far after this card will be dead and burried.

Outside of Techspot, the websites with the largest selection of games all have the 3080 wining at 1440p. The ones that tested more than 10 games. Techspot is the only site which has the radeon wining. From the sites ive looked at, i cant check every single review in the world and count them all
 
Last edited:

Sky-X

Neo Member
The 6800XT is a card that costs as much as the 3080, yet its worse in every measurable area.
... How so? Its better in some games and very competitive. I think the better 4K performance for Nvidia is mainly due to the memory bandwidth advantage. AMD managed to get really close in a way we haven't witnessed for years. It also is a very efficient card, and it has 16GB of VRAM. I think its a great card only let down a little by its RT performance.
 

Bolivar687

Banned
You're right, if we go on the power, efficiency its something else. What i was thinking when i said every measurable area, was in regards to performance. Thats where the 6800xt has literally nothing going for it. Outside the vram which is a qunatity that will have no practical purpose far after this card will be dead and burried.

Outside of Techspot, the websites with the largest selection of games all have the 3080 wining at 1440p. The ones that tested more than 10 games. Techspot is the only site which has the radeon wining. From the sites ive looked at, i cant check every single review in the world and count them all

Hardware Unboxed also has Nvidia behind at 1440p across 18 games. That's a trend I've seen in almost every video and it wasn't until you guys posted a German aggregator, with some reviews having as few as 8 games, that I saw anything to the contrary. Techspot is not the outlier you're making it out to be.
 

Ascend

Member
OK, I'm really annoyed by what has befallen the GPU landscape.

It all started with the GTX Titan back in Feb 2013. A $1,000 single-GPU was unheard of. Sure, you had the GTX 690 but it was two GTX 680 ($500) taped together so it made sense. The OG Titan had 6GB of VRAM whereas the biggest NVIDIA had at the time were 4GB variants of the 680/670 and later on 3GB 780/Ti.
Twice as much VRAM as most cards, it walloped everything in performance but most importantly, was incredible in compute and mopped the floor with everything in double-precision workloads. Yes, it was expensive but there 100% was value to own by far the best gaming and compute card on the market.

As time went by, NVIDIA stripped features away from it until it just became a glorified gaming card. The $2500 of the last Titan is absolutely bonkers, even with the 24GB of VRAM.

Now we're at a point even AMD charges $1000 for what is a single-GPU gaming oriented card. The 6900XT is 2080-Ti levels of bad in terms of value proposition.

I'm really displeased with all of this. Thanks to NVIDIA, $1000+ top-tier GAMING GPU's are a thing and AMD wanting to cash in just followed suit.

The only losers here are the gamers.
And yet people get mad at me when I tell them to not buy nVidia on principle. I guess that virtue is gone nowadays.

They're random games. Some are, some arent, some arent for either nvidia nor amd. It doesnt win at 1440p
And that's the thing. Techspot/Hardware Unboxed deliberately choose their games to have a good distribution of different APIs, differently sponsored games, different engines and a wide range of how old the games are.
Technically, it doesn't in at 1440p, but, it doesn't lose either. If you call a 10 fps disadvantage when you're over 150 fps a loss, well, I don't know what to tell you.

Techspot is the only site which has the radeon wining.
Everyone else is using Intel processors.
 
Easily OCes to beyond 2.5Ghz. OC-ing memory can decrease perf.

Yeah unfortunately just like the reference 6800XT the reference 6900XT is too power limited to really OC well, the lack of more power starves the cores and you end up getting bad scaling or not able to reach higher clocks on the reference models, by decreasing memory clocks there is more power left to go to the cores which is why on the reference models we see this behaviour.

We've seen AIB 6800XT cards maintain 2600-2700Mhz on air, which is pretty crazy. Liquid cooled models do even better than that getting some great performance, and that is even with the 6800XT cards being still somewhat power limited by AMD and artificially limited in terms of clocks to 2800Mhz in the BIOS, as well as memory clocks being limited.

Where this card will really shine is with a manual OC on the eventually released AIB models. The 6900XT has a BIOS limit of 3000Mhz clock and is a much better binned chip, which means it can boost higher, more efficiently and maintain that clock more stably. AIB models should really fly with a manual OC.

Having said that even the reference model is $1000 which is stupid money for the performance increase over 6800XT and 3080. It is objectively a bad buy in a general sense, it is just that its direct competitor the 3090 is an even worse buy at $1500 that it makes the 6900XT come out looking pretty good by comparison. But I would recommend people buy 6800XT/3080 over the 6900XT/3090.

I can't imagine how expensive the AIB models for the 6900XT will be given the gouging going on by AIBs and retailers across both AMD and Nvidia cards. These shortages can't end soon enough to return some normality to pricing and actually allow people to buy any of these cards from AMD/Nvidia
 
Last edited:
Top Bottom