• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Xyphie

Member
Review embargo for 6900XT should be tomorrow 15:00 CET (21h30m, from this post).

Spoiler: It'll perform like ~5% better than a 6800XT.
 
Last edited:
In terms of 6900XT performance it only has 8 more CUs than the 6800XT, while it should be a better binned chip so we can expect it to boost a little higher out of the box I'm not expecting massive gains over the 6800XT. It still has the same default power limits and TBP as the 6800XT as far as I know.

I would say probably 5-7% increase over the 6800XT at 4K with stock clocks. I'd love to be pleasantly surprised but that is where I see it falling in general, at least for the reference model.

Where this card should really shine is with manual overclocking on the AIB models. The reference model is still going to be power limited so it probably wont overclock that well compared to what an AIB model does.

I can see a good AIB model with a manual OC clocking up to 2900Mhz+, the 6900XT has a max clock frequency of 3000Mhz in the BIOS compared to 2800Mhz on the 6800XT which should grant it a very solid performance boost. On top of that it should be a better binned chip so that should allow it to boost higher and more stably maintain those boosts compared to 6800XT.
 

Rikkori

Member
Good news/bad news I guess
kLSU6JaXp9hP9cwMrLCVrj.png

zEMKgpfRYxP6SDGjvFMrag.png
 

wachie

Member
Nice showing by AMD there, I was expecting them to get demolished like in Minecraft.

Not even DLSS can work miracles, at least in its higher quality modes. DLSS Quality improves performance by 90 percent, which is great to see. Except, when your starting point is 20 fps, that still only gets you into the high 30s.
That's on the 3090 at 4K, yeesh. Looks like maybe next-gen of GPUs will be able to play this at 4K with full RT.
 
Last edited:

Armorian

Banned
Good news/bad news I guess
kLSU6JaXp9hP9cwMrLCVrj.png

zEMKgpfRYxP6SDGjvFMrag.png

This chart is a fucking mess :messenger_astonished:

Basically, ray tracing in Cyberpunk 2077 feels like something that's all or nothing. If you have a sufficiently fast graphics card—RTX 30-series, or maybe 2080 Super—you can run with the RT Ultra preset and DLSS and get a pleasing result. RT Medium is okay as well, but not as visually striking. But if you don't have a high-end RTX card, including AMD's RX 6000 GPUs, for now, it looks like you'll be better off running without ray tracing.
 
Last edited:

Rikkori

Member
It's pretty much what I thought since that german article some time ago. The current cards just aren't good enough but I'll go ahead and say CDPR isn't a great technical studio either. Also why I didn't care about spending more for RTX, it would've been wasted money. Pretty disappointed tbh, the GI I'm seeing is just shit, and textures are also still mediocre. It was forgivable for TW3 because of when it came out but ffs why can individual modders do better than these 2000+ dev studios. Clearly they have better quality assets made so just give us the option!

Bleah. 🤮
 

Papacheeks

Banned
This chart is a fucking mess :messenger_astonished:

So I was right about this then? Hate to say told you so, but engines that were literally being developed years ago are not going to be optimized for ray traced effects. It's still going to take time and full on engine re-writes to make it so there is much more minimal impact to performance.

Which is why Radeon is not betting their entire R&D costs on it.

Okaying this without RT with everything else cranked is going to be the way to go until they get the next gen optimization patch out which wont be till next year. Which I hear is coinciding with Witcher 3 update.

It's pretty much what I thought since that german article some time ago. The current cards just aren't good enough but I'll go ahead and say CDPR isn't a great technical studio either. Also why I didn't care about spending more for RTX, it would've been wasted money. Pretty disappointed tbh, the GI I'm seeing is just shit, and textures are also still mediocre. It was forgivable for TW3 because of when it came out but ffs why can individual modders do better than these 2000+ dev studios. Clearly they have better quality assets made so just give us the option!

Bleah. 🤮

Actually they are very techincal. The original Witcher 3 engine had a renderer that could not be done on consoles so they redid the entire engine. Their games scale very well and I would argue that if you look at Witcher 2 in 2011 it was among one of the best looking titles during that time. Same with Witcher 3 even with the downgrade.

All engines you are seeing ray tracing in are added after the fact. As in were not built from the ground up with ray tracing in mind. Most all games currently in production coming out in the next year mostly are running on older engine builds.

Unreal 5 isn't even available for use yet, Crytek is doing a brand new engine re-write from what I know, could be a new one from scratch? But there are many engines that are being reworked on the PC side and they are being help back right now by DIRECT X development.

There's a reason the next Battlefield is next gen only, and that would be because it is focusing on new API, and 100% ray tracing will be a focus.
 
Last edited:
The 3080 is 10% faster at 1440p than the 6800XT and a whoping 20% faster at 4k. This is a different class of difference. No ray tracing of course. No DLSS.

This is a pretty brutal difference between cards that are sold are basically the same street price now. My sincere recomandation, would be to either get a 3080 or wait until you can. The 6800XT simply isnt worthwhile in a market that has the 3080
 
Seriously... I'm sad that AMD's Ray Tracing iteration is so barebones. It was expected since it's their first step into this, but still... I hoped it would be better.
 
Last edited:

Armorian

Banned
So I was right about this then? Hate to say told you so, but engines that were literally being developed years ago are not going to be optimized for ray traced effects. It's still going to take time and full on engine re-writes to make it so there is much more minimal impact to performance.

Which is why Radeon is not betting their entire R&D costs on it.

Okaying this without RT with everything else cranked is going to be the way to go until they get the next gen optimization patch out which wont be till next year. Which I hear is coinciding with Witcher 3 update.

Are you able to turn off individual RT options and set quality of each setting?

If so, I don't need any RT shadows for example. I think with 2560x1080 I will be able to play this comfortably with RT settings that interest me (maybe with DLSS Q) on 3070.

I don't see reason to set everything to "ultra" if visual/performance ratio for the setting is fucked up.
 
Last edited:

RoyBatty

Banned
The 3080 is 10% faster at 1440p than the 6800XT and a whoping 20% faster at 4k. This is a different class of difference. No ray tracing of course. No DLSS.

This is a pretty brutal difference between cards that are sold are basically the same street price now. My sincere recomandation, would be to either get a 3080 or wait until you can. The 6800XT simply isnt worthwhile in a market that has the 3080

In 1080p medium 6800 XT beats RTX 3090 (650 vs 1500 USD).

In 1440p medium 6800 XT is on par with RTX 3080.

And the game is a nvidia sponsored title with Hairworks, Physx, DLSS, RT only for NV, etc.

Nice try.
 
Surely this was expected in an Nvidia sponsored title? Same with AMD sponsored games where the 6800XT beats a 3090.


Thing is the 3080 beats the 6800XT in everything at 4k and in most of the games at 1400p, regardless of what gpu vendor apears at the start of the game. Thats why im saying, if you're gonna spend 800 dollars or more on a gpu, why buy the 6800XT and get less of everything ? Best to wait until you find a 3080. If the prices were very different between them, maybe, but not at the same price
 

Papacheeks

Banned
Seriously... I'm sad that AMD's Ray Tracing iteration is so barebones. It was expected since it's their first step into this, but still... I hoped it would be better.
But looking at the lack luster performance with Nvidia's cards I would argue it looks like a case of un-optimization within the engine for ray traced effects.
 
In 1080p medium 6800 XT beats RTX 3090 (650 vs 1500 USD).

In 1440p medium 6800 XT is on par with RTX 3080.

And the game is a nvidia sponsored title with Hairworks, Physx, DLSS, RT only for NV, etc.

Nice try.


Did you maybe look at another game ? At 1080, this useless resolution that techspot didnt even benchark for the cards, they are equal. The 6800Xt doesn beat anyone, they're all the same. Because the cards arent utilized, the cpu is. The rest of the results are as i posted. Is the obsession for AMD so great that you're inventing results now ? Why are you worshipping a company instead of choosing whats best for yourself ?

@i see, you've looked at the medium settings results. Scrool down and look at ultra settings, for 1440p
 
Last edited:

RoyBatty

Banned
Did you maybe look at another game ? At 1080, this useless resolution that techspot didnt even benchark for the cards, they are equal. The 6800Xt doesn beat anyone, they're all the same. Because the cards arent utilized, the cpu is. The rest of the results are as i posted. Is the obsession for AMD so great that you're inventing results now ? Why are you worshipping a company instead of choosing whats best for yourself ?

@i see, you've looked at the medium settings results. Scrool down and look at ultra settings, for 1440p

I had NVIDIA almost all my life, I came from an RTX 2080 Hybrid from EVGA.

Now I ordered an 6800 XT, for price (similar to 3070 here), real VRAM (8BG is very little for next gen), and is better than a 3080 in average 1440p, beating 3090 in some games. I know the cons from RT + DLSS just now.

1440p-Average.png




I saw all the results, just don't like your 'nitpicking' and attitude here, which is very biased. We don't know if the settings has something like hairworks in there destroying AMD performance. Either way is a nvidia sponsored title like I said before.
 
Last edited:

Armorian

Banned
I had NVIDIA almost all my life, I came from an RTX 2080 Hybrid from EVGA.

Now I ordered an 6800 XT, for price (similar to 3070 here), real VRAM (8BG is very little for next gen), and is better than a 3080 in average 1440p, beating 3090 in some games.





I saw all the results, just don't like your 'nitpicking' and attitude here, which is very biased. We don't know if the settings has something like hairworks in there destroying AMD performance. Either way is a nvidia sponsored title like I said before.

HW is only killing AMD FPS because they have lower tesselation performance
 
I had NVIDIA almost all my life, I came from an RTX 2080 Hybrid from EVGA.

Now I ordered an 6800 XT, for price (similar to 3070 here), real VRAM (8BG is very little for next gen), and is better than a 3080 in average 1440p, beating 3090 in some games.

1440p-Average.png




I saw all the results, just don't like your 'nitpicking' and attitude here, which is very biased. We don't know if the settings has something like hairworks in there destroying AMD performance. Either way is a nvidia sponsored title like I said before.


That website is one of the very few who found those results. Almost every other site found the 3080 faster at 1440p.

It may apear as if im biased. But the matter of fact is the 3080 is faster. And better. More features, better next gen performance. Its just better overall. If you're gonna claim this in relation to its competitor it may apear as biased, but thats just how it is. The 3080 is better in every aspect. You cant really sugarcoat it differently
 
Last edited:

Sun Blaze

Banned
I had NVIDIA almost all my life, I came from an RTX 2080 Hybrid from EVGA.

Now I ordered an 6800 XT, for price (similar to 3070 here), real VRAM (8BG is very little for next gen), and is better than a 3080 in average 1440p, beating 3090 in some games. I know the cons from RT + DLSS just now.

I saw all the results, just don't like your 'nitpicking' and attitude here, which is very biased. We don't know if the settings has something like hairworks in there destroying AMD performance. Either way is a nvidia sponsored title like I said before.
3Dcenter compiled the data from some of the biggest review sites. Notably eurogamer, techpowerup and guru3d. It took 17 or 18 of them, the conclusion is that the 3080 beats the 6800XT at all resolutions.

Source


7U9QYmL.png


It also gets clobbered at 4K in Cyberpunk 2077 at 4K.

m3k2BqyYYVPmm6mkVYnnJj.png
xU5niGVzJHcnAy6NdNTceh.png
NVIDIA sponsored title though, but getting beaten by 15-20% is not good. Throw in the fact the 3080 has far superior RT for the time being and AMD has no DLSS equivalent...
 
Last edited:

Buggy Loop

Member
Grats, you managed to find the (1) of (2) from (17) website reviews of the 6800XT where the 3080 is under the 6800XT on average at 1440p


Statistically, 3080 is over 6800XT in rasterization even at lower resolutions on average with benchmarks that have no RT & no DLSS. I'm not sure why anyone would bother to save the few $ to go AMD at this point when it does not win at anything. Not rasterization, not VR, not RT, not AI, not in drivers/features..

I guess if you needed a card and this is what you could find, fine i guess, but if you had both of them in front of you today to pick one?
 
So it actually even wins at 1080p ? I knew it wins at 1440p because i looked at 13 benchmarks and averaged every score they got. But it actually even wins at 1080p, useless as that may be at this level.

Trully, the way fanboism works is amazing. That you will cherlead for a company at your own expense and your own hard worked money for absolutely no reason at all. Jut because. And that company doesnt even know we exist. To drop almost a thousand dollars on a card thats worse in every measurable way because you cherlead for them. Jesus
 
Last edited:

Sun Blaze

Banned
So it actually even wins at 1080p ? I knew it wins at 1440p because i looked at 13 benchmarks and averaged every score they got. But it actually even wins at 1080p, useless as that may be at this level.

Trully, the way fanboism works is amazing. That you will cherlead for a company at your own expense and your own hard worked money for absolutely no reason at all. Jut because. And that company doesnt even know we exist. To drop almost a thousand dollars on a card thats worse in every measurable way because you cherlead for them. Jesus
It's only 4% faster at 1080p/1440p, basically a wash, but still tiny edge to NVIDIA. It convincingly wins at 4K.

In the 6800XT's defense, it seems to be a significantly better overclocker with the Strix LC OC trading and even beating a 3090 with an overclock.
Problem is for OC, reviewers tend to get golden samples and it's not the same across the board.
 
Last edited:

Bolivar687

Banned
It's only 4% faster at 1080p/1440p, basically a wash, but still tiny edge to NVIDIA. It convincingly wins at 4K.

In the 6800XT's defense, it seems to be a significantly better overclocker with the Strix LC OC trading and even beating a 3090 with an overclock.
Problem is for OC, reviewers tend to get golden samples and it's not the same across the board.

4% faster at 1080p/1440p is a wash, but 6% faster at 4K is a convincing win.

You guys are trying way too hard.

edit: I also noticed that TechPowerUp tested more games than any of the other reviewers on that list, some of whom tried as little as 8 games. So posting that German chart really isn't doing your argument any favors.
 
Last edited:

Sun Blaze

Banned
4% faster at 1080p/1440p is a wash, but 6% faster at 4K is a convincing win.

You guys are trying way too hard.

edit: I also noticed that TechPowerUp tested more games than any of the other reviewers on that list, some of whom tried as little as 8 games. So posting that German chart really isn't doing your argument any favors.
It's actually 3.5 vs 7.4. I just rounded to 4 but I'd say anything above 5% is a win.
 
HW is only killing AMD FPS because they have lower tesselation performance

While you are right that is the main reason, the secondary reason is that Hairworks is a closed source black box so AMD/Intel can't optimize their drivers for it at all because they have no access to the source code.
 

RoyBatty

Banned
3Dcenter compiled the data from some of the biggest review sites. Notably eurogamer, techpowerup and guru3d. It took 17 or 18 of them, the conclusion is that the 3080 beats the 6800XT at all resolutions.

Source


7U9QYmL.png


It also gets clobbered at 4K in Cyberpunk 2077 at 4K.

m3k2BqyYYVPmm6mkVYnnJj.png
xU5niGVzJHcnAy6NdNTceh.png
NVIDIA sponsored title though, but getting beaten by 15-20% is not good. Throw in the fact the 3080 has far superior RT for the time being and AMD has no DLSS equivalent...

So 3% on 1440p on almost all old games, with almost half VRAM and in my country is 50-70% more pricer. And 3070 doesn't even beat the 6800 on a single game. Thanks for the summary, but I think I did the right choice.

In this NVIDIA title, 1440p medium same performance and on ultra 6 FPS more for the 3080. That doesn't look to bad for me. Yep, RT is the difference here, but the performance is awful even on the 3080. (48 min with DLSS!).
 
Last edited:

Sun Blaze

Banned
So 3% on 1440p on almost all old games, with almost half VRAM and in my country is 50-70% more pricer. And 3070 doesn't even beat the 6800 on a single game. Thanks for the summary, but I think I did the right choice.

In this NVIDIA title, 1440p medium same performance and on ultra 6 FPS more for the 3080. That doesn't look to bad for me. Yep, RT is the difference here, but the performance is awful even on the 3080. (48 min with DLSS!).
Why do you cheat? 3.7 is much closer to 4 than 3. Be honest.

If the 3080 is 50-70% more expensive then yeah, you should go with the 6800XT.
 

Mister Wolf

Gold Member
In 1080p medium 6800 XT beats RTX 3090 (650 vs 1500 USD).

In 1440p medium 6800 XT is on par with RTX 3080.

And the game is a nvidia sponsored title with Hairworks, Physx, DLSS, RT only for NV, etc.

Nice try.

No one cares about playing at 1080p or using medium setting probably lower than what's on consoles.
 

Ascend

Member
So with ray tracing set to Ultra, even a 3080 can't hit 60 FPS at 1080p. That's pretty brutal.
I'm not surprised. I've been constantly saying that these cards (i.e. both the 6800 AND the RTX 3000 series) are still too slow for RT. I get hammered for that and get spammed with "but DLSS". It's a reality that many don't want to face, and it's also the reason that at this point in time, RT should not be a primary decision for your graphics card upgrade.

That website is one of the very few who found those results. Almost every other site found the 3080 faster at 1440p.

It may apear as if im biased. But the matter of fact is the 3080 is faster. And better. More features, better next gen performance. Its just better overall. If you're gonna claim this in relation to its competitor it may apear as biased, but thats just how it is. The 3080 is better in every aspect. You cant really sugarcoat it differently
They are one of the few websites that test with a Ryzen processor, which is actually the processor that many are using now.
It doesn't 'appear' that you are biased. You're here trying to advertise for the RTX 3080 every chance you get.
 
Last edited:

Ascend

Member
Gotta love AMD. Just installed RX6800, Mankind Divided started to crash to desktop whenever I try to load a save. FFS.
Well that sucks...
Are your settings set to fullscreen mode? Try Borderless (if it is available). If not, try Windowed mode, and make it full screen after loading with ALT+Enter.

Try running in DX11 mode if you're using DX12 or vice versa.

Try increasing virtual memory in your Windows settings. (recommendation based on this;


It seems to be a widespread problem with the game, since both AMD and nVidia users are encountering this issue.
 

BluRayHiDef

Banned
Good news/bad news I guess
kLSU6JaXp9hP9cwMrLCVrj.png

zEMKgpfRYxP6SDGjvFMrag.png

Based on the second chart, my best options are 4K via DLSS Quality Mode with Ultra settings and no ray tracing or 4K via DLSS Performance Mode with Ultra Settings and ray tracing.

However, dropping from Ultra to Very High via DLSS Quality Mode will probably strike the best balance for what I'm looking for.
 
D

Deleted member 17706

Unconfirmed Member
I had NVIDIA almost all my life, I came from an RTX 2080 Hybrid from EVGA.

Now I ordered an 6800 XT, for price (similar to 3070 here), real VRAM (8BG is very little for next gen), and is better than a 3080 in average 1440p, beating 3090 in some games. I know the cons from RT + DLSS just now.

1440p-Average.png




I saw all the results, just don't like your 'nitpicking' and attitude here, which is very biased. We don't know if the settings has something like hairworks in there destroying AMD performance. Either way is a nvidia sponsored title like I said before.

If you knew the cons going into it, then more power to you. It's a badass card and totally worth getting if RT and DLSS are things you know you won't care about.
 

BluRayHiDef

Banned
3Dcenter compiled the data from some of the biggest review sites. Notably eurogamer, techpowerup and guru3d. It took 17 or 18 of them, the conclusion is that the 3080 beats the 6800XT at all resolutions.

Source


7U9QYmL.png


It also gets clobbered at 4K in Cyberpunk 2077 at 4K.

m3k2BqyYYVPmm6mkVYnnJj.png
xU5niGVzJHcnAy6NdNTceh.png
NVIDIA sponsored title though, but getting beaten by 15-20% is not good. Throw in the fact the 3080 has far superior RT for the time being and AMD has no DLSS equivalent...

Damn, this game annihilates all graphics cards at native 4K with Ultra settings and ray tracing.
 

Rikkori

Member
Gotta love AMD. Just installed RX6800, Mankind Divided started to crash to desktop whenever I try to load a save. FFS.
Tbf that game is prone to crashing (especially when going into menus). For what it's worth it was one of the first games I booted when I got my 6800 & it ran beautifully, even could do 5K >30fps :) Didn't even crash once, but I didn't mess around with it too much. It was an older save as well (that I also booted with RX 480 & Vega 64).
 

MadYarpen

Member
Tbf that game is prone to crashing (especially when going into menus). For what it's worth it was one of the first games I booted when I got my 6800 & it ran beautifully, even could do 5K >30fps :) Didn't even crash once, but I didn't mess around with it too much. It was an older save as well (that I also booted with RX 480 & Vega 64).
I will try to re install the game and if it doesn't help I'll write to some support, gog or AMD.
 
Top Bottom