• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Rage mode is basically just increasing the power limit which in turn allows the card to boost higher. It only really increases perf by like 1-2% it is kind of useless but I guess it shows AMD is finally getting better at branding so not all bad
And is it independent of the rest of the system or does it hinge on you having an amd cpu? I thought I read something a while back where if you had both an amd cpu and gpu, you could reap added benefits.
 
And is it independent of the rest of the system or does it hinge on you having an amd cpu? I thought I read something a while back where if you had both an amd cpu and gpu, you could reap added benefits.

Rage Mode is completely independent to the system and built entirely into the GPU so you can turn it on regardless of CPU.

What you might be thinking about is SAM which makes use of resizable BAR support on PCIe for Windows.

SAM currently only works with Ryzen 5000 CPU and Radeon 6000 GPUs along with 500 series MOBO. Although it will likely be ported to work on older MOBOs and maybe even Zen 2 CPUs.

Some games SAM offers no increase, others a small increase, games that benefit tend to be around 3-6% increase in performance with some outliers getting 11% performance increase.

Eventually SAM will likely work on Nvidia GPUs, Intel CPUs and older Zen CPUs but right now it only works on all AMD systems.
 

Kenpachii

Member
Only for 1440p. At 4K, it mostly loses to the RTX 3080.

4k is laughable anyway in newer games, look at the best number of any of those cards and it only gets worse from here on out.

Godfall 55 fps
watch dog legion 55 fps
valhalla 45 fps minimum
dirt 5 67 fps minimum
horizon zero dawn 59 fps minimum
flight simulator omegalul
odyssey 45 fps minimum

1440p is where it's at for PC if you like to have more pixels on your screen but even then i would still stick with those cards at 1080p to get max performance as even 1440p the performance is disappointing in many titles already.
 
Last edited:

BluRayHiDef

Banned
4k is laughable anyway in newer games, look at the best number of any of those cards and it only gets worse from here on out.

Godfall 55 fps
watch dog legion 55 fps
valhalla 45 fps minimum
dirt 5 67 fps minimum
horizon zero dawn 59 fps minimum
flight simulator omegalul
odyssey 45 fps minimum

1440p is where it's at for PC if you like to have more pixels on your screen but even then i would still stick with those cards at 1080p to get max performance as even 1440p the performance is disappointing in many titles already.

50 frames per second and higher is still smooth. However, those benchmarks were conducted with the games' settings maxed out (i.e. Ultra quality); hence, 60 frames per second can be attained by lowering the settings to Very High, which is imperceptibly of lower quality than Ultra.

As for 1440p, it looks noticeably inferior to 4K on a 40" screen and larger, and I play on a 55".
 
Rage Mode is completely independent to the system and built entirely into the GPU so you can turn it on regardless of CPU.

What you might be thinking about is SAM which makes use of resizable BAR support on PCIe for Windows.

SAM currently only works with Ryzen 5000 CPU and Radeon 6000 GPUs along with 500 series MOBO. Although it will likely be ported to work on older MOBOs and maybe even Zen 2 CPUs.

Some games SAM offers no increase, others a small increase, games that benefit tend to be around 3-6% increase in performance with some outliers getting 11% performance increase.

Eventually SAM will likely work on Nvidia GPUs, Intel CPUs and older Zen CPUs but right now it only works on all AMD systems.
Awesome, thank you for the explanation!
 
Hardware Unboxed's review of the RX 6800 XT includes benchmark results for 18 games that show the card rendering quite a few of them faster than the RTX 3080 in 1440p but slower than the RTX 3080 in 4K.



Godfall (1440p)
RX 6800XT -> 79 FPS Min & 100 FPS Avg
RTX 3080 -> 71 FPS Min & 88 FPS Avg
RTX 3090 -> 80 FPS Min & 104 FPS Avg

Godfall (4K)
RX 6800XT -> 50 FPS Min & 59 FPS Avg
RTX 3080 -> 49 FPS Min & 59 FPS Avg
RTX 3090 -> 55 FPS Min & 68 FPS Avg


Watch Dogs: Legion (1440p)
RX 6800XT -> 70 FPS Min & 85 FPS Avg
RTX 3080 -> 69 FPS Min & 85 FPS Avg
RTX 3090 -> 71 FPS Min & 91 FPS Avg

Watch Dogs: Legion (4K)
RX 6800XT -> 42 FPS Min & 49 FPS Avg
RTX 3080 -> 47 FPS Min & 55 FPS Avg
RTX 3090 -> 50 FPS Min & 61 FPS Avg


Assassin's Creed Valhalla (1440p)
RX 6800XT -> 66 FPS Min & 89 FPS Avg
RTX 3080 -> 56 FPS Min & 75 FPS Avg
RTX 3090 -> 58 FPS Min & 78 FPS Avg

Assassin's Creed Valhalla (4K)
RX 6800XT -> 44 FPS Min & 57 FPS Avg
RTX 3080 -> 42 FPS Min & 56 FPS Avg
RTX 3090 -> 45 FPS Min & 59 FPS Avg


Dirt 5 (1440p)
RX 6800XT -> 104 FPS Min & 117 FPS Avg
RTX 3080 -> 80 FPS Min & 90 FPS Avg
RTX 3090 -> 86 FPS Min & 99 FPS Avg

Dirt 5 (4K)
RX 6800XT -> 72 FPS Min & 80 FPS Avg
RTX 3080 -> 60 FPS Min & 69 FPS Avg
RTX 3090 -> 67 FPS Min & 76 FPS Avg


Death Stranding (1440p)
RX 6800XT -> 140 FPS Min & 161 FPS Avg
RTX 3080 -> 139 FPS Min & 157 FPS Avg
RTX 3090 -> 140 FPS Min & 160 FPS Avg

Death Stranding (4K)
RX 6800XT -> 93 FPS Min & 102 FPS Avg
RTX 3080 -> 97 FPS Min & 107 FPS Avg
RTX 3090 -> 99 FPS Min & 113 FPS Avg


Microsoft Flight Simulator (1440p)
RX 6800XT -> 42 FPS Min & 51 FPS Avg
RTX 3080 -> 40 FPS Min & 50 FPS Avg
RTX 3090 -> 44 FPS Min & 53 FPS Avg

Microsoft Flight Simulator (4K)
RX 6800XT -> 31 FPS Min & 33 FPS Avg
RTX 3080 -> 37 FPS Min & 40 FPS Avg
RTX 3090 -> 41 FPS Min & 44 FPS Avg


Shadow of the Tomb Raider (1440p)
RX 6800XT -> 116 FPS Min & 151 FPS Avg
RTX 3080 -> 119 FPS Min & 154 FPS Avg
RTX 3090 -> 130 FPS Min & 170 FPS Avg

Shadow of the Tomb Raider (4K)
RX 6800XT -> 65 FPS Min & 79 FPS Avg
RTX 3080 -> 71 FPS Min & 87 FPS Avg
RTX 3090 -> 76 FPS Min & 96 FPS Avg


Tom Clancy's Rainbow Six Siege (1440p)
RX 6800XT -> 280 FPS Min & 337 FPS Avg
RTX 3080 -> 273 FPS Min & 327 FPS Avg
RTX 3090 -> 301 FPS Min & 360 FPS Avg

Tom Clancy's Rainbow Six Siege (4K)
RX 6800XT -> 145 FPS Min & 168 FPS Avg
RTX 3080 -> 152 FPS Min & 174 FPS Avg
RTX 3090 -> 176 FPS Min & 201 FPS Avg


F1 2020 (1440p)
RX 6800XT -> 157 FPS Min & 197 FPS Avg
RTX 3080 -> 164 FPS Min & 193 FPS Avg
RTX 3090 -> 176 FPS Min & 203 FPS Avg

F1 2020 (4K)
RX 6800XT -> 100 FPS Min & 124 FPS Avg
RTX 3080 -> 112 FPS Min & 129 FPS Avg
RTX 3090 -> 121 FPS Min & 138 FPS Avg


Gears of War 5 (1440p)
RX 6800XT -> 88 FPS Min & 111 FPS Avg
RTX 3080 -> 90 FPS Min & 107 FPS Avg
RTX 3090 -> 92 FPS Min & 109 FPS Avg

Gears of War 5 (4K)
RX 6800XT -> 56 FPS Min & 66 FPS Avg
RTX 3080 -> 62 FPS Min & 74 FPS Avg
RTX 3090 -> 63 FPS Min & 78 FPS Avg


Horizon Zero Dawn (1440p)
RX 6800XT -> 88 FPS Min & 117 FPS Avg
RTX 3080 -> 83 FPS Min & 118 FPS
Avg
RTX 3090 -> 85 FPS Min & 126 FPS Avg

Horizon Zero Dawn (4K)
RX 6800XT -> 57 FPS Min & 66 FPS Avg
RTX 3080 -> 50 FPS Min & 73 FPS Avg
RTX 3090 -> 59 FPS Min & 79 FPS Avg


Assassin's Creed Odyssey (1440p)
RX 6800XT -> 52 FPS Min & 80 FPS Avg
RTX 3080 -> 54 FPS Min & 84 FPS Avg
RTX 3090 -> 56 FPS Min & 86 FPS Avg

Assassin's Creed Odyssey (4K)
RX 6800XT -> 45 FPS Min & 65 FPS Avg
RTX 3080 -> 43 FPS Min & 62 FPS Avg
RTX 3090 -> 45 FPS Min & 66 FPS Avg


World War Z (1440p)
RX 6800XT -> 160 FPS Min & 207 FPS Avg
RTX 3080 -> 162 FPS Min & 190 FPS Avg
RTX 3090 -> 168 FPS Min & 221 FPS Avg

World War Z (4K)
RX 6800XT -> 122 FPS Min & 134 FPS Avg
RTX 3080 -> 98 FPS Min & 118 FPS Avg
RTX 3090 -> 118 FPS Min & 139 FPS Avg


Metro Exodus (1440p)
RX 6800XT -> 114 FPS Min & 157 FPS Avg
RTX 3080 -> 113 FPS Min & 153 FPS Avg
RTX 3090 -> 119 FPS Min & 155 FPS Avg

Metro Exodus (4K)
RX 6800XT -> 90 FPS Min & 105 FPS Avg
RTX 3080 -> 98 FPS Min & 124 FPS Avg
RTX 3090 -> 108 FPS Min & 132 FPS Avg


Resident Evil 3 (1440p)
RX 6800XT -> 153 FPS Min & 190 FPS Avg
RTX 3080 -> 161 FPS Min & 196 FPS Avg
RTX 3090 -> 178 FPS Min & 216 FPS Avg

Resident Evil 3 (4K)
RX 6800XT -> 82 FPS Min & 98 FPS Avg
RTX 3080 -> 88 FPS Min & 104 FPS Avg
RTX 3090 -> 100 FPS Min & 119 FPS Avg


Doom Eternal (1440p)
RX 6800XT -> 234 FPS Min & 308 FPS Avg
RTX 3080 -> 237 FPS Min & 312 FPS Avg
RTX 3090 -> 255 FPS Min & 346 FPS Avg


Doom Eternal (4K)
RX 6800XT -> 147 FPS Min & 175 FPS Avg
RTX 3080 -> 154 FPS Min & 189 FPS Avg
RTX 3090 -> 170 FPS Min & 209 FPS Avg


Wolfenstein: Youngblood (1440p)
RX 6800XT -> 163 FPS Min & 241 FPS Avg
RTX 3080 -> 176 FPS Min & 252 FPS Avg
RTX 3090 -> 185 FPS Min & 271 FPS Avg

Wolfenstein: Youngblood (4K)
RX 6800XT -> 122 FPS Min & 137 FPS Avg
RTX 3080 -> 128 FPS Min & 151 FPS Avg
RTX 3090 -> 134 FPS Min & 170 FPS Avg


Hitman 2 (1440p)
RX 6800XT -> 105 FPS Min & 126 FPS Avg
RTX 3080 -> 103 FPS Min & 123 FPS Avg
RTX 3090 -> 105 FPS Min & 126 FPS Avg

Hitman 2 (4K)
RX 6800XT -> 70 FPS Min & 81 FPS Avg
RTX 3080 -> 72 FPS Min & 88 FPS Avg
RTX 3090 -> 80 FPS Min & 94 FPS Avg



Interesting results they got there. Very dependant on the suite of titles they use.

Sweclockers has the 3080 coming on top in 9 games out of 12 at 1440p
Techpowerup has the 3080 4% faster at 1440p from 23 games
Guru 3d has the 3080 wining 7 games out of 14 and 1 tie at 1440p
PC Gamer has 4 games for 3080 and 4 games for 6800XT at 1440p
KitGuru has the 3080 winning at 1440 from 14 games
Igorslab has the 3080 on top slightly at 1440p but the 6800XT slightly on top with SAM
Golem.de has the 3080 taking 9 games out of 10 at 1440p
Computerbase.de has the 3080 6% faster at 1440p from 17 games
Hardwareluxx.de has the 3080 taking 6 games out of 10 at 1440p
PC Ganes Hardware has the 3080 winning in 17 games out of 20 at 1440p


It seems one person is saying he has issues with DX9 games that are dating back to the 5700 launch

 

Rikkori

Member
RE DX9 issues (and older undemanding games in general).

Make sure to increase min frequency from 500 to >1500, the frametime graph used to look like a seismograph when trying some games because it kept going very low. Once I did that the issue was resolved.
 

Rikkori

Member
Also speaking about never repaired problems

https://www.purepc.pl/test-ryzen-7-5800x-i-radeon-rx-6800-xt-w-miejscach-procesorowych

AMD still suffers in dx11 in all cpu bottlenecked places in games
I would 100% discard those tests. Not because AMD isn't behind in DX11, they are, that's a fact & will not be fixed ever. But because the results are a lot more skewed by bad settings which you'd be stupid to use, e.g. KCD shaders destroy performance, and further gimp the CPU-limit, but show no visual difference; or issues with WD2 (and applies to Legion too) where shadows in particular at the highest settings obliterate the CPU. Then that issue is further compounded by Zen 3 which is behind Intel still in such games due to various legacy & etc issues. It's a variant of the Hairworks problem, where some settings hit so hard even Nvidia but they're happy for them to be on because then AMD does even worse when it's unoptimised, instead of using optimised tessellation levels & having good experience for all setups. It's just a very stupid methodology.

I can tell you for a fact that I have way better performance than that on my i7 6800K + RX 6800 (heh, unintentional match). And there is a lot more to it as there are a lot of non-obvious issues which even reviewers are generally ignorant about, but I don't feel like writing another essay atm.

Don't get me wrong though, there's a lot of value in Nvidia + Intel in such cases, hell, I will make a 2nd PC like that for myself in a year or two, just for the sake of using it in older DX11 titles, particularly in KCD (but also Cities Skylines and others very CPU-bound & DX11 bound). Just waiting for DDR5 because memory is crucially overlooked for these situations. Ofc, 1st PC will be AMD + AMD for modern titles though. ;)
 

rnlval

Member
Also speaking about never repaired problems

https://www.purepc.pl/test-ryzen-7-5800x-i-radeon-rx-6800-xt-w-miejscach-procesorowych

AMD still suffers in dx11 in all cpu bottlenecked places in games
RTG hasn't implemented DX11 multi-threading command list submission

understanding-directx-multithreaded-rendering-performance-by-experiments-image1-842212.png
 

Antitype

Member
Also speaking about never repaired problems

https://www.purepc.pl/test-ryzen-7-5800x-i-radeon-rx-6800-xt-w-miejscach-procesorowych

AMD still suffers in dx11 in all cpu bottlenecked places in games

That one speaks volume about the state of their drivers:

Even trivial things like this go unfixed for over a year now.

With how difficult it was to get an RTX 3080, I was considering going AMD this time around if they had more stocks and so I went and looked around the AMD community forums, as well as their subreddit and man, their awful reputation is definitely deserved. The amount of bugs these drivers have is staggering.

There are games that are essentially broken like FFXIII and it's been known for a very long time, but nobody cares to fix it (I know FFXIII PC port is trash, but hey it works on Nvidia and used to work on AMD).

If they can't be bothered to fix previous generations of DX and OGL, then at least they need to implement some kind of wrapper like dxvk that will finally sort all their issues with performance and compatibility once and for all. The kind of performance they have on older API is simply unacceptable., it's not like they sell their GPU at a discount.
 
RDNA 2 white paper when?

Yeah I'd love to find out some info on when this is due. Should be a fascinating for those inclined to dive deep into the architecture details. I think these white papers normally appear around a month or so after launch, maybe 2 months max I would expect to see it appear.
 

Kenpachii

Member
Problem i have with overclocking tests is that these guys will always get the golden samples. So it could be way off from what users will actually get.

Finally a review that's not praising it just for the sake of it. Those are good cards but cmon. 6800xt vs 3080, No dlss, worse ray tracing performance, 16gb not usable now but DLSS very much so.
I think these should be cheaper. Saving 50usd is not worth losing on these features and rt in rt upcoming world

Yea major complain i got with the card is price. i honestly think that the 6800xt should have dropped for 499 the 6800 should have been 399. but then prices are fucked anyway at this poitn so yea they can do whatever they want as of now.
 
Last edited:

JRW

Member
Finally a review that's not praising it just for the sake of it. Those are good cards but cmon. 6800xt vs 3080, No dlss, worse ray tracing performance, 16gb not usable now but DLSS very much so.
I think these should be cheaper. Saving 50usd is not worth losing on these features and rt in rt upcoming world

ya It's been kinda frustrating seeing how many folks seem to brush off the ray tracing performance on AMDs cards but maybe its different for those of us who've already owned RTX 20 series cards for awhile , There's no way I'm significantly downgrading ray tracing performance / losing DLSS for a few more frames in non ray trace games.
 
Last edited:
Problem i have with overclocking tests is that these guys will always get the golden samples. So it could be way off from what users will actually get.

Yeah this tends to also be true about general performance of review samples for tech tubers/reviewers. Having said that the AIB models at least for Radeon right now are stratified by tier, in that the best binned chips that an AIB partner receives will go to their premium top end variants while the lower binned cards will go to their entry level or mid range offerings.

We won't know exactly how it all pans out for a few weeks/months until more people have the cards in their hands to test and the lower tier AIB cards launch.

A general rule of thumb you could use is that if you get the highest tier product from an AIB partner you will likely be getting one of the best binned chips that that particular AIB received. I think with the likes of Sapphire and Powercolor you will see manual OCs be in the 2600-2700 range for most people.

The lower tier cards will likely not clock/boost as high but there may not end up being a huge variance in the end.
 
Last edited:

Ascend

Member
ya It's been kinda frustrating seeing how many folks seem to brush off the ray tracing performance on AMDs cards but maybe its different for those of us who've already owned RTX 20 series cards for awhile , There's no way I'm significantly downgrading ray tracing performance / losing DLSS for a few more frames in non ray trace games.
It's understandable that if you already have RT, and you want an upgrade for that, that AMD is not really an option right now. It would be a similar situation like having a 1080 Ti, and looking at the Turing cards and finding out you have to pay the same price for pretty much the same amount of rasterization performance. In the latter case, you'd get RT as a bonus if you go Turing, while with the AMD cards, you'd get significant rasterization boost as a bonus while remaining relatively stagnant at RT.

For someone just trying to get into RT, or simply not caring about it at all, things are different, because the perspective is completely different. I don't want to revive certain discussions, but I have to say it.
People have been complaining about my potential interest in a 6800 series card while having a 2560x1080 monitor. Before I go on, I'll state that I do not care about RT enough at this point to base my graphics card choice on it. And multiple reviewers can agree on that point.
But... At the moment I showed interest in the 6800 cards with my 21:9 1080p resolution monitor, it had been completely forgotten that these cards are perfectly viable for RT at that resolution. From that sense, the card is still a perfectly good choice, especially if you can find them cheaper than the alternatives (which is currently not the case). No one had a problem brushing RT off at the moment that it is good enough for 1080p with these cards... Suddenly the whole discussion became about native 4K being so much better than a lower resolution. Apparently, such a card is only "allowed" to be bought to game at 4K, basically its weakest point compared to the competition. Otherwise you're made out to be stupid.

This shows a huge disconnect with reality. The variables are altered each time to one's own bias. That is understandable, because the cards are more than the sum of their parts. There are many variables at play, and not everyone is going to use these cards in the same way. You can throw every number or every review out, but a certain group of people are going to value things differently than another group of people, simply because their needs are different. Which is exactly why it is completely absurd to trash not only the cards, but the people interested in these cards.

You call it brushing off, but really, RT is not mandatory, like at all. Just like with a 3070 you might have to lower settings to remain below the 8GB, you can also lower RT settings on a 6800 card. Priorities are simply different.
At one point, people were trying to shove their own view down others' throats here, going as far as trying to ridicule and borderline bullying others. Obviously, things get heated and the pushback becomes harder.

If you do care about RT, that is fine, and it is also fine to state how that means you won't get one of these cards. That being said (this is not directed to you personally), hanging around for pages nitpicking every little detail just to portray these cards in the worst light possible is not fine.

I ultimately have to ask though... Yes, this is directed at you personally... Why is it frustrating for you, that others are "brushing off" AMD's RT performance? You're free to choose your own product based on your own priorities.
 
Last edited:

Ascend

Member
Interesting video worth watching, regarding the whole hardware review scene as a whole.
It is timestamped at a specific section, but the whole thing is worth watching.

 
It was pretty obvious how this would pan out.
Its a great reset for AMD, but they still lag behind Nvidia, and its in the two big next gen feature set, RT and DLSS.
AMD is a couple of generations behind Nvidia and is going to take some time to close the gap, if they can.
Interesting that AMDs supersampling won't be ML related at all.
 
Interesting that AMDs supersampling won't be ML related at all.

I've also heard rumblings that their FidelityFX Super Resolution may not be ML based too or at the very least if it does use ML it won't do it in the same way as DLSS does. If it is algorithmic in some way or uses some kind of combination algorithmic approach with a generic ML approach then it could potentially work on a very wide selection of games.

Possibly even all titles, which would certainly be a game changer if it doesn't require any work by the developers to implement. But right now it is all just speculation we don't really know exactly how it will work or when it will be available so take anything we hear until launch of it with a grain of salt.

Speaking of launch I have heard some hints from RedGamingTech who seems to be very good for AMD related info that it might launch early next year (so Q1 2021) but he hasn't directly "leaked" it so to speak as a confirmed launch date or anything so it could just be his speculation.

Either way we can assume AMD are hard at work on it so hopefully they release it sooner rather than later.
 

Rentahamster

Rodent Whores
Interesting video worth watching, regarding the whole hardware review scene as a whole.
It is timestamped at a specific section, but the whole thing is worth watching.


He's right to be concerned about the shady influencer marketing, but the example he gives is dumb. Not enough data about raytracing performance? They did the benchmark and reported their results. Anything about future potential performance improvements is just speculation at that point.
 

Ascend

Member
He's right to be concerned about the shady influencer marketing, but the example he gives is dumb. Not enough data about raytracing performance? They did the benchmark and reported their results. Anything about future potential performance improvements is just speculation at that point.
Well, after the Riftbreaker benchmarks, which is an AMD sponsored title and they still fall behind in RT, it is highly likely that RDNA2's RT is simply slower compared to Ampere's. He's likely not aware of that one.
 

Ascend

Member
Driver update today;

Fixed Issues
Lower than expected performance may be experienced on Radeon™ RX 6000 series graphics products in Watchdogs:® Legion and Dirt™ 5.
Lower than expected performance may be experienced on Radeon™ RX 5000/500/400 series graphics products in Godfall™.
Godfall™ is not detected or listed in Radeon Software gaming tab.
Crysis™ Remastered may experience corruption on character models on Radeon™ RX 6800 Series graphics products.
Fixed some intermittent crashes found in Total War™ Saga: Troy and World of Warcraft®: Shadowlands.
World of Warcraft®: Shadowlands may fail to launch when DirectX®12 API is selected on Windows®7 system configurations.
Fixed some intermittent crashes found in Call of Duty®: Black Ops Cold War with DirectX® Raytracing enabled.
HDR on supported Windows 10 desktops might get disabled when DOOM® Eternal™ starts rendering in HDR mode.
Fixed issues found on Adobe™ Illustrator, Adobe™ Premier and FinalWire AIDA64.
Fixed corruption issues in Red Dead Redemption 2 in 1080p resolution on Radeon™ RX 6800 Series graphics products.

Known Issues
Brightness flickering may intermittently occur in some games when Radeon™ FreeSync is enabled, and the game is set to use borderless fullscreen.
Metro Exodus™, Shadow of the Tomb Raider™, Battlefield™ V and Call of Duty®: Modern Warfare may experience intermittent application crashes with DirectX® Raytracing enabled.
Anisotropic Filtering in Radeon™ Software graphics settings is not taking effect in DirectX®9 applications on RDNA graphics products.

Some games may experience stuttering when set to borderless fullscreen and an extended display is connected running the Netflix™ windows store application on RDNA graphics products.
Radeon™ recording and streaming features may fail to enable on AMD Radeon™ HD 7800 series graphics products.
Modifying the HDMI Scaling slider may cause FPS to become locked to 30.
Performance Metrics Overlay and the Performance Tuning tab incorrectly report higher than expected idle clock speeds on Radeon™ RX 5700 series graphics products. Performance and power consumption are not impacted by this incorrect reporting.
Enhanced Sync may cause a black screen to occur when enabled on some games and system configurations. Any users who may be experiencing issues with Enhanced Sync enabled should disable it as a temporary workaround.
Oculus Link users might experience crashes on Polaris and Vega series graphics products.
Flickering might be observed if the Radeon Software Overlay is invoked while Immortals: Fenyx Rising™ is running on an extended display.
Tom Clancy’s Rainbow Six® Siege might experience corruption in Hybrid Graphics scenarios when using the Vulkan API on an extended display.
Screen flickering might be observed when using MSI Afterburner.


I wonder how Watchdogs Legion performs now... But... Dirt 5 performance was lower than expected...?
Looking at the list of crashing RT games, there definitely needs to be some optimization there.
The anisotropic filtering menntioned earlier in this thread is on the list of known issues. At least it's getting attention, or will get it somewhere down the line.
It's amazing that they are still supporting HD7000 series cards. Those cards are eight years old.
 

Rentahamster

Rodent Whores
Well, after the Riftbreaker benchmarks, which is an AMD sponsored title and they still fall behind in RT, it is highly likely that RDNA2's RT is simply slower compared to Ampere's. He's likely not aware of that one.
With or without it, it doesn't matter. His example is still dumb. He's calling speculation objectivity and reporting facts as not being objective. He doesn't have a sound basis for his argument.
 

Ascend

Member
With or without it, it doesn't matter. His example is still dumb. He's calling speculation objectivity and reporting facts as not being objective. He doesn't have a sound basis for his argument.
I disagree. Numbers aren't always the whole story.
It's not speculation to say that the majority of RT titles have been programmed and optimized to work with nVidia RTX cards.
It's not speculation to say that historically, whatever has been optimized for nVidia tends to work abysmally on AMD cards.
It's not speculation to say that AMD's cards perform relatively well in RT games that were optimized for it.
It's not speculation to say based on the information above that it would be too early to draw definitive conclusions about AMD's RT performance.

So even though it is a fact that in games like Control and Watchdogs Legion AMD performs worse, it is not enough to immediately say that nVidia's RT is better. His stance is one of reserving judgment until more data is available, not speculation.
The only thing that throws all that out the window is The RiftBreaker, because it is AMD sponsored, and yet nVidia performs better in it. THAT is the only objective truth.

But people love to jump to conclusions when it suits their own bias.
 


This is the roundup summary of the previous 4 hours OC tests with the Port Royal benchmark.

So it looks like the cards get just under 10% performance uplift here with a solid manual OC versus the stock performance for each of these AIB models.

It should be noted that the AIB models already have a factory OC which puts them between 3-5% more performant than the reference model.

So in total if you buy one of these AIB models and perform a good manual OC you can potentially gain 13%-15% performance uplift over the stock reference model.

Of course this is just a synthetic benchmark here so any uplifts will vary game by game in the real world. Still impressive results overall.

Now if only the AIB model pricing wasn't so outrageous. Hopefully pricing becomes more sane as stock levels stabilize.
 
Last edited:


I came across this video which I thought was interesting enough to share.

No idea how reliable this guy is but if true it could mean big things for supply going forward.
 

MadYarpen

Member
I think I just managed to grab RX6800 TUF Gaming.

Overpriced but I was annoyed by all this waiting and uncerntainty.


Tomorrow or on monday I will find out if they really had the cards (not in stock any more obviously) or it was just some error.


Edit - if any one in Poland is trying to buy one, go to x-kom, it seems to be still available for order. And all the other models are not possible to add to the card, so this seems to be deliberate.
 
Last edited:

Armorian

Banned
At 1440p, Ryzen 9 3950X can bottleneck RX 6800 XT.

This was probably posted at some point but AMD still is miles behind in some DX11 CPU heavy games. This was the reason why i changed 290 to 970, GPU performance was similar but in some games AMD cad was just killing my 2600K:


rEqQ9Js.jpg


vQ2iLIy.jpg


This is not relevant for new games but there are probably more titles like this (I wonder how FC4 in first village run, massive CPU hog...)
 

Irobot82

Member
This was probably posted at some point but AMD still is miles behind in some DX11 CPU heavy games. This was the reason why i changed 290 to 970, GPU performance was similar but in some games AMD cad was just killing my 2600K:


rEqQ9Js.jpg


vQ2iLIy.jpg


This is not relevant for new games but there are probably more titles like this (I wonder how FC4 in first village run, massive CPU hog...)
Probably just needs a driver update
 

Armorian

Banned
Probably just needs a driver update

These games are running like that for years, now the problem is just more visible since GPUs are faster, it's just AMD drivers have much higher overhead in DX11 (at least in some games) and this won't be fixed I think.
 

regawdless

Banned
I disagree. Numbers aren't always the whole story.
It's not speculation to say that the majority of RT titles have been programmed and optimized to work with nVidia RTX cards.
It's not speculation to say that historically, whatever has been optimized for nVidia tends to work abysmally on AMD cards.
It's not speculation to say that AMD's cards perform relatively well in RT games that were optimized for it.
It's not speculation to say based on the information above that it would be too early to draw definitive conclusions about AMD's RT performance.

So even though it is a fact that in games like Control and Watchdogs Legion AMD performs worse, it is not enough to immediately say that nVidia's RT is better. His stance is one of reserving judgment until more data is available, not speculation.
The only thing that throws all that out the window is The RiftBreaker, because it is AMD sponsored, and yet nVidia performs better in it. THAT is the only objective truth.

But people love to jump to conclusions when it suits their own bias.

And what about 3D Mark raytracing benchmark? I know it doesn't translate 1:1 into gaming performance, but is a good indicator.
 
Last edited:
Hardware unboxed has a review for the Asus 6800XT ROG Strix OC Liquid Cooled:



Looks mighty impressive reaching 2750Mhz with a manual OC. Obviously this is Asus' premium model so expect the best binned chips they could get from AMD and given it is liquid cooled this will probably give the best OC/thermal performance of any of the 6800XT cards.

They only really showed Tomb Raider as a barometer to measure gaming performance but they do that with all of their AIB models so at least there is a like to like comparison for relative performance. Real world the performance will obviously vary from game to game.

In Tomb Raider the ROG Strix had 4% performance increase out of the box vs the reference 6800XT at 1440p. With a manual OC it reached an additional 8% performance increase vs the stock level of this card. That would put it at a 12% performance uplift vs the reference model at stock clocks.

At 4K this card had a 6% performance uplift out of the box vs the reference model at stock clocks. With the manual OC this card showed an amazing 19% performance uplift vs the reference AMD 6800XT at stock clocks.

Again this is only the performance in a single game, but it definitely indicates that these cards have great real world performance uplifts from overclocking and are definitely overclocking monsters despite desperate claims to the contrary from some.

And just to think that currently all of the 6800XT cards are artificially limited in their max clocks, max memory clocks and max power limit by AMD in the BIOS. Once someone either flashes the 6900XT BIOS or otherwise tinkers with the BIOS to remove the limits these cards will likely be able to clock even higher. Simply amazing.

Pity about the artificial limits from AMD though.
 
Last edited:

Armorian

Banned
Hardware unboxed has a review for the Asus 6800XT ROG Strix OC Liquid Cooled:



Looks mighty impressive reaching 2750Mhz with a manual OC. Obviously this is Asus' premium model so expect the best binned chips they could get from AMD and given it is liquid cooled this will probably give the best OC/thermal performance of any of the 6800XT cards.

They only really showed Tomb Raider as a barometer to measure gaming performance but they do that with all of their AIB models so at least there is a like to like comparison for relative performance. Real world the performance will obviously vary from game to game.

In Tomb Raider the ROG Strix had 4% performance increase out of the box vs the reference 6800XT at 1440p. With a manual OC it reached an additional 8% performance increase vs the stock level of this card. That would put it at a 12% performance uplift vs the reference model at stock clocks.

At 4K this card had a 6% performance uplift out of the box vs the reference model at stock clocks. With the manual OC this card showed an amazing 19% performance uplift vs the reference AMD 6800XT at stock clocks.

Again this is only the performance in a single game, but it definitely indicates that these cards have great real world performance uplifts from overclocking and are definitely overclocking monsters despite desperate claims to the contrary from some.

And just to think that currently all of the 6800XT cards are artificially limited in their max clocks, max memory clocks and max power limit by AMD in the BIOS. Once someone either flashes the 6900XT BIOS or otherwise tinkers with the BIOS to remove the limits these cards will likely be able to clock even higher. Simply amazing.

Pity about the artificial limits from AMD though.

We didn't have GPUs that really benefited from OC in a long time, impressive. We are reaching CPU clocks here LOL
 
Top Bottom