• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

llien

Member
I kind of expected 6800XT to come in at or a little above in general performance, but it was close.

AMD is ahead in newest games (AC:V for instance) even without counting in SAM, which boosts it by another 5%-ish in many games.

RT perf needs more analysis:

yhNPgPg.png
 
Last edited:

CrustyBritches

Gold Member
AMD is ahead in newest games (AC:V for instance) even without counting in SAM, which boosts it by another 5%-ish in many games.

RT perf needs more analysis.
It's a strong showing from AMD. Those OC average clocks are ludicrous. Looking forward to 6900 XT reviews.

P.S.- I don't know how I could go without mentioning AiBs like Sapphire and PowerColor will almost certainly have the best Big Navis with their Nitro+ and Red Devil cards. Can't wait to see those!
PowerColor-Radeon-RX-6800-Red-Devil.jpg


Looks like something out of DOOM Eternal.🔥
 
Last edited:

llien

Member
So, on RT front, Watch Dogs: Legion => 6800XT on par with 3080, Dirt 5 => 6800XT ahead of 3080.

In most other games, 6800XT is quite behind 3080, but around 3070, with exceptions like Mincraft, where it is particularly bad performing.
 

regawdless

Banned
So, on RT front, Watch Dogs: Legion => 6800XT on par with 3080, Dirt 5 => 6800XT ahead of 3080.

In most other games, 6800XT is quite behind 3080, but around 3070, with exceptions like Mincraft, where it is particularly bad performing.

RT in WD Legion is bugged on the 6800XT and is way lower quality than on a 3080. They are not on par. AMD is looking into it.
 

Ascend

Member
It's not surprising that despite both having RT, that the games have been focusing on nVidia's hardware, thus AMD performing worse in existing games. We will see how things develop down the line. The fact that there is even one example of RT performance scaling differently means that there is more going on and that judgment should be reserved for now. nVidia has the advantage for now, and I wouldn't be surprised if the difference narrows (or even vanishes) over time.
 
So, on RT front, Watch Dogs: Legion => 6800XT on par with 3080, Dirt 5 => 6800XT ahead of 3080.

In most other games, 6800XT is quite behind 3080, but around 3070, with exceptions like Mincraft, where it is particularly bad performing.

Yeah I've been looking a bit more into the RT performance and it doesn't seem quite as bad as I initially thought.

Regarding the outlier performance on Dirt 5 this seems pertinent:



Looks like Dirt 5 has a RT beta branch that people can use. Dirt 5 uses DXR 1.1 where as all the other RT games that I know of use DXR 1.0

This in and of itself isn't that odd, but what is strange is how much better the 6800XT performs in RT compared to the 3080 in Dirt 5, when that seems to be the opposite in most other titles. It could mean that games using DXR 1.1 optimizations could end up running better on AMD and worse on Nvidia, unless of course the developer makes two branches perhaps to optimize for both. Of course it could also just make light use of RT effects and the game is heavily optimized for AMD anyway so we will have to see more examples to be sure.

Interestingly I don't think Watch Dogs: Legion uses DXR 1.1 (I'm not 100% sure but I don't think it does), but because it is optimized for AMD hardware/RT solution due to consoles it seems to perform very closely to the 3080 on a 6800XT.

As with anything we will have to see how the RT landscape develops such as more mature drivers and an improved number of DXR 1.1/Optimized or tested for AMD titles become available.
 

Dampf

Member
Yeah I've been looking a bit more into the RT performance and it doesn't seem quite as bad as I initially thought.

Regarding the outlier performance on Dirt 5 this seems pertinent:



Looks like Dirt 5 has a RT beta branch that people can use. Dirt 5 uses DXR 1.1 where as all the other RT games that I know of use DXR 1.0

This in and of itself isn't that odd, but what is strange is how much better the 6800XT performs in RT compared to the 3080 in Dirt 5, when that seems to be the opposite in most other titles. It could mean that games using DXR 1.1 optimizations could end up running better on AMD and worse on Nvidia, unless of course the developer makes two branches perhaps to optimize for both. Of course it could also just make light use of RT effects and the game is heavily optimized for AMD anyway so we will have to see more examples to be sure.

Interestingly I don't think Watch Dogs: Legion uses DXR 1.1 (I'm not 100% sure but I don't think it does), but because it is optimized for AMD hardware/RT solution due to consoles it seems to perform very closely to the 3080 on a 6800XT.

As with anything we will have to see how the RT landscape develops such as more mature drivers and an improved number of DXR 1.1/Optimized or tested for AMD titles become available.


Minecraft RTX received a DXR 1.1 update and that resulted in a huge performance boost across all RTX GPUs. So it's not like DXR 1.1 works better on AMD than on Nvidia.

It is more likely the AMD sponsored RT games do not take advantage of BVH traversal in hardware for RTX cards yet, as they were optimized for BVH traversal running on compute shaders. These optimizations could be the reason for the delayed Godfall patch on Nvidia hardware
 
Last edited:
Minecraft RTX received a DXR 1.1 update and that resulted in a huge performance boost across all RTX GPUs. So it's not like DXR 1.1 works better on AMD than on Nvidia.

It is more likely the AMD RT games do not take advantage of BVH traversal in hardware for RTX cards yet.

Very true, I didn't mean to make out that DXR 1.1 is somehow like a new API entirely and performs worse on Nvidia. It is obviously built on top of DXR 1.0 with additional features. These new features that AMD worked closely with MS on likely allow developers to optimize better with AMD's RT hardware/solution than a pure DXR 1.0 solution might.

As for Minecraft, seeing as is uses full Path Tracing rather than Hybrid Rendering for its ray tracing it was always going to perform much worse on AMD GPUs regardless of the API used as Nvidia's RT cores are much more performant than AMD's RT Accelerators at pure RT calculations when you strip away the additional overhead of normal rasterization.

The reason why this approach is feasible on Minecraft and Quake is that they have very primitive graphics and use simple geometry. That kind of RT is not really possible with modern graphics quality, which is why 99% of RT enabled games you will see will use hybrid rendering. This is why Minecraft is a bit of an outlier when it comes to the performance delta between Nvidia and AMD with RT enabled compared to most other games.

In addition to all of that, Minecraft RTX is an Nvidia sponsored title so it will be optimized to their RT hardware/GPUs, which gives their cards an even greater advantage than they would already have.
 
Last edited:

Ascend

Member
In addition to all of that, Minecraft RTX is an Nvidia sponsored title so it will be optimized to their RT hardware/GPUs, which gives their cards an even greater advantage than they would already have.
Although this is true, back in March ( I believe), Digital Foundry released a video of the XSX running path traced Minecraft. So technically, it should run respectably on the 6800 cards. But it doesn't look like that this has been implemented for them (yet).
 

VFXVeteran

Banned
So, on RT front, Watch Dogs: Legion => 6800XT on par with 3080, Dirt 5 => 6800XT ahead of 3080.

In most other games, 6800XT is quite behind 3080, but around 3070, with exceptions like Mincraft, where it is particularly bad performing.

That indicates it's not a true RT card. Minecraft is basically showcasing full on path-tracing. Sorry but without RT cores, AMD is dead in the water IMO. Wait on RDNA3 for them to correct their rushed product.
 
That feeling when you're gut feeling was right, all along. Good thing I'm sticking with The leather jacket man again this generation. Had a strong feeling the results would be just that. Hopefully RDNA3 shows vast improvement in the areas that matter, and they bring some competition to Nvidia.
 

waylo

Banned
I understand buying into hype, and I understand being disappointed at the inability to outright purchase a 3000 series GPU, but I just don't understand the market for the AMD cards now that performance is out there (particularly RT performance). If these were like $400, then fuck yeah. But as it stands, the XT is $50 less than a 3080 (and the other is MORE than a 3070). Who the fuck is spending $600+ on a GPU just so they can turn settings down or off? If I spend that, I want to know I can go in and crank my game and also get good performance. Not one or the other.
 
I understand buying into hype, and I understand being disappointed at the inability to outright purchase a 3000 series GPU, but I just don't understand the market for the AMD cards now that performance is out there (particularly RT performance). If these were like $400, then fuck yeah. But as it stands, the XT is $50 less than a 3080 (and the other is MORE than a 3070). Who the fuck is spending $600+ on a GPU just so they can turn settings down or off? If I spend that, I want to know I can go in and crank my game and also get good performance. Not one or the other.
I been saying this all along, and these clowns laugh and pretended that all the AMD rumors were true.... And look how that turned out. This is why I always wait for benchmarks, from legitimate sources. Buying into the hype and rumors is the most idiotic thing to do, especially for those who been burned generation after generation of waiting for them to be competitive.
 

Ascend

Member
That feeling when you're gut feeling was right, all along. Good thing I'm sticking with The leather jacket man again this generation. Had a strong feeling the results would be just that. Hopefully RDNA3 shows vast improvement in the areas that matter, and they bring some competition to Nvidia.
If you honestly think they did not bring any competition to nVidia...

giphy.gif
 
Last edited:

The Skull

Member
I been saying this all along, and these clowns laugh and pretended that all the AMD rumors were true.... And look how that turned out. This is why I always wait for benchmarks, from legitimate sources. Buying into the hype and rumors is the most idiotic thing to do, especially for those who been burned generation after generation of waiting for them to be competitive.

What are you smoking? The "rumours" were AMD barely had a 2080ti.
 
What are you smoking? The "rumours" were AMD barely had a 2080ti.
If you honestly think they did not bring any competition to nVidia...

giphy.gif
Unless either of you two were the ones spreading baseless rumors and talking down on Nvidia, there's nothing to worry about. We just can't pretend the only rumor was that it was on par with 2080 Ti...

Nvidia doesn't have much completion in raytracing or an alternative to DLSS at the moment, does it? If not, then why get triggered over my post?
 

Rickyiez

Member
What are you talking about? Hardware Unboxed was one of the few sites to withhold judgment, investigate more, and then state that the situation was overblown.

No . They deleted their initial video which they were so hell bent that most of the RTX3080 was faulty . And it's not them that actually uncovered the situation , it's GN Steve .

Look at this video , they're were just reciting research from other sites . Also look at the clickbait thumbnail , why couldn't they just wrote the actual reason there instead of trying to paint the impression that 3080 is still faulty ?


Another evidence here is their site Techspot which is spreading the false information https://www.techspot.com/news/86900-capacitor-issues-causing-rtx-30803090-crashes.html
 
Last edited:

M1chl

Currently Gif and Meme Champion
Radeon is only relevant nowadays because of Consoles and cheaper prices, now however? nVidia is light years ahead with all of their features and stuff like that, on the top of that, something like Infinity cache must be very expensive to incorporate into chip, I would not be surprised if Ampere would be cheaper to build.

They have rt processors built into every CU.
Exactly and that way it's eating compute capabilities unlike nVidia solution, if my assumption is right...
 
Last edited:

Ascend

Member
Unless either of you two were the ones spreading baseless rumors and talking down on Nvidia, there's nothing to worry about. We just can't pretend the only rumor was that it was on par with 2080 Ti...

Nvidia doesn't have much completion in raytracing or an alternative to DLSS at the moment, does it? If not, then why get triggered over my post?
giphy.gif


in watchdogs legion rt doesn't work properly on radeons and not all rays are visible
Control is also glitching. I guess the conclusion for now is that RT benchmarks for the 6800 cards are not yet reliable.

But if you care enough about RT to base your graphics card purchase on that, nVidia is a safer deal at this point.
 

VFXVeteran

Banned
Don't bother. We've told him that a gazillion times. But he still touts the same BS constantly. He might still be on nVidia's payroll 🤷‍♂️

I don't believe it. All they've shown was a diagram. The TMU can also do RT? That's like a CPU also doing dedicated physics calculations. If it is in the hardware, it's very weak design and the reason why their performance is so crappy.

I can bet you a dollar to a dime, RDNA3 will have dedicated RT cores.
 
Last edited:

M1chl

Currently Gif and Meme Champion
What does that mean? You think there are extensions that AMD doesn't support? If that were the case, the game wouldn't run.
Getting flashback to deep history, when it was said, how OpenCl picks up and it's going to work better then CUDA. It maybe does that to certain extend, AMD mining software is generally faster than on comparable GPU from nVidia, but all the people mantaining the codes is saying that it's a huge mess and almost no worth it, to tune, tinker and stuff like that.

nVidia GPU is not just power, but also dev/feature support (outside of compute), which many people somewhat not taking into account.
 

psorcerer

Banned
What does that mean? You think there are extensions that AMD doesn't support? If that were the case, the game wouldn't run.

Nope. I think that RT performance depends on the implementation.
Specific hw acceleration features are a drop in a bucket.
Similar to a hw z-buffer perf. Nobody is comparing hw z-buffer acceleration strategies.

Getting flashback to deep history, when it was said, how OpenCl picks up and it's going to work better then CUDA. It maybe does that to certain extend, AMD mining software is generally faster than on comparable GPU from nVidia, but all the people mantaining the codes is saying that it's a huge mess and almost no worth it, to tune, tinker and stuff like that.

That's false. Nobody wrote any decent frameworks for OCL, that's it.
There's nothing messy about having or not having tensorflow. It just exists and uses cuda.
 
Last edited:

Ascend

Member
I don't believe it. All they've shown was a diagram. The TMU can also do RT? That's like a CPU also doing dedicated physics calculations. If it is in the hardware, it's very weak design and the reason why their performance is so crappy.

I can bet you a dollar to a dime, RDNA3 will have dedicated RT cores.
I guess they just made the patent for fun 🤷‍♂️. Do you honestly believe the consoles do RT without any hardware BVH acceleration...? And are you honestly saying that AMD are blatantly lying about what their own hardware supports...?

I think your age is getting to your brain.
 
Last edited:
The majority of RT implementations we will see in the next 5 years will be written for console games.
So they are going to gimp PC versions from now on? They have better raytracing on PC, and devs will leverage that extra power, like that have been, and will continue to do so. There's a reason why comparisons keep being made between pc and console in regards to raytracing, because they differer so much. Even AMD GPU's will leave consoles behind in raytracing now, and going forward.
 

VFXVeteran

Banned
Nope. I think that RT performance depends on the implementation.
Specific hw acceleration features are a drop in a bucket.
Similar to a hw z-buffer perf. Nobody is comparing hw z-buffer acceleration strategies.

It's not just implementation. And even if it were, how often have we seen AMD overtake Nvidia with driver support that's maximum efficiency with EVERY game released? The drivers are shitty or have been for several years which is why we don't use the boards here. I'm not seeing any difference in Minecraft that's worthwhile compared to Nvidia boards.
 

longdi

Banned
disappointment, will be better at $100 less. let see how amd super-resolution ML upscaling will help
likewise nvidia should have more than 8/10GB ram

time to bunker down with 1080ti still 🤷‍♀️
 
Last edited:

VFXVeteran

Banned
The majority of RT implementations we will see in the next 5 years will be written for console games.

That's not true at all.

Console games = exclusives.

Overall games = everybody else.

There are significantly MORE 3rd party developers out there than 1st party. The games going forward that will use RT can't be assumed they would NOT have used it if the consoles didn't have RT support. This is one of those FUD statements that make people think consoles drive tech when that's completely false. Nvidia shot first on this RT and everyone else is following. Period.
 

Brofist

Member
The majority of RT implementations we will see in the next 5 years will be written for console games.

What is a console game? Any game that comes out on consoles? A game that only comes out on consoles?

This is an outdated term now. Everything disregarding the platform specific exclusives comes out on PC. So basically a handful of Sony games. You are saying a handful of Sony games will drive the industry?
 
Last edited:
Top Bottom