• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Radeon Chill.
Radeon Anti-Lag.
Checkboard rendering.
FidelityFX suite (cross plat).
SAM.
Unified shaders.

"no no, that doesn't count, only proprietary green crap does!".
No problem with that.
Lets be honest, Ray Tracing has been the biggest move in gaming IQ for the last couple of years, and DLSS is the future in moving gaming forward in the future.

Talking about historic stuff going back to ATI even isn't what today is about.
And fidelityfx contains VRS which came out back in 2018 on Nvidia cards, and Microsoft had a big hand in developing FFX with AMD.

People should really thank Nvidia for doing what they have over the last few years.
Sure They were overpriced, but they didn't sit on their hands like Intel did at the time AMD didn't offer any competition. They pushed ahead and introduced tech that we all now take for granted.
 

llien

Member
Ray Tracing has been the biggest move in gaming IQ
Bullshit.
We had shadows, reflections and what not since ages ago.
That is why it is, in fact, so hard to impress with "RT on", on top of RT being essentially noise, before intensive postprocessing, even in a trivial game like Quake, this is "no post processing" frame (this game is open source, if you wonder, how):

r4LJppH.png


But let's move the goals from "AMD doesn't innovate" to "let me cherry pick innovation I like from stinky green camp", shall we.
 
Last edited:
Complaining that RT is noisy "bEfOrE pOsT pRoCeSSiNg" just demonstrates how little you know about it. RT is done in concert with de-noising. And this is true even for non-realtime applications like Blender.

Realtime Raytracing is a significant leap forward in IQ and we have Nvidia to thank for that. Not AMD.
 

Ascend

Member
The truth is that for the past 2 years Nvidia has been the only game in town for raytracing, so all the progress made so far has been through Nvidia's solution. And right now they have a combination of superior Raytracing cores AND DLSS which now allows Raytracing AND high resolutions. You don't have to choose one or the other anymore.

AMD still has no answer to this.
Even though this is true, that does not mean one can make the statement that AMD is always two years behind. But people's true colors always unintentionally shine through in their comments.

Supporting an API is nothing like innovating with new tech.
New tech is required to support a new API... 🤷‍♂️ Look at what happened with DX10.1, which AMD was also ahead in...
A couple of years ago AMD was ahead with hardware features. But nobody cared about its hardware advantages then... It's only logical that AMD has inverted the game, by adopting features later. Because when they had additional features in their hardware, nobody cared. It seems most people don't even know about it, but obviously they all have an opinion... 🤷‍♂️
 
Last edited:

llien

Member
Most people just want the best GPUthey can get. So the choose....
The most expensive card available on the market, right?
How could people spout this level of nonsense in all seriousness is beyond me.

But I have something rather upsetting for ya:

Xfe5dyQ.png


Complaining that RT is noisy "bEfOrE pOsT pRoCeSSiNg" just demonstrates how little you know about it. RT
Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.

It also shows why NV is so in love with TAA.

Unless I’m mistaken, it seems like these cards are the best value for a 4K/60FPS experience?
Best value across the board, unless you are into less than a handful of nvidia sponsored RT games that you wish to run with RT on.

If consoles will aim at 4k/30fps there is no GPU on the market that can guarantee you 4k/60FPS.
 
Last edited:
The most expensive card available on the market, right?
How could people spout this level of nonsense in all seriousness is beyond me.

But I have something rather upsetting for ya:

Xfe5dyQ.png



Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.

It also shows why NV is so in love with TAA.


Best value across the board, unless you are into less than a handful of nvidia sponsored RT games that you wish to run with RT on.

If consoles will aim at 4k/30fps there is no GPU on the market that can guarantee you 4k/60FPS.

Your comments don't seem to have anything to do with anything.

AMD has not had a better GPU than Nvidia in many many years. If the 6900 is, then people will buy it. What's so complicated about that? But it hasn't actually launched yet now has it.

For many many YEARS, if you wanted a high end GPU, Nvidia was the ONLY option. Why is this so difficult for you to come to terms with? When AMD owns the high end, you will see damn near everyone who was buying Nvidia high end cards, switch to AMD. The actual fanboys are in the extreme minority.

This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.

Your comments about Realtime Raytracing are nonsensical. Like you hate it because it was Nvidia who pioneered it not AMD so you are in some way obligaed to hate it now. Posting a noisy image from quake like it means something? Seriously what are you expecting by posting that pic? All it does it show you don't understand raytracing. Using denoising is not some weakness or proof that it's not ready yet. They do the exact same thing for CGI.

 
Last edited:

Brofist

Member
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok:

Most people just want the best GPUthey can get. So the choose Nvidia.

You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.

Nah it's not like that I think it's more like he....

But let's move the goals from "AMD doesn't innovate" to "let me cherry pick innovation I like from stinky green camp", shall we.


ok nm
 
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok:

Most people just want the best GPUthey can get. So the choose Nvidia.

You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.
Nah it's not like that I think it's more like he....




ok nm
I think it's Lisa Su's son/daughter posting. There's no way a "normal" person could ever be this infatuated with a company or item. It just can't be healthy. The person even hates the color green for Christ sakes lol.
 

llien

Member
For many many YEARS
Lovely switch of the subjects, but no, I won't chase you on "many many years".

You hate it.
Perhaps you love it and that's why you feel hurt when someone criticizes it?
Why do you feel buthurt about someone posting image actually rendered by rays (with optimizations already applied by the way)

Posting a noisy image from quake like it means something? Seriously what are you expecting by posting that pic?
Who the hell told you that it means NOTHING pretty please?
Why are most users SURPRISED and outright DO NOT BELIEVE IT IS TRUE?

You want to claim "blender uses same kind of source to render images"? Well, make an argument, then compare what blender is "denoising" to what has been posted above and think again: do you even have a point?

What's so complicated about that?
You claim that:
1) people are always 100% rational about their purchases
2) FUD doesn't work.

That's patently false.


This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.
Intel had close to zero levels of FUD surrounding its products and products of competitor.
Whereas with green stuff, it is a garden of it.

Oh, sweetie, did I hurt your feelings? There is that nice "ignore" button when you mouse is over my avatar.
 
Last edited:

llien

Member
Ironically, FUD is on pro-AMD side in CPU world.
Intel's 14nm is much closer to TSMC 7nm than naming suggests.

Keep up the good fight Ilien! Lisa approves


XLDJ4S4.gif

It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).

1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
 

Senua

Member
Ironically, FUD is on pro-AMD side in CPU world.
Intel's 14nm is much closer to TSMC 7nm than naming suggests.



It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).

1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
Agreed on all points

aVakPap.gif
 

kittoo

Cretinously credulous
Unless I’m mistaken, it seems like these cards are the best value for a 4K/60FPS experience?

Its pretty good no doubt. Its pretty even (or better) than 3080 for 1080p and 1440p, but is, on average, ~7% behind 3080 at 4k. And since its 7% cheaper (at MSRP anyway, which is useless these days), they both offer same price perf at 4K.
Of course, it has 16GB as compared to 10GB of 3080, but then 3080 has far better Raytracing and also has DLSS. So at the end, depends what you value more. I am sure no one will be disappointed with either one of them IF they can get one. Just get whichever one you are getting within your budget and I am sure the experience would be stellar.
 
Last edited:
"AMD Radeon RX 6800 XT gets overclocked to 2.8 GHz with liquid nitrogen - VideoCardz.com"



Well damn, that's impressive.
 

llien

Member
Interesting stats from German online retailer:





Quite healthy number of 3070s, surprisingly high number of 3090s, and missing in action 3080 (would be surprised if not)
 
Last edited:
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.

 

Ascend

Member
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.

There we go again... Not always.... Saying on average is fine. Saying always is your bias talking. Once again, it unintentionally shines through...

 

Ascend

Member
Interesting discussions going on here...;



Video Index:
00:00 - Welcome Back to Hardware Unboxed
01:22 - Has Nvidia's Poor Launch Helped AMD?
05:07 - How Long Until RDNA2 Optimized Ray Tracing?
07:45 - Is RX 6800 Really Worth Buying Over RTX 3070?
20:37 - Is a 650W PSU Enough?
21:23 - Will RX 6800 Be Highly Overclockable?
22:20 - Will AIBs Use Different VRAM/Cache Amounts?
26:16 - Will Non-Nvidia-RTX Games Perform Better?
31:11 - Driver Stability?
34:03 - Will AIB Launch Lead to More Supply?
38:36 - Why Did AMD Switch Away From Blower Design?
41:59 - Will These Cards See FineWine? (importance of VRAM size)
52:11 - Outro
 
Last edited:

llien

Member
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.


Green reality distortion field?

Pancake-tease.png
 

waylo

Banned
There we go again... Not always.... Saying on average is fine. Saying always is your bias talking. Once again, it unintentionally shines through...


You AMD guys are wild man.

"Don't say always because in 15 tests, it wins in 4 of them".

Come the fuck on.

Just can't admit defeat. Nvidia won this time around. AMD cards, while having pretty good raster performance, just aren't a good value. Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive. Sorry that hurts your feelings, or that you take that personally for some reason.
 

Ascend

Member
You AMD guys are wild man.

"Don't say always because in 15 tests, it wins in 4 of them".

Come the fuck on.
That's called precision of language and not exacerbating lies. It's like a team losing 3-1 in football, and saying to the losing team that they never score a goal, while they did score one in the same game. Or are you going to disagree that that is a blatant lie?

Just can't admit defeat.
I am not AMD nor nVidia. I have not been 'defeated', whatever that means. I don't identify myself with corporations.

Nvidia won this time around.
That's your opinion. Considering how everyone was saying that AMD wouldn't go past the RTX 3070, if that, the fact that they are competing at all is a win for AMD, even if according to the ones that think that only RT and DLSS exist that AMD lost.

AMD cards, while having pretty good raster performance, just aren't a good value.
Good luck with your 8GB on the 3070. I'd rather pay $80 for double the VRAM, thanks, especially since multiple games are already surpassing the 8GB usage limit.
As for the 6800XT, it's only not a good value if you don't care about SAM, see higher RT performance in current games as a must have, don't care about power consumption, and see DLSS-like feature as mandatory right now.
Want an unbiased perspective? Watch the video about future proofing in my previous post;

Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive.
Who says its RT is not viable? If the PS5 can do it with 36CUs, you really think that the PC version with twice the CUs has a non-viable RT solution? Not as fast as nVidia, fine, but to call it "not viable" is, once again, a blatant lie.
And even IF it was not viable, weren't you the one that said this...?;
You do realize PC already has games that have ray tracing as an option? And that PC's without RTX cards can still play them? It's literally a toggle. I don't understand how this is an issue or a concern.
And weren't you using an RTX 2060...????? Why did you buy that? If the RT on the 6800 cards are not viable, neither is the one on the RTX 2060, yet, you bought it over the clearly superior 5700XT. What's the deal?

In before, BBBBUUUTTTT DEM DRIVERZZZZZZZZZZZ

Sorry that hurts your feelings, or that you take that personally for some reason.
I dislike dishonesty, putting it mildly. Apathy and carelessness is damaging. You didn't even think AMD could come even close to the levels of the 3080 with anything. Well, they did so in multiple aspects. And now you want to criticize me for calling out people that exaggerate things and perpetuate lies. 🤷‍♂️
 
Last edited:
Interesting, it definitely seems that optimisation for each vendor's RT implementation seems to play a major role in the final performance.

This game appears to be another example of that along with Dirt 5. I wonder will this trend continue with AMD sponsored titles and non-sponsored console ports/cross platform releases? We may end up getting a case where "neutral" non-sponsored titles may need to have separate code paths/branches for RT depending on the detected GPU if the developers are looking to optimise well for both platforms.

Outside of the AMD vs Nvidia RT performance stuff an interesting point seems to be that the 3090 performs worse with RT enabled than even the 2080ti. I can only really think of two possibilities:

  1. The architectural changes from Turing to Ampere, specifically relating to RT seem to at least in this title lead to better performance with Turing than Ampere, which does seem a little odd but I suppose not totally impossible.
  2. A potentially more likely scenario is a potential driver issue with the 3090 for this game? Although it is strange that without RT turned on the 3090 is well ahead of 2080ti.
Either way it looks like the RT landscape/final performance story between the cards may not be fully fleshed out right now seeing as all of the games tested to showcase comparisons are optimised to Nvidia's RT solution/cards.
 
That benchmark makes no sense, OC-ing 6800 by 10% brings exactly zero perf boost, that is crazy stuff...
I'd say maybe CPU limited (and running on different CPUs) but I doubt CPU load changes much by resolution.


Peace & Harmony! :messenger_smiling:

Yeah I'm not really sure myself, it does seem a little strange but could be partially CPU limited at 1440p? Or could be game engine limits or other weird resource usage?

Regarding OC performance for 6000 series we know that they clock high but we don't know how exactly that translates into performance just yet, lets wait until we see a few more examples once the AIB custom models release and we should get some better data to see how they perform.
 

Ascend

Member
No compromise: RTX 3090, but is terrible value
RT at the cost of likely having to lower settings in the future due to VRAM: 3080
Large amount of VRAM at the cost of potentially having to lower RT settings in the future: 6800XT
Worst option: RTX 3070 due to only having 8GB, so get 6800 instead.

What's going on? :pie_thinking:

NBhLT23.jpg
The most obvious conclusion is that RT needs to be optimized for the specific architecture, otherwise it simply doesn't work properly.
Another one might be that AMD's RT is specifically very good with shadows, although that seems very weird, because RT GI is not as heavy as RT shadows. But all of AMD's RT implementations seem to focus primarily on shadows. There must be something more behind that.

Do you have a link to the original video?
 
Last edited:

MadYarpen

Member
This is why I am actually happy that for example RT will not be working on AMD cards at launch. It sucks in some way, but on the other hand it gives hope it will actually work bettar than control for example, when it's here. And I don't mean suddenly it will be better than NV, but maybe RT turns out to be a playable setting to some extent. It will be interesting to see how much time it will take them.
 

SantaC

Member
It is funny how everyone care about RT all of a sudden. Nvidia got leapfrogged so people have to create a new narrative i guess.
 

MadYarpen

Member
It is funny how everyone care about RT all of a sudden. Nvidia got leapfrogged so people have to create a new narrative i guess.
I think the fact that we are starting the generation when the consoles have some RT capacity *might* have something to do with that. And that for example Cyberpunk about to be released looks crazy good with RT.
 

llien

Member
This is why I am actually happy that for example RT will not be working on AMD cards at launch. It sucks in some way, but on the other hand it gives hope it will actually work bettar than control for example, when it's here. And I don't mean suddenly it will be better than NV, but maybe RT turns out to be a playable setting to some extent. It will be interesting to see how much time it will take them.
Welp, but even in green sponsored Control 6800XT is only about 10% behind 2080Ti.
 
Sooooooo what the benchmark comparisons are telling us? I wasn't following news for a while. Is there some kind of average out of many games from multiple sources that compares AMD and Nvidia cards?

Go to page 1, post 1 of this thread. I have a bunch of embedded videos for reviews which contain benchmarks and also a link to a review roundup with links to all of the reviews.
 
Go to page 1, post 1 of this thread. I have a bunch of embedded videos for reviews which contain benchmarks and also a link to a review roundup with links to all of the reviews.
Yeah, that's all good but I wish someone would take the time and put all available data into a spreadsheet so we can analyze it.
 

Ascend

Member
After thinking about it a bit more, maybe the reason that AMD's RT is focused on shadows is because shadows don't need color data. They basically only darken what is already there (extremely simplified of course). So maybe RT effects that require to 'hold' the color data are more demanding on AMD's architecture, like reflections.
 
After thinking about it a bit more, maybe the reason that AMD's RT is focused on shadows is because shadows don't need color data. They basically only darken what is already there (extremely simplified of course). So maybe RT effects that require to 'hold' the color data are more demanding on AMD's architecture, like reflections.

That is certainly possible and might end up being the case. It is odd that most AMD sponsored games seem to only be using RT shadows, however it might be the case that they know their Super Resolution tech is not ready yet and to get playable framerates at something like 4K with many RT effects enabled they need an upscaling tech like DLSS. Otherwise they would be getting directly compared to Nvidia and their own sponsored games would be running better on Nvidia cards? At least that is another possibility and something to take into account.

For example on consoles with only 36CUs we see good RT reflections in Spiderman and we see many RT effects in Watch Dogs (although I know those are running at lower quality/settings vs PC).

We should hopefully know more as the months pass and more games optimized for AMD release such as console ports etc... It will also be interesting to see how Watch Dogs performs once the driver issue is sorted out. I wonder will performance drop or stay roughly the same for that title on AMD cards?
 

Rikkori

Member
No compromise: RTX 3090, but is terrible value
RT at the cost of likely having to lower settings in the future due to VRAM: 3080
Large amount of VRAM at the cost of potentially having to lower RT settings in the future: 6800XT
Worst option: RTX 3070 due to only having 8GB, so get 6800 instead.


The most obvious conclusion is that RT needs to be optimized for the specific architecture, otherwise it simply doesn't work properly.
Another one might be that AMD's RT is specifically very good with shadows, although that seems very weird, because RT GI is not as heavy as RT shadows. But all of AMD's RT implementations seem to focus primarily on shadows. There must be something more behind that.

Do you have a link to the original video?
 

waylo

Banned
That's called precision of language and not exacerbating lies. It's like a team losing 3-1 in football, and saying to the losing team that they never score a goal, while they did score one in the same game. Or are you going to disagree that that is a blatant lie?
It's called semantics. Something people often use to not be wrong.

I am not AMD nor nVidia. I have not been 'defeated', whatever that means. I don't identify myself with corporations.
Sure doesn't seem like it judging by this thread.

That's your opinion. Considering how everyone was saying that AMD wouldn't go past the RTX 3070, if that, the fact that they are competing at all is a win for AMD, even if according to the ones that think that only RT and DLSS exist that AMD lost.
Yes, their basic raster performance is solid. Nobody is taking that away from AMD. But at the end of the day, being on par or slightly worse while also missing out on many useful features isn't really a win.

Good luck with your 8GB on the 3070. I'd rather pay $80 for double the VRAM, thanks, especially since multiple games are already surpassing the 8GB usage limit.
As for the 6800XT, it's only not a good value if you don't care about SAM, see higher RT performance in current games as a must have, don't care about power consumption, and see DLSS-like feature as mandatory right now.
Want an unbiased perspective? Watch the video about future proofing in my previous post;
Allocation =/= usage. Also, as evidenced already, having more VRAM doesn't necessarily make a difference when it's slower.

Who says its RT is not viable? If the PS5 can do it with 36CUs, you really think that the PC version with twice the CUs has a non-viable RT solution? Not as fast as nVidia, fine, but to call it "not viable" is, once again, a blatant lie.
And even IF it was not viable, weren't you the one that said this...?;
That is irrelevant to this discussion. Read the thread you pulled that quote from. Like...what?

And weren't you using an RTX 2060...????? Why did you buy that? If the RT on the 6800 cards are not viable, neither is the one on the RTX 2060, yet, you bought it over the clearly superior 5700XT. What's the deal?

In before, BBBBUUUTTTT DEM DRIVERZZZZZZZZZZZ
Hey there genius, the 2060 came out BEFORE a 5700 existed. Like, a full six months before. Guess when I bought my 2060?

I dislike dishonesty, putting it mildly. Apathy and carelessness is damaging. You didn't even think AMD could come even close to the levels of the 3080 with anything. Well, they did so in multiple aspects. And now you want to criticize me for calling out people that exaggerate things and perpetuate lies. 🤷‍♂️
Okay bud.
 
Top Bottom