• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

|OT| Review Hardware AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Valued Gamer

Neo Member
Oct 26, 2020
198
357
220
Radeon Chill.
Radeon Anti-Lag.
Checkboard rendering.
FidelityFX suite (cross plat).
SAM.
Unified shaders.

"no no, that doesn't count, only proprietary green crap does!".
No problem with that.
Lets be honest, Ray Tracing has been the biggest move in gaming IQ for the last couple of years, and DLSS is the future in moving gaming forward in the future.

Talking about historic stuff going back to ATI even isn't what today is about.
And fidelityfx contains VRS which came out back in 2018 on Nvidia cards, and Microsoft had a big hand in developing FFX with AMD.

People should really thank Nvidia for doing what they have over the last few years.
Sure They were overpriced, but they didn't sit on their hands like Intel did at the time AMD didn't offer any competition. They pushed ahead and introduced tech that we all now take for granted.
 
  • Like
Reactions: Dampf

llien

Member
Feb 1, 2017
9,375
7,745
895
Ray Tracing has been the biggest move in gaming IQ
Bullshit.
We had shadows, reflections and what not since ages ago.
That is why it is, in fact, so hard to impress with "RT on", on top of RT being essentially noise, before intensive postprocessing, even in a trivial game like Quake, this is "no post processing" frame (this game is open source, if you wonder, how):



But let's move the goals from "AMD doesn't innovate" to "let me cherry pick innovation I like from stinky green camp", shall we.
 
Last edited:

Ascend

Member
Jul 23, 2018
2,650
3,908
515
The truth is that for the past 2 years Nvidia has been the only game in town for raytracing, so all the progress made so far has been through Nvidia's solution. And right now they have a combination of superior Raytracing cores AND DLSS which now allows Raytracing AND high resolutions. You don't have to choose one or the other anymore.

AMD still has no answer to this.
Even though this is true, that does not mean one can make the statement that AMD is always two years behind. But people's true colors always unintentionally shine through in their comments.

Supporting an API is nothing like innovating with new tech.
New tech is required to support a new API... 🤷‍♂️ Look at what happened with DX10.1, which AMD was also ahead in...
A couple of years ago AMD was ahead with hardware features. But nobody cared about its hardware advantages then... It's only logical that AMD has inverted the game, by adopting features later. Because when they had additional features in their hardware, nobody cared. It seems most people don't even know about it, but obviously they all have an opinion... 🤷‍♂️
 
Last edited:

llien

Member
Feb 1, 2017
9,375
7,745
895
Most people just want the best GPUthey can get. So the choose....
The most expensive card available on the market, right?
How could people spout this level of nonsense in all seriousness is beyond me.

But I have something rather upsetting for ya:



Complaining that RT is noisy "bEfOrE pOsT pRoCeSSiNg" just demonstrates how little you know about it. RT
Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.

It also shows why NV is so in love with TAA.

Unless I’m mistaken, it seems like these cards are the best value for a 4K/60FPS experience?
Best value across the board, unless you are into less than a handful of nvidia sponsored RT games that you wish to run with RT on.

If consoles will aim at 4k/30fps there is no GPU on the market that can guarantee you 4k/60FPS.
 
Last edited:
  • Fire
Reactions: Bolivar687

wordslaughter

Member
Apr 17, 2019
939
2,626
405
The most expensive card available on the market, right?
How could people spout this level of nonsense in all seriousness is beyond me.

But I have something rather upsetting for ya:




Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.

It also shows why NV is so in love with TAA.


Best value across the board, unless you are into less than a handful of nvidia sponsored RT games that you wish to run with RT on.

If consoles will aim at 4k/30fps there is no GPU on the market that can guarantee you 4k/60FPS.
Your comments don't seem to have anything to do with anything.

AMD has not had a better GPU than Nvidia in many many years. If the 6900 is, then people will buy it. What's so complicated about that? But it hasn't actually launched yet now has it.

For many many YEARS, if you wanted a high end GPU, Nvidia was the ONLY option. Why is this so difficult for you to come to terms with? When AMD owns the high end, you will see damn near everyone who was buying Nvidia high end cards, switch to AMD. The actual fanboys are in the extreme minority.

This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.

Your comments about Realtime Raytracing are nonsensical. Like you hate it because it was Nvidia who pioneered it not AMD so you are in some way obligaed to hate it now. Posting a noisy image from quake like it means something? Seriously what are you expecting by posting that pic? All it does it show you don't understand raytracing. Using denoising is not some weakness or proof that it's not ready yet. They do the exact same thing for CGI.

 
Last edited:

Brofist

Member
Jun 15, 2004
10,688
563
1,690
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok:

Most people just want the best GPUthey can get. So the choose Nvidia.

You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.
Nah it's not like that I think it's more like he....

But let's move the goals from "AMD doesn't innovate" to "let me cherry pick innovation I like from stinky green camp", shall we.

ok nm
 

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
1,642
3,760
605
The end of today, is tomorrow dust
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok:

Most people just want the best GPUthey can get. So the choose Nvidia.

You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.
Nah it's not like that I think it's more like he....




ok nm
I think it's Lisa Su's son/daughter posting. There's no way a "normal" person could ever be this infatuated with a company or item. It just can't be healthy. The person even hates the color green for Christ sakes lol.
 
  • Like
Reactions: ZywyPL

llien

Member
Feb 1, 2017
9,375
7,745
895
Lovely switch of the subjects, but no, I won't chase you on "many many years".

Perhaps you love it and that's why you feel hurt when someone criticizes it?
Why do you feel buthurt about someone posting image actually rendered by rays (with optimizations already applied by the way)

Posting a noisy image from quake like it means something? Seriously what are you expecting by posting that pic?
Who the hell told you that it means NOTHING pretty please?
Why are most users SURPRISED and outright DO NOT BELIEVE IT IS TRUE?

You want to claim "blender uses same kind of source to render images"? Well, make an argument, then compare what blender is "denoising" to what has been posted above and think again: do you even have a point?

What's so complicated about that?
You claim that:
1) people are always 100% rational about their purchases
2) FUD doesn't work.

That's patently false.


This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.
Intel had close to zero levels of FUD surrounding its products and products of competitor.
Whereas with green stuff, it is a garden of it.

Oh, sweetie, did I hurt your feelings? There is that nice "ignore" button when you mouse is over my avatar.
 
Last edited:

llien

Member
Feb 1, 2017
9,375
7,745
895
Ironically, FUD is on pro-AMD side in CPU world.
Intel's 14nm is much closer to TSMC 7nm than naming suggests.

Keep up the good fight Ilien! Lisa approves


It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).

1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
 

Senua

Member
Apr 30, 2007
3,139
7,086
1,650
Ironically, FUD is on pro-AMD side in CPU world.
Intel's 14nm is much closer to TSMC 7nm than naming suggests.



It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).

1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
Agreed on all points

 

kittoo

Cretinously credulous
Apr 11, 2007
3,173
766
1,460
32
Unless I’m mistaken, it seems like these cards are the best value for a 4K/60FPS experience?
Its pretty good no doubt. Its pretty even (or better) than 3080 for 1080p and 1440p, but is, on average, ~7% behind 3080 at 4k. And since its 7% cheaper (at MSRP anyway, which is useless these days), they both offer same price perf at 4K.
Of course, it has 16GB as compared to 10GB of 3080, but then 3080 has far better Raytracing and also has DLSS. So at the end, depends what you value more. I am sure no one will be disappointed with either one of them IF they can get one. Just get whichever one you are getting within your budget and I am sure the experience would be stellar.
 
Last edited:

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
1,642
3,760
605
The end of today, is tomorrow dust
"AMD Radeon RX 6800 XT gets overclocked to 2.8 GHz with liquid nitrogen - VideoCardz.com"



Well damn, that's impressive.
 

llien

Member
Feb 1, 2017
9,375
7,745
895
Interesting stats from German online retailer:




Quite healthy number of 3070s, surprisingly high number of 3090s, and missing in action 3080 (would be surprised if not)
 
Last edited:
  • Like
Reactions: Ascend
Dec 14, 2008
33,540
1,822
1,240
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.

 

Ascend

Member
Jul 23, 2018
2,650
3,908
515
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.

There we go again... Not always.... Saying on average is fine. Saying always is your bias talking. Once again, it unintentionally shines through...

 
  • Fire
Reactions: llien

Ascend

Member
Jul 23, 2018
2,650
3,908
515
Interesting discussions going on here...;


Video Index:
00:00 - Welcome Back to Hardware Unboxed
01:22 - Has Nvidia's Poor Launch Helped AMD?
05:07 - How Long Until RDNA2 Optimized Ray Tracing?
07:45 - Is RX 6800 Really Worth Buying Over RTX 3070?
20:37 - Is a 650W PSU Enough?
21:23 - Will RX 6800 Be Highly Overclockable?
22:20 - Will AIBs Use Different VRAM/Cache Amounts?
26:16 - Will Non-Nvidia-RTX Games Perform Better?
31:11 - Driver Stability?
34:03 - Will AIB Launch Lead to More Supply?
38:36 - Why Did AMD Switch Away From Blower Design?
41:59 - Will These Cards See FineWine? (importance of VRAM size)
52:11 - Outro
 
Last edited:
  • Like
Reactions: Ryujin and llien

llien

Member
Feb 1, 2017
9,375
7,745
895

waylo

Member
Jan 7, 2018
680
1,173
490
There we go again... Not always.... Saying on average is fine. Saying always is your bias talking. Once again, it unintentionally shines through...

You AMD guys are wild man.

"Don't say always because in 15 tests, it wins in 4 of them".

Come the fuck on.

Just can't admit defeat. Nvidia won this time around. AMD cards, while having pretty good raster performance, just aren't a good value. Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive. Sorry that hurts your feelings, or that you take that personally for some reason.
 
  • Like
Reactions: ZywyPL

Ascend

Member
Jul 23, 2018
2,650
3,908
515
You AMD guys are wild man.

"Don't say always because in 15 tests, it wins in 4 of them".

Come the fuck on.
That's called precision of language and not exacerbating lies. It's like a team losing 3-1 in football, and saying to the losing team that they never score a goal, while they did score one in the same game. Or are you going to disagree that that is a blatant lie?

Just can't admit defeat.
I am not AMD nor nVidia. I have not been 'defeated', whatever that means. I don't identify myself with corporations.

Nvidia won this time around.
That's your opinion. Considering how everyone was saying that AMD wouldn't go past the RTX 3070, if that, the fact that they are competing at all is a win for AMD, even if according to the ones that think that only RT and DLSS exist that AMD lost.

AMD cards, while having pretty good raster performance, just aren't a good value.
Good luck with your 8GB on the 3070. I'd rather pay $80 for double the VRAM, thanks, especially since multiple games are already surpassing the 8GB usage limit.
As for the 6800XT, it's only not a good value if you don't care about SAM, see higher RT performance in current games as a must have, don't care about power consumption, and see DLSS-like feature as mandatory right now.
Want an unbiased perspective? Watch the video about future proofing in my previous post;

Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive.
Who says its RT is not viable? If the PS5 can do it with 36CUs, you really think that the PC version with twice the CUs has a non-viable RT solution? Not as fast as nVidia, fine, but to call it "not viable" is, once again, a blatant lie.
And even IF it was not viable, weren't you the one that said this...?;
You do realize PC already has games that have ray tracing as an option? And that PC's without RTX cards can still play them? It's literally a toggle. I don't understand how this is an issue or a concern.
And weren't you using an RTX 2060...????? Why did you buy that? If the RT on the 6800 cards are not viable, neither is the one on the RTX 2060, yet, you bought it over the clearly superior 5700XT. What's the deal?

In before, BBBBUUUTTTT DEM DRIVERZZZZZZZZZZZ

Sorry that hurts your feelings, or that you take that personally for some reason.
I dislike dishonesty, putting it mildly. Apathy and carelessness is damaging. You didn't even think AMD could come even close to the levels of the 3080 with anything. Well, they did so in multiple aspects. And now you want to criticize me for calling out people that exaggerate things and perpetuate lies. 🤷‍♂️
 
Last edited:

Ryujin

Gold Member
Dec 28, 2006
1,491
1,012
1,420
Interesting, it definitely seems that optimisation for each vendor's RT implementation seems to play a major role in the final performance.

This game appears to be another example of that along with Dirt 5. I wonder will this trend continue with AMD sponsored titles and non-sponsored console ports/cross platform releases? We may end up getting a case where "neutral" non-sponsored titles may need to have separate code paths/branches for RT depending on the detected GPU if the developers are looking to optimise well for both platforms.

Outside of the AMD vs Nvidia RT performance stuff an interesting point seems to be that the 3090 performs worse with RT enabled than even the 2080ti. I can only really think of two possibilities:

  1. The architectural changes from Turing to Ampere, specifically relating to RT seem to at least in this title lead to better performance with Turing than Ampere, which does seem a little odd but I suppose not totally impossible.
  2. A potentially more likely scenario is a potential driver issue with the 3090 for this game? Although it is strange that without RT turned on the 3090 is well ahead of 2080ti.
Either way it looks like the RT landscape/final performance story between the cards may not be fully fleshed out right now seeing as all of the games tested to showcase comparisons are optimised to Nvidia's RT solution/cards.
 

llien

Member
Feb 1, 2017
9,375
7,745
895
NV probably needs a driver update to fix whatever is going wrong on Ampere.
That benchmark makes no sense, OC-ing 6800 by 10% brings exactly zero perf boost, that is crazy stuff...
I'd say maybe CPU limited (and running on different CPUs) but I doubt CPU load changes much by resolution.

Peace & Harmony! :messenger_smiling:
 
Last edited:

Ryujin

Gold Member
Dec 28, 2006
1,491
1,012
1,420
That benchmark makes no sense, OC-ing 6800 by 10% brings exactly zero perf boost, that is crazy stuff...
I'd say maybe CPU limited (and running on different CPUs) but I doubt CPU load changes much by resolution.


Peace & Harmony! :messenger_smiling:
Yeah I'm not really sure myself, it does seem a little strange but could be partially CPU limited at 1440p? Or could be game engine limits or other weird resource usage?

Regarding OC performance for 6000 series we know that they clock high but we don't know how exactly that translates into performance just yet, lets wait until we see a few more examples once the AIB custom models release and we should get some better data to see how they perform.