• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Kenpachii

Member
Hairworks implementation was too computationally expensive for the result it was producing.

Nobody stopped AMD from:

1) updating there drives to enforce lower tesselation for witcher 3.
2) go out to developers to implement a slider for tesselation
3) implement stressfx in the game and talk with the developer.

But what did amd do? bad bad nvidia. And the end result is AMD users have ot tinker with there settings wait forever for meaningful driver updates and with nvidia it just works.

So what are people going to buy shit that just works the end.

This is why i stated ealier in some AMD topic that AMD needs to go out and start talking with developers for meaningful games that are coming out. Nobody gives 2 flying shits about dirt 5 or some shitty godfall hack and slash game, start working with key titles such as cyberpunk / metro etc and make sure those games run raytracing at 1080p 60 fps on there 6800xt and have a AMD option specifically that reaches that for those cards even if the quality is a bit reduced from nvidia. But at least people have a stable performing solution going forwards intead of having to wait and all they get is AMD crying about nvidia which your consumers aren't wanting to hear.
 
Last edited:
What you mean?

repeating the same fucking tropes that show that he hasn't used an amd GPU in the last 10 years.

"bad drivers bla bla"
"software sucks blabla"
just showing synthetics where nvidia wins and taking 50% of benchmark presentation time for that, while leaving out those where they don't

there is enough fair critisim for rdna2 (no DLSS, RT perf) but that surely isn't it. i really hate shit like that.

i had more driver issues in my first 6 weeks of ampere than i had in 6 years GCN
adrenaline software is so good, that nvidia took a year to copy it with geforce experience (which is doing it better now in most aspects tbf but it took them a year to get there)


bonus:

"look at my BF LCD monitor im sitting 30cm in front of" lol. sorry nvidia didn't send you an OLED like linus :messenger_loudly_crying:
 

Antitype

Member
Hairworks heavily utilised tessellation to spam a shitload of triangles where they weren't really needed. At the time, Nvidia had much stronger Geometry performance than AMD. This meant that while HairWorks tanked Nvidia's performance, due to their advantage in geometry, it absolutely destroyed AMD.
Tessellation in general was a big weapon Nvidia used to slap AMD, by introducing triangles where they were not visible to clutter the gfx pipeline and choke AMD GPU's. The visual difference between max tessellation and medium tessellation was negligible as most of the triangles were invisible to the player, but when you benchmark at max settings and turn the sliders all the way up, it will disproportionately hit AMD.

Its not really cheating, but its just a little bit sus behaviour.

And TressFX was very compute heavy, thus tanking performance on Nvidia while not being too crippling on AMD. People always like to call out NV on their use of tesselation while giving a pass to AMD for its compute usage. They both play to the advantage of their architecture(s). There's nothing scummy in that.
 

Irobot82

Member
And TressFX was very compute heavy, thus tanking performance on Nvidia while not being too crippling on AMD. People always like to call out NV on their use of tesselation while giving a pass to AMD for its compute usage. They both play to the advantage of their architecture(s). There's nothing scummy in that.
TressFX was fully open-source that allowed Nvidia to optimize for it. Hairworks is not.
 

Turk1993

GAFs #1 source for car graphic comparisons
It is indeed nice to have both, which the 6800XT also has, in case you conveniently forgot.

I would pick 16GB with medium speed RT over 10GB with slightly higher speed RT.

And oh, when the 3080 becomes RAM limited, you can expect me to come rub it in your face, and I then expect an apology too. You have been warned.
The XT will be more limited by its RT capabilities than the RTX by its ram limitation. If you gonna rub it to my face whenever the RTX gets vram limited performance i will rub it on your face whenever the RX gets behind a RTX 3080 in ray traced games. If you want to play like that fine :).
 
And TressFX was very compute heavy, thus tanking performance on Nvidia while not being too crippling on AMD. People always like to call out NV on their use of tesselation while giving a pass to AMD for its compute usage. They both play to the advantage of their architecture(s). There's nothing scummy in that.

all tech AMD distributed to game developers has always been fully open source and no propriety blackboxes.

and as i really think thats the right mindset/philosophy for moving mankind foward , it obviously did not bode well for AMD in the past and i kinda wished they weren't that dogmatic with it.
 

Ascend

Member
The XT will be more limited by its RT capabilities than the RTX by its ram limitation. If you gonna rub it to my face whenever the RTX gets vram limited performance i will rub it on your face whenever the RX gets behind a RTX 3080 in ray traced games. If you want to play like that fine :).
You already did, so, no change there. 🤷‍♂️
 

Ascend

Member
So... is the 6000 series a wet fart or a success?

Not asking Ilien as he's as biased as it gets.
No DLSS alternative as of right now
RT is inferior to nVidia as of right now
Similar performance
Lower power consumption.

The answer is; it depends on what you want. If you feel like you're gonna die if you don't have DLSS (f)or RT, go nVidia.
Other than that, you can save a few bucks for competitive performance.

It's also interesting how everyone is ignoring the 6800 non-XT.
 
No DLSS alternative as of right now
RT is inferior to nVidia as of right now
Similar performance
Lower power consumption.

The answer is; it depends on what you want. If you feel like you're gonna die if you don't have DLSS (f)or RT, go nVidia.
Other than that, you can save a few bucks for competitive performance.

It's also interesting how everyone is ignoring the 6800 non-XT.
Thanks.

Oh man those RT benches don't look good for the consoles. :(
 

CrustyBritches

Gold Member
So... is the 6000 series a wet fart or a success?
RDNA 2 is a success in the sense that it's competitive up and down the stack with slightly better perf/$ comparing 6800XT to 3080. Hopefully we'll get a look at some nice AiBs soon. Consoles will be getting at least RT reflections commonly going forward, and you can expect many games to be optimized for RDNA 2.

For as much as certain elements on this board rally against DLSS, it's an important component in Nvidia's strategy to bring RT to market. For the whole suite of hybrid-RT effects(i.e.- Control) you must have AI-upscaling.

The best part about a competitive AMD is how it forces Nvidia to sell their GA102 at $699. The bummer about everything is the lack of available stock. At least people who need an upgrade have a chance at RDNA 2 in addition to Ampere
 
Last edited:

Ascend

Member
Translated;

"In the individual games, it turns out that AMD's ray tracing implementation performs completely differently depending on the title. So there are games where AMD's solution has quite a problem. Control is such a case where the Radeon RX 6800 XT is even 16 percent behind a GeForce RTX 3070 and loses 59 percent of performance by raytracing. It is quite possible that the different approach to ray tracing implementation makes a difference here.
...
Because there are also games that are quite good for the AMD accelerators. In Call of Duty: Modern Warfare, for example, the Radeon RX 6800 XT "only" slows down by 32 percent, not so far from the minus 29 percent of geForce RTX 3080. This puts the AMD card in the game only 10 percent behind its opponent and 13 percent faster than the GeForce RTX 3070, which in turn is 3 percent slower than the Radeon RX 6800. "

 

UrgeLoLUS

Neo Member
Is it bad judgement from my side that I really want to go AMD GPU? Just got a x570 motherboard and a 5800x and 32GB 3600 MHZ DDR4 and was planning to go for a 6800 or XT. But even if the benchmarks are okey I am not sure. 50-100 usd lower price would make it easier. Have had nvidia GPUs forever now, and I want them to step up the game and stop milking us PC players for money. The 3000 series looks to be a product of the increased competition they knew they would be up against.
 

BluRayHiDef

Banned
You did, in the context of "number of RT games".

No, I didn't. Your post claimed that there are only about twenty games that support ray tracing and only about a handful that support DLSS. So, I addressed both of those claims.

See for yourself.

There are only 20 known RT titles, including those in the works.
There is only a handful of games supporting NV's DLSS upscaling.
So, "yes" is rather an optimistic take.
 
So in the end AMD's blustering about how they would have a superior launch was typical bullshit, even though demand for these cards is undoubtedly lower than for Nvidia's offerings they still sold out everywhere with no obvious signs of the stock they promised. I'm shocked, I tell you. Shocked!

Also the resident fanboys here seem awfully silent about how AMD paid the developers of Godfall to only enable RT on AMD cards. Strange how they scream incessantly about how evil Nvidia is when things happen that exclude AMD but now they are completely absent when it's AMD excluding Nvidia. I'm genuinely shocked that this might happen!
 
Last edited:

Ascend

Member
So in the end AMD's blustering about how they would have a superior launch was typical bullshit, even though demand for these cards is undoubtedly lower than for Nvidia's offerings they still sold out everywhere with no obvious signs of the stock they promised. I'm shocked, I tell you. Shocked!
Wait till the AIB cards are released before judging.

Also the resident fanboys here seem awfully silent about how AMD paid the developers of Godfall to only enable RT on AMD cards. Strange how they scream incessantly about how evil Nvidia is when things happen that exclude AMD but now they are completely absent when it's AMD excluding Nvidia. I'm genuinely shocked that this might happen!
At this point, both nVidia and its users deserve it, considering how long it has been going on in reverse. This stuff is not good for the gaming market in the long run, but everyone, i.e. nVIdia and their users, should get a good couple of years in reverse to know how it is, and then we can get balance.
But it's not the same, because ultimately it will work on nVidia. nVidia did a bunch of shady closed shit that never ever worked on AMD hardware.

In other news;
 
Last edited:

Turk1993

GAFs #1 source for car graphic comparisons
No DLSS alternative as of right now
RT is inferior to nVidia as of right now
Similar performance
Lower power consumption.

The answer is; it depends on what you want. If you feel like you're gonna die if you don't have DLSS (f)or RT, go nVidia.
Other than that, you can save a few bucks for competitive performance.

It's also interesting how everyone is ignoring the 6800 non-XT.
Im more interested in the 6900XT performance, if it has like comparable RT performance or something between 3070 and 3080 like RT performance i can snatch one maybe. Oh and i agree with 90% of what you wrote there :).
 

psorcerer

Banned
Nobody stopped AMD from:

1) updating there drives to enforce lower tesselation for witcher 3.
2) go out to developers to implement a slider for tesselation
3) implement stressfx in the game and talk with the developer.

I think you're coming at me from a wrong direction.
Hairworks was a subpar implementation. In general NV was always bad at their implementations, mostly because they are not game developers.
Therefore it's pretty understandable that AMD didn't do much about it, they just let it sort out by itself.
It was always the case: NV comes with some badly optimized stuff, game developers implement it in a much better fashion later.
The same will happen with RT.
 

VFXVeteran

Banned
Translated;

"In the individual games, it turns out that AMD's ray tracing implementation performs completely differently depending on the title. So there are games where AMD's solution has quite a problem. Control is such a case where the Radeon RX 6800 XT is even 16 percent behind a GeForce RTX 3070 and loses 59 percent of performance by raytracing. It is quite possible that the different approach to ray tracing implementation makes a difference here.
...
Because there are also games that are quite good for the AMD accelerators. In Call of Duty: Modern Warfare, for example, the Radeon RX 6800 XT "only" slows down by 32 percent, not so far from the minus 29 percent of geForce RTX 3080. This puts the AMD card in the game only 10 percent behind its opponent and 13 percent faster than the GeForce RTX 3070, which in turn is 3 percent slower than the Radeon RX 6800. "


LOL!

Control is using several RT features vs. COD 1 feature. Of course it's going to lag behind less. It's 1 feature.
 
Last edited:

ethomaz

Banned
Translated;

"In the individual games, it turns out that AMD's ray tracing implementation performs completely differently depending on the title. So there are games where AMD's solution has quite a problem. Control is such a case where the Radeon RX 6800 XT is even 16 percent behind a GeForce RTX 3070 and loses 59 percent of performance by raytracing. It is quite possible that the different approach to ray tracing implementation makes a difference here.
...
Because there are also games that are quite good for the AMD accelerators. In Call of Duty: Modern Warfare, for example, the Radeon RX 6800 XT "only" slows down by 32 percent, not so far from the minus 29 percent of geForce RTX 3080. This puts the AMD card in the game only 10 percent behind its opponent and 13 percent faster than the GeForce RTX 3070, which in turn is 3 percent slower than the Radeon RX 6800. "

I have no ideia Modern Warfare even used RT.
 
Nobody stopped AMD from:

1) updating there drives to enforce lower tesselation for witcher 3.
2) go out to developers to implement a slider for tesselation

3) implement stressfx in the game and talk with the developer.

But what did amd do? bad bad nvidia. And the end result is AMD users have ot tinker with there settings wait forever for meaningful driver updates and with nvidia it just works.

So what are people going to buy shit that just works the end.

This is why i stated ealier in some AMD topic that AMD needs to go out and start talking with developers for meaningful games that are coming out. Nobody gives 2 flying shits about dirt 5 or some shitty godfall hack and slash game, start working with key titles such as cyberpunk / metro etc and make sure those games run raytracing at 1080p 60 fps on there 6800xt and have a AMD option specifically that reaches that for those cards even if the quality is a bit reduced from nvidia. But at least people have a stable performing solution going forwards intead of having to wait and all they get is AMD crying about nvidia which your consumers aren't wanting to hear.

They did, there was a slider in the driver (AMD optimised was 16x iiirc)

I had a 290x and 1060 at the time in seperate computers. Gayworks utterly raped my 1060 and I wished it had the AMD option.
 

Pagusas

Elden Member
Now that it’s announced Nvidia is getting SAM support also, I’m super excited to see how these cards compare like for like with it. Get your DLSS solution in place soon AMD, I want to see this boxing match!!!
 

Irobot82

Member
Rumor says it's coming with the yearly December driver overhaul. It's just a rumor though, and if it was that close, I suppose we would have heard more about it. So, don't get your hopes up yet.
I am curious about the implementation. They said they want it to work on everything so I expect it be not as good as DLSS, but if you can just inject into any game though the Radeon settings that would be fucking game changing.
 

rnlval

Member
Since nvidia has the better designed and higher performing ray tracing implementation, i went with that. But bullshit like how it happened with Dirt 5 now or Godfall implementing rtx only on amd and later on nvidia isnt out of the question. At the moment, an AMD sponsored game means actively gimping performance on nvidia and removing features that are beneficial for the paying customer. AMD turned out the bad guy in the end, high prices on cpu's the first second they saw themselves on top. Doing shit like this with the graphics cards now. Theres really not much reason to cheer for them. They're being shitbags
FYI, AMD's RDNA 2 seems to be geared towards DXR towards Tier 1.1.



NVIDIA RTX cards can run Dirt 5 DXR path.

Godfall uses DX12U DXR with an unknown Tier level and its developers may not qualify NVIDIA RTX's DXR implementation at this present time.
 
So is this gonna be the same old song about DLSS, but for super resolution? Because DLSS delivered, and it's continuously delivering. Not only that, but raytracing. Comparable raytracing at the least. I honestly wish the makers of my CPU, made their gpu's.
 
Last edited:

Rentahamster

Rodent Whores
At this point, both nVidia and its users deserve it, considering how long it has been going on in reverse. This stuff is not good for the gaming market in the long run, but everyone, i.e. nVIdia and their users, should get a good couple of years in reverse to know how it is, and then we can get balance.
If it's not good for gamers in the long run, which it isn't, then NO, Nvidia and their users should not get a couple of years in reverse. Tit for tat sucks.

Somebody needs to break the cycle.
 
If it's not good for gamers in the long run, which it isn't, then NO, Nvidia and their users should not get a couple of years in reverse. Tit for tat sucks.

Somebody needs to break the cycle.
Careful there, remember that it is okay when it's "your side". I mean this is Resetera level thinking but even here it's apparently unavoidable.
 
Just for the record, I'm completely against money hatting features to lock out the competition or forcing partnered developers to use proprietary closed tech that won't work on the competition's GPUs.

I don't care if it is Nvidia or AMD doing it, I think it is a really shitty practice. Having said that I'm not sure if that is the case with Godfall. If AMD had some custom extensions for RT or somehow had a stipulation in their contract that disabled DXR features when detecting an Nvidia card then I think we would see that happen across the board and that doesn't seem to be the case as Dirt 5 works perfectly fine on Nvidia GPUs.

I would imagine Far Cry 5 (or was it 6?) (another AMD sponsored title) RT will also work fine on Nvidia GPUs. I think there has to be something more going on with the Godfall situation on the technical side, I just don't know what. It could even just be holding off due to lack of optimization in their RT implementation for Nvidia GPUs.

Of course I could be wrong and AMD could be picking up Nvidia's shitty practices which would be really terrible for consumers and the industry. We don't need the RT landscape to look like a fractured hot mess.

At the moment until we know more I'm willing to give AMD the benefit of the doubt as this seems to go against their company culture and willingness to push forward open standards and their commitment to open source.
 

Senua

Member
Humans.

/shrug
humans and their tribezzzzzz

NHRZmMp.gif
 

Ascend

Member
If it's not good for gamers in the long run, which it isn't, then NO, Nvidia and their users should not get a couple of years in reverse. Tit for tat sucks.

Somebody needs to break the cycle.
Tit for that sucks indeed. But sometimes it is the only thing that makes people understand how bad it is. Because people are constantly touting nVidia's greatness, whatever weird shenanigans they pull. It is not to get back at anyone; it's so that maybe then they can understand how detrimental this is. Unfortunately I fear it will still fall into the same fanboy nonsense, but I digress.

As an example, TressFX was also temporarily exclusive on AMD cards. But nVidia was able to fix their stuff later, because, it was open source. Then suddenly it was rebranded PureHair and nVidia even advertised with it, despite it originally being AMD tech... But in reverse, say, HairWorks, that was never possible, simply because nVidia keeps everything for themselves.

People have short memories, and they immediately forget the past when a brand new toy comes out, but I don't. That is why I cannot prioritize an nVidia card if there is an AMD alternative. I'm extremely grateful that the GeForce Partner Program did not take a foothold for example. Since that time, I have not bought any Asus, MSI or Gigabyte products, because they were nodding along like a bunch of sheep. I switched to Asrock for motherboards, for example. I value not only gaming, but I value sustaining the gaming market. And there are things that are constructive to it, and things that are destructive to it. If you as a company support something that is hugely destructive, I drop you in a heartbeat.

But that's just my 2 cents.

Careful there, remember that it is okay when it's "your side". I mean this is Resetera level thinking but even here it's apparently unavoidable.
That's kind of funny coming from you. You've been one of the most biased posters since forever.
 
Last edited:

spyshagg

Should not be allowed to breed
I'm extremely grateful that the GeForce Partner Program did not take a foothold for example



There were two major attempts in sillicon computing history where one company tried to kill another not by competing with it, but by using backstage tactics devoid of any morality or respect for the free market or the law itself.

Both of which were made against AMD.



1 - Intel, having already previously lost some battles to AMD in the 386/486 era, and with no means to compete with the new Athlon 64, paid billions in bribes to stop retailers and OEMs from selling and building PCs with AMD.
This tactic deprived AMD from money to invest in R&D. It was a giant with its own factories and It almost went bankrupt. Intel won the CPU war from 2007 until 2019 and it only costed them 1 billion $ in a legal suit, that they are still trying to dispute. AMD was not to survive beyond 2017 and the amazing mindshare it had in the 2000's dissipated.

2- Nvidia had already used all sort of scum of the earth tactics to cripple competition performance. But then went a step ahead and tried to kill the GPU free market by using the GeForce Partner Program, anticipating what Navi would start. They are a great hardware company, Nvidia is, but they also are a street gang protected by an army of lawyers.




When I say Two attempts were made against AMD, in actuality they were made against you, reading this. When you put money on each of these two companies products, you best be sure you are betting against yourself.


Navi 2 has arrived. Navi 3 is coming. Competition hit HARD. The first battle will be a pricing battle. The second battle will be war of tactics from Intel and Nvidia that will hurt you even more and likely lead to AMD's third assassination attempt. But AMD has money to do proper R&D now. Things will get hot.
 
Last edited:

Bluntman

Member
So in terms of RT performance.......

The thing is, most of the games currently being benchmarked have raytracing solutions one and a half / two years old with DXR 1.0 implementation.

DXR 1.0 isn't the problem in itself, but it requires heavy optimization regarding data moves to run good on a specific hardware. Which obviously the developers couldn't do for RDNA 2, since there wasn't even a prototype of RDNA 2 two years ago.

You can see the results, pretty big drop in performance for the 6800XT:

Control-Ray-Tracing-880x543.png


Metro-Exodus-RT-1-880x544.png



But moving on to Watch Dogs Legions where the developers were provided with hardware from AMD beforehand, so they could optimize. The difference to the 3080 is pretty minor:

main-one-legion-1-880x544.png


And then moving on to Dirt 5, which is currently the only game that is optimized for RDNA 2 and uses DXR 1.1 instead of 1.0:

Dirt-5-RT-880x544.png



So we shouldn't judge RNDA 2 RT performance based on older titles with RT solutions from a time when this hardware didn't even existed. Besides, the RDNA 2 architecture itself was always designed with DXR 1.1 in mind.
 
Last edited:

Ascend

Member
"I think the Radeon RX 6800 XT is a fantastic graphics card when looked at with pure rasterization performance, and so too for the Radeon RX 6800. We are here in the tail end of 2020 with AMD's third-best Big Navi keeping up with and beating the GeForce RTX 2080 Ti... NVIDIA's previous-gen flagship.

We have some impressive power numbers, impressive thermal numbers, 16GB of RAM, HDMI 2.1 connectivity, Infinity Cache, kick ass performance at all resolutions -- including 3440 x 1440, where I think the Radeon RX 6800 has an awesome and snuggly home. There are heaps of great 34-inch 21:9 ultrawide gaming monitors at 3440 x 1440 and 120/144/165Hz that would gel beautifully with the Radeon RX 6800."


Read more:
 
Last edited:

Armorian

Banned
So in terms of RT performance.......

The thing is, most of the games currently being benchmarked have raytracing solutions one and a half / two years old with DXR 1.0 implementation.

DXR 1.0 isn't the problem in itself, but it requires heavy optimization regarding data moves to run good on a specific hardware. Which obviously the developers couldn't do for RDNA 2, since there wasn't even a prototype of RDNA 2 two years ago.

You can see the results, pretty big drop in performance for the 6800XT:

Control-Ray-Tracing-880x543.png


Metro-Exodus-RT-1-880x544.png



But moving on to Watch Dogs Legions where the developers were provided with hardware from AMD beforehand, so they could optimize. The difference to the 3080 is pretty minor:

main-one-legion-1-880x544.png


And then moving on to Dirt 5, which is currently the only game that is optimized for RDNA 2 and uses DXR 1.1 instead of 1.0:

Dirt-5-RT-880x544.png



So we shouldn't judge RNDA 2 RT performance based on older titles with RT solutions from a time when this hardware didn't even existed. Besides, the RDNA 2 architecture itself were always designed with DXR 1.1 in mind.

WD RT is broken on AMD


Oh and minecraft is using 1.1

 
Last edited:

Dampf

Member
Bluntman Bluntman Dirt 5 in general performs much better on AMD cards, for some...reasons. So that is not really an indication of RT performace, especially considering it renders just shadows.
 
Top Bottom