• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Sun Blaze

Banned
For people fomo'ing on RTX, don't be (RTX on vs off; notice the blurring of reflections - dlss can't upscale it so you get the low-res version as if playing at that resolution):

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

If you remember this was pointed out with WD:L too:

UeykcQW.jpg
You were so busy favoboying you didn't realize how little sense your post makes. The shadows in DLSS ON vs OFF in the third picture are completely different. One has sharp shadows and the other diffused shadows. Might be a case of RT shadows vs regular ones mate. Not DLSS.
 

Kenpachii

Member
This is war, never surrender.
Rikkori Rikkori

Just checked out the DLSS. As others pointed out, you're wrong. I don't see any difference in the reflections.

First you claim some imaginary, magical 20% performance boost for the 6800. Now you post totally wrong comparison pictures trying to trash DLSS.

You don't seem like a happy new GPU owner that is confident in his purchase.

I saw some guy having the same issue where reflection quality was way lower. maybe its a bug tho the game is riddled with it. Would be interesting to see people with a RTX card make a comparison.
 

Rikkori

Member
You were so busy favoboying you didn't realize how little sense your post makes. The shadows in DLSS ON vs OFF in the third picture are completely different. One has sharp shadows and the other diffused shadows. Might be a case of RT shadows vs regular ones mate. Not DLSS.
RTX = DLSS + RT. The comparisons are with both effects on vs both off.
Issue is DLSS can't upscale RT reflections so you get the effect at the resolution it's upscaling from, hence blurriness.
 
Last edited:
This is war, never surrender.


I saw some guy having the same issue where reflection quality was way lower. maybe its a bug tho the game is riddled with it. Would be interesting to see people with a RTX card make a comparison.

"way lower"

I already posted an example on the previous page


Here's another one


There is the slightest softness applied to the image that is only noticeable in brightly lit areas. At night or in darker lit areas its not noticeable. The slightly softer image also renders additional details like fences or electric lines that apear broken in the distance with native but are completed and fully rendered with dlss.

As ive said, the pictures posted on the previous page are not about what hes talking about, in his fanboy drivell. They're pics with rt on and off. What he thought was blurry reflection is just how the pavement works, it'll reflect a distorted view back, because its not a completely flat surface nor is it a mirror like reflective material, its not glass.
 

Sun Blaze

Banned
RTX = DLSS + RT. The comparisons are with both effects on vs both off.
Issue is DLSS can't upscale RT reflections so you get the effect at the resolution it's upscaling from, hence blurriness.
You're comparing non-RT vs RT, what are we supposed to see besides the fact RT has more accurate lighting, shadows and refelctions?

You want to compare RT+DLSS vs native+RT.
 

RedVIper

Banned
First it was better than native resolution.

Now we finally accept that it introduces blurriness.

Hopefully next you guys will be able to accept how it loses quality on a lot of distant objects, but that might be pushing it I guess.

It's better at some stuff and worse at others.

Not to mention people are making retarded comparisons.

The comparasion isn't between 4k and 4kDLSS, because these two settings run at completely different framerates.

If you want to compare DLSS with native you do with it a resolution that gives you similar performance.
 
First it was better than native resolution.

Now we finally accept that it introduces blurriness.

Hopefully next you guys will be able to accept how it loses quality on a lot of distant objects, but that might be pushing it I guess.


It depends on its implementation, i dont remember peolpe saying its 100% better than native in every game ever. Its regarding modern TAA implementation in games vs a good implementation of DLSS. In Cyberpunk i know for a fact that its actually beneficial for long distance because in native it loses elements of a scene, while with DLSS its rendered. In Death Stranding its a fact that its better than native. In Pumpkin Jack its also a fact. That may be the best DLSS in any game actually.

The fact that nvidia has this tech really keeps some people up at night, huh ?
 
Last edited:
First it was better than native resolution.

Now we finally accept that it introduces blurriness.

Hopefully next you guys will be able to accept how it loses quality on a lot of distant objects, but that might be pushing it I guess.
You seem to be rather triggered anytime it comes to DLSS or raytracing... Will you say the same when AMD has their own take on DLSS? I can't imagine you would, as you seem to be an alt account of 2 possible hardcore AMD fanboys on this site...

With that being said, would you take negligible visual differences for a much higher framerate? Or indistinguishable visuals, and much lower framerate? Honest question to gauge whether you are a complete troll, or just AMD fanboy, or even both.
 

regawdless

Banned
That Cyberpunk DLSS discussion belongs in this thread I think:


Posted comparison screens, you can check them out.
 

Bolivar687

Banned
You seem to be rather triggered anytime it comes to DLSS or raytracing... Will you say the same when AMD has their own take on DLSS? I can't imagine you would, as you seem to be an alt account of 2 possible hardcore AMD fanboys on this site...

With that being said, would you take negligible visual differences for a much higher framerate? Or indistinguishable visuals, and much lower framerate? Honest question to gauge whether you are a complete troll, or just AMD fanboy, or even both.
I'm not one of the members policing AMD threads for unallowed opinions, and accusing anyone who sees things differently of being an alt. You were likewise asking "what if it was AMD?" in the Hardware Unboxed thread, which is embarrassing now that even Nvidia formally acknowledged they crossed the line. This obsession and assumption of hypocrisy is unhealthy, because you act as if everyone who disagrees with you must be dishonest, including that thread in particular, where criticism of Nvidia was both warranted and obvious.

I posted that above because several members routinely say that DLSS is generally better than native and, as Soulblighter31 acknowledged above, it's not. When AMD's implementation comes out, I'll likewise acknowledge its benefits and shortcomings but I'm certainly not going to brigade graphics card threads inflating it to be more than it is.

Regarding your last question, it depends on the strength of each implementation. I've always been a high refresh monitor guy, so of course I always lower indistinguishable settings if it means a superior framerate, but I'm also predisposed to skepticism of changing native resolution, especially for what I've seen and heard on performance mode.
 

Kenpachii

Member
"way lower"

I already posted an example on the previous page


Here's another one


There is the slightest softness applied to the image that is only noticeable in brightly lit areas. At night or in darker lit areas its not noticeable. The slightly softer image also renders additional details like fences or electric lines that apear broken in the distance with native but are completed and fully rendered with dlss.

As ive said, the pictures posted on the previous page are not about what hes talking about, in his fanboy drivell. They're pics with rt on and off. What he thought was blurry reflection is just how the pavement works, it'll reflect a distorted view back, because its not a completely flat surface nor is it a mirror like reflective material, its not glass.

It wasn't a slight softness it was straight up a completely nuked resolution. he was looking at hte mirror in his apartment some random youtube video i saw. But i guess they fixed it now then.
 

llien

Member
Cyberpunk 2077 in 4K with the "Ultra" graphics preset (including ray tracing) via DLSS Quality Mode on
You are playing Cyberpunk 2077 at 1440p, with TAA derivative used for anti-aliasing, that has been called out for adding typical shit that TAA derivatives are well known for: blurry images, loss of detail, particularly bad with small, quickly moving objects, many times.
 
Last edited:

llien

Member
The shadows in DLSS ON vs OFF in the third picture are completely different
This reminds me the bonkers "there is a door and this is daylight" theory, despite the very corridor in discussion was shown right in the next post and it, spoiler alert,has shown no signs of "daylight" somehow turning it into a dull version, with RT on.
 

rofif

Can’t Git Gud
You are playing Cyberpunk 2077 at 1440p, with TAA derivative used for anti-aliasing, that has been called out for adding typical shit that TAA derivatives are well known for: blurry images, loss of detail, particularly bad with small, quickly moving objects, many times.
you sir, are full of shit.
DLSS is better than TAA and does not share it's flaws. It's not blurry, it's not ghosting and it's stable in motion. It's the best AA solution on the market right now and it's better than free... it adds performance due to rendering in lower res internally.
Even in games like Death Stranding where performance is not an issue, I choose to enable quality DLSS at 4k since it just looks better in motion and does not break up like TAA on foliage or distant lines. The resulting image is jaggies free in motion or not since it's based on 16k ground truth.

Stop spreading Your bullshit. Why do You hate Nvidia so much you bend facts?
Thanks to DLSS, I was able to play cyberpunk at maxed psycho rt settings, 4k output (balanced dlss internally setting) locked to 30fps with no drops ever below that. If I wanted to do 60, then dlss performance was on point but the shitty game engine drops to 30s in a city no matter what settings.
 
Last edited:

Buggy Loop

Member
you sir, are full of shit.
DLSS is better than TAA and does not share it's flaws. It's not blurry, it's not ghosting and it's stable in motion. It's the best AA solution on the market right now and it's better than free... it adds performance due to rendering in lower res internally.
Even in games like Death Stranding where performance is not an issue, I choose to enable quality DLSS at 4k since it just looks better in motion and does not break up like TAA on foliage or distant lines. The resulting image is jaggies free in motion or not since it's based on 16k ground truth.

Stop spreading Your bullshit. Why do You hate Nvidia so much you bend facts?
Thanks to DLSS, I was able to play cyberpunk at maxed psycho rt settings, 4k output (balanced dlss internally setting) locked to 30fps with no drops ever below that. If I wanted to do 60, then dlss performance was on point but the shitty game engine drops to 30s in a city no matter what settings.

Ilien in a nutshell

Ray tracing :
blind cheech marin GIF


DLSS :
closer spy kids GIF
 

regawdless

Banned
, loss of detail,

DLSS actually adds details because it reconstructs the image. If you have fine lines in the distance for example, native 1440p fails to reproduce them, having gaps in the lines. While DLSS does a visibly better job at it.
In Cyberpunk, DLSS actually makes the image a bit blurrier because the game removes some sharpening effect. I use reshade either way and adding some light sharpen, the DLSS IQ is pretty great.

The small negatives that come with DLSS - like some noise on certain edges - are insignificant if you can get the whole raytracing package at very good framerates in exchange.
 
DLSS actually adds details because it reconstructs the image. If you have fine lines in the distance for example, native 1440p fails to reproduce them, having gaps in the lines. While DLSS does a visibly better job at it.
In Cyberpunk, DLSS actually makes the image a bit blurrier because the game removes some sharpening effect. I use reshade either way and adding some light sharpen, the DLSS IQ is pretty great.

The small negatives that come with DLSS - like some noise on certain edges - are insignificant if you can get the whole raytracing package at very good framerates in exchange.
Also llien llien




 
Seriously. I have a feeling there is like a single person on GAF, with 4 alt accounts, that white knight all day for AMD, and shit on Nvidia all night. There's no way someone can be this delusional, yet turn the blind eye to AMD, EVERY. SINGLE. TIME.
Its a carry over from console fanboys as well.
As their console of choice uses AMD, then they also war over AMD vs Nvidia.
 
D

Deleted member 17706

Unconfirmed Member
Now look at how all tech tubers will begin to embrace SAM, when none of them wanted to include it in their AMD GPU tests, but they use DLSS vs Native to show how better raytracing is on Nvidia with DLSS....So much hypocrisy...

I imagine folks will start including it, since it's free extra performance gains. They don't look very significant at the moment though outside of a few exceptions (Valhalla, and Gears 5).
 

Ascend

Member
So i assume its only for 10k series cpu's? useless then.
I'm not sure.

Its a carry over from console fanboys as well.
As their console of choice uses AMD, then they also war over AMD vs Nvidia.
Call me crazy, but this is a 6800 series thread. Not a DLSS or RT or nVidia thread. It isn't the AMD fanboys that are in a thread they don't belong in and warring...

Another on-topic post;
 
Last edited:
You are playing Cyberpunk 2077 at 1440p, with TAA derivative used for anti-aliasing, that has been called out for adding typical shit that TAA derivatives are well known for: blurry images, loss of detail, particularly bad with small, quickly moving objects, many times.
It takes a special kind of institutional blindness to actually pretend DLSS is just another form of TAA. There are plenty of white papers and other material out there if you actually wanted to read something once in your life which explains what "Deep Learning" is and how the AI machine learning algorithms Nvidia runs on images of games on the actual Top 10 supercomputer they have at their HQ in Santa Clara in order to teach the video cards how to upsample games.

Oh I doubt you care but the real purpose of the Nvidia supercomputer is to perform the machine learning for the Nvidia AI self-driving car project, they do the games related stuff for DLSS when there is availability and downtime from the Nvidia DRIVE work.
 

Bolivar687

Banned
It takes a special kind of institutional blindness to actually pretend DLSS is just another form of TAA. There are plenty of white papers and other material out there if you actually wanted to read something once in your life which explains what "Deep Learning" is and how the AI machine learning algorithms Nvidia runs on images of games on the actual Top 10 supercomputer they have at their HQ in Santa Clara in order to teach the video cards how to upsample games.

Oh I doubt you care but the real purpose of the Nvidia supercomputer is to perform the machine learning for the Nvidia AI self-driving car project, they do the games related stuff for DLSS when there is availability and downtime from the Nvidia DRIVE work.
giphy.gif
 

Ascend

Member
What kind of raytracing does the game utilize? Only shadows? I think I saw RT reflections in some promo footage. But regarding the small fps hit, definitely seems weird. Would love to know more about their RT implementation.
I think it's shadows only, while using SSR.


In other news;
 

llien

Member


DLSS is better than TAA
Yeah, who would have thought.
Or who has stated otherwise.

Oh, wait, nobody.

Nice strawman though.


There are plenty of white papers
Whitepapers. And not just whitepapers but "plenty of".
And that is what team "hurt green butt" read.
Sure, John.


The whole DLSS "better than native" debacle is an embarrassment.
A human with half functioning brain should be able to see both blur and loss of detail in that glorified TAA derivative. (yets, it's the bets TAA derivative that we have, silly boys, it's just not what you claim it to be)

People need to remember that we have TRIED IT OUT ("can' you tell the difference between DLSS upscaled 1440p => 4k and 4k").
Yay, how easy it was.
Guess which of the two pics added blur and wiped out details.
 

llien

Member
supercomputer is to perform the machine learning for the Nvidia AI self-driving
Oh, you clown, that DLSS 1.0 (failed miserably) song again.
DLSS 2 does not do per game training, in fact, nobody knows, exactly which images it is trained on, to do the temporal anti-aliasing.
 
Last edited:

rofif

Can’t Git Gud



Yeah, who would have thought.
Or who has stated otherwise.

Oh, wait, nobody.

Nice strawman though.



Whitepapers. And not just whitepapers but "plenty of".
And that is what team "hurt green butt" read.
Sure, John.


The whole DLSS "better than native" debacle is an embarrassment.
A human with half functioning brain should be able to see both blur and loss of detail in that glorified TAA derivative. (yets, it's the bets TAA derivative that we have, silly boys, it's just not what you claim it to be)

People need to remember that we have TRIED IT OUT ("can' you tell the difference between DLSS upscaled 1440p => 4k and 4k").
Yay, how easy it was.
Guess which of the two pics added blur and wiped out details.

yes, you can tell the difference. 1440->4k DLSS looks better than native 4k.
About the review You posted... again, steve fails to capitalize on dlss, rt and other features. 6800 is faster but it's also more expensive by quite a bit
 

Ascend

Member
You posted... again, steve fails to capitalize on dlss, rt and other features. 6800 is faster but it's also more expensive by quite a bit
Twice the VRAM makes the higher price justifiable, considering two games reach the RAM limit of the 3070 already.

It's a good thing if Hardware Unboxed does not change their editorial direction. Who else out there tests graphics card performance with over 40 games...??? That's quite valuable.

Whining about HUB not focusing on RT is pretty much an embarrassment at this point.
 

Sun Blaze

Banned
6800's price increase is actually justified over the 3070. 10-15% faster, 16% more expensive. No DLSS or good RT, but twice the VRAM.

6800 is probably the best card from the AMD lineup. Only thing 6800XT has over the 3080 is VRAM.
 

Sun Blaze

Banned
yes, you can tell the difference. 1440->4k DLSS looks better than native 4k.
About the review You posted... again, steve fails to capitalize on dlss, rt and other features. 6800 is faster but it's also more expensive by quite a bit
Steve says at the end of the video there is an argument for the 3070 if you value RT and DLSS. Thing is, you're sacrificing a chunk of performance and half the VRAM if you go from a 6800 to a 3070, and that 8GB is starting to show its age.
 
6800's price increase is actually justified over the 3070. 10-15% faster, 16% more expensive. No DLSS or good RT, but twice the VRAM.

6800 is probably the best card from the AMD lineup. Only thing 6800XT has over the 3080 is VRAM.




If that were the case, but as we know from AIB's, the price marketed by AMD isnt actually possible. As i've said before, AMD must've expected Turing prices from nvidia and they were massively sucker punched by nvidia on this aspect. The actual prices are 700 dollars for 6800 and 800 to 900 dollars for 6800XT. That was right at launch, not scalping. At that price point, the 3070 is more appealing.
 

Bolivar687

Banned


If that were the case, but as we know from AIB's, the price marketed by AMD isnt actually possible. As i've said before, AMD must've expected Turing prices from nvidia and they were massively sucker punched by nvidia on this aspect. The actual prices are 700 dollars for 6800 and 800 to 900 dollars for 6800XT. That was right at launch, not scalping. At that price point, the 3070 is more appealing.
The reference models at launch were generally being sold for MSRP. Also, $500 for the x70 and $700 for the x80 are indeed Turing prices, so I don't see the sucker punch argument at all. Just looking on Newegg, a lot of Nvidia AIBs are listed at the same premium we see on the red side. That said, there does seem to be more Nvidia cards closer in line with their MSRP, at least from a quick glance, although many of them are the lower tier models.

Until availability improves, the price comparisons are obviously premature, but if the current trend indeed holds, then I would agree the lower quality binned Nvidia models would be the best value.
 

llien

Member
1440->4k DLSS looks better than native 4k.
Bullshit.
One of your brethren has challenged me to figure which of the pics was upscaled from 1440p to 4k and then had NV's glorified TAA applied to it.
I had no problem figuring it out.
DLSS one added blur and lost fine detail in texture.

Normally, it would also improve lines, but in that particular pic, there wasn't much to improve anyhow.

It is a crazy talk.
Just because you applied AA it doesn't magically make it higher resolution.

Had Nvidia honestly (let me chuckle at nvidia and honestly in the same sentence) said, "hey, look, we have rolled out great TAA derivative, it eats quite a bit of performance, but it is great at AA", "oh, and there is even a handful of games that support it", "oh, and it's incompatible with VRS", there would be nothing to object to.

But no, it needs to be lied on "3080 is two times faster than 2080" levels, and wait for it, there is DF "totally not shills" video to kinda support both claims.
Pathetic.
 
Top Bottom