• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

I'm not sure how you arrived to that?
What exactly "raytracing implementation" is?

What exactly has you confused ? They way nvidia designed their hardware for ray tracing is better than what AMD did. Thats why a 3080 destroys a 6800Xt and even a 2080Ti does, in ray tracing. Nvidia accelerates the ray tracing features on its dedicated cores, separate on the board. AMD also has dedicated hardware, but its integrated in the compute units. Its less effective and performant. Thats why AMD's ray tracing prowess at the end of 2020 is slower than nvidia's in 2018.
 

psorcerer

Banned
They way nvidia designed their hardware for ray tracing is better than what AMD did.

How do you know that?

Thats why a 3080 destroys a 6800Xt and even a 2080Ti does, in ray tracing.

There is no "in raytracing" there are a handful of games that use a vendor implementation. I don't see how it's even comparable.

Nvidia accelerates the ray tracing features on its dedicated cores, separate on the board.

No. We don't know that.
What we do know is that RT cores on Turing could not run traversal workloads when shaders were running on the unit. Which means they actually colocated with CUs at least logically.
For Ampere it changed but nothing changed in NV diagrams. Hence you cannot state that it's not colocated even now.

AMD also has dedicated hardware, but its integrated in the compute units.

AMD has RA (ray acceleration) units and they are working within TMU array which is exactly where NV RT cores are too.

Thats why AMD's ray tracing prowess at the end of 2020 is slower than nvidia's in 2018.

You're making a lot of far fetched conclusions based on a pretty limited information.
 

VFXVeteran

Banned
Drivers are irrelevant here.
Current implementations AFAIK use high level primitives from NV frameworks.

And how does that affect performance specifically to AMD? I'm curious how deep you think it goes. The voxel primitives are freaking axis-aligned bounding boxes FFS. And aligned to the camera no less.
 

VFXVeteran

Banned
How do you know that?



There is no "in raytracing" there are a handful of games that use a vendor implementation. I don't see how it's even comparable.



No. We don't know that.
What we do know is that RT cores on Turing could not run traversal workloads when shaders were running on the unit. Which means they actually colocated with CUs at least logically.
For Ampere it changed but nothing changed in NV diagrams. Hence you cannot state that it's not colocated even now.



AMD has RA (ray acceleration) units and they are working within TMU array which is exactly where NV RT cores are too.



You're making a lot of far fetched conclusions based on a pretty limited information.

All I'm hearing you do is defend with nothing to back up anything because you don't know what's going on. Take the loss on the chin dude and move on. You got RDNA3 to speculate on next.
 
Last edited:

llien

Member
DLSS will be used by more consumers than SAM. That's even including people who currently use an AMD CPU like myself.
SAM is not proprietary and works in all games, DLSS upscaling works in a handful.

Your assessment is weird, it actually doesn't matter what numbers are, you can't compare performance at 1440p on one card to 4k on another card, by just lying about "it looks the same".
 

spyshagg

Should not be allowed to breed
I'll finish my side of the debate by saying this, AMD got too close to the comfort of some, and did it without using the latest memory and with 100mm2 less.

The more you see people acting like fanboys, the more you know the impact this product had. And the war doesn't end in 2020. In 2020 It won the performance per mm2 crown fron nvidia. 2021 brings Navi3.

Dont keep looking like Intel fanboys from two years ago. Start thinking clearly about whats relevant in 2021 and keep an eye on Navi3.
 
Its not the same. Those were proprietary tech. The shit AMD does now, with SAM, with raytracing, with gimping performance for amd sponsored games - these are standars features that work on everything. They're just holding them hostage from nvidia or intel. They've become complete scum in a very short amount of time. Even the biggest cheerleaders for them shouldnt praise this. Locking features from competing parts even if they can run them ? Thats too scummy. They locked SAM from their own platform to make you buy their latest parts, in a combo evern - mobo,cpu and gpu.

I know I shouldn't bother engaging with you but here goes anyway!

I do agree that not having RT in Godfall does seem strange. If it is something to do with AMD's partnership contract somehow disabling the DXR code from running on Nvidia card then that would be really shitty. However the evidence we have against that being a norm with AMD is that DIRT 5 works perfectly fine on both AMD and Nvidia.

So what could be the reason for Godfall? Well I see two possibilities:

  1. AMD has some kind of proprietary RT extensions that don't work on Nvidia GPUs.

  2. The Godfall developers have only optimized their RT performance for AMD at the moment (given it is a partner title) and they could release the same RT implementation for Nvidia cards but it would run pretty badly as it is optimized for AMD RT so they want to wait until they have time to optimize their RT for Nvidia.
I see number 1 as pretty unlikely as it goes against AMD's usual practices and commitment to open source, plus we already can see Dirt 5 running on Nvidia cards so if this were true it would be pretty unlikely.

I see number 2 as the most likely scenario but we will have to wait and see what happens.

Your second point, AMD does not gimp performance of RT or normal raster on Nvidia cards. The games are simply optimized for AMD rather than Nvidia, different architectures will have different strengths and weaknesses, it is all about the optimization. If we took what you said at face value that would be like claiming almost all the Nvidia sponsored titles (especially RTX ones) are situations in which Nvidia paid the developers to purposely gimp performance on AMD cards when it is simply the fact that they are optimized for Nvidia rather than AMD.

Of course we do have concrete examples of Nvidia actually paying to gimp performance on AMD cards in the past with Hairworks and Tessellation stuff so we do have a precident for it in the industry, but so far only from one manufacturer. It is interesting though that in Dirt 5 the RT performance is much better on AMD than Nvidia simply due to optimizing to AMDs RT hardware. Goes to show how much optimization matters to the final performance espeically where it applies to Ray Tracing.

I do find it funny though that you seem to be so annoyed by locking out features from competitors and calling it a scummy move when it has been Nvidia's MO for almost as long as they have existed as a company. I find the irony especially delicious seeing as you are one of the forums biggest Nvidia cheerleaders who quotes marketing numbers directly from Nvidia slides and quotes them as fact. 🤷‍♂️
 

ZywyPL

Banned
Overall I think this is a pretty big win for AMD/Radeon compared to their past performance that a lot of people in this thread seem to be missing.

Reference Cooler:
AMD was notorious for releasing terrible "blower" style reference coolers that ran loud with high temps. Making only AIB models viable buys.

This time around their cooler seems to be pretty damn good essentially matching the expensive custom 3080 reference cooler in heat/noise. That is a pretty huge improvement from where they were in the past. Granted not everyone cares about reference models and AIBs will take the lions share of sales but this is still a huge improvement and nice win for AMD showing that they are serious about competing and are willing to up their game and improve on their obvious weak spots.

Power Draw:
For the last few generations of GPUs the Radeon family were consistently lambasted for higher power draw, hot running, inefficient designs. This time around we see that AMD have really focused on improving here and are actually running mostly below their specified 300w. This time around they draw less power across the board compared to Nvidia, which is a huge turn around/upset to the previous generations landscape. Another great win for AMD here.

Efficiency:
This goes hand in hand with power draw, but AMD has massively increased their efficiency even from RDNA1 to RDNA2 while running on the same 7nm node. You have to remember that RDNA1 was already a massive efficiency improvement over Vega. This is pretty impressive stuff and allows these GPUs to run at very high clocks while reducing power draw, a really impressive outcome for AMD, they seem to have a really great engineering team behind these RDNA2 card.

Rasterization Performance:
For the last 5-7 years of GPU releases AMD was mostly behind in performance at the highest end of the stack. Sometimes by a lot sometimes by a little but they were almost always behind in performance at the top end of the stack by a noticeable amount.

Prior to the RDNA2 reveal we had many rumours and tons of "concerned" Nvidia fans stating, almost as an outright fact that RDNA2 GPUs would definitely at the very top end be 2080ti levels of performance and might compete with a 3070. Were were told that in an absolute best case scenario that maybe they would scrape out 2080ti + 10-15% performance.

AMD could never compete with the powerhouse performance of Nvidia's latest GPUs you see, there were "power king" meme threads, the way it is meant to be played after all. As some later rumours started becoming clearer and it seemed likely that AMD might compete with 3080, we were told "Yeah but they will never touch the 3090! lolol". Along comes the 6900XT competing for $500 less. And here we are, Radeon group has seemingly done the impossible and is competing at the high end across the stack even with the hail mary release of the 3090. This is a pretty incredible improvement for Radeon group and cannot be overstated enough.

Release Cadence:
Previous Radeon GPUs tended to arrive much later than their Nvidia counterparts, often times more than a year behind Nvidia and often with worse performance or close to performance of a GPU that was to be replaced soon by a newer Nvidia model. Here we have essentially release parity, with only a month or two in the difference of releases. AMD have finally caught up to Nvidia here which again is a huge improvement.

Current rumours have AMD set to potentially release RDNA3 this time next year, which would be a huge improvement and upset to the normal 2 yearly development/release cycle we normally see for GPUs. We will have to wait and see how they progress but they already went from RDNA1 to RDNA2 in little over a year.
------

The above are some pretty fantastic improvements. In addition previously AMD cards were missing RT hardware/functionality altogether and they now have RT functionality performing at around 2080ti levels or higher depending on the title. Granted not as performant as Nvidia's 2nd generation of Ray Tracing with Ampere, but so far almost all of the RT enabled games were optimized/tested against Nvidia cards, most of them being sponsored by Nvidia to the point that Nvidia themselves coded most of the RTX functionality for some titles.

When you look at Dirt 5, which is the first RT enabled game optimized for AMD's solution it seems to perform pretty well on AMD cards. It will be interesting to revisit the RT landscape a year from now and see how these cards perform on (non Nvidia sponsored) console ports and on AMD sponsored titles.

Nvidia will still have a clear win with RT overall for this generation, if you need the best RT performance you should definitely go with Nvidia. However I find it interesting that 2080ti levels of RT performance (and some games higher) is suddenly considered "unplayable trash" but that is for another discussion.

Features:
Much like Nvidia was first to market with MS Direct Storage and rebranded it "RTX IO" and how Nvidia were first to market with Ray Tracing and rebranded DXR as "RTX", AMD here are first to market with resizable BAR support in windows as part of the PCIE spec. Like Nvidia, they have rebranded this feature as SAM.

SAM seems like a cool feature that can offer in a lot of cases 3-5% performance increases for essentially free. There are some games that don't benefit from this, and some outliers that get up to 11% increase such as Forza. Overall it seems like a cool tech that AMD are first to bring to the table.

Nvidia are working on adding their own resizable BAR support but we have no idea how long it will take them to release it or what Mobo/CPU combos will be certified to support it.

Regarding DLSS, this is definitely somewhere that AMD are behind. AMD are currently working on their FidelityFX Super Resolution technology to compete with DLSS. We don't know if this will be an algorithmic approach or if it will leverage ML in some way, or possibly a combination. AMD have mentioned their solution will work differently to DLSS so we can only guess for now until we have more information.

What we do know is that it will be a cross platform, cross vendor feature that should also work on Nvidia GPUs. If they go with an algorithmic approach then it may end up working on every game or a lot of them as an on/off toggle in the drivers. Of course this is all speculation until we have more concrete info to go off. What we do know is they are currently developing it and it should hopefully release by end of Q1 2021.

Value Proposition:
The 6800XT offers roughly equal rasterization performance with a 3080. It does this with less power draw and still runs cool and quiet. It has 16GB of VRAM vs 10GB VRAM on the 3080.

It offers RT but still behind 3080. I know this is a big issue for a lot of people, believe it or not there are actually tons of people out there who don't care about RT at all right now and won't for the forseable future. There are also people who view it as a "nice to have" but don't believe the benefits currently are worth the trade off until hardware improves to a much higher level and more games go heavily into designing their games around RT.

Personally I think RT is the future, in 10 years time almost all games will support RT and hardware will be fast enough with engines designed around it that performance will be great. That time is not quite right now though, as least for me. So I fall into the "nice to have" group at the moment.

AIB models seem to offer some overclocking headroom.

It offers SAM, which is a nice bonus but is currently missing a DLSS competitor. It is a little disappointing that Super Resolution was not ready for launch but they are currently working on it so hopefully the wait should not be too long.

Would it be more competitive if it was priced at $600 rather than $650? Definitely and I can understand the argument that for $50 more you can get better RT, DLSS and CUDA. The 3080 is a fantastic card afterall. Of course the chances of actually getting one for anywhere near MSRP even after shortages are sorted is another story altogether. Especially for AIB models.

The reality is that AMD and pretty much all manufacturers are wafer constrained on the supply side. AMD CPUs are very high profit margin products on the same 7nm wafer as a GPU. The consoles are also produced in huge quantities, AMD is on TSMC 7nm which is not cheap.

The simple reality is that right now AMD would sell out the entirety of their stock regardless of the price. They have no need to lower the price currently as they would simply be losing guaranteed money. AMD are also looking to position themselves out of the "budget" brand market.


Yup, hard to disagree with any of those points. I'd personally still put my money on NV right now, but anyone being able to actually get either of the cards should have no reasons to complain. As oppose the RDNA1 cards, Big Navi is not an LTTP product this time around, it's basically closing almost a decade old gap between AMD and NV, and if only they will figure out some DLSS-equivalent, there basically won;t be much, if any, difference between the two camps, just like in the good old days.
 

psorcerer

Banned
And how does that affect performance specifically to AMD? I'm curious how deep you think it goes. The voxel primitives are freaking axis-aligned bounding boxes FFS. And aligned to the camera no less.

I don't know. But from the past performance of NV-specific solutions we do know that they perform not that great on AMD if at all.

All I'm hearing you do is defend with nothing to back up anything because you don't know what's going on.

That's exactly what I'm sayng: nobody can back anything, but some strong conclusions are reached....
 

llien

Member
I'm worried about right now
There is nothing that would make you buy something from team red anyhow, so the only reason you are worried now is because your world of glorious green cards is shattered by Lisa.
None of your reasons is an actual reasons, it's just excuses.
So, just relax, man, take it easy. It's just GPUs.




I do agree that not having RT in Godfall does seem strange. If it is something to do with AMD's partnership contract somehow disabling the DXR code from running on Nvidia card then that would be really shitty.
BF5 change log (can't find it :() had mention of switching from DXR to NV extensions, mentioning:
a) it was buggy with DXR
b) perf issues

It might be that sort of issue of API existing, but not working (akin to how Nvidia sabotaged OpenCL)

This time around their cooler seems to be pretty damn good essentially matching the expensive custom 3080 reference cooler in heat/noise.
It actually beats it:

noise-normalized-temperature.png


So, the question shouldn't be whether or not most of the games that have ever been released support ray tracing and/ or DLSS but whether or not most games that are being released now or will be released in the future will support them, the answer to which is yes.
There are only 20 known RT titles, including those in the works.
There is only a handful of games supporting NV's DLSS upscaling.
So, "yes" is rather an optimistic take.
 
Last edited:

BluRayHiDef

Banned
yes, in >2022, maybe. Different ball game when we get there.

Here's how I feel about the relevance of ray tracing in regard to buying a graphics card now:

When someone buys a new graphics card, they typically do so because their current graphics card cannot play the latest games or will not be able to play upcoming games at the settings that they prefer and cannot support the new software technologies of said games.

For example, my GTX 1080Ti cannot run recent, DirectX 12 Ultimate titles at 4K and max settings at more than 35 to 45 frames per second in traditional rasterization. Heck, it can't even run Control at 4K and medium settings well and that's without ray tracing and DLSS (which it obviously does not support).

Hence, when I decided to upgrade to a new graphics card, I didn't think to myself that I wanted a card that could run the latest games and upcoming games really well in only rasterization. No, I thought to myself that I wanted a card that could run said games really well in both traditional rasterization and with the new software technologies that are ray tracing and DLSS.

Why? So that I could play Control at 4K and max settings with ray tracing enabled and with good frame rates thanks to DLSS...and because most games that are being released now and that will be released in the future will support these two technologies.

So, in other words, one doesn't buy a new graphics card for old games but buys one for new games.

So, the question shouldn't be whether or not most of the games that have ever been released support ray tracing and/ or DLSS but whether or not most games that are being released now or will be released in the future will support them, the answer to which is yes.
 

VFXVeteran

Banned
I don't know. But from the past performance of NV-specific solutions we do know that they perform not that great on AMD if at all.

That's true but that's an assumption adopted here with no basis in fact. Not good coming from a professional.

That's exactly what I'm sayng: nobody can back anything, but some strong conclusions are reached....

People don't need to back up the results of the realworld benchmarks. They are there for the keeping. It is what it is for now. No sense in fighting it.
 

MadYarpen

Member
I've just watched a review of 6800 non-XT, and it is quite good. A perfect card for UW 1440p.

It seems more and more obvious for me that I cannot hope for RT gaming at that resolution, without investing in RTX3080. I don't want to do that, I can barely justify the 3070/non-XT price. And I don't have 144 Hz monitor to play in 30fps. This pretty much takes the RT out of the equation - sadly. For now, that is. Correct me please if I'm wrong, but it seems it will take some time before RT is on the level allowing for 60fps + gaming in ultra wide 1440p, in a mid-high price tier. And until then, 6800 is a better choice than 3070. In two years 8GB couldstart to be a problem with nVidia card...
 
I do find it funny though that you seem to be so annoyed by locking out features from competitors and calling it a scummy move when it has been Nvidia's MO for almost as long as they have existed as a company. I find the irony especially delicious seeing as you are one of the forums biggest Nvidia cheerleaders who quotes marketing numbers directly from Nvidia slides and quotes them as fact. 🤷‍♂️


The features that amd is locking out are hardware agnostic features. Its not the same as what you acuse nvidia of doing nor does it matter. Its wrong when anyone is doing it. AMD seems to be going hard on bullshit right now, from all angles. I've quoted nvidia slides you say ? And how was the projected performance fro ampere false ? The games run as advertised. They didnt have just slides, they gave DF videos with performance metrics running on screen. They had the Doom video running.

Every game from the AMD presentation runs worse in real life than what they showed. They dont win any of the games they places over nvidia at their conference.
 
Last edited:

Mister Wolf

Member
There is nothing that would make you buy something from team red anyhow, so the only reason you are worried now is because your world of glorious green cards is shattered by Lisa.
None of your reasons is an actual reasons, it's just excuses.
So, just relax, man, take it easy. It's just GPUs.





BF5 change log (can't find it :() had mention of switching from DXR to NV extensions, mentioning:
a) it was buggy with DXR
b) perf issues

It might be that sort of issue of API existing, but not working (akin to how Nvidia sabotaged OpenCL)


It actually beats it:

noise-normalized-temperature.png



There are only 20 known RT titles, including those in the works.
There is only a handful of games supporting NV's DLSS upscaling.
So, "yes" is rather an optimistic take.


I already told you I own an AMD CPU and have bought AMD CPUs exclusively since the Athlon vs Pentium 2 days. When AMD offers something real in the GPU market I will buy it. Until then having worse 1440p/4K performance(and every resolution between them), No special upscaling, and shitty RT performance they simply aren't worth my money. Even the game touting all its VRAM usage Godfall runs the same at 4K with the 3080 having 6GB less. It was all smoke and mirrors.
 
Last edited:

psorcerer

Banned
People don't need to back up the results of the realworld benchmarks. They are there for the keeping. It is what it is for now. No sense in fighting it.

I have no problem with benchmarks. I do have problem with far-fetched conclusions.
Godfall has no RT on NV hence NV is not capable of RT. That's the level of the discussion I have problem with...
Or, game performs better on HW A it means that Z-buffer implementation on HW B sucks! Yes, the game uses z-buffer and yes hw accelerated z-buffer is used on both HW A and B but it doesn't mean that the difference in z-buffer implementation is the only thing that impacts the result.
It doesn't even mean that the z-buffer implementation has any significant impact.
 

wachie

Member
I find some of the enthusiasm for AMD kinda weird. 6800XT vs RTX3080.

Let's take for example PS5 vs Series X. While having nearly identical performance and costing the same, imagine the Series X having only around 60% of the raytracing performance, making it basically not feasible to use, AND Sony having DLSS while MS has no alternative.
It would be a bloodbath and I doubt people would celebrate MS for this achievement.

You can think of raytracing what you want, but it will be supported by a lot of upcoming AAA games and smaller games like Mortal Shell are receiving patches. In one form or another, it will become more and more important. We're talking 650+ bucks enthusiast cards here and the 50 bucks difference is insignificant at this price point.
Your whole analogy falls apart because of one simple reason - Xbox never competed with Sony at the high end (if your analogy has to be correct), they finally are competitive in a large swath of titles.

Again, it's weird seeing people UPSET by how close AMD got to Nviida. Really, it's quite strange.
 
Last edited:

Ascend

Member
And even that chart gives the 6800xt only 0,3%.....while other outlets see the 3080 ahead with about the same percentage.......
The notion that either side "wins" with those differences is absolutely ridiculous!
And yet you refused to classify it as a draw but had to classify it as the 6800XT being worse than the RTX 3080 at all resolutions. But now that the 6800XT is ahead, suddenly you want to classify it as a draw. Disgusting.
 

martino

Member
how can ray tracing implementation for amd can left out nvidia ?
does it mean this is not using vulkan/dx feature to achieve it ? OptiX like framework ?
(real question)
 

Ascend

Member
The features that amd is locking out are hardware agnostic features. Its not the same as what you acuse nvidia of doing nor does it matter. Its wrong when anyone is doing it. AMD seems to be going hard on bullshit right now, from all angles. I've quoted nvidia slides you say ? And how was the projected performance fro ampere false ? The games run as advertised. They didnt have just slides, they gave DF videos with performance metrics running on screen. They had the Doom video running.

Every game from the AMD presentation runs worse in real life than what they showed. They dont win any of the games they places over nvidia at their conference.
Tessellation was also hardware agnostic. But nVidia gimped their own cards with stuff like Hairworks to make AMD run slower. But apparently that's more than fine. 🤷‍♂️

You've all said your piece. You all think nVidia is better. Good for you. Go hang out in the RTX thread, and leave this thread for the ones that are actually interested in Radeon, rather than interested in bashing Radeon.
 

regawdless

Banned
Your whole analogy falls apart because of one simple reason - Xbox never competed with Sony at the high end (if your analogy has to be correct), they finally are competitive in a large swath of titles.

Again, it's weird seeing people UPSET by how close AMD got to Nviida. Really, it's quite strange.

Is it a respectable showing from AMD? Of course. Are these cards the hyped up Nvidia killers? Not really. I'm all for a neutral assessment of these cards. The downplaying of Nvidia and their features by some AMD warriors here just got so ridiculous that I felt the urge to write that comment.

I agree with you, it's weird to see people being upset by these AMD cards. As weird as defending and worshipping them.

As I've stated before, I really appreciate it and think it is very important that AMD attacks and challenges Nvidia on the high-end.
 

waylo

Banned
I've just watched a review of 6800 non-XT, and it is quite good. A perfect card for UW 1440p.

It seems more and more obvious for me that I cannot hope for RT gaming at that resolution, without investing in RTX3080. I don't want to do that, I can barely justify the 3070/non-XT price. And I don't have 144 Hz monitor to play in 30fps. This pretty much takes the RT out of the equation - sadly. For now, that is. Correct me please if I'm wrong, but it seems it will take some time before RT is on the level allowing for 60fps + gaming in ultra wide 1440p, in a mid-high price tier. And until then, 6800 is a better choice than 3070. In two years 8GB couldstart to be a problem with nVidia card...
So, you spent the money on a 144Hz, ultrawide 1440p monitor, but can't justify the cash to buy a card that runs games on it? A 3080 does everything you want. Framerates. Ray tracing. When you're spending $600 on a card, what's an extra $100? Especially if it means getting the performance you actually want? It's not like you can get either readily right now anyway, so pocket some extra cash for the next month or so and then buy a card you won't wind up regretting.
 

Ascend

Member
The downplaying of Nvidia and their features by some AMD warriors here just got so ridiculous that I felt the urge to write that comment.

I agree with you, it's weird to see people being upset by these AMD cards. As weird as defending and worshipping them.

As I've stated before, I really appreciate it and think it is very important that AMD attacks and challenges Nvidia on the high-end.
When I asked if they would rather have 10GB with fast RT or 16GB with medium RT, no one answered. Because they are not interested in being fair or neutral. They are interested in trashing AMD's cards as much as possible.

There are people out there that are genuinely not interested in RT. But of course, they must be shamed in the interest of keeping the idea that nVidia is better.

The expectation from a lot of the people here was that AMD would not even surpass the 2080 Ti. But the narrative has shifted.

When you're spending $600 on a card, what's an extra $100?
You mean what's an extra $400-$500.
 
Last edited:

rofif

Can’t Git Gud
At the end of the day, it's the same card as 3080 but with a lot worse ray tracing performance and no dlss/features.... so if it's almost the same price, why bother.
Ray tracing is the feature you will use today, 16gb probably not
 
And yet you refused to classify it as a draw but had to classify it as the 6800XT being worse than the RTX 3080 at all resolutions. But now that the 6800XT is ahead, suddenly you want to classify it as a draw. Disgusting.
It´s not ahead...anywhere.....
You´re just an idiot trying to put rethorics to the foreground instead of simply acknowledging the facts at hand.
AMD is about even in rasterization, loses horribly in anything RT and also misses any comparable features to DLSS

This is not "being ahead" or "winning" by any metric.
It`s a hard loss....
 
Last edited:

Ascend

Member
You´re just an idiot trying to put rethorics to the foreground instead of simply acknwledging the facts at hand.
AMD is about even in rasterization and loses horribly in RT and also misses DLSS

This is not "being ahead" or "winning" by any metric. It`s a hard loss....
tenor.gif
 

regawdless

Banned
When I asked if they would rather have 10GB with fast RT or 16GB with medium RT, no one answered. Because they are not interested in being fair or neutral. They are interested in trashing AMD's cards as much as possible.

There are people out there that are genuinely not interested in RT. But of course, they must be shamed in the interest of keeping the idea that nVidia is better.

The expectation from a lot of the people here was that AMD would not even surpass the 2080 Ti. But the narrative has shifted.

Can we please stop with that stupid VRAM argument? It's just a higher number of slower VRAM that doesn't help the card at high resolutions because it gets bandwidth limited compared to the "only" 10gb 3080.

I've also seen a lot of AMD warriors saying that AMD will wipe the floor with Nvidia and even be competitive in raytracing, downplaying the Cuda Cores.

It was hyperbole from both sides though. Nvidia fanbois also tried to downplay AMDs performance.

Still, at the end of the day, with the info we currently have, I see the following situation:

If both would be available, I simply don't see any strong objective reason for an enthusiast who puts down 650 bucks and more for a GPU, to go with AMD.

These 50bucks make no difference at this price point. Classic rasterization performance more or less the same. 6800xt is already bandwidth limited at 4k. Even if raytracing isn't a factor at the moment for some people, it would be strange to throw away the possibility of using it in the future. Because at 1440p, it's basically not usable on a 6800xt.

Brand wars aside, I just don't see a reason to pretend as if a 6800xt would be an objectively better choice than a 3080.

See how I didn't even included DLSS.
 
Last edited:
So to sum up, the AMD Radeon RX6000 series of cards seem to deliver on essentially everything AMD promised and match pretty much all of the rumours close to release.

All in all, I would say they have a very good product that is quite competitive with Ampere outside of RT, it will be interesting to see how the RT performance looks in a year+ from now when more console ports and AMD optimized titles are released if Dirt 5 is anything to go by.

Even for the biggest Nvidia fans I think everyone can agree that Radeon group they have come a really long way and improved on tons of their long standing short comings vs Nvidia. And all this with Radeon group being less that 1/10th the size of Nvidia. It is very impressive no matter how you slice it.

I am a little surprised that so many people are stating they are disappointed by the cards/results when everything matches what we knew at the very least by reveal date.

  • We knew the card would cost $649 for the reference 6800XT.
  • We knew they would be more power efficient than Nvidia.
  • We knew that they would have high clock speeds.
  • We knew that SAM would offer a small boost in some titles and no boost in others.
  • We knew RT performance would be worse than Ampere.
  • We knew that raster performance would be roughly on par with 3080.
  • We knew that AMD sponsored titles would perform better on AMD.
  • We knew that Nvidia sponsored titles would run better on Nvidia.
  • We had a pretty solid guess that AMD optimized RT would work better on AMD GPUs.
  • We knew that Super Resolution is being worked on but not ready yet.
What exactly were the new shocking or surprising results that have disappointed people exactly? We already knew pretty much all of this before these benchmarks released.

Not surprisingly looking at the names of the majority of the posters who seem to be the most "disappointed" and "concerned" it looks to mostly be our resident Nvidia superfans, odd how that seems to work out isn't it? We have almost the whole bunch of recognizable names posting in here about how disappointed they are despite the fact that we all pretty much know they would never buy an AMD GPU even if it was better across the board than Nvidia and cost $200 less.

The fact that so many of them are this rattled and doing their best to downplay these new GPUs must mean AMD are doing something right. I understand you guys have cemented the narrative in your mind that AMD can't be competitive and Nvidia can't be reached based on past generations, but things change and you just have to accept the reality that AMD are competitive now. 🤷‍♂️

In fact the Nvidia superfans should be the ones who are happiest that AMD is competitive with Nivida, you know the "I hope AMD competes so that Nvidia lowers prices and offers better products!" crowd. You can thank AMD for lower prices on the 3000 series, higher VRAM/performance of the lower Nvidia stack (lots of cards being brought up a die) and a cheaper 3090 (3080ti with 20GB VRAM) likely being announced in January. Why wouldn't you want AMD to compete when it nets you all of these benefits?

I understand you guys are worried about Nvidia being Intel'd, and it is a valid worry I think if AMD keeps improving at this rate. Granted a big difference is Nvidia are not resting on their laurels and are still innovating, but I would wager RDNA3 should help close the RT gap and at some point soon this generation AMD will have their Super Resolution tech released so by the time RDNA3 launches that pretty much invalidates most of the advantages.
 

Rikkori

Member
I did my tally for raytracing and the only one I care about is Cyberpunk; maybe also The Witcher 3 if they do it for GI. But RDNA 2 can handle single-effects fine (I'd only enable reflections & maybe diffuse illu for 2077). DLSS I do not care about at all. 3070 is weak as hell even with raytracing and more expensive than the 6800, and 3080 is gold dust & will stick far above msrp even when stock improves in spring - for my region anyway which is the only one that matters to me. I also have no doubt this will be a repeat of GCN with how RDNA 2 ages, thanks to consoles.

Thus I will accept my 6800 tomorrow & keep it. Now back to playing games and never repeating this mistake of selling my GPU early. Best of luck to everyone else no matter what GPU you're after. And always: fuck NV & AMD both they're corpos not our friends.

🙏 🙏 🙏

1DA26D6CA688696920A887C0A768E6BCC95E5534
 

mansoor1980

Gold Member
the radeon 6800 launch press transcript mentions fall back options like FIDELITY FX SSR and FIDELITY FX AO...............wat r those?
 
Last edited:
the radeon 6800 launch press transcript mentions fall back options like FIDELITY FX SSR and FIDELITY FX AO...............wat r those?

AMD has a GPU Open initiative where they offer tools and libraries to help out developers with certain features. These are open source and work on both Nvidia and AMD cards (and I guess Intel too). Nvidia used to (still does?) have a similar set of features called GameWorks but those either only worked on Nvidia cards or tanked performance on AMD cards.

SSR are screen space reflections, reflections you see on the ground in games not using Ray Tracing.

AO is Ambient Occlusion
 

llien

Member
Try harder fanboy....

Why are you guys so upset about AMD rolling out great products?
Does it take a genius to notice WD:L and Dirt 5 show peculiar RT performance picture and not jump the guns?
How is 3070 level RT (Minecraft is an interesting outlier) "terrible" (in all those 20 games in which most people do not enable RT anyway, for reasons repeated many times)?
 

mansoor1980

Gold Member
AMD has a GPU Open initiative where they offer tools and libraries to help out developers with certain features. These are open source and work on both Nvidia and AMD cards (and I guess Intel too). Nvidia used to (still does?) have a similar set of features called GameWorks but those either only worked on Nvidia cards or tanked performance on AMD cards.

SSR are screen space reflections, reflections you see on the ground in games not using Ray Tracing.

AO is Ambient Occlusion
i know about nvidia gameworks and nvidia physicx as i had a bad experience with those heavily hyped features........................wat i mean is the two features mentioned above are like HBAO plus or some new type of SSR developed by amd.
 
i know about nvidia gameworks and nvidia physicx as i had a bad experience with those heavily hyped features........................wat i mean is the two features mentioned above are like HBAO plus or some new type of SSR developed by amd.

 

regawdless

Banned
Why are you guys so upset about AMD rolling out great products?
Does it take a genius to notice WD:L and Dirt 5 show peculiar RT performance picture and not jump the guns?
How is 3070 level RT (Minecraft is an interesting outlier) "terrible" (in all those 20 games in which most people do not enable RT anyway, for reasons repeated many times)?

What. The. Fuck. Dude.
I've already called you out and quoted you, saying that WD Legion is bugged and has extremely simplified raytracing, thus the performance.

Stop spreading false information just because of fanboyism.
 
Last edited:

MadYarpen

Member
So, you spent the money on a 144Hz, ultrawide 1440p monitor, but can't justify the cash to buy a card that runs games on it? A 3080 does everything you want. Framerates. Ray tracing. When you're spending $600 on a card, what's an extra $100? Especially if it means getting the performance you actually want? It's not like you can get either readily right now anyway, so pocket some extra cash for the next month or so and then buy a card you won't wind up regretting.
Trying to be responsible father and husband... What can I say:)

And it's not like the monitor was that expensive.
 
Why are you guys so upset about AMD rolling out great products?
How hard exactly is it to selectively disregard all test-numbers you don`t like if those are a good 95% of them?
I´ve seen highly paid sugar lobbyists in Brussels not trrying as hard as you to lie and decieve while getting paid a fortune....and you are doing it for free.....I´d call that stupid.
 
Last edited:

llien

Member
This thread is an example of what is wrong with AMD subreddit.
It is overrun by certain types, that have expected RDNA2 to barely scratch 2080Ti at most, be power hungry, use some insane memory config to achieve even that, be behind 2060 at RT.
The types who grabbed 3090 / 3080 directly or from scalpers, and now need to justify why.


How hard exactly is it to selectively disregard all test-numbers you don`t like if those are a good 95% of them?
Dude.
While there are 20 RT games listed.
A number of them has not even been released.
One of them (Godfall) only supports AMD.
And most of the remaining handful, only supports Green (because, as Battlefield 5 devs figure, NV's DXR implementation was buggy and slow, so they had to write proprietary code)

Where did your 95% come from pretty please? :messenger_beaming:

We simply haven't seen RT games developed with both manufacturers in mind, out of handful that we have, judging by the likes of NV sponsored games is hardly reasonable.
 
Last edited:
Top Bottom