• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

PhoenixTank

Member
It's just silly. We can all like different things.

Some can buy only AMD because that's what they like. Some can buy Nvidia because that's what they like. Some can swap back and forth depending on their budget/game preferences/feature importance.

In the end, we are all PCMR and we are all better than the console peasants.
I think some of the frustration on the AMD side is borne out of old arguments. I've absolutely seen power usage thrown at AMD as a downside in the 480/580 vs 1060 era.
A fair few posters on record basically saying they only want AMD around to make Nvidia drop prices, and wouldn't ever buy them. That isn't healthy for the long term state of the market and AMD being underfunded in some areas still hurts their value proposition today.
Unfortunately it becomes all too easy to lump people together rather than calling individuals on their shit.

Making a huge leap on rasterization, catching up and even exceeding in places is fantastic from AMD - on a generation where Nvidia didn't sandbag, no less! RT is below par, but I'd say it is fair to give them a bit to let the dust settle. Only one game actually had RTX in 2018, after all. Not expecting massive improvements but there do seem to be teething issues out there i.e. potential bugs/differences between implementations.
If RT is your jam, I get it. It isn't for me yet. I sit here, hoping my 1080Ti keeps on kicking for a while yet because by the time there are a lot of games making good use of this tech, there should be far better cards on offer.
 
It's just silly. We can all like different things.

Some can buy only AMD because that's what they like. Some can buy Nvidia because that's what they like. Some can swap back and forth depending on their budget/game preferences/feature importance.

In the end, we are all PCMR and we are all better than the console peasants.
Either point out the bullshit posters and let's all laugh a bit and then let's move onto talking more about benchmarks and performance. That's more interesting shit anyways. Stop giving a shit about fanboys if the facts speak for themselves.
I think some of the frustration on the AMD side is borne out of old arguments. I've absolutely seen power usage thrown at AMD as a downside in the 480/580 vs 1060 era.
A fair few posters on record basically saying they only want AMD around to make Nvidia drop prices, and wouldn't ever buy them. That isn't healthy for the long term state of the market and AMD being underfunded in some areas still hurts their value proposition today.
Unfortunately it becomes all too easy to lump people together rather than calling individuals on their shit.

Making a huge leap on rasterization, catching up and even exceeding in places is fantastic from AMD - on a generation where Nvidia didn't sandbag, no less! RT is below par, but I'd say it is fair to give them a bit to let the dust settle. Only one game actually had RTX in 2018, after all. Not expecting massive improvements but there do seem to be teething issues out there i.e. potential bugs/differences between implementations.
If RT is your jam, I get it. It isn't for me yet. I sit here, hoping my 1080Ti keeps on kicking for a while yet because by the time there are a lot of games making good use of this tech, there should be far better cards on offer.
Regardless of what hardware we pick, we are all part of the PCMR, whether we like the slogan or not. If you are team red, or team green, you are on my side. If I can get in discord, and hit up a bunch of my homies to play, idc what set you claim, as long as we can play TOGETHER. I just don't like the blatant fanboyism, or the ones who only side with one company. You can call me bipolar because I'll switch between several different companies. Whoever has the better shit, I'm all in.
 

Rentahamster

Rodent Whores
Regardless of what hardware we pick, we are all part of the PCMR, whether we like the slogan or not. If you are team red, or team green, you are on my side.
Yup. I can see how tribe mentality forms among console players since each console has its own ecosystem and targeted branding.

But GPUs and CPUs? How fucking stupid is that. We're all gaming on motherfucking Windows in an open system (aside from any Steam vs Epic vs etc shenanigans). It's baffling that some people get so attached to a brand that makes up just one component of a system. They all do the same fucking shit! :messenger_tears_of_joy:

Identify your usage scenario, look at the relevant benchmarks that apply to your usage scenario, make a list of candidates that satisfy your requirements within your budget, buy one. Who gives a shit what brand it is.
 
Yup. I can see how tribe mentality forms among console players since each console has its own ecosystem and targeted branding.

But GPUs and CPUs? How fucking stupid is that. We're all gaming on motherfucking Windows in an open system (aside from any Steam vs Epic vs etc shenanigans). It's baffling that some people get so attached to a brand that makes up just one component of a system. They all do the same fucking shit! :messenger_tears_of_joy:

Identify your usage scenario, look at the relevant benchmarks that apply to your usage scenario, make a list of candidates that satisfy your requirements within your budget, buy one. Who gives a shit what brand it is.
Exactly! I used to rock bulldozer when everyone was recommending Intel. Finally jumped on board for a while, then right back to AMD. Also had AMD GPU's then went Nvidia for the time being. I'll take whoever gives me the most performance. Platform agnostic is the way to go. That's how PC has always been.
 

llien

Member


So this is the performance that's so terrible.....My only gripe is that he has the 3090 in there... I wish he compared the 6800XT to the 6080 and the 6800 to the 3070......Even then you can see the performance for AMD is pretty great in rasterization and RT....When these drivers mature, I think Nvidia is in for a world of hurt...


If they would include 3070 the "terrible RT performance song" would look to awkward as we are talking about AMD cards beating $1200 Turing card.

RDNA2 RT performance is great, even ignoring "sponsored by Greedia" aspect of the titles.
 
Last edited:

Rentahamster

Rodent Whores
Exactly! I used to rock bulldozer when everyone was recommending Intel. Finally jumped on board for a while, then right back to AMD. Also had AMD GPU's then went Nvidia for the time being. I'll take whoever gives me the most performance. Platform agnostic is the way to go. That's how PC has always been.
Right. If we all really cared more about marketing instead of performance per dollar or max overall performance, we'd all be in some thread about Macs.
 
Last edited:


So this is the performance that's so terrible.....My only gripe is that he has the 3090 in there... I wish he compared the 6800XT to the 6080 and the 6800 to the 3070......Even then you can see the performance for AMD is pretty great in rasterization and RT....When these drivers mature, I think Nvidia is in for a world of hurt...




You know these channels are making shit up, right ? Where they dont show anything, just put up some slides with made up numbers. We have RTX benchmarks from dozens of proper websites already, you dont need to look at no name youtube channels who dont have the cards and are just making up numbers. Look up normal websites where you're sure they actually have the hardware and tested it
 
They have done a tremendous job this time around, but if you want everyone to praise AMD, step back and look at your post history. On any given page, I see nothing but Nvidia's bashing from you. You never appreciate anything Nvidia does, so it's crazy you want others to notice what AMD has done to finally start to catch up to Nvidia. Aren't the kettle and pot both black if I remember correctly?

Don't agree, because of the 8nm process, the power draw of the 3080 is absolutely horrendous. Anything going close to 350W is a big no no because of the heat output and effect it has on your gaming experience.

So yes, it has 5% more performance in 4K, better RT and DLSS, but the fundamentals are bad, it requires a shitload of energy to achieve this. Other small things irk as well. No overclocking headroom, phantom MSRPs.
 
So the 7000 series AMD cards will be the b9mb.
I expect their RT to be four times better than the 6000 series, and the addition of ML will also bring them closer.
 
So the 7000 series AMD cards will be the b9mb.
I expect their RT to be four times better than the 6000 series, and the addition of ML will also bring them closer.

so wait for vega rdna 3?

i think that would be the sensible thing to do, for everyone who hasn't a GPU fetisch like me. buy a console now and buy rDNA3 or Hopper 1.5 years from now.
 
Now that the dust has settled a little bit it is interesting to have another look at some factors regarding performance, price and value proposition.

3080 reference has a very slight lead overall in 4K rasterization (3-5%), however the 3000 series appears to have very little overclocking headroom even on AIB custom models.

The 6800XT seems to have some room left in the tank for a noticeable overclock on custom AIB models, meaning that a solid card from Sapphire for example would likely eliminate that gap entirely. (And possibly exceed?)

I would wager the majority of people buying these cards (or the 3000 series) will likely be purchasing an AIB model rather than reference so it will be interesting to see how much extra performance/higher clocks a solid AIB model will achieve. Similarly I've heard AMD will phase out the reference design early 2021, unsure about Nvidia but are their Founders Edition cards normally limited edition? Hopefully someone else can confirm.

Regarding value proposition/price on paper (at least for the US region) we have $649 vs $699 for 6800XT/3030 respectively. In reality right now prices are all over the place due to low supply, high demand and scalpers.

Once stock stabilizes I doubt we will see 3080/6800XT reference cards going for anywhere near MSRP, unless purchased directly from Nvidia/AMD whose stores only ever have very limited stock. Realistically we will likely be left with mostly AIB models to choose from. Which will obviously be more expensive than reference for both AMD and Nvidia. While we don't know what that landscape will look like just yet I would imagine that 3000 series AIB cards will still carry a hefty price premium over their 6000 series counterparts, which means when all is said and done we could end up seeing a $100+ price differential between an available 3080 and an available 6800XT card.

In Europe the story gets even more complicated as even for reference models I have no idea what the price for a 6800XT is, although I do know it will be higher than the US model. One way or another we will end up paying through the nose over here, whatever the US price is, reference or AIB, expect at the very least a €100 markup, likely more in reality. It will be interesting to see what kind of price differential we are left with once all is said and done between 3080/6800XT, the price gap could end up being pretty wide.

Revisiting Ray Tracing performance for a moment, it doesn't seem as bad as I thought initially. Currently raw 4K RT is an unplayable joke no matter what card you use unless you have an upscaling tech like DLSS to bring you up from 1080p/1440p. AMD will eventually have something here but we don't really know when that will be.

At 1440p, which lets be honest is the highest native RT resolution that results in playable framerates for either company, the 6800XT seems to be performing reasonably well with clearly playable framerates above 60fps in almost all cases. Granted Nvidia is still a good bit ahead but the performance is hardly unplayable trash. Bare in mind that outside of Dirt 5, pretty much every RT enabled game is either sponsored by Nvidia or otherwise optimized for their RT solution as simply put they were the only RT enabled cards on the market to test/code against.

I don't know what is happening with Watch Dogs: Legion in the RT department, we know that certain effects are not rendering properly but we don't know if this is a case of those rays not being calculated at all, or if it is simply something buggy in the driver preventing the calculated result from rendering correctly to the screen. We've been told this bug will be fixed with a driver update shortly, the only open question is if this fix will reduce RT performance in this game for RDNA2 GPUs or if the performance will stay relatively the same.

If the performance stays relatively the same then that is an interesting indicator for how future console/cross platform ports may perform as they will have at least some level of optimization for AMD's RT solution. Not that I think AMD are suddenly going to match or exceed Nvidia in RT performance, Nvidia will still have the best RT performance for this generation, but I think the gap could potentially shrink by a reasonable margin.

It will be very interesting to see how the RT landscape looks a year from now with driver updates, developer familiarity with AMD's RT solution/DXR 1.1 and how to optimize for it, more AMD sponsored titles with RT releasing and more console ports/cross platform releases. I think we could potentially see the performance increase by a noticeable margin once everything has a little time to mature.
 

evanft

Member
AMD fans in this thread:

2bd.png
 

Ascend

Member
AMD fans in this thread:

2bd.png
Of course you have to try and add fuel to the fire after the dust settled a bit....

In other news...;
AMD stated in an interview with PCWorld that its Radeon group is working with Intel to get this feature supported with RX 6000-series GPUs and Intel's latest CPUs and motherboards. The same goes with AMD's Ryzen group, which is working with Nvidia to get Smart Access Memory working with GeForce GPUs.
AMD also stated in the PCWorld interview that this feature simply isn't a "toggle switch" you can just turn on. There will be a good chunk of development and optimization required to get good performance gains from the feature.

 
Last edited:

llien

Member
So yes, it has 5% more performance in 4K, better RT and DLSS, but the fundamentals are bad, it requires a shitload of energy to achieve this. Other small things irk as well. No overclocking headroom, phantom MSRPs.
It looks way better for team red if one focuses on newer games, e.g. computerbase:

With EPIC onboard with AMD, the only major player that I care about yet to be cracked is CDPR.
j62Ma11.png
 

Irobot82

Member
Regardless of what hardware we pick, we are all part of the PCMR, whether we like the slogan or not. If you are team red, or team green, you are on my side. If I can get in discord, and hit up a bunch of my homies to play, idc what set you claim, as long as we can play TOGETHER. I just don't like the blatant fanboyism, or the ones who only side with one company. You can call me bipolar because I'll switch between several different companies. Whoever has the better shit, I'm all in.
Exactly. The whole reason I started PC gaming was to be with my friends. I was using a PS3 at the time and a bunch of my friends gave me some parts to build a PC so I could get back into PC gaming.

I've gone from
CPU
AMD (like 3800+ or something like that, way back in CS Source days)
Intel Core 2 Duo
Intel 3570 K (this guy lasted years)
AMD 3700X

GPU
Nvidia 6800 (I think? Also I remember having an ATI before this but no idea what it was)
Nvidia 9800 GTX
Nvidia 260 Core 216 ( gifted from friend)
AMD 7950 (my golden child for years)
Nvidia 1080

We should all just be happy we have a choice in our parts and we can customize to our budgets.

Edit: Corrected some items
 
Last edited:

Rikkori

Member


So this is the performance that's so terrible.....My only gripe is that he has the 3090 in there... I wish he compared the 6800XT to the 6080 and the 6800 to the 3070......Even then you can see the performance for AMD is pretty great in rasterization and RT....When these drivers mature, I think Nvidia is in for a world of hurt...

Those are fake btw, the channel doesn't have cards to test. Don't know where they steal info from but for the 6800 performance is not in-line with my own tests (higher for me). Lots of fake channels like this around, you'll learn to spot them in time.
 
Yeah, I mean, Ampere is 16 times faster at RT than Turing, it's reasonable to expect AMD would do at least 8 times faster bump.


You clearly have no idea what ML and it would be beneficial to this thread if you would avoid referencing it.
Thanks for your advice, but I am well aware of what ML is.
But feel free to carry on.
 
so wait for vega rdna 3?

i think that would be the sensible thing to do, for everyone who hasn't a GPU fetisch like me. buy a console now and buy rDNA3 or Hopper 1.5 years from now.
AMD appears to always be a generation or two behind Nvidia with tech.
Sure they can fight with Rasta, but additional features like Mesh Shading and VRS they were behind Nvidia, and while they have added RT to their GPUs, they are behind Nvidia with the performance of RT, and don't have any answer to DLSS at this point.
 

psorcerer

Banned
Giving it a little thought, it looks like AMD approach to RT is a much better solution.
They build on two things essentially:
1. Lots of cache
2. Higher clocks
Both of these are the only way to achieve good perf in a bad data locality situation, which is what RT (or any other GI solution) has.
Essentially to win a GI game a lot of approaches from CPUs need to be implemented in GPUs (we are going to see branch predictors soon).
Obviously any company can fuck up the implementation, but overall direction is the only correct one.
 
disingenuous at best, the only way they will have respect for AMD is for fans to buy these cards like crazy, just like they did Ryzen.....Ryzen always had good performance, but they nitpicked Ryzen too, never forget...I personally expect these cards to do very well, they are clearly the best value products out there relative to performance per dollar and watt, people will see that and make the right decisions as they have now made with Ryzen......The bell tolls...

I remember the threads.

How many people were talked out of buying a first gen Ryzen, to pick up an overpriced intel at the height of their shortages. Only to now be stuck on a dead end platform.
 

Rikkori

Member
What I'm curious to find out is what settings hit the cache the hardest. If we know what to tweak that would help a lot with maintaining ideal visual/perf ratio.
 

PhoenixTank

Member
Unfortunately I use some applications that are built on CUDA 😭.
Nah, fair. Those I'm referring to tend not to have any sort of reason, or at least aren't forthcoming with it. Vendor lock-in is a bit of a bitch at times.

Let's be honest. If you're in the market for a new card, you'll be happy with either a 3080 or a 6800XT. I'll honestly buy the first one that is actually available.
Probably the reality in this cursed year. If my 1080Ti gave up right now I'd take what I could get, or maybe something mid range at a discount to tide me over.

AMD fans in this thread:

2bd.png

tCWpDSf.jpg
 

kittoo

Cretinously credulous
AquaticSquirrel AquaticSquirrel is it possible to have the raytracing effects at let's say 1440p while the rest of the resolution at full 4k? Something like they do on consoles where RT is quarter resolution or something? If that is possible then that combination would be really good and may tilt me towards 6800xt.
 
Last edited:
AquaticSquirrel AquaticSquirrel is it possible to have the raytracing effects at let's say 1440p while the rest of the resolution at full 4k? Something like they do on consoles where RT is quarter resolution or something? If that is possible then that combination would be really good and may tilt me towards 6800xt.

I don't think you can specify resolution of the RT vs the normal rendering resolution, I think it is just a quality setting like low, mid, high.
 
despite some reports ray traced reflections are working in watch WD: Ledgion on the 6800 as intended. it bugs out from time to time but can be reactivated by disabling and enabling it again in the setting. no reload required.

tanks framerate real hard when you play at 4K (same on the 3080 when DLSS is disabled). 1440p is playable and still looks nice on a 4K TV.
 
Last edited:
I guess it's hilarious watching the AMD fanboy meltdowns in this thread. As a longtime Ascended member of PCMR, I've always used what offered the best features and performance and I've been that way since day 1. I have actually used ATI and AMD products at various points in my gaming career when they offered superior features and performance, and the unfortunate truth is that this has been a vanishingly rare event over the decades.

I remember when I had to pick a video card to play Half-Life 2, and at the time only ATI Radeon had Shader Model 2.0 support so I had one then. Some years later, only ATI Radeon allowed you to have anti-aliasing in Oblivion, so I had one then too. I can't really think of any other time ATI (later AMD) had actual features and performance Nvidia didn't which is why I've largely otherwise been using Nvidia cards.

Similarly, there was a time when AMD offered unquestionably superior performance to Intel, during the days of Athlon 64. I had one of those too. And now with Zen, AMD has reclaimed the performance crown from Intel and it happens I'm typing this post on a gaming PC with a 3900X.

So if and when AMD ever offers a graphics product which has superior features and performance to Nvidia again, then I'll have AMD again. Big Navi is not such a product, as it lacks key future-proofing features like ray-tracing and DLSS and the rasterization performance is only better at 1440p while I game in 4K now and have been for several years. There is nothing that this AMD product offers me that the Nvidia one isn't superior, so unfortunately all the fanboy bleating on Earth won't change the fact that I'll still be using Nvidia.
 
Last edited:
I guess it's hilarious watching the AMD fanboy meltdowns in this thread. As a longtime Ascended member of PCMR, I've always used what offered the best features and performance and I've been that way since day 1. I have actually used ATI and AMD products at various points in my gaming career when they offered superior features and performance, and the unfortunate truth is that this has been a vanishingly rare event over the decades.

I remember when I had to pick a video card to play Half-Life 2, and at the time only ATI Radeon had Shader Model 2.0 support so I had one then. Some years later, only ATI Radeon allowed you to have anti-aliasing in Oblivion, so I had one then too. I can't really think of any other time ATI (later AMD) had actual features and performance Nvidia didn't which is why I've largely otherwise been using Nvidia cards.

Similarly, there was a time when AMD offered unquestionably superior performance to Intel, during the days of Athlon 64. I had one of those too. And now with Zen, AMD has reclaimed the performance crown from Intel and it happens I'm typing this post on a gaming PC with a 3900X.

So if and when AMD ever offers a graphics product which has superior features and performance to Nvidia again, then I'll have AMD again. Big Navi is not such a product, as it lacks key future-proofing features like ray-tracing and DLSS and the rasterization performance is only better at 1440p while I game in 4K now and have been for several years. There is nothing that this AMD product offers me that the Nvidia one isn't superior, so unfortunately all the fanboy bleating on Earth won't change the fact that I'll still be using Nvidia.
Fucking well said! I hate when people unintelligently label you as "team green, Nvidia fanboy", simply because you prefer them at this current time and place. Tomorrow things can change, but to them, they have to defend AMD's honor. Its kinda scary to look at the history of some of the die hard AMD fanboys on this site. Literally pages upon pages of why AMD is better than Nvidia, why DLSS sucks, why raytracing is pointless, etc. I can't make this up.
 

Ascend

Member
AMD appears to always be a generation or two behind Nvidia with tech.
Aaaaand here we go again....
Remember when AMD had DX12 cards since 2012, and nVidia only in 2015?

So if and when AMD ever offers a graphics product which has superior features and performance to Nvidia again, then I'll have AMD again. Big Navi is not such a product, as it lacks key future-proofing features like ray-tracing and DLSS
It doesn't lack ray tracing....

See what I mean? Krappadizzle Krappadizzle
And look who is supporting that post again, rather than correcting that the AMD cards do in fact have RT... If you're truly unbiased, you don't support lies.

I don't think you can specify resolution of the RT vs the normal rendering resolution, I think it is just a quality setting like low, mid, high.
Most likely you can change it in one of the .ini files.
 
Last edited:

waylo

Banned
I guess it's hilarious watching the AMD fanboy meltdowns in this thread. As a longtime Ascended member of PCMR, I've always used what offered the best features and performance and I've been that way since day 1. I have actually used ATI and AMD products at various points in my gaming career when they offered superior features and performance, and the unfortunate truth is that this has been a vanishingly rare event over the decades.

I remember when I had to pick a video card to play Half-Life 2, and at the time only ATI Radeon had Shader Model 2.0 support so I had one then. Some years later, only ATI Radeon allowed you to have anti-aliasing in Oblivion, so I had one then too. I can't really think of any other time ATI (later AMD) had actual features and performance Nvidia didn't which is why I've largely otherwise been using Nvidia cards.

Similarly, there was a time when AMD offered unquestionably superior performance to Intel, during the days of Athlon 64. I had one of those too. And now with Zen, AMD has reclaimed the performance crown from Intel and it happens I'm typing this post on a gaming PC with a 3900X.

So if and when AMD ever offers a graphics product which has superior features and performance to Nvidia again, then I'll have AMD again. Big Navi is not such a product, as it lacks key future-proofing features like ray-tracing and DLSS and the rasterization performance is only better at 1440p while I game in 4K now and have been for several years. There is nothing that this AMD product offers me that the Nvidia one isn't superior, so unfortunately all the fanboy bleating on Earth won't change the fact that I'll still be using Nvidia.
In the same boat. My first PC way back in 2004 had Nvidia. Then I started buying ATI cards because they offered more price/performance. I've gone back and forth many times in the last 16 years. Just so happens the last eight or so years have been exclusively Nvidia. Similarly, I've gone back and forth with Intel and AMD. I buy whatever offers the best performance and features for the price. I'm not one side or the other. I'm currently on a 3800X PC (with a 3080).

The simple fact of the matter is the current offering from AMD just isn't worth it. It would need to be significantly less in order for it to be a legit option. Stating this doesnt mean I'm "team green" or "anti-AMD". It just means I vote with my wallet.
 

BluRayHiDef

Banned
"Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080"

What about when it's overclocked? The RX 6000 Series has been said to have higher overclockability than the RTX 30 Series.
 

llien

Member
I don't think you can specify resolution of the RT vs the normal rendering resolution, I think it is just a quality setting like low, mid, high.
I think it is worth pondering about what "RT resolution" is.
Rays would hardly care what resolution you have.
However, transformation from noisy dots, and I mean, really really noisy, this is a frame as it is rendered by pure RT (with some optimization tricks) without any pre-processing (including temporal) applied:

r4LJppH.png


Now, "something" into which you turn those pixels does have its resolution and, I guess, when it's lower, one likely gets away with smaller number of rays. It could also be that denoising/making up textures is also quite time consuming, so lowreing resolution of that also helps.

I guess it's hilarious watching the AMD fanboy meltdowns in this thread.
Coming from a person who doubted RDNA2 would do anything beyond beating 2080Ti. :

...then I'll have AMD again.
Please stick with overpriced green stuff.
 
Aaaaand here we go again....
Remember when AMD had DX12 cards since 2012, and nVidia only in 2015?
Not sure if serious?
Supporting an API is nothing like innovating with new tech.
Turing came out with Variable Rate Shading, Mesh Shaders, hardware accelerated Ray tracing and DLSS back in 2018.
AMD has only come out with VRS, Mesh Shaders and Ray Tracing in 2020, and still doesn't have an answer yet for DLSS.
I'm not sure why you need to argue with that.
Nvidia has ruled the GPU space for many years now, and AMD is playing catch up.
I hope they do catch up. I hope they can do to Nvidia what they did to Intel.
But facts are facts.
 
I finally found a review with benchmarks for Ultrawide 3440 x 1440 for 6800XT and 6800



It seems to perform quite well. Close to its 1440p performance from the looks of it. Although this is only a very small selection of games it should give us a rough idea.
 
It doesn't lack ray tracing....

You're both kind of right. Sure the 6800XT HAS raytracing, .....BUT.... when you startup Cyberpunk on the 10th will your 6800XT offer you any raytracing? The answer is no.

Sometime in the future it might but then the performance will likely be lacking. Probably have to choose between high resolutions OR raytracing. Not both.

The truth is that for the past 2 years Nvidia has been the only game in town for raytracing, so all the progress made so far has been through Nvidia's solution. And right now they have a combination of superior Raytracing cores AND DLSS which now allows Raytracing AND high resolutions. You don't have to choose one or the other anymore.

AMD still has no answer to this.
 

llien

Member
You're both kind of right. Sure the 6800XT HAS raytracing, .....BUT.... when you startup Cyberpunk on the 10th will your 6800XT offer you any raytracing? The answer is no.

And that:
1) Contradicts NV's promises that 'it's using standard API"
2) Undermines RT's (at least near) future in a terrifying way (you can't just develop DirectX RT game, you need to kamasutra with each manufacturer's card
3) Godfall also has "just for the record" RT effects... supported only on AMD GPUs so far

Oh, and I mentioned "just for the record", right?
Here is "teh evidence":

RT is off on... the right one:

BQ5l4be.png


RT is... off on this one:

IzKMyxO.png


See which of the two is blurry? Right, that's the "RT on" one (DLSS TAA Crap again?):

rQSNcLt.png


The "generational leap" could be seen here, but, hold on, it's a different time of the day (not to mention it's PC vs console):

t5W0EXH.png



in motion (turn sound off):



The truth is that for the past 2 years Nvidia has been the only game in town for raytracing, so all the progress made so far has been through Nvidia's solution.
Except it is Microsoft's official API: DXR.
And in 2 years we got... remind me, how many games worth playing with RT on?
 
Last edited:
Top Bottom