• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

thelastword

Banned
The 3090 has RTX and DLSS, which make it better than the 6900 XT. In regard to rasterization, whenever the 6900 XT is faster, it's barely so, which means nothing considering that it isn't always faster.

Ampere is simply better than RDNA2; it's a Jack of all trades while RDNA2 is merely a master of one and only in some comparisons.
Rasterization at Native resolutions. When Super Resolution debuts, we will compare.......SR delivers more frames than DLSS, so I hope you guys don't backtrack....Faster is faster, in many of the latest games, the 6800XT beats the 3090 as well, so put that into perspective and devs have not even developed with the Infinity cache in mind yet, nor with RDNA RT in mind yet....
We already heard it 7 years ago...
RDNA 1 was the new shift, the ground work began with Polaris and Ryzen 1, where AMD shipped many PC kits to developers to prepare them for AMD styled architecture. They got bundled ryzen+polaris+vega kits......The 5700XT showed tremendous speed at $400 beating the 2080S is some games and was the precursor.....RDNA 2 is at pole position already in rasterization across the board.....These consoles have shown, well the PS5 has shown itself to be very performant in rasterization and RT, so from here on out it's a foregone conclusion....Even in the PS4 gen, the majority of titles were NV focused, see how that has already changed with the number of AMD sponsored titles available since the console launch and RDNA 2's launch.......There will be more of that to come and even NV sponsored titles can't outdo AMD as they used to because these games have to run well on consoles and AMD can't be denied anymore as their hardware is spanning the whole GPU portfolio....
 

Papacheeks

Banned
this review shows how you can change the performance ladder with your games selection:



i think it's fair to say, that when it comes to games which were build with the new consoles in mind rDNA2 performs much more favorable. and i think it's kinda sad that this aspect was lost in so many reviews.


Most tech reviewers dont seem to follow the industry in a very close intimate way. Gamers nexus, Redtechgaming, moore's law is dead seem to be the only ones and dont focus just on consoles they do all tech. But again thats why there are youtubers like The cherno who can show you on a api level whats happening engine wise with the hardware so you can kind of formulate a good understanding of how RDNA2 is being used.

Also RDNA2 is very new, so also give it maybe 6 months before we start seeing more titles that really utilize rdna2.
 

thelastword

Banned
This is some quality comedy :) At this point we can joke around this and have fun.

Truth is, the 6900 is even more redundant than the 6800Xt. 1% faster than the 3080 in raster. 11% slower than the 3090 and with everything else worse. Its a peculiar card. They were too far behind nvidia to upset the status quo. But its a good job for them regardless
I must be looking at different benches, native resolution, the 6900XT is more performant overall, unless you only want to test Control, Unity, FF15 etc....and a million other old NV titles...
 
I must be looking at different benches, native resolution, the 6900XT is more performant overall, unless you only want to test Control, Unity, FF15 etc....and a million other old NV titles...




im sure you were looking at the wrong benchamrks. The 1000 dollar AMD card is basically the same as the 700 3080, in raster i mean, its slower in everything else. At 1440p and 4k. 1080p is so useless at this level that some outlets dont even test that resolution
 

Sun Blaze

Banned
So the 6900XT wallops the 3090 at $500 less......Better in 1080p, better in 1440p and about par in 4K with Nvidia taking the lead with a mixup of older games at 4K....

It's funny everytime AMD destroys Nvidia in a new title we have to hear, It's AMD sponsored, but some of these goons never made such disclaimers when they stick to old Nvidia titles to skew these results in NV's favor. Pretty much the majority of titles are still NV focused, so what's the point in specifying an AMD title, when the majority aren't....

They do the same for RT, some even use NV DLSS vs AMD native to compare RT.....This 6900XT is much more performant from 6800XT as opposed to 3080 to 3090......Then it has some amazing OC headroom unlike Nvidia and it also has SAM.......The crazy thing on some of these benchmarks is that people are still pairing these cards with 8700K's, when there is SAM available.....So SAM is cheating, but DLSS vs AMD Native is not....So much bias out there it's not even funny.....

From here on out, you will see the gap widen between AMD vs Nvidia, most titles will be AMD focused because of the consoles or well optimized for RDNA 2......Nvidia is in for a world of hurt, AMD will even catch up in RT as can be seen, very promising performance in the latest titles as opposed to NV developed RT titles like Control etc......Techtubers have no problems, forcing a million NV titles down our throats, but one AMD title and they must let you know....
Huh? According to aggregated scores from various sites, the 3090 beats the 6800XT by 10% at 1080p, 13% at 1440p and 20% at 4K. Meaning it would match or perhaps slightly lose to a 6900XT at 1080p, tie it or beat it at 1440p and beat it at 4K.
No card "wallops" the other and the victory certainly isn't AMD's to claim. Now with an OC the 6900XT should at least match and most likely beat the 3090 but thus far, the 3090 is still the fastest GPU on the market.

Throw in RT and there really isn't an argument anymore.

Not sure where you got your data from.
 

BluRayHiDef

Banned
When Super Resolution debuts, we will compare.......SR delivers more frames than DLSS, so I hope you guys don't backtrack....

You literally just made this up. There's no proof that Super Resolution delivers more frames than DLSS; as your own post implies, it isn't even available yet.
 

Antitype

Member

[…] RX 6900 XT MBA is already at the end of its life even though it has not yet been launched. Just like the RX 6800 MBA and the RX 6800 XT MBA, production has been one shot for this card. A brand told us that they only have about forty cards for France, not one more […] those who manage to have an RX 6900 XT MBA will have a real collector [item] in their hands.
— Aurélien LAGNY, Cowcotland

Fake MSRP confirmed. Turns out the 3080 (FE and low tier AIB) is actually cheaper than the 6800XT while being better across the board and the 6800 no longer makes any sense when you can get a 3080 for around €50 more (not sure about the difference with USD, but it's likely around that too). Going TSMC 7nm was a terrible mistake.
 
Going TSMC 7nm was a terrible mistake.

This has got to be the funniest comment in this thread by far, and that's saying something! Congrats!

Turns out the 3080 (FE and low tier AIB) is actually cheaper than the 6800XT while being better across the board and the 6800 no longer makes any sense when you can get a 3080 for around €50 more (not sure about the difference with USD, but it's likely around that too).

Eh, prices are crazy across the board right now for both AMD and Nvidia, especially in Europe. They will likely remain so until stock becomes readily available for both brands. Most cards at decent prices are not in stock regardless so the quoted price doesn't matter until you can actually readily buy one.

The ones you can actually occasionally buy are crazy for both brands. I or anyone else in the thread could easily quote prices from retailers showing crazy prices for 3000 series cards for example so your attempted victory lap here rings pretty hollow in reality.

As for phasing out the reference models, this is pretty normal for the most part for both AMD and Nvidia. Although I didn't expect them to be phased out so early in Jan/Feb, supply constraints could be a factor here but we can't really know for sure.

As I've been saying for a while, I would advise anyone looking for either AMD or Nvidia GPUs to wait until early next year for the prices to hopefully become sane along with ample supply. We can revisit pricing then once people can actually buy any of these cards 🤷‍♂️
 

Papacheeks

Banned
This has got to be the funniest comment in this thread by far, and that's saying something! Congrats!



Eh, prices are crazy across the board right now for both AMD and Nvidia, especially in Europe. They will likely remain so until stock becomes readily available for both brands. Most cards at decent prices are not in stock regardless so the quoted price doesn't matter until you can actually readily buy one.

The ones you can actually occasionally buy are crazy for both brands. I or anyone else in the thread could easily quote prices from retailers showing crazy prices for 3000 series cards for example so your attempted victory lap here rings pretty hollow in reality.

As for phasing out the reference models, this is pretty normal for the most part for both AMD and Nvidia. Although I didn't expect them to be phased out so early in Jan/Feb, supply constraints could be a factor here but we can't really know for sure.

As I've been saying for a while, I would advise anyone looking for either AMD or Nvidia GPUs to wait until early next year for the prices to hopefully become sane along with ample supply. We can revisit pricing then once people can actually buy any of these cards 🤷‍♂️

Like I SAID a while back, chip manufacturing for Memory, gpu's, boards etc are all constrained because of COVID in the EAST. They dont have all facilities open, they have less than half of staff working. only some of the factories are automated.

And what sucks is TSMC is the only place in town currently for 7nm that has good yields.
 

Antitype

Member
This has got to be the funniest comment in this thread by far, and that's saying something! Congrats!



Eh, prices are crazy across the board right now for both AMD and Nvidia, especially in Europe. They will likely remain so until stock becomes readily available for both brands. Most cards at decent prices are not in stock regardless so the quoted price doesn't matter until you can actually readily buy one.

The ones you can actually occasionally buy are crazy for both brands. I or anyone else in the thread could easily quote prices from retailers showing crazy prices for 3000 series cards for example so your attempted victory lap here rings pretty hollow in reality.

As for phasing out the reference models, this is pretty normal for the most part for both AMD and Nvidia. Although I didn't expect them to be phased out so early in Jan/Feb, supply constraints could be a factor here but we can't really know for sure.

As I've been saying for a while, I would advise anyone looking for either AMD or Nvidia GPUs to wait until early next year for the prices to hopefully become sane along with ample supply. We can revisit pricing then once people can actually buy any of these cards 🤷‍♂️

How is it funny when it's the reason they have to inflate their margins that much? It's simply not profitable for AMD to produce those desktop RDNA2 chips at MSRP when they make much higher profits per wafer with Zen3 and are contractually obligated to a certain volume of console SOCs and get nice bonuses when they can go beyond.

Prices are not crazy in Europe if you know where to buy. I got an FE for MSRP in November and it's regularly possible to get ASUS at MSRP on their store as well for example. Just follow those discord/twitter/telegram bots.
 

Bolivar687

Banned
How is it funny when it's the reason they have to inflate their margins that much? It's simply not profitable for AMD to produce those desktop RDNA2 chips at MSRP when they make much higher profits per wafer with Zen3 and are contractually obligated to a certain volume of console SOCs and get nice bonuses when they can go beyond.

Prices are not crazy in Europe if you know where to buy. I got an FE for MSRP in November and it's regularly possible to get ASUS at MSRP on their store as well for example. Just follow those discord/twitter/telegram bots.

Saying that using a smaller node was a mistake is insane, to the extent that you should probably go back and edit that out of your post. I don't know what you think that has to do with MSRP when I'm seeing all cards getting sold for the same inflated premiums from retailers and a poster said they were able to get the 6800 xt for much cheaper than a 3080 in their country.

You do make a good point that there is an opportunity cost to making graphics cards when their production would could be redirected to the CPU side more profitably.
 
How is it funny when it's the reason they have to inflate their margins that much? It's simply not profitable for AMD to produce those desktop RDNA2 chips at MSRP when they make much higher profits per wafer with Zen3 and are contractually obligated to a certain volume of console SOCs and get nice bonuses when they can go beyond.

Prices are not crazy in Europe if you know where to buy. I got an FE for MSRP in November and it's regularly possible to get ASUS at MSRP on their store as well for example. Just follow those discord/twitter/telegram bots.

It is funny because TSMC is the current world leading silicon fabrication manufaturer in the world, by a solid margin. They reach node shrinks more quickly than anyone else and the quality of each of their nodes is second to none. In addition they have matured their 7nm process very well to the point there is likely a low failure rate.

The fact that AMD have chosen TSMC 7nm for their GPUs allows then to run cooler and more power efficiently than if they were to use a competing brand. This in turn aids them in achieving high clock speeds. They also allow AMD to fit more transitors on the die area, which again improves performance.

Just sticking with GPUs for the moment, Samsung is really the only other fabrication option in town. And they have an objectively worse fabrication process/node which is less power efficient, generates more heat and by most reports seems to have higher failure rates than TSMC. Nvidia tried to strong arm TSMC for cheaper prices and more fabrication time which was already promised to other customers (including AMD). TSMC told them to take a hike and Nvidia then went to Samsung who were offering a much cheaper node.

However this small saving in node cost seems to have resulted in the need for extravagant expensive cooling solutions to keep the temperatures under control so the cost ended up likely being more in the end. This inferior Samsung node contributes somewhat to the 3000 series higher power draw, heat and lower efficiency across the board. This in turn leads to lower performance than they could potentially have on a hypothetical TSMC node.

Right now Samsung are having yield issues for the 3000 series, contributing to the supply issues Nvidia is facing. This of course is ignoring the COVID impact across all fabrication right now.

It is very clear from the end result that Nvidia is the one who made a mistake by choosing Samsung for their wafer fabrication. To see an Nvidia fan try to spin this obvious and fairly undisputed fact to try to draw the opposite conclusion is why I found your comment so funny, given its detachment from reality.

Moving back to TSMC/AMD for a moment, even if some hypothetical 3rd silicon fabrication company was available with the exact same cost and performance as TSMC and AMD moved their entire stack to this non existent company that was able to produce more wafers. Even then AMD would still be sending most of their wafers to their console obligations, what was left would go to their datacenter CPU/GPUs, their DIY desktop Ryzen processors and a good chunk towards the laptop market which is really opening up for AMD right now.

In this hypothetical scenario, would AMD be able to produce more GPUs than they have now? Certainly, but there would probably still be supply issues as the bulk would still go elsewhere as it makes more business sense. Although I do agree that having a ton more GPU now would definitely help them grow market share against Nvidia during this critical time in the GPU market so in this sense you do have a point, but Nvidia is not faring much better with Samsung/supply right now.

However coming back to reality for a moment, the only other game in town is Samsung in which case AMD would need to sacrifice some performance, efficiency, power draw, clock speeds and have a higher power draw all the while still being supply limited by bad yields at Samsung as we are seeing with Nvidia right now. To blame TSMC rather than market/logistic realities is pretty foolish and shortsighted, especially given all of the advantages they bring.

However more TSMC capacity is opening up now with Apple moving over to 5nm so hopefully supply for AMD across the board in all of their markets will improve.
 

Antitype

Member
Saying that using a smaller node was a mistake is insane, to the extent that you should probably go back and edit that out of your post. I don't know what you think that has to do with MSRP when I'm seeing all cards getting sold for the same inflated premiums from retailers and a poster said they were able to get the 6800 xt for much cheaper than a 3080 in their country.

You do make a good point that there is an opportunity cost to making graphics cards when their production would could be redirected to the CPU side more profitably.

You do know AIB are selling significantly above MSRP due to how much AMD is asking them for the chip right? It's a similar case to the 3060 Ti where MSRP is simply not realistic for AIB. That's a direct result from TSMC 7nm node being very expensive and supply being extremely constrained till at least 2021H2 (when it should become possible to book more capacity again). So sure the node is superior to Samsung's 8nm, but what's the point if in the end it forces you to sell an inferior product at a higher price than the competition? Now I realize it's not like they have other options, since afaik Nvidia has fully loaded Samsung's fab and there's no other player in town, but it doesn't change the fact that this node is a bad fit for their product when you factor in the price sold to consumer.

Now that there aren't any RX produced to be sold at MSRP, it's not a matter of being able to score one at that price point or not like with Nvidia (which I'm sorry to say but is possible, I did it and so did many others, they just go OoS very fast), you simply no longer have the option with AMD. Actual MSRP is higher for the foreseeable future.
 
The actual BOM cost for these GPUs including wafer prices are far lower than the MSRP. There is significant profit being made by both companies on these GPUs.

Simply put they will all charge whatever they can get away with in the market with regards to competitors and supply/demand economics. The wafer cost while important has little bearing on the final price. Both AMD and Nvidia could drop their whole stack by $100 tomorrow and still be making a tidy profit.
 

Bolivar687

Banned
It is funny because TSMC is the current world leading silicon fabrication manufaturer in the world, by a solid margin. They reach node shrinks more quickly than anyone else and the quality of each of their nodes is second to none. In addition they have matured their 7nm process very well to the point there is likely a low failure rate.

The fact that AMD have chosen TSMC 7nm for their GPUs allows then to run cooler and more power efficiently than if they were to use a competing brand. This in turn aids them in achieving high clock speeds. They also allow AMD to fit more transitors on the die area, which again improves performance.

Just sticking with GPUs for the moment, Samsung is really the only other fabrication option in town. And they have an objectively worse fabrication process/node which is less power efficient, generates more heat and by most reports seems to have higher failure rates than TSMC. Nvidia tried to strong arm TSMC for cheaper prices and more fabrication time which was already promised to other customers (including AMD). TSMC told them to take a hike and Nvidia then went to Samsung who were offering a much cheaper node.

However this small saving in node cost seems to have resulted in the need for extravagant expensive cooling solutions to keep the temperatures under control so the cost ended up likely being more in the end. This inferior Samsung node contributes somewhat to the 3000 series higher power draw, heat and lower efficiency across the board. This in turn leads to lower performance than they could potentially have on a hypothetical TSMC node.

Right now Samsung are having yield issues for the 3000 series, contributing to the supply issues Nvidia is facing. This of course is ignoring the COVID impact across all fabrication right now.

It is very clear from the end result that Nvidia is the one who made a mistake by choosing Samsung for their wafer fabrication. To see an Nvidia fan try to spin this obvious and fairly undisputed fact to try to draw the opposite conclusion is why I found your comment so funny, given its detachment from reality.

Moving back to TSMC/AMD for a moment, even if some hypothetical 3rd silicon fabrication company was available with the exact same cost and performance as TSMC and AMD moved their entire stack to this non existent company that was able to produce more wafers. Even then AMD would still be sending most of their wafers to their console obligations, what was left would go to their datacenter CPU/GPUs, their DIY desktop Ryzen processors and a good chunk towards the laptop market which is really opening up for AMD right now.

In this hypothetical scenario, would AMD be able to produce more GPUs than they have now? Certainly, but there would probably still be supply issues as the bulk would still go elsewhere as it makes more business sense. Although I do agree that having a ton more GPU now would definitely help them grow market share against Nvidia during this critical time in the GPU market so in this sense you do have a point, but Nvidia is not faring much better with Samsung/supply right now.

However coming back to reality for a moment, the only other game in town is Samsung in which case AMD would need to sacrifice some performance, efficiency, power draw, clock speeds and have a higher power draw all the while still being supply limited by bad yields at Samsung as we are seeing with Nvidia right now. To blame TSMC rather than market/logistic realities is pretty foolish and shortsighted, especially given all of the advantages they bring.

However more TSMC capacity is opening up now with Apple moving over to 5nm so hopefully supply for AMD across the board in all of their markets will improve.

dhMeAzK.gif
 
Now that there aren't any RX produced to be sold at MSRP, it's not a matter of being able to score one at that price point or not like with Nvidia (which I'm sorry to say but is possible, I did it and so did many others, they just go OoS very fast), you simply no longer have the option with AMD. Actual MSRP is higher for the foreseeable future.

According to Hardware Unboxed they spoke with AMD directly who said they are working with AIBs to enable them to release GPUs for MSRP or close to it. Supposedly this is to happen in 4-8 weeks. Could be nonsense but all we have to go on so far.

I don't know if this means a price reduction for all AIB cards or simply AIBs launching lower tier/entry level cards for MSRP while keeping the premium stuff priced high.
 

Antitype

Member

I already replied to most of that stuff in my previous post, but just going to address this part:

"To see an Nvidia fan try to spin this obvious and fairly undisputed fact to try to draw the opposite conclusion is why I found your comment so funny, given its detachment from reality."

That always cracks me up when basically you conflate people buying the best products with Nvidia fans. If AMD made a better GPU I'd buy one in a heartbeat. I'm a fan of good products, so obviously I'm not going to go with the worse one when it's more expensive on top of it.
 
I already replied to most of that stuff in my previous post, but just going to address this part:

"To see an Nvidia fan try to spin this obvious and fairly undisputed fact to try to draw the opposite conclusion is why I found your comment so funny, given its detachment from reality."

That always cracks me up when basically you conflate people buying the best products with Nvidia fans. If AMD made a better GPU I'd buy one in a heartbeat. I'm a fan of good products, so obviously I'm not going to go with the worse one when it's more expensive on top of it.

To be clear, I didn't say you were a fanboy, I know you are not claiming I did but just wanted to clarify.

But do I think you are a fan? As in someone who generally prefers Nvidia? Judging by your posts almost certainly. Nothing necessarily wrong with that as everyone has a preference for various things in life including brands but worth pointing out during your silly TSMC was a mistake comment.
 

Bolivar687

Banned
I already replied to most of that stuff in my previous post, but just going to address this part:

"To see an Nvidia fan try to spin this obvious and fairly undisputed fact to try to draw the opposite conclusion is why I found your comment so funny, given its detachment from reality."

That always cracks me up when basically you conflate people buying the best products with Nvidia fans. If AMD made a better GPU I'd buy one in a heartbeat. I'm a fan of good products, so obviously I'm not going to go with the worse one when it's more expensive on top of it.

It's not a worse product nor is it more expensive. It outperforms Nvidia's equivalents in different games and resolutions, they have more VRAM and better efficiency, overclocking, and frame pacing. I guess you guys have nothing left at this point but to keep saying things that are not true but it's really hurting the quality of the thread and the forum if you're just going to straight up lie. I believe you when you tell me you and others managed to get your cards at MSRP but you are extrapolating the current supply-constrained environment to definitively declare the 6000 series will forever and for always be more expensive than the 3000 cards. Do you really not see how insane you sound?

I'm also a fan of good products and always buy the best value - that's why I have been using an Nvidia + Intel set up for years now. But saying "7nm was a mistake" is the gold standard for just how crazy Nvidia fans really are.
 

Ascend

Member
Looks like reference designs are back on the table it seems.


I guess this shows that AMD really has a lot to learn about their user base... They really have no idea what they're doing marketing-wise. That's like most companies actually, but AMD always have a way of showing it lol.

In other news...;

Radeon Software Adrenalin 2020 Edition 20.12.1 Highlights
Support For
Radeon™ RX 6900 Series

Cyberpunk 2077™

I wonder if this brought any performance improvements... Most likely not much if any, otherwise they would likely have mentioned a certain percentage of what the performance increase is.
 

Antitype

Member
It's not a worse product nor is it more expensive. It outperforms Nvidia's equivalents in different games and resolutions, they have more VRAM and better efficiency, overclocking, and frame pacing. I guess you guys have nothing left at this point but to keep saying things that are not true but it's really hurting the quality of the thread and the forum if you're just going to straight up lie. I believe you when you tell me you and others managed to get your cards at MSRP but you are extrapolating the current supply-constrained environment to definitively declare the 6000 series will forever and for always be more expensive than the 3000 cards. Do you really not see how insane you sound?

I'm also a fan of good products and always buy the best value - that's why I have been using an Nvidia + Intel set up for years now. But saying "7nm was a mistake" is the gold standard for just how crazy Nvidia fans really are.

Review aggregates have RTX 3080 and 3090 above their AMD counterparts. It's more arguable in the case of the 3090 since it's also a lot more expensive. But 3080 is superior on all fronts to the 6800XT. It's already been discussed ad nauseum in this thread so I'm not going to list again all the numerous advantages of the Nvidia side. Even the VRAM argument falls flat when you see how it behaves at 4k. The heavier the workload, the more it falls behind GDDR6X. So if a larger buffer sounds good on paper, in practice the bandwidth is just too slow to make a difference. And with sampler feedback + directstorage, VRAM usage is not going to balloon when we move past cross gen.

If going with something that forces your hand to sell an inferior product for more than the better competing product is not a mistake, then I don't know what is. It's not like they even offer any proprietary tech that could make them the better choice for a certain demographic. They have nothing to offer other than raw performance and unfortunately for them they are behind in that regard. So with the current situation at TSMC you have a worse product that you are forced to sell at the same or higher price than the competition. That's how you go bankrupt in a free market. Or you deliberately produce low amount of that product and focus on where you have an unbeatable product currently and with much higher margins at that (Zen3). They're not going to lower their margins until the supply is no longer constrained and it's not going to happen for around 6 months now according to reports that have TSMC fully loaded till H2. Now, they could lower their margins if they wanted, but they would be losing money. Deliberately lowering their profits. That would surely please their customers, but less so their shareholders. Don't believe them when they say supply will be normalized in 4 weeks. They already lied with the launch stock situation and it's perfectly understandable why they barely produce any desktop RDNA2 for the time being as I already said.

Looks like reference designs are back on the table it seems.



Sounds like damage control to me. So they suddenly decided today after the news got out that the MSRP was fake, that they would indefinitely continue producing their reference model? Sure...
 

Bolivar687

Banned
Review aggregates have RTX 3080 and 3090 above their AMD counterparts. It's more arguable in the case of the 3090 since it's also a lot more expensive. But 3080 is superior on all fronts to the 6800XT. It's already been discussed ad nauseum in this thread so I'm not going to list again all the numerous advantages of the Nvidia side. Even the VRAM argument falls flat when you see how it behaves at 4k. The heavier the workload, the more it falls behind GDDR6X. So if a larger buffer sounds good on paper, in practice the bandwidth is just too slow to make a difference. And with sampler feedback + directstorage, VRAM usage is not going to balloon when we move past cross gen.

If going with something that forces your hand to sell an inferior product for more than the better competing product is not a mistake, then I don't know what is. It's not like they even offer any proprietary tech that could make them the better choice for a certain demographic. They have nothing to offer other than raw performance and unfortunately for them they are behind in that regard. So with the current situation at TSMC you have a worse product that you are forced to sell at the same or higher price than the competition. That's how you go bankrupt in a free market. Or you deliberately produce low amount of that product and focus on where you have an unbeatable product currently and with much higher margins at that (Zen3). They're not going to lower their margins until the supply is no longer constrained and it's not going to happen for around 6 months now according to reports that have TSMC fully loaded till H2. Now, they could lower their margins if they wanted, but they would be losing money. Deliberately lowering their profits. That would surely please their customers, but less so their shareholders. Don't believe them when they say supply will be normalized in 4 weeks. They already lied with the launch stock situation and it's perfectly understandable why they barely produce any desktop RDNA2 for the time being as I already said.



Sounds like damage control to me. So they suddenly decided today after the news got out that the MSRP was fake, that they would indefinitely continue producing their reference model? Sure...

And we've likewise discussed ad nauseum the problem with those aggregates when you are adding the percentage of sites that tested 18+ games showing AMD superior at 1440p to sites testing 8 games showing Nvidia with an advantage, and pretending like that aggregated percentage across widely different sample sizes is meaningful.

It's not.

We're still waiting for you to demonstrate why 7nm is making things more expensive beyond your own unsupported conjecture, let alone how you think the product with superior performance per watt is somehow the inferior product. Again, how can you tell us that AMD is alone with a unique price problem when retailers are charging the same $150 premium on the AIB models for both brands? Even if you finally started providing concrete evidence (which, of course, you haven't), do you seriously think that this supply-constrained market is indicative of what future prices will be when availability normalizes?

Of course you don't. This is shit posting.
 
Last edited:

Antitype

Member
And we've likewise discussed ad nauseum the problem with those aggregates when you are adding the percentage of sites that tested 18+ games showing AMD superior at 1440p to sites testing 8 games showing Nvidia with an advantage, and pretending like that aggregated percentage across widely different sample sizes is meaningful.

It's not.

We're still waiting for you to demonstrate why 7nm is making things more expensive beyond your own unsupported conjecture, let alone how you think the product with superior performance per watt is somehow the inferior product. Again, how can you tell us that AMD is alone with a unique price problem when retailers are charging the same $150 premium on the AIB models for both brands? Even if you finally started providing concrete evidence (which, of course, you haven't), do you seriously think that this supply-constrained market is indicative of what future prices will be when availability normalizes?

Of course you don't. This is shit posting.

Even if you dislike aggregates, most reviews have Nvidia ahead. (btw no need to repost those 2-3 reviews where AMD is better, we've all seen them enough already).

We know AIB could not sell their GPU for MSRP. The reason cited was how expensive the RDNA2 chip is. The RDNA2 chip is produced at TSMC 7nm. A node that is so supply constrained that prices skyrocketed. We know that because it's #1 common sense when demand exceeds supply and #2 the reason Nvidia went with Samsung 8nm for their consumer GPUs.

All Nvidia AIB sell GPUs starting at Founder's Edition MSRP. AMD AIB do not. Again, unrealistic MSRP as far as they're concerned.

Retailers price gouging is another story and it's true at the moment prices are whack in a lot of places, but you can still find RTX30s at MSRP if you look around. You can't with AMD because there's nothing at MSRP and also because there's no stock whatsoever.

If you still don't get it, I give up.
 

thelastword

Banned



Pretty much sums up the situation with reviews at the moment.....

I can trust Hardware Unboxed since he uses the latest games, tests on the latest hardware and drivers and redoes tests every time he benches.......Take for example, AC oddysey has received a huge upgrade in perf on AMD GPU's lately...What you get there is the latest....


Funny that some of these guys test Ghost Recon, but no one is testing the updated Fortnite which is performing better on AMD, yet when Fortnite was better on Nvidia by a wide margin, it was on every test.......JayZTwo cents did not test Dirt, WatchDogs. Godfall, RiftBreaker, AC Valhalla, Horizon Zero Dawn, not even Doom Eternal, DMC or Resident Evil 3.....This is just shameless...
 

Buggy Loop

Member



Fake MSRP confirmed. Turns out the 3080 (FE and low tier AIB) is actually cheaper than the 6800XT while being better across the board and the 6800 no longer makes any sense when you can get a 3080 for around €50 more (not sure about the difference with USD, but it's likely around that too). Going TSMC 7nm was a terrible mistake.

I won't consider the 3090 here in this discussion because i don't care about blender machines.

The 3080 is truely the best chip now with the real MSRPs.

Imagine for a moment, rasterization was not even their focus here, because this brute forcing way of doing things will be gone. We can't see or understand what's coming here, but anyway, no matter, they managed to dedicate a ton of ML crunching power with the tensor cores, a refined RT & rasterization, and still fought and mostly won in rasterization, even at lower resolutions on average as we've seen from data accumulations.

From 89 TOPS to 238 TOPS (Tensor OPs)compared to Turing's, for what? What are they gonna do? It's not going to be used solely for DLSS 2.1, as then, Turing was fine enough. Perhaps a DLSS 3.0? Move the denoiser to tensor cores for games? NPC AI? Game physics predicted by AI thousands of times faster than traditional methods, AI texture upscale on the fly, Automatic lipsync based on language , Life-like faces animations and models.

I mean who knows. But to think that will all that silicon dedicated to ML, to good performing ray tracing (ASIC implementation, takes more silicon, more comple), and still held it's own on rasterization... Boggles my mind.
 
I won't consider the 3090 here in this discussion because i don't care about blender machines.

The 3080 is truely the best chip now with the real MSRPs.

Imagine for a moment, rasterization was not even their focus here, because this brute forcing way of doing things will be gone. We can't see or understand what's coming here, but anyway, no matter, they managed to dedicate a ton of ML crunching power with the tensor cores, a refined RT & rasterization, and still fought and mostly won in rasterization, even at lower resolutions on average as we've seen from data accumulations.

From 89 TOPS to 238 TOPS (Tensor OPs)compared to Turing's, for what? What are they gonna do? It's not going to be used solely for DLSS 2.1, as then, Turing was fine enough. Perhaps a DLSS 3.0? Move the denoiser to tensor cores for games? NPC AI? Game physics predicted by AI thousands of times faster than traditional methods, AI texture upscale on the fly, Automatic lipsync based on language , Life-like faces animations and models.

I mean who knows. But to think that will all that silicon dedicated to ML, to good performing ray tracing (ASIC implementation, takes more silicon, more comple), and still held it's own on rasterization... Boggles my mind.


AMD did exactly like this AI researcher said over here:


"nVidia understands what is coming with next-gen game engines, and they’ve taken a very forward-thinking approach with the Ampere architecture. If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs. I have a feeling that if nVidia wasn’t so concerned with prizing 1080 Ti’s out of our cold, dead hands, then they would have bothered even less with competing head-to-head with older cards in rasterization performance. "


AMD did to the letter what this guy was saying months before AMD released the cards. They went all in in rasterisation and ignored the actual next gen features that they themselves put into the next gen consoles and which nvidia opened the door to 2 years ago. They released a gpu that's very fast in games of old.

Right now, 3080 owners are gonna have a never before seen visual experience in Cyberpunk while AMD owners who paid 800 to 1000 dollars for their cards are unable to use the next gen features. Even after RT is enabled on radeon cards, the performance isnt there to use it.
 
I won't consider the 3090 here in this discussion because i don't care about blender machines.

The 3080 is truely the best chip now with the real MSRPs.

Imagine for a moment, rasterization was not even their focus here, because this brute forcing way of doing things will be gone. We can't see or understand what's coming here, but anyway, no matter, they managed to dedicate a ton of ML crunching power with the tensor cores, a refined RT & rasterization, and still fought and mostly won in rasterization, even at lower resolutions on average as we've seen from data accumulations.

From 89 TOPS to 238 TOPS (Tensor OPs)compared to Turing's, for what? What are they gonna do? It's not going to be used solely for DLSS 2.1, as then, Turing was fine enough. Perhaps a DLSS 3.0? Move the denoiser to tensor cores for games? NPC AI? Game physics predicted by AI thousands of times faster than traditional methods, AI texture upscale on the fly, Automatic lipsync based on language , Life-like faces animations and models.

I mean who knows. But to think that will all that silicon dedicated to ML, to good performing ray tracing (ASIC implementation, takes more silicon, more comple), and still held it's own on rasterization... Boggles my mind.

Hey man, how's it going?

This is the AMD Radeon RX6000 review/benchmark thread. If you want to advertise for Nvidia or otherwise cream yourself about the 3080 there is a 3080 OT for that.

I'm assuming you accidentally clicked on the wrong thread when you posted this so I'm happy to direct you to the right place.

BTW that leather jacket looks stunning on you, really suits you bro! :)
 

Buggy Loop

Member
Hey man, how's it going?

This is the AMD Radeon RX6000 review/benchmark thread. If you want to advertise for Nvidia or otherwise cream yourself about the 3080 there is a 3080 OT for that.

I'm assuming you accidentally clicked on the wrong thread when you posted this so I'm happy to direct you to the right place.

BTW that leather jacket looks stunning on you, really suits you bro! :)

I was ATI/AMD from 1994 (even 2D cards) to 2016 when I picked a 1060. After my first Pentium, I was always AMD processor, even during the rough phenom II times. So yeah.. Were you even born back then? Even knew what a GPU was?

The difference between me and you is that this is not my first AMD rodeo of deflated hype. I warned many peoples to not hype themselves since the rumours. It’s again a « poor Volta » situation. As much as I like AMD, they always stumble here and there, in marketing, in price, in drivers..

Nvidia is guilty of fake MSRP too like during Turing, this time it seems to hold since even a few AIB meet MSRP.

Bring that warrior meme´ing to someone else.
 

Buggy Loop

Member
And engineering wise, if you don’t focus too much on ray tracing, nor machine learning, it’s fine, no really, I don’t care, there’s a market for that and a place for it. But then I expect that the silicon area, on a better foundry node, should fucking punch the competition Mike Tyson style in rasterization.

I’m not sure how Nvidia is even standing in the ring after this. Honest.
 
I was ATI/AMD from 1994 (even 2D cards) to 2016 when I picked a 1060. After my first Pentium, I was always AMD processor, even during the rough phenom II times. So yeah.. Were you even born back then? Even knew what a GPU was?

The difference between me and you is that this is not my first AMD rodeo of deflated hype. I warned many peoples to not hype themselves since the rumours. It’s again a « poor Volta » situation. As much as I like AMD, they always stumble here and there, in marketing, in price, in drivers..

Nvidia is guilty of fake MSRP too like during Turing, this time it seems to hold since even a few AIB meet MSRP.

Bring that warrior meme´ing to someone else.

I don't have any kind of problem or history with you, but you've randomly posted out of the blue, not really related to any ongoing conversation, and your post literally reads like an advertisement for Nvidia.

I don't have a problem with your opinion in and of itself. In fact I think the 3080 is a fantastic card, I've never said otherwise. But you've come into a Radeon thread out of the blue and essentially posted about how amazing you find the 3080 to be, which is completely unprompted and nothing to do with the topic at hand.

Either you accidentally posted in the wrong thread and you meant for your post to be in the 3000 series OT or you are randomly shit posting in a Radeon thread.

Btw I'm 36 and have been around the PC space for a while, spare me the "I'm not a fanboy" dance.

We see it time and time again from shit posters who coincidentally always seem to only post positively about Nvidia and negatively about AMD and mostly in AMD related threads.

If we took fanboys at their word we would have to assume that none exist because every one I've encountered swears they are not a fanboy so....
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
Honestly looks like the bog standard 6800 is really the only "smart" choice out of this lineup, sadly.
 

Buggy Loop

Member
I don't have any kind of problem or history with you, but you've randomly posted out of the blue, not really related to any ongoing conversation, and your post literally reads like an advertisement for Nvidia.

I don't have a problem with your opinion in and of itself. In fact I think the 3080 is a fantastic card, I've never said otherwise. But you've come into a Radeon thread out of the blue and essentially posted about how amazing you find the 3080 to be, which is completely unprompted and nothing to do with the topic at hand.

Either you accidentally posted in the wrong thread and you meant for your post to be in the 3000 series OT or you are randomly shit posting in a Radeon thread.

Btw I'm 36 and have been around the PC space for a while, spare me the "I'm not a fanboy" dance.

We see it time and time again from shit posters who coincidentally always seem to only post positively about Nvidia and negatively about AMD and mostly in AMD related threads.

If we took fanboys at their word we would have to assume that none exist because every one I've encountered swears they are not a fanboy so....

Right, I’ll see myself out to the other thread. I’m just flabbergasted, AMD gave a reference with an MSRP, which then every reviewer compared to Nvidia. Then they pull it out and no AIB are at MSRP. I’ve seen some shit tactics in the past, but this one is a bit too obvious.
 

tusharngf

Member



Pretty much sums up the situation with reviews at the moment.....

I can trust Hardware Unboxed since he uses the latest games, tests on the latest hardware and drivers and redoes tests every time he benches.......Take for example, AC oddysey has received a huge upgrade in perf on AMD GPU's lately...What you get there is the latest....


Funny that some of these guys test Ghost Recon, but no one is testing the updated Fortnite which is performing better on AMD, yet when Fortnite was better on Nvidia by a wide margin, it was on every test.......JayZTwo cents did not test Dirt, WatchDogs. Godfall, RiftBreaker, AC Valhalla, Horizon Zero Dawn, not even Doom Eternal, DMC or Resident Evil 3.....This is just shameless...


not to mention he always speaks with his cameraman and makes silly jokes. No one will laugh on them. I am yet to see this cameraman guy getting a full 1minute on his video.
 
Last edited:

Bolivar687

Banned
I agree that Jay has some incredibly obscure game choices but I thought he was overall fair to the 6000 series considering his past perceived bias, which is something he kind of acknowledged in the 6900 XT review. He did show the flaws of the 6800 XT but had a pretty overall positive conclusion. He praised the hardware in the 6900 XT while saying he has reservations about software, which, you have to admit, he was having some really weird results. But he also says in that video that he thinks the 6800 is the best overall value on the market right now, for approaching 3080 performance for a much lower price and handily outclassing the 3070.
 
Last edited:

thelastword

Banned
not to mention he always speaks with his cameraman and makes silly jokes. No one will laugh on them. I am yet to see this cameraman guy getting a full 1minute on his video.
Tbf, I like Jay, I don't think he is a bad youtuber at heart, but he needed to be called out on this.....It's too blatant. I know he prefers Nvidia, but to say the value proposition of the $1000 AMD card is worse than that of the $1500 NV card is total bollocks, if you check in most benches the 3090 is barely more than 5% better than the 3080, but in most benches the 6090XT is over 11% better than the 6800XT and in some cases much higher....

Even in the Ghost Recon game he is talking about, the 4K performance of the AMD cards are better over Nvidia....Yet generally in most tests on the net with an aggregate of the latest games, AMD has the superior 1440p results and fall back on 4K, yet we saw many tests in Coreteks Video or Hardware Unboxed video where AMD is ahead at both 1440p and 4K. So clearly these guys have to bench more games, bench more of the latest games.....So many synthetics, seriously Uniengine, port royale who cares, most of these were made with NV in mind....Bench games....2019-2020
 

Ascend

Member
It's always funny how the AMD haters always claim they were AMD fans in the past... As if that would justify their bigotry....

In any case... Jay normally swings strongly to one side, even when he pretends to be neutral. He simply switches sides when he feels like it. He has entertaining content, but is definitely not my main go-to source for benchmarks. That's still Hardware Unboxed.
 
It's always funny how the AMD haters always claim they were AMD fans in the past... As if that would justify their bigotry....

In any case... Jay normally swings strongly to one side, even when he pretends to be neutral. He simply switches sides when he feels like it. He has entertaining content, but is definitely not my main go-to source for benchmarks. That's still Hardware Unboxed.

slow your roll bro. Bigotry? IS that the right word?
 

BluRayHiDef

Banned
Here's the most reasonable assessment of RDNA 2 vs Ampere:

Depending on the reviewer and/ or suite of games tested, RDNA 2 may be faster on average than Ampere in rasterization - or Ampere may be faster on average than RDNA 2 in rasterization; however, whenever one architecture is faster than the other in rasterization, typically it is barely so - meaning that the difference in rasterization is negligible.

Hence, what truly separates one architecture from the other are additional rendering technologies, namely artificially intelligent upscaling and ray tracing. As we all know, Ampere blows RDNA 2 out of the water in regard to these two technologies.

As has been revealed, graphics cards need artificially intelligent upscaling to run the latest triple-A title, Cyberpunk 2077, with ray tracing enabled in combination with high settings. Hence, Ampere is the better architecture as it strikes a great balance between rasterization, ray tracing, and artificially intelligent upscaling (the last of which RDNA 2 does not currently have).
 

ZywyPL

Banned
Here's the most reasonable assessment of RDNA 2 vs Ampere:

Depending on the reviewer and/ or suite of games tested, RDNA 2 may be faster on average than Ampere in rasterization - or Ampere may be faster on average than RDNA 2 in rasterization; however, whenever one architecture is faster than the other in rasterization, typically it is barely so - meaning that the difference in rasterization is negligible.

Hence, what truly separates one architecture from the other are additional rendering technologies, namely artificially intelligent upscaling and ray tracing. As we all know, Ampere blows RDNA 2 out of the water in regard to these two technologies.

As has been revealed, graphics cards need artificially intelligent upscaling to run the latest triple-A title, Cyberpunk 2077, with ray tracing enabled in combination with high settings. Hence, Ampere is the better architecture as it strikes a great balance between rasterization, ray tracing, and artificially intelligent upscaling (the last of which RDNA 2 does not currently have).


Yup, pretty much this. IMO AMD should've stick with 8GB VRAM and undercut NV by 150-200$, this would've put RDNA2 cards in the same position as 5700 vs Turing back in the day, where the lacking features/RT performance had its reflection in price, and getting a 500-550$ 6800/XT to play without RT or with RT but only at 1080p would made much more sense.
 

BluRayHiDef

Banned
Yup, pretty much this. IMO AMD should've stick with 8GB VRAM and undercut NV by 150-200$, this would've put RDNA2 cards in the same position as 5700 vs Turing back in the day, where the lacking features/RT performance had its reflection in price, and getting a 500-550$ 6800/XT to play without RT or with RT but only at 1080p would made much more sense.
I don't think that RDNA 2's is problematic for AMD because the demand for it is through the roof anyway.
 

Bolivar687

Banned
Here's the most reasonable assessment of RDNA 2 vs Ampere:

Depending on the reviewer and/ or suite of games tested, RDNA 2 may be faster on average than Ampere in rasterization - or Ampere may be faster on average than RDNA 2 in rasterization; however, whenever one architecture is faster than the other in rasterization, typically it is barely so - meaning that the difference in rasterization is negligible.

Hence, what truly separates one architecture from the other are additional rendering technologies, namely artificially intelligent upscaling and ray tracing. As we all know, Ampere blows RDNA 2 out of the water in regard to these two technologies.

As has been revealed, graphics cards need artificially intelligent upscaling to run the latest triple-A title, Cyberpunk 2077, with ray tracing enabled in combination with high settings. Hence, Ampere is the better architecture as it strikes a great balance between rasterization, ray tracing, and artificially intelligent upscaling (the last of which RDNA 2 does not currently have).

That assessment is more accurate of RDNA1 vs Turing, at the end of the console generation when Nvidia was the only game in town for these two new technologies.

We're now at the beginning of a new console generation with more developers implementing ray tracing and we now have recently released games where RDNA2 is leading Ampere in rt. However Nvidia fans want to trivialize that away, they will not be the last and final games where we see this happen. There are also other big titles like Call of Duty where the implementation is moderate and, while Nvidia has the advantage, the performance hit is manageable and RDNA2 is still providing of a high framerate experience with RT enabled. As the generation matures, that is going to become the norm as opposed to Control and Minecraft, since RT will be implemented around the consoles, with AMD's incoming upscaling solution neutralizing much of the remaining advantage.

I don’t think you can go wrong with either brand and, as almost every review has said, it really comes down to a number of preferences.
 
Last edited:
Yup, pretty much this. IMO AMD should've stick with 8GB VRAM and undercut NV by 150-200$, this would've put RDNA2 cards in the same position as 5700 vs Turing back in the day, where the lacking features/RT performance had its reflection in price, and getting a 500-550$ 6800/XT to play without RT or with RT but only at 1080p would made much more sense.

I strongly disagree on the VRAM, releasing a high end GPU with only 8GB VRAM in 2020 would have been a terrible move. You could even argue that the 3080 doesn't have enough VRAM with only 10GB for a high end 2020 GPU. Not really looking to get into an allocated vs used memory discussion as that's been done to death but once the 3070ti launches with 16GB and the 3080ti launches with 20GB and Nvidia switches focus to those I assume a lot of people will suddenly change their tune on VRAM amounts. In simple terms, having more VRAM is always better than having less.

In terms of pricing, in an ideal world where there were no shortages and there was ample supply of both AMD/Nvidia GPUs, would the 6800XT be more competitive at for example $600 rather than $650? Sure, definitely but right now they are selling everything they can make as demand is through the roof. They will likely continue selling everything they can make for at least the next few months, that includes when supply ramps up. They will have no reason to price drop for a long time if ever as people have shown that they find the cards worth the asking price.

Regarding RT, at 4K native it is essentially unplayable on any card from either AMD or Nvidia. To make it viable Nvidia uses DLSS which is great tech, AMD is working on their FidelityFX Super Resolution tech so hopefully that will be ready sometime early next year. Once that is ready RT should be fairly playable on these cards similarly to how DLSS makes it playable on Nvidia cards. In addition to that at 1440p the AMD sponsored titles all run at playable framerates right now with RT enabled plus a bunch of their non sponsored titles do too.

Granted that is not available just yet so we can't factor it in currently to the existing value proposition but we know it is coming next year. Once available when looking at gaming the only real advantage the 3080 would have for gaming will be more performant RT but it should be perfectly playable on both cards in 90% of cases.
 

BluRayHiDef

Banned
That assessment is more accurate of RDNA1 vs Turing, at the end of the console generation when Nvidia was the only game in town for these two new technologies.

We're now at the beginning of a new console generation with more developers implementing ray tracing and we now have recently released games where RDNA2 is leading Ampere in rt. However Nvidia fans want to trivialize that away, they will not be the last and final games where we see this happen. There are also other big titles like Call of Duty where the implementation is moderate and, while Nvidia has the advantage, the performance hit is manageable and RDNA2 is still providing of a solid experience. As the console generation progresses, that is going to become the norm as opposed to Control and Minectaft, with AMD's incoming upscaling solution neutralizing much of the remaining advantage.

I don’t think you can go wrong with either brand and, as almost every review has said, it really comes down to a number of preferences.

RDNA 2 beats Ampere in ray tracing in only games optimized for RDNA2 and - even then - only in ray traced shadows. Furthermore, there is proof that in such games, RDNA2 renders the games with worse image quality than Ampere.

 
That assessment is more accurate of RDNA1 vs Turing, at the end of the console generation when Nvidia was the only game in town for these two new technologies.

We're now at the beginning of a new console generation with more developers implementing ray tracing and we now have recently released games where RDNA2 is leading Ampere in rt. However Nvidia fans want to trivialize that away, they will not be the last and final games where we see this happen. There are also other big titles like Call of Duty where the implementation is moderate and, while Nvidia has the advantage, the performance hit is manageable and RDNA2 is still providing of a solid experience. As the console generation progresses, that is going to become the norm as opposed to Control and Minecraft, with AMD's incoming upscaling solution neutralizing much of the remaining advantage.

I don’t think you can go wrong with either brand and, as almost every review has said, it really comes down to a number of preferences.


The AMD sponsored titles that run better on rdna are around 2 games as far as i know ? Dirt, who renders differently on nvidia vs radeon and world of warcraft ? RDNA 2 is so poor at RT that it can only mostly support one feature, the lightest ones, shadows. And the implementation is so sparce that reviewers constantly say they cant see a difference. Like in Dirt 5. Then in Godfall, you have outright better image quality with RT off than it turned on.

Then you have another AMD sponsored game Riftbreakert than actually runs far better on ampere, simply because the RT performance gap is so massive. Even though its an AMD title.

What will actually likely happen in the future, you will have the low end RT implementation for consoles and rdna2 pc gpus and multiple levels for geforce cards, like we have in Cyberpunk now. That will become the norm. Nvidia is going to continue to push for RT, its not going to stop and AMD somehow will dominate the market.
 
Last edited:
Top Bottom