• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

M1chl

Currently Gif and Meme Champion
Tessellation was also hardware agnostic. But nVidia gimped their own cards with stuff like Hairworks to make AMD run slower. But apparently that's more than fine. 🤷‍♂️

You've all said your piece. You all think nVidia is better. Good for you. Go hang out in the RTX thread, and leave this thread for the ones that are actually interested in Radeon, rather than interested in bashing Radeon.
Ehh what?
 

BluRayHiDef

Banned
There are only 20 known RT titles, including those in the works.
There is only a handful of games supporting NV's DLSS upscaling.
So, "yes" is rather an optimistic take.

Current And Upcoming Titles That [Will] Support Ray Tracing And/ Or DLSS:

Source #1:

01. Amid Evil
02. Battlefield V
03. Bright Memory
04. Call of Duty: Modern Warfare (2019)
05. Control
06. Crysis Remastered
07. Deliver Us The Moon
08. Fortnite
09. Ghostrunner
10. Justice
11. Mechwarrior V: Mercenaries
12. Metro Exodus
13. Minecraft
14. Moonlight Blade
15. Pumpkin Jack
16. Quake II RTX
17. Shadow of the Tomb Raider
18. Stay in the Light
19. Watch Dogs: Legion
20. Wolfenstein: Youngblood

Source #2:

21. Call of Duty: Black Ops Cold War
22. Cyberpunk 2077
23. Mount and Blade II: Bannerlord
24. World of Warcraft
25. Xuan-Yuan VII
26. Edge of Eternity
27. Ready or Not
28. Enlisted

Source #3:

29. War Thunder

Source #4:

30. JX3
31. Ring of Elysium
32. Atomic Heart
33. Boundary
34. Convallaria
35. Doom Eternal
36. Dying Light 2
37. F.I.S.T. Forged In Shadow Torch
38. Grimstarr
39. Project X
40. Sword and Fairy 7
41. Synced: Off Planet
42. Vampire: The Masquerade: Bloodlines 2
 

psorcerer

Banned
So they gimped their own cards to own the AMD?

That seems like an exaggeration.
I would say their implementation was not optimized well and relied upon a lot of brute force in wrong places.
Which is pretty "standard" for any vendor-specific stuff (be it NV or AMD, or Intel)
 

Ascend

Member
So they gimped their own cards to own the AMD? That does not really make sense. And yeah I remember harworks being really intensive, but it looked nice.
AMD was weaker at tessellation than nVidia. So what did nVidia do? They used tessellation x64 by default to tank performance on AMD cards. Their cards ran slower than necessary as well, but obviously had a lower performance hit than AMD.
It's one of the main reasons AMD added a tessellation override in their driver, so you could manually put it at x8 or x16, so the performance doesn't tank more than it needs to. nVidia had no such option.

1465068735012.png


There's a reason any reasonable reviewer turns off stuff like hairworks in their benchmarks.

I guess it's time for this video again;

 
Last edited:

regawdless

Banned
Current And Upcoming Titles That [Will] Support Ray Tracing And/ Or DLSS:

Source #1:

01. Amid Evil
02. Battlefield V
03. Bright Memory
04. Call of Duty: Modern Warfare (2019)
05. Control
06. Crysis Remastered
07. Deliver Us The Moon
08. Fortnite
09. Ghostrunner
10. Justice
11. Mechwarrior V: Mercenaries
12. Metro Exodus
13. Minecraft
14. Moonlight Blade
15. Pumpkin Jack
16. Quake II RTX
17. Shadow of the Tomb Raider
18. Stay in the Light
19. Watch Dogs: Legion
20. Wolfenstein: Youngblood

Source #2:

21. Call of Duty: Black Ops Cold War
22. Cyberpunk 2077
23. Mount and Blade II: Bannerlord
24. World of Warcraft
25. Xuan-Yuan VII
26. Edge of Eternity
27. Ready or Not
28. Enlisted

Source #3:

29. War Thunder

Source #4:

30. JX3
31. Ring of Elysium
32. Atomic Heart
33. Boundary
34. Convallaria
35. Doom Eternal
36. Dying Light 2
37. F.I.S.T. Forged In Shadow Torch
38. Grimstarr
39. Project X
40. Sword and Fairy 7
41. Synced: Off Planet
42. Vampire: The Masquerade: Bloodlines 2

So you're going to tell me that Ilien blatantly lied about raytracing and DLSS to negatively portrait Nvidia? What a twist!

*Mild shock gif*
 

BluRayHiDef

Banned
So you're going to tell me that Ilien blatantly lied about raytracing and DLSS to negatively portrait Nvidia? What a twist!

*Mild shock gif*

I wouldn't say that he lied; I would say that he wasn't aware of the actual number of games that support or will support ray tracing and/ or DLSS.
 

psorcerer

Banned
Current And Upcoming Titles That [Will] Support Ray Tracing And/ Or DLSS:

Source #1:

01. Amid Evil
02. Battlefield V
03. Bright Memory
04. Call of Duty: Modern Warfare (2019)
05. Control
06. Crysis Remastered
07. Deliver Us The Moon
08. Fortnite
09. Ghostrunner
10. Justice
11. Mechwarrior V: Mercenaries
12. Metro Exodus
13. Minecraft
14. Moonlight Blade
15. Pumpkin Jack
16. Quake II RTX
17. Shadow of the Tomb Raider
18. Stay in the Light
19. Watch Dogs: Legion
20. Wolfenstein: Youngblood

Source #2:

21. Call of Duty: Black Ops Cold War
22. Cyberpunk 2077
23. Mount and Blade II: Bannerlord
24. World of Warcraft
25. Xuan-Yuan VII
26. Edge of Eternity
27. Ready or Not
28. Enlisted

Source #3:

29. War Thunder

Source #4:

30. JX3
31. Ring of Elysium
32. Atomic Heart
33. Boundary
34. Convallaria
35. Doom Eternal
36. Dying Light 2
37. F.I.S.T. Forged In Shadow Torch
38. Grimstarr
39. Project X
40. Sword and Fairy 7
41. Synced: Off Planet
42. Vampire: The Masquerade: Bloodlines 2

01. PS3-level graphics
02. Pastgen
03. Okay
04. Pastgen
05. Pastgen
06. Pastgen, runs like shit
07. Pastgen
08. Pastgen
09. Okay
10. Pastgen

So, from the first 10 games only 20% have graphics that at least look nice.
20 RT games was an exaggeration it seems...
 

regawdless

Banned
01. PS3-level graphics
02. Pastgen
03. Okay
04. Pastgen
05. Pastgen
06. Pastgen, runs like shit
07. Pastgen
08. Pastgen
09. Okay
10. Pastgen

So, from the first 10 games only 20% have graphics that at least look nice.
20 RT games was an exaggeration it seems...

Please be a joke post.

Damn it's so hard to tell these days. Has to be a joke post.
 

BluRayHiDef

Banned
01. PS3-level graphics
02. Pastgen
03. Okay
04. Pastgen
05. Pastgen
06. Pastgen, runs like shit
07. Pastgen
08. Pastgen
09. Okay
10. Pastgen

So, from the first 10 games only 20% have graphics that at least look nice.
20 RT games was an exaggeration it seems...

That doesn't matter; there are still people who play those games. Heck, CSGO is super-duper old but is still very popular. Also, calling Control last gen is absolutely ridiculous; it was released only last year and can only be enjoyed with current-gen technology on a card that supports ray tracing and DLSS 2.0.
 
We simply haven't seen RT games developed with both manufacturers in mind,
Right, everyone who uses MS´s general DXR API without any propritary extensions clearly favours ONE manufacturer. I bet you`d even claim that MS themselves favour Nvidia.


The bullshit some people make up........

01. PS3-level graphics
02. Pastgen
03. Okay
04. Pastgen
05. Pastgen
06. Pastgen, runs like shit
07. Pastgen
08. Pastgen
09. Okay
10. Pastgen

So, from the first 10 games only 20% have graphics that at least look nice.
20 RT games was an exaggeration it seems...
So we`re back to....."my favourite manufacturer sucks at feature xy, so better pretend it doesn´t matter"?
 
Last edited:

Ascend

Member
Can we please stop with that stupid VRAM argument? It's just a higher number of slower VRAM that doesn't help the card at high resolutions because it gets bandwidth limited compared to the "only" 10gb 3080.
It's actually backwards. It's the 3080 that scales badly at low resolutions, rather than Big Navi scaling badly at high resolutions. How we know? Compare both the 3080 and the 6800 scaling to the 2080Ti scaling. The 6800 cards are not likely bandwidth limited. At least, not for rasterization.

The VRAM is not a stupid argument. If the situation was reversed, you all would be screaming the 16GB over the 10GB from the rooftops as yet another advantage for nVidia.

I've also seen a lot of AMD warriors saying that AMD will wipe the floor with Nvidia and even be competitive in raytracing, downplaying the Cuda Cores.

It was hyperbole from both sides though. Nvidia fanbois also tried to downplay AMDs performance.
The latter is obviously still happening in this thread, where a 3fps difference advantage for the 3080 means it beats the 6800XT, but a 3fps advantage for the 6800XT means they are tied.

These 50bucks make no difference at this price point. Classic rasterization performance more or less the same. 6800xt is already bandwidth limited at 4k. Even if raytracing isn't a factor at the moment for some people, it would be strange to throw away the possibility of using it in the future. Because at 1440p, it's basically not usable on a 6800xt.
I don't agree that it's not usable at 1440p. But other than that, at least this stance is reasonable. But still..
Why are we throwing away the possibility of needing more than 10GB in the future, but clinging to needing RT in the future? If the 10GB becomes the limit, you can likely lower a few settings and the card works fine. Pretty much the same as with RT...
But I guess more VRAM is a stupid argument. But when the Fury had 4GB, the 6GB of the 980 was godly... Or even worse, the GTX 1060 3GB vs the RX 480 4GB.... But I guess that was all different, right?

Brand wars aside, I just don't see a reason to pretend as if a 6800xt would be an objectively better choice than a 3080.

See how I didn't even included DLSS.
The 6800XT might be in a better position if it was $599. Then again, whenever AMD has undercut nVidia by a lot, people see their cards as a tier lower while not really being lower.
I understand why people would prefer the 3080. The 6800XT can be a better choice for some people. But people act as if nVidia is always the only choice, and that is simply fanaticism and bias.
 

regawdless

Banned
I understand why people would prefer the 3080. The 6800XT can be a better choice for some people. But people act as if nVidia is always the only choice, and that is simply fanaticism and bias.

I don't want do drag this discussion, so one last question:

Without fanaticism and bias, purely objective:
For what customer would the 6800XT be a better choice compared to a 3080?

Price basically identical
1080p performance better on the 6800XT
1440p basically the same
4k performance better on the 3080
Raytracing huge advantage for the 3080
No DLSS alternative for the 6800XT

So that's only people who play at 1080p and don't use raytracing I guess? Which for a 650 bucks card.... Is most likely a very small amount of people.

You seem pretty biased yourself, while calling others biased.
 

SantaC

Member
I don't want do drag this discussion, so one last question:

Without fanaticism and bias, purely objective:
For what customer would the 6800XT be a better choice compared to a 3080?

Price basically identical
1080p performance better on the 6800XT
1440p basically the same
4k performance better on the 3080
Raytracing huge advantage for the 3080
No DLSS alternative for the 6800XT

So that's only people who play at 1080p and don't use raytracing I guess? Which for a 650 bucks card.... Is most likely a very small amount of people.

You seem pretty biased yourself, while calling others biased.
More VRAM
SAM
power-gaming-average.png
 
Last edited:

Ascend

Member
Computerbase.de found 6800xt to be 4% slower at 4K resolution, 6% slower at 1440p resolution and 1% faster at 1080p resolution.
Then they did a separate section for new games and found 6800xt to be faster on 5 games out of 7.

IJbIilK.png


Price basically identical
If the 2080 was $50 cheaper you would all be screaming that difference from the rooftops.

4k performance better on the 3080
Negligibly better. This can also be classified as the same in the majority of cases.

So that's only people who play at 1080p and don't use raytracing I guess? Which for a 650 bucks card.... Is most likely a very small amount of people.
The people who think having 16GB is a better idea than 10GB
The people that care about power consumption
The people that simply want an all AMD build
The people that want to make use of smart access memory right now
The people that care more about frame consistency rather than framerate (go look at the minimums and the frame times)
The people that think RDNA2 being used in consoles is going to give their cards a long usability
The people that think AMD's DLSS alternative will do fine
The people that think 80% downscaling with sharpening is good enough for them
The people that think $50 makes a difference
The people that want easy VR hookup(USB-C connection)

You seem pretty biased yourself, while calling others biased.
I openly dislike nVidia, and I will only buy their products as a last resort. Not because of their products, but because of their business practices. That does not mean I don't try to keep things balanced. Go back a few posts. I did say that if you immediately want RT, nVidia is the better choice.
But people are blindly putting nVidia on a pedestal. Like always...

If these cards are so bad, reviewers wouldn't be putting titles like these;
  • TechPowerUp: "AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble"
  • The Verge: "AMD RADEON RX 6800 XT REVIEW: AMD IS BACK IN THE GAME"
  • TweakTown: "AMD Radeon RX 6800 XT Review: RDNA 2 Totally Makes AMD Great Again"
  • PCWorld: "AMD Radeon RX 6800 and RX 6800 XT review: A glorious return to high end gaming"
  • Forbes: "AMD Radeon RX 6800 Series Review: A Ridiculously Compelling Upgrade"
  • PCPer: "AMD RADEON RX 6800 XT AND RX 6800 REVIEW: BIG NAVI DELIVERS"
  • HotHardware: "Radeon RX 6800 & RX 6800 XT Review: AMD’s Back With Big Navi"

Reading this thread, you would think all reviews would read like:
  • "AMD fails to compete again"
  • "RDNA2 a disappointment"
  • "6800XT Review: nVidia still uncontested".

But that is not how it is. So maybe, just maybe, some of you need to check your own biases before calling me biased.
 
Last edited:

regawdless

Banned
More VRAM
SAM
power-gaming-average.png

Above mentioned performance includes SAM. RAM is way slower and the card has less bandwidth compared to the 3080. Show me where this is a clear advantage.

Energy consumption, agreed of course.

So I'll correct myself. The 6800XT is clearly the better choice for people who play at 1080p on their 650+ bucks card, don't use raytracing AND care about energy consumption.
 

SantaC

Member
Above mentioned performance includes SAM. RAM is way slower and the card has less bandwidth compared to the 3080. Show me where this is a clear advantage.

Energy consumption, agreed of course.

So I'll correct myself. The 6800XT is clearly the better choice for people who play at 1080p on their 650+ bucks card, don't use raytracing AND care about energy consumption.
The 6000 series is using infinity cache to make up for the bandwidth. So no it is not slow.
 

psorcerer

Banned
That doesn't matter; there are still people who play those games. Heck, CSGO is super-duper old but is still very popular. Also, calling Control last gen is absolutely ridiculous; it was released only last year and can only be enjoyed with current-gen technology on a card that supports ray tracing and DLSS 2.0.

People buy cards for the future games, AFAIK.
What was the case in the past doesn't matter.

Right, everyone who uses MS´s general DXR API without any propritary extensions
So we`re back to....."my favourite manufacturer sucks at feature xy, so better pretend it doesn´t matter"?

DXR is not a feature.
It's pretty much a whole new framework.
It should have been called "dynamic rendering pipeline" but it would be harder to sell.
 
What was the case in the past doesn't matter.
So now we should just disregard AMD`s current DXR performance and trust guys like you that AMD and MS are gonna work this out together and magically put them in punching range to Nvidias dedicated hardware solution.

riiiiight.
This gets more ridiculous by the minute.
 
Last edited:

Ascend

Member
So now we should just disregard AMD`s current DXR performance and trust guys like you that AMD and MS are gonna work this out together and magically put them in punching range to Nvidias dedicated hardware solution.

riiiiight.
This gets more ridiculous by the minute.
Not more ridiculous than you claiming the 3080 beats the 6800XT at 1440p

 

Nikana

Go Go Neo Rangers!
Catching up on this today. (or they dropped today, not sure)

It really seems like the price is the biggest head scratcher for me. The 6800Xt trades blows with the 3080 but at only $50 more I feel like I would be inclined to go with Nvida at this point for its DLSS and RT perforemance. If I am spending that much on a card I might as well spend a bit more. $100 difference would possibly make me bite on AMD.
 
Not more ridiculous than you claiming the 3080 beats the 6800XT at 1440p
Keep on trying to ignore all charts you don`t like fanboy. The cumulative results have been posted here often enough already.
And I never said "beat", I said "equal".....but hey, i know that facts aren`t your thing. You made this clear already.
according to all tests I´ve seen AMD is at best equal/trading blows on a game by game basis. They don`t "win" anywhere.
 
Last edited:

Ascend

Member
Keep on trying to ignore all charts you don`t like fanboy. The cumulative results have been posted here often enough already.
And I never said "beat", I said "equal".....but hey, i know that facts aren`t your thing. You made this clear already.
I stand corrected. It's kind of hard to keep track of all the trolls in here, I apologize for confusing you with someone else.

That being said.... Let's not call each other names, shall we? Want to call me a fanboy? Prove this post wrong;
Computerbase.de found 6800xt to be 4% slower at 4K resolution, 6% slower at 1440p resolution and 1% faster at 1080p resolution.
Then they did a separate section for new games and found 6800xt to be faster on 5 games out of 7.

IJbIilK.png



If the 2080 was $50 cheaper you would all be screaming that difference from the rooftops.


Negligibly better. This can also be classified as the same in the majority of cases.


The people who think having 16GB is a better idea than 10GB
The people that care about power consumption
The people that simply want an all AMD build
The people that want to make use of smart access memory right now
The people that care more about frame consistency rather than framerate (go look at the minimums and the frame times)
The people that think RDNA2 being used in consoles is going to give their cards a long usability
The people that think AMD's DLSS alternative will do fine
The people that think 80% downscaling with sharpening is good enough for them
The people that think $50 makes a difference
The people that want easy VR hookup(USB-C connection)


I openly dislike nVidia, and I will only buy their products as a last resort. Not because of their products, but because of their business practices. That does not mean I don't try to keep things balanced. Go back a few posts. I did say that if you immediately want RT, nVidia is the better choice.
But people are blindly putting nVidia on a pedestal. Like always...

If these cards are so bad, reviewers wouldn't be putting titles like these;
  • TechPowerUp: "AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble"
  • The Verge: "AMD RADEON RX 6800 XT REVIEW: AMD IS BACK IN THE GAME"
  • TweakTown: "AMD Radeon RX 6800 XT Review: RDNA 2 Totally Makes AMD Great Again"
  • PCWorld: "AMD Radeon RX 6800 and RX 6800 XT review: A glorious return to high end gaming"
  • Forbes: "AMD Radeon RX 6800 Series Review: A Ridiculously Compelling Upgrade"
  • PCPer: "AMD RADEON RX 6800 XT AND RX 6800 REVIEW: BIG NAVI DELIVERS"
  • HotHardware: "Radeon RX 6800 & RX 6800 XT Review: AMD’s Back With Big Navi"

Reading this thread, you would think all reviews would read like:
  • "AMD fails to compete again"
  • "RDNA2 a disappointment"
  • "6800XT Review: nVidia still uncontested".

But that is not how it is. So maybe, just maybe, some of you need to check your own biases before calling me biased.
 

psorcerer

Banned
So now we should just disregard the DXR performance so far

There is no "DXR performance". All the current implementations used high-level NV-specific libraries. AFAIK.
And if you read DXR 1.0 spec you can clearly see it's 1:1 the NV RTX spec.
DXR 1.1 adds all the generic interfaces (like inline shaders).
I'm surprised that AMD even bothered to support DXR 1.0, I think it's because how DX12 Tier system works they probably just couldn't refuse 1.0 support.
As I've already said: newer games will need to support PS5/XBSX at interactive framerates, which means their RT implementations will be much better than the current gen ones: faster and less power hungry.
 
Prove this post wrong;
That post says exactly the same as I´ve already said => equal in rasterization..

The 6xxx still absolutely sucks in RT workloads though and has no DLSS.......and with that this discussion is right at the beginning again.

There is no "DXR performance". All the current implementations used high-level NV-specific libraries. AFAIK.
False...
There are only 3 games with NV specific addons and those don`t have RT on AMD cards for that very reason.

As I've already said: newer games will need to support PS5/XBSX at interactive framerates, which means their RT implementations will be much better than the current gen ones: faster and less power hungry.
I call bullshit/damage control/relativize on this.
You simply have no facts and go with "trust me bro, just wait".
 
Last edited:

Ascend

Member
So I'll correct myself. The 6800XT is clearly the better choice for people who play at 1080p on their 650+ bucks card, don't use raytracing AND care about energy consumption.
Don't be ridiculous. All these cards are viable for 1080p. And they are also all viable for 1440p and 4k.

As for the 6800XT bandwidth, look at all that bandwidth bottleneck at 4k...;

4K-WWZ.png

 

llien

Member
Right, everyone who uses MS´s general DXR API without any propritary extensions clearly favours ONE manufacturer. I bet you`d even claim that MS themselves favour Nvidia.

The bullshit some people make up........
You are commenting on a thread in which literally within couple of pages BF5 switching away from DXR API to green proprietary crap was called out (reasons: bugs, performance hit)

Oh, and people wondering how come something that is using DXR doesn't just work on AMD GPU.

Oh, and people claiming DXR API based Dirt 5 does not work on NV GPU.

"bullshit some people make up" dear god...


By the way, any news about Super Resolution?

1) It is coming
2) It will be sprinkled with buzzwords such as ML to address respective part of the population
3) It will be cross plat, as most things AMD (including SAM)


And/ Or DLSS:
Lovely trick.
 
Last edited:
[/QUOTE]
As for the 6800XT bandwidth, look at all that bandwidth bottleneck at 4k...;

well i think the variability in the benchmark results show that infinity cache doesn't play nice with every situation. in games where cache overflows can't be avoided it will take hits.

that said, i don't think we will see this in any games going foward, as long amd provides game optimized (day 1) drivers and nvidia isn't trying to put shit in sponsored / co-engineered games.
 

Bolivar687

Banned
Don't be ridiculous. All these cards are viable for 1080p. And they are also all viable for 1440p and 4k.

As for the 6800XT bandwidth, look at all that bandwidth bottleneck at 4k...;

4K-WWZ.png


I don't understand how there's all these benchmarks of the 6800xt beating the 3080 and sometimes even the 3090 in 4K and yet we keep hearing "it's a 1080p card."

The goal posts have moved so far we're now in a different stadium.
 

regawdless

Banned
Computerbase.de found 6800xt to be 4% slower at 4K resolution, 6% slower at 1440p resolution and 1% faster at 1080p resolution.
Then they did a separate section for new games and found 6800xt to be faster on 5 games out of 7.

IJbIilK.png



If the 2080 was $50 cheaper you would all be screaming that difference from the rooftops.


Negligibly better. This can also be classified as the same in the majority of cases.


The people who think having 16GB is a better idea than 10GB
The people that care about power consumption
The people that simply want an all AMD build
The people that want to make use of smart access memory right now
The people that care more about frame consistency rather than framerate (go look at the minimums and the frame times)
The people that think RDNA2 being used in consoles is going to give their cards a long usability
The people that think AMD's DLSS alternative will do fine
The people that think 80% downscaling with sharpening is good enough for them
The people that think $50 makes a difference
The people that want easy VR hookup(USB-C connection)


I openly dislike nVidia, and I will only buy their products as a last resort. Not because of their products, but because of their business practices. That does not mean I don't try to keep things balanced. Go back a few posts. I did say that if you immediately want RT, nVidia is the better choice.
But people are blindly putting nVidia on a pedestal. Like always...

If these cards are so bad, reviewers wouldn't be putting titles like these;
  • TechPowerUp: "AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble"
  • The Verge: "AMD RADEON RX 6800 XT REVIEW: AMD IS BACK IN THE GAME"
  • TweakTown: "AMD Radeon RX 6800 XT Review: RDNA 2 Totally Makes AMD Great Again"
  • PCWorld: "AMD Radeon RX 6800 and RX 6800 XT review: A glorious return to high end gaming"
  • Forbes: "AMD Radeon RX 6800 Series Review: A Ridiculously Compelling Upgrade"
  • PCPer: "AMD RADEON RX 6800 XT AND RX 6800 REVIEW: BIG NAVI DELIVERS"
  • HotHardware: "Radeon RX 6800 & RX 6800 XT Review: AMD’s Back With Big Navi"

Reading this thread, you would think all reviews would read like:
  • "AMD fails to compete again"
  • "RDNA2 a disappointment"
  • "6800XT Review: nVidia still uncontested".

But that is not how it is. So maybe, just maybe, some of you need to check your own biases before calling me biased.

Ok now that makes sense. Calling people biased, while openly disliking Nvidia and being biased as fuck. What a hypocrite you are.
I could comment on your points and we would go back and forth without an end. But there's really no point in that. Especially when that much ideology is involved.

I don't want to come off as the Nvidia defense force, because I actually like AMD as a company a lot.

Enjoy your still very good and powerful AMD GPU. I'll spend my time in way more positive and totally unbiased threads like the PS5 power king meme thread.
 

Ascend

Member
Ok now that makes sense. Calling people biased, while openly disliking Nvidia and being biased as fuck. What a hypocrite you are.
I would call it being honest. The hypocrites are the ones claiming to be neutral while obviously being biased towards nVidia. You may think I am biased, but, my opinion is more in line with the majority of the reviews and review titles than the ones that claim to be neutral.

I could comment on your points and we would go back and forth without an end. But there's really no point in that. Especially when that much ideology is involved.
I am an idealist. No doubt about that. Doesn't make me wrong though. Because, after all, there is plenty of reason to dislike nVidia as a company. And some of us blindly buy whatever suits us, and some of us want to go the extra mile of voting with our money. Whenever you spend money, you are supporting the whole business, not just the product. And nVidia has mostly been a pain to gamers. But most don't see it.

I don't want to come off as the Nvidia defense force, because I actually like AMD as a company a lot.
Fair enough. You did sound more reasonable than many others here, although, it gets kind of frustrating when so many people that obviously don't care about these cards, jump in here just to trash on it.

Enjoy your still very good and powerful AMD GPU. I'll spend my time in way more positive and totally unbiased threads like the PS5 power king meme thread.
I'm not even sure if I'm going to get one. It depends on the price of the Nitro versions.

As for the unbiased PS5 power king meme thread... I'm not sure if that's a joke or not, but, enjoy the memes. Meme threads are the best on here.
 
But the fact is that the new RDNA2 and Ampere cards are priced very similarly.

And since Nvidia is beating AMD by a huge margin when new exiting features such as RT are used, Nvidia is able to provide better bang for the buck in this case!

To me it seems that AMD using the Silicon budget for big cache like Infinity Cache instead of more CU's or dedicated tensor cores is not working for them. Maybe AMD should try to secure GDDR6X (or something similar) for upcoming RDNA2/3 products and use the available silicon for computational assets.

At this point the RDNA2 GPU's cannot be recommended for new and up-coming AAA/AA titles which, by and large, will use RT and provide support for DLSS.

Really?
Every new game I've seen in benchmarks shows RDNA2 being very competitive - both in rasterization and in RT.
Its the older RTX games (which were likely optimised for Nvidia's RT solution given that it was the only one available) where RDNA2 does very poorly.

As for using GDDR6X. Well we know that costs a shit load of money. Not just the DRAM modules themselves, which are more expensive, but also the need account for PAM4 signalling and having a thicker PCB to support the signalling needed etc. The Infinity Cache allows Navi 21 to perform almost identically to GA102 with half the memory bandwidth and considerably less power. That's a trade-off that is worth making.
And besides, Navi 21 at 519mm2 manages to get very close to GA102 at 627mm2. GA102 also has an extra 2 billion transistors. So maybe if AMD went larger, they could have added in all the extra shit they needed, but that would have come at a cost to yields and potentially power. Absolutely nothing wrong with the design approach they took.

Can you recommend the RX6000 cards? Absolutely. If you get one, are you really going to be disappointed with the end product? How many people use RTX Voice and NVENC? RT and DLSS won't be widely adopted until RTX 4000 and RX 7000 are released.
Those edge features might matter to enthusiasts and in that case go right ahead and buy Nvidia. But to most other people these software features don't really matter. If you need a GPU now, you buy what's available. Right now both GPU's are scarce, but if you see one or the other at a good price, you can go right ahead with either option and know you're getting a solid experience either way. I mean the vast majority of people are still playing on 1080p monitors and below, with a 1060. 4K performance is great if you actually game in 4K, which most people just do not do. If you want high-refresh 1080p or 1440p, AMD is just better. If you want 4K performance, go ahead and get Nvidia.
There are plenty of situations where you can and perhaps should recommend people get Radeon this time around. Maybe that'll mean you might be able to actually get your hands on an RTX3000 card.

AMD can't continue to compete with Nvidia with hardware and software unless people actually buy their products.
 
Last edited:
You are commenting on a thread in which literally within couple of pages BF5 switching away from DXR API to green proprietary crap was called out (reasons: bugs, performance hit)

Oh, and people wondering how come something that is using DXR doesn't just work on AMD GPU.

Oh, and people claiming DXR API based Dirt 5 does not work on NV GPU.

"bullshit some people make up" dear god...




1) It is coming
2) It will be sprinkled with buzzwords such as ML to address respective part of the population
3) It will be cross plat, as most things AMD (including SAM)



Lovely trick.
You must be nearly suffocating due to laughing too hard while typing this nonsense....
 
Last edited:

Nikana

Go Go Neo Rangers!
Glaring lack of glossy reflections.


It's 10GB vs 16GB in a market sold out for months to come, I simply do not see AMD dropping price any time soon.

People who bite on NV upscaling are unlikely to switch anyhow.

I dont disagree that they will not drop the price. I just feel like if they were a tad more aggressive and undercut them by $100 with the 6800Xt and the 6800 was closer to the 3070 price even by $40 it would just become far more compettive as they cant match some of those features form Nvdia.
 

LQX

Member
Damn, looks like I will stick to trying to get a 3080. Considering their ray tracing performance and lack of dlss these cards should indeed be cheaper. They would be way more compelling if they were.
 

regawdless

Banned
I would call it being honest. The hypocrites are the ones claiming to be neutral while obviously being biased towards nVidia. You may think I am biased, but, my opinion is more in line with the majority of the reviews and review titles than the ones that claim to be neutral.


I am an idealist. No doubt about that. Doesn't make me wrong though. Because, after all, there is plenty of reason to dislike nVidia as a company. And some of us blindly buy whatever suits us, and some of us want to go the extra mile of voting with our money. Whenever you spend money, you are supporting the whole business, not just the product. And nVidia has mostly been a pain to gamers. But most don't see it.


Fair enough. You did sound more reasonable than many others here, although, it gets kind of frustrating when so many people that obviously don't care about these cards, jump in here just to trash on it.


I'm not even sure if I'm going to get one. It depends on the price of the Nitro versions.

As for the unbiased PS5 power king meme thread... I'm not sure if that's a joke or not, but, enjoy the memes. Meme threads are the best on here.

Of course it's a joke with the meme thread. :messenger_tears_of_joy:

I really appreciate you not getting triggered by my post. And still bringing up arguments without getting personal, hats off to you, sir. You're one of the more rational and self aware people, even if I'm critical regarding your bias.
My approach might've been too black and white and I guess everyone has some kind of bias, however small or big.

See ya in other threads.
 
Last edited:
So they gimped their own cards to own the AMD? That does not really make sense. And yeah I remember harworks being really intensive, but it looked nice.
Hairworks heavily utilised tessellation to spam a shitload of triangles where they weren't really needed. At the time, Nvidia had much stronger Geometry performance than AMD. This meant that while HairWorks tanked Nvidia's performance, due to their advantage in geometry, it absolutely destroyed AMD.
Tessellation in general was a big weapon Nvidia used to slap AMD, by introducing triangles where they were not visible to clutter the gfx pipeline and choke AMD GPU's. The visual difference between max tessellation and medium tessellation was negligible as most of the triangles were invisible to the player, but when you benchmark at max settings and turn the sliders all the way up, it will disproportionately hit AMD.

Its not really cheating, but its just a little bit sus behaviour.
 
Hairworks heavily utilised tessellation to spam a shitload of triangles where they weren't really needed. At the time, Nvidia had much stronger Geometry performance than AMD. This meant that while HairWorks tanked Nvidia's performance, due to their advantage in geometry, it absolutely destroyed AMD.
Tessellation in general was a big weapon Nvidia used to slap AMD, by introducing triangles where they were not visible to clutter the gfx pipeline and choke AMD GPU's. The visual difference between max tessellation and medium tessellation was negligible as most of the triangles were invisible to the player, but when you benchmark at max settings and turn the sliders all the way up, it will disproportionately hit AMD.

Its not really cheating, but its just a little bit sus behaviour.



im sure you have documented articles about this, beyond a shadow of a doubt, no ? Please dont let it be the myth with Crysis 2 sea tesselation which was never true
 
Last edited:
im sure you have documented articles about this, beyond a shadow of a doubt, no ? Please dont let it be the myth with Crysis 2 sea tesselation which was never true
Not HairWorks, but this was one of the examples I was referring to with OTT tessellation and lack of proper object culling.

 
Top Bottom