• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Where exactly does it disagree? Because that is exactly how things went.


It's kind of hard to ignore them when they are in a 6800 review thread making the cards look a lot worse than they really are. DonJuanSchlong DonJuanSchlong is one of them. He has already said a gazillion times that the RTX 3080 is better. Good for him. Go to the RTX thread if you're not interested in these cards. But no. He needs to hang around with his gaslighting BS.

As for the supposed "false narrative".... Some people see things, some people see things when they are shown, and some don't see. Some can't see, most don't want to.
Can't believe you are still triggered that I prefer Nvidia. I've told you specifically a million times that I buy whatever gives me the best performance. I switched from Intel to AMD because of that. I switched from AMD to Nvidia, for the same reason. If the RX 6000 series provided me with what I need, I would have one in have in hand, right now.

I'm not te blue, green or red. I'm team performance. That's what matters to me, and most PC gamers. I don't exclusively buy MSI, Asus, Gigabyte, etc, as performance can change amongst manufacturers. This is something you clearly don't understand, as you only worship AMD, no matter how well or how shit they may do. Cringe.

And lastly, please point to where I mentioned the 3080 in this specific thread. Bet you won't find it. But continue to fight for Lisa Su, I'm sure she appreciates it.


People has always cared about native rasterisation and power consumption.

And apparently clockspeeds go over 2600mhz on air as well.
Clockspeeds are impressive, but it's similar to how a 4.5ghz cpu can achieve the same results as a 2ghz cpu. Its all about how they utilize it and it's architecture.

Power consumption, not so much. Excessive power will cause heat, especially on a poor cooler, which more people care about than sheer power consumption.
 

FireFly

Member
Power consumption, not so much. Excessive power will cause heat, especially on a poor cooler, which more people care about than sheer power consumption.
With this generation AMD and Nvidia were limited by power consumption before they were limited by die size. 3080s are more expensive to produce due to the added power circuitry and cooling system, and the 3090 looks to be bottlenecked by having only a 9% higher TDP, despite 21% more compute power.
 
Last edited:

SantaC

Member
I am hearing that the infinity cache hitrate is causing 4K to suffer a bit. 128MB is on the smallside, but increasing it would also cause latency problems.

I wonder if AMD goes with GDDR6X for 7000 series or is that memory exclusive to nvidia.
 

Krappadizzle

Gold Member
People has always cared about native rasterisation and power consumption.

And apparently clockspeeds go over 2600mhz on air as well.
In the roughly 14 years since I switched to PC gaming, it's never been a concern of mine or what I've read in reviews in all that time. People will talk about power consumption and it being high or low, but it was always a concern as far as I understand in regards to heat management and most of the cards I've bought over the years, with maybe the exception of the 780 it was never an issue/concern. It's never made any kind of massive influence on whether I bought a card or not. Do people want lower power consumption/better heat management? Sure. Is it gonna stop an enthusiast from purchasing a card because there are cards with better power consumption/heat management but lesser performance? No. No it will not.

At then end of the day, I'd wager people will buy the card that gives them the best performance for the money, if the 3080 was a 900w card it wouldn't stop people from buying it. Maybe those that don't have a strong enough PSU, but if all things are equal, people are gonna buy the card that gives them the "prettiest graphics" on the screen that they can afford. I think the 6800/xt is gonna take a huge amount of the marketplace, but it won't be because it offers great efficiency for anyone, it's because it does super well in a lot of games at a cheaper price than it's competitor.

If you don't agree with that, it's all good. We just have differing opinions on what we think people think is important.

I am hearing that the infinity cache hitrate is causing 4K to suffer a bit. 128MB is on the smallside, but increasing it would also cause latency problems.

I wonder if AMD goes with GDDR6X for 7000 series or is that memory exclusive to nvidia.

Isn't 6x just a standard? I don't see why it would stop AMD from using it. Nvidia just put HBM2 in some of their server/a.i. cards as far as I understand and they've not used it for any of their consumer GPU's. I don't see why AMD couldn't use 6x either other than it just being expensive. 6x bandwidth is huge and it'd be great if literally every card in all stacks on both sides used it. I wish they would.
 
Last edited:

llien

Member
Looks like in Cyberpunk 2077, a 6800 XT would be able to play at 1080p with RT Ultra. Since a 3070 apparently needs DLSS to get playable framerates at 1440p.




No mention of Radeon GPUs possibly hints at proprietary green crap being used instead of DXR, for the same reasons BF5 did it.
 

FireFly

Member
No mention of Radeon GPUs possibly hints at proprietary green crap being used instead of DXR, for the same reasons BF5 did it.
See my later reply. According to Nvidia nothing is proprietary. Looks like the lack of mention of the 6800 is due to a marketing agreement.
 
GPU_Roadmap_Master_678x452.png


Roadmap shows RDNA3 before 2022 begins.


AMD-Vega-Roadmap-57a3932a5f321ae4.jpeg


:messenger_grinning_sweat:
 

Krappadizzle

Gold Member
See my later reply. According to Nvidia nothing is proprietary. Looks like the lack of mention of the 6800 is due to a marketing agreement.

I dunno, I think people expecting decent RT performance with the 6800 in Cyberpunk day 1 are gonna be in for some disappointment. Again, I don't think we'll see it being meaningfully implemented into CP2077 for AMD until it gets the support of the next-gen updates. It might work, but I'd bet it'll work the same way raytracing works on the 1080ti. Technically it can do it, but it runs like absolute trash. And I would take anything NV says with a giant GIANT grain of salt. They loosened up restrictions of their PhsyX shit too and AMD never really supported it as far as I know in any meaningful way either.
 
Last edited:

llien

Member
triggered that I prefer Nvidia.
I doubt anyone in this thread cares which GPUs you buy.
The problem is you feeling the need to come shit on AMD GPUs and it doesn't matter if it is due to strong butthurt or just to justify purchase of an overpriced Fermi2 Ampere or something else.

See my later reply. According to Nvidia nothing is proprietary. Looks like the lack of mention of the 6800 is due to a marketing agreement.
According to them proprietary stuff doesn't even exist on Microsoft OS, which contradicts BF5 changelog.


I think people expecting decent RT performance with the 6800 in Cyberpunk day 1 are gonna be in for some disappointment
Indeed, as goals have moved and 2080Ti is an example of disappointing RT performance.
 
Last edited:

llien

Member
DemonCleaner DemonCleaner
You've been presented with concrete promise of AMD CEO Lisa Su from about a year ago, when she promised to deliver Zen3 and RDNA2 in 2020 .

This renders your "AMD never keeps its promises" moot.

Bringing in slides from Raja period doesn't change it (and it was indeed an embarrassing period for the company, I'm so glad that dude is doing wonders somewhere else).
 
Last edited:

SantaC

Member
DemonCleaner DemonCleaner
You've been presented with concrete promise of AMD CEO Lisa Su, to deliver Zen3 and RDNA2 in 2020 and so they did.

This renders your "AMD never keeps its promises" moot.

Bringing in slides from Raja period doesn't change it (and it was indeed embarrassing period for the company, I'm so glad dude is doing wonders somewhere else).
Yeah Raja was hurting AMD more than helping it.

 
Last edited:

FireFly

Member
I doubt anyone in this thread cares which GPUs you buy.
The problem is you feeling the need to come shit on AMD GPUs and it doesn't matter if it is due to strong butthurt or just to justify purchase of an overpriced Fermi2 Ampere or something else.


According to them proprietary stuff doesn't even exist on Microsoft OS, which contradicts BF5 changelog.
Well so far the only RT games that don't run on the 6800 series are Quake 2 RTX and Wolfenstein Youngblood, as they used proprietary Vulkan extensions.

I dunno, I think people expecting decent RT performance with the 6800 in Cyberpunk day 1 are gonna be in for some disappointment. Again, I don't think we'll see it being meaningfully implemented into CP2077 for AMD until it gets the support of the next-gen updates. It might work, but I'd bet it'll work the same way raytracing works on the 1080ti. Technically it can do it, but it runs like absolute trash. And I would take anything NV says with a giant GIANT grain of salt. They loosened up restrictions of their PhsyX shit too and AMD never really supported it as far as I know in any meaningful way either.
Depends on your resolution and expectations. Control averages close to 60 FPS at 1080p with everything maxed out on a 6800 XT, and you can always disable additional ray tracing features – like ray traced shadows, which don't add much.
 
I doubt anyone in this thread cares which GPUs you buy.
The problem is you feeling the need to come shit on AMD GPUs and it doesn't matter if it is due to strong butthurt or just to justify purchase of an overpriced Fermi2 Ampere or something else.


According to them proprietary stuff doesn't even exist on Microsoft OS, which contradicts BF5 changelog.
Obviously Ascend does, as he's constantly dragging my name around and worried about how I spend my money. Preferring one GPU over the other isn't shitting on AMD. Please understand that, since you seem to disregard everything else.

It would be different if I came in this thread, and was saying how bad AMD is, etc. But I didn't. So let's not start spreading b.s.

Finally, calling the most performant GPU's, Fermi, is the literally the definition of being butthurt. Isn't that ironic? You can't even be unbiased for one second. As triggered as you are, I'd imagine you don't have anything green inside of your house or closet. You probably even hate trees and mint chocolate ice cream.
 

Krappadizzle

Gold Member
Indeed, as goals have moved and 2080Ti is an example of disappointing RT performance.

As far as we know it runs on a 2080ti decently with DLSS. I'm not moving goal posts, I've never set any to begin with. I'm not an enemy here bub. No need to be so defensive. Even that aside, 2080ti didn't exactly have the best RT performance either, it just had the best of the bunch, which is like being the tallest midget. Great, but still.. I HOPE to be wrong, I just think people will be setting themselves up for disappointment. If you go in with that expectation and are disappointed than you'll see even more people unfairly shitting on AMD cards. It's doing something it's not designed to do with tech not designed around it's hardware.
 

Rikkori

Member
Well so far the only RT games that don't run on the 6800 series are Quake 2 RTX and Wolfenstein Youngblood, as they used proprietary Vulkan extensions.


Depends on your resolution and expectations. Control averages close to 60 FPS at 1080p with everything maxed out on a 6800 XT, and you can always disable additional ray tracing features – like ray traced shadows, which don't add much.
You can do RT Contact Shadows + Transparent reflections at 1440p and keep 60 fps with a 6800. Which are the best imo, but even then I'd probably keep only tr. ref.
Reflections + diffuse lighting add too much noise tbh.

Notice the near macro-blocking low-res of the RT reflections in the 2nd pic (all RT effects on; native 4K) compared to RT off in 1st pic. And trust me, it looks a hundred times worse in actual motion.

ut0f7LE.jpg
7Lksz09.jpg
 
Last edited:

Antitype

Member
Well, my bad for trusting Nvidia. Looks like a repeat of the Godfall situation. Hopefully this isn't going to be a pattern moving forward.



Why blame Nvidia? They surely only recently got RX GPUs and they've been crunching like madmen for a while now just to meet the release date. They simply don't have the time to support yet another target. Chances are AMD support will come at the same time as the next gen consoles patch.
 

The Skull

Member
Why blame Nvidia? They surely only recently got RX GPUs and they've been crunching like madmen for a while now just to meet the release date. They simply don't have the time to support yet another target. Chances are AMD support will come at the same time as the next gen consoles patch.

I'd assume it's due to the marketing agreement they have with Nvidia, akin to the one AMD has with Godfall.
 

Kenpachii

Member
Is that another AMD sponsored title? If not, that's pretty mental.

All i know about ryzen optimized or something as there is a cutscene at the startup in the game for it. However that 85 fps for 3080 seems to actually be a thing. The 1080ti numbers which i also got is also a thing 66 average seems about what i get on ultra 1080p. the numbers are really really bad for nvidia.
 

Krappadizzle

Gold Member
Like whats up with ac valhalla and amd cards lol

index.php


that 5700xt straight up sits behind a 3080.
This is 1080p though. Which isn't really what either series of cards is aiming for. What are the 1440p/4k benchmarks at?

can't get VRR or Freesync working via HDMI 2.1 on the LG C9 / E9 :messenger_sad_relieved:

I'm not qutie sure how it works on the AMD side, but I know with NV you have to turn on G-sync in the control panel. I assume AMD's is the same process. You have it turned on yeah?
 
Last edited:
I'm not qutie sure how it works on the AMD side, but I know with NV you have to turn on G-sync in the control panel. I assume AMD's is the same process. You have it turned on yeah?

yeah i got it running on my 3080 after the LG firmware update.

but AMDs adrenaline software says TV wouldn't be freesync compatible. HDMI 2 VRR should work anyways. but it doesn't. or at least not correctly -> it does prevent tearing if you disable vsync. but it introduces microstutter. had the same issue with the 3080 before the LG firmeware update got the gsync mode working.

thought that would be done with the last TV update... dammit
 
Last edited:
Interesting it seems that RT will not be supported at least at launch for CyberPunk on AMD cards. This is contrary to what Nvidia said but matches up with what a previous CDPR employee mentioned so they are at least consistent.

The way I see it we only have 3 possibilities for CyberPunk and Godfall RT exclusivity.

1. Marketing/Sponsorship agreement with Nvidia/AMD to block functionality on competitor's cards.

2. Proprietary extensions for Nvidia/AMD to assist with RT that don't work on the opposition cards.

3. Only optimizing towards one company set of cards due to sponsorship agreements and not being happy with the default gimped non optimized performance on the opposition cards so waiting until they have time to optimize and release patch to enable it.

If it is option 1 then that really sucks for us as consumers and contributes towards fracturing the RT market into a hot mess. Personally I'm not a fan of those kinds of outcomes. The best case scenario is number 3 but even then that doesn't exactly inspire confidence in "write once run anywhere" types of DXR calls if you literally have to rewrite a separate RT branch for each GPU manufacturer. Will get even worse once Intel launces their RT capable GPUs with their own architecture and optimizations.
 

kittoo

Cretinously credulous
yeah i got it running on my 3080 after the LG firmware update.

but AMDs adrenaline software says TV wouldn't be freesync compatible. HDMI 2 VRR should work anyways. but it doesn't. or at least not correctly -> it does prevent tearing if you disable vsync. but it introduces microstutter. had the same issue with the 3080 before the LG firmeware update got the gsync mode working.

thought that would be done with the last TV update... dammit

LG didn't uodate C9 for freesync I believe. Though there was a guide on how to enable it on C9. Google should tell you I think.
 

Antitype

Member
You can do RT Contact Shadows + Transparent reflections at 1440p and keep 60 fps with a 6800. Which are the best imo, but even then I'd probably keep only tr. ref.
Reflections + diffuse lighting add too much noise tbh.

Notice the near macro-blocking low-res of the RT reflections in the 2nd pic (all RT effects on; native 4K) compared to RT off in 1st pic. And trust me, it looks a hundred times worse in actual motion.

ut0f7LE.jpg
7Lksz09.jpg

Looks like it's bugged on AMD, there shouldn't be any macro blocking. There's indeed some kind of temporal noise, but it's very subtle at 4k native or 4k dlss quality mode. Nothing distracting like your screenshot and it's in fact not visible at all in a screenshot.

4k native (maxed out minus SSAO cause it's not needed with RT shadows):
 

wachie

Member
Some of you AMD guy's sound like battered spouses. How/Why are you guys even coming up with this shit? This simply isn't true and you are trying to create a false narrative. Not even sure why.
It's not false narrative. First it was Big Navi is just 2080Ti at best and now that it looks like its almost at 3090 level (minus RT) then suddenly a lot of custom scenarios come into play for AMD to be even remotely viable.

Like I said earlier in the thread, with this level of RT performance, I don't have much confidence now in the next-gen consoles and not sure how much and what level of RT will be commonly available in the next 5-7 years.
 

Rikkori

Member
Looks like it's bugged on AMD, there shouldn't be any macro blocking. There's indeed some kind of temporal noise, but it's very subtle at 4k native or 4k dlss quality mode. Nothing distracting like your screenshot and it's in fact not visible at all in a screenshot.

4k native (maxed out minus SSAO cause it's not needed with RT shadows):
Yup, definitely bugged on AMD. Curious to know though if that means it would be even lower performance if fixed? xD
 

Krappadizzle

Gold Member
It's not false narrative. First it was Big Navi is just 2080Ti at best and now that it looks like its almost at 3090 level (minus RT) then suddenly a lot of custom scenarios come into play for AMD to be even remotely viable.

Like I said earlier in the thread, with this level of RT performance, I don't have much confidence now in the next-gen consoles and not sure how much and what level of RT will be commonly available in the next 5-7 years.
People speculate. They always have and always will. It wasn't so out of the question to think that Big Navi was around 2080ti levels considering AMD's rather inconsistent GPU history at that point. Other than unreliable leaks, people can only go on the best information they have based on the histories of whatever they are speculating on. But to create this narrative that NV fans just shit on all the good works of AMD is just plain false. Maybe hearing things that you don't like might be considered "shitting on" but if they are truthful it's not the same as people just unduly shitting on something just to do that. By that mantra, I've shit on NV a ton with their bunk ass 20xx series and this awful paper launch of the 30xx series. Am I just shitting on them too? Or telling the truth?



One thing I never really considered in arguing that people don't care about performance per watt was the laptop GPU market. Holy shit, at that TDW they could slap 6800m in a ton of shit and really grab that laptop gaming segment. I don't know why I always forget that segment of the market. I guess I just never think of laptop gaming as a good way to play shit and forget all about it.
 
Last edited:

BluRayHiDef

Banned
...and the 3090 looks to be bottlenecked by having only a 9% higher TDP, despite 21% more compute power.

That's an issue for only the Founders Edition and reference models. The FTW3 Ultra Gaming has a maximum TDP of ~ 415 watts out of the box, which can be increased to a little over 500 watts via an alternate BIOS.
 

MadYarpen

Member
People speculate. They always have and always will. It wasn't so out of the question to think that Big Navi was around 2080ti levels considering AMD's rather inconsistent GPU history at that point. Other than unreliable leaks, people can only go on the best information they have based on the histories of whatever they are speculating on. But to create this narrative that NV fans just shit on all the good works of AMD is just plain false. Maybe hearing things that you don't like might be considered "shitting on" but if they are truthful it's not the same as people just unduly shitting on something just to do that. By that mantra, I've shit on NV a ton with their bunk ass 20xx series and this awful paper launch of the 30xx series. Am I just shitting on them too? Or telling the truth?



One thing I never really considered in arguing that people don't care about performance per watt was the laptop GPU market. Holy shit, at that TDW they could slap 6800m in a ton of shit and really grab that laptop gaming segment. I don't know why I always forget that segment of the market. I guess I just never think of laptop gaming as a good way to play shit and forget all about it.
You are kinda correct here :D
 

wachie

Member
People speculate. They always have and always will. It wasn't so out of the question to think that Big Navi was around 2080ti levels considering AMD's rather inconsistent GPU history at that point. Other than unreliable leaks, people can only go on the best information they have based on the histories of whatever they are speculating on. But to create this narrative that NV fans just shit on all the good works of AMD is just plain false. Maybe hearing things that you don't like might be considered "shitting on" but if they are truthful it's not the same as people just unduly shitting on something just to do that. By that mantra,
Please, excuse me but it was plain obvious with more and more time that AMD didnt just have another 2080Ti-level product.

I've shit on NV a ton with their bunk ass 20xx series and this awful paper launch of the 30xx series. Am I just shitting on them too? Or telling the truth?
I know this is a rhetorical question but to the hardcore NV crowd, yes you were shitting on them too.
 

thelastword

Banned


So this is the performance that's so terrible.....My only gripe is that he has the 3090 in there... I wish he compared the 6800XT to the 6080 and the 6800 to the 3070......Even then you can see the performance for AMD is pretty great in rasterization and RT....When these drivers mature, I think Nvidia is in for a world of hurt...
 

Ascend

Member
People speculate. They always have and always will. It wasn't so out of the question to think that Big Navi was around 2080ti levels considering AMD's rather inconsistent GPU history at that point.
Actually, it was. If you honestly thought that a GPU with twice the CUs of the 5700XT would only be able to be 30% faster, you simply weren't thinking straight. I made a whole thread about how we should not underestimate Big Navi. Guess what happened. I got mocked and ridiculed, and I wasn't far off either.

Other than unreliable leaks, people can only go on the best information they have based on the histories of whatever they are speculating on. But to create this narrative that NV fans just shit on all the good works of AMD is just plain false.
I didn't hear anyone talk in here about how amazing their advancements are in comparison to RDNA1, even if according to them, nVidia's card is better.
I didn't hear anyone talk about how their cooler is not shit this time.
Smart Access Memory is swept under the rug.
16GB is swept under the rug.

You are the only one that mentioned a few of these. You are projecting yourself to everyone else in here. Not everyone is like you. Some of them pretend to be neutral yet stealthily post propaganda against these AMD cards. Many of them, before the cards were even released, would keep repeating the 2080Ti level as if it was already a fact. Bad drivers kept coming up, power consumption kept coming up, bad reference cards kept coming up. Basically all the faults of AMD were ballooned, before the cards were even announced.
When the 3080 was released, did you know how many times it was said that nVidia is too far ahead and that AMD will never catch up? I disagreed. You can already guess what happened.

Maybe hearing things that you don't like might be considered "shitting on" but if they are truthful it's not the same as people just unduly shitting on something just to do that.
Claiming that the 6800XT is slower than the RTX 3080 at every resolution I consider shitting on the AMD cards. Not only is it not true, that is exactly the narrative you say that never happens, yet it happened in this very thread. And it's now more remarkable, especially when in certain scenarios the sub $600 6800 beats even the $1500 RTX 3090. Do you see me going to the RTX or 3000 series thread and spamming that all over their thread? No. But it happens here. Considering the amount of times DLSS alone was mentioned here, you'd think it was an nVidia thread.

By that mantra, I've shit on NV a ton with their bunk ass 20xx series and this awful paper launch of the 30xx series. Am I just shitting on them too? Or telling the truth?
There is a difference between criticizing a card and shitting on it. One entails constructive criticism, and one is bravado which generally entails constant mentioning of all the stuff it is not good enough at, ignoring all the stuff the competitor lacks, and praising every little advantage the competitor has.
The 6800 cards are not perfect. Their RT as of right now seems to be inferior. And as of right now, there is no DLSS alternative. That does not mean that if I prefer to have 16GB with slower RT over 10GB with faster RT, that I have to be bullied for it or called an AMD fanboy. Because yes, I am going to push back.

The same thing happened on the CPU side. Even after the Ryzen 3000 series, there were still some people that say Intel is still superior. Even with the 5000 series, weird outliers are universalized to show how Intel is still better. Thankfully, the sales do not reflect these people, but, unfortunately, on the GPU side they do.

nVidia played with their customers multiple times. The last one was the RTX2000 series. Suddenly with the 3000 series, everyone forgets and starts praising nVidia to the moon. 3.5GB of usable memory instead of 4GB on the GTX 970? Card still is one of the best cards sold. 3GB 1060 not being an actual 1060 but sold under the same name? People still buy nVidia. The RTX2000 series is the first time where some people actually woke up, but one gen later, and it's all forgotten.
RX 570 beating out the 1050 Ti while being cheaper and having more VRAM? Sorry, too high a power consumption. No one bought it. RX480 having an insignificant amount of additional power drawn through the PCIe slot? Better not buy AMD.

It has come to the point where AMD needs to be a little scummy to gain back market share, and that is really fucking sad. Because the user base is like Dory. They forget too quickly and keep making the company that shoves a large log in their ass richer, so that the size of the log keeps growing every time. Hopefully, AMD gets more sales than previously, because honestly, they deserve it.
 
Actually, it was. If you honestly thought that a GPU with twice the CUs of the 5700XT would only be able to be 30% faster, you simply weren't thinking straight. I made a whole thread about how we should not underestimate Big Navi. Guess what happened. I got mocked and ridiculed, and I wasn't far off either.


I didn't hear anyone talk in here about how amazing their advancements are in comparison to RDNA1, even if according to them, nVidia's card is better.
I didn't hear anyone talk about how their cooler is not shit this time.
Smart Access Memory is swept under the rug.
16GB is swept under the rug.

You are the only one that mentioned a few of these. You are projecting yourself to everyone else in here. Not everyone is like you. Some of them pretend to be neutral yet stealthily post propaganda against these AMD cards. Many of them, before the cards were even released, would keep repeating the 2080Ti level as if it was already a fact. Bad drivers kept coming up, power consumption kept coming up, bad reference cards kept coming up. Basically all the faults of AMD were ballooned, before the cards were even announced.
When the 3080 was released, did you know how many times it was said that nVidia is too far ahead and that AMD will never catch up? I disagreed. You can already guess what happened.


Claiming that the 6800XT is slower than the RTX 3080 at every resolution I consider shitting on the AMD cards. Not only is it not true, that is exactly the narrative you say that never happens, yet it happened in this very thread. And it's now more remarkable, especially when in certain scenarios the sub $600 6800 beats even the $1500 RTX 3090. Do you see me going to the RTX or 3000 series thread and spamming that all over their thread? No. But it happens here. Considering the amount of times DLSS alone was mentioned here, you'd think it was an nVidia thread.


There is a difference between criticizing a card and shitting on it. One entails constructive criticism, and one is bravado which generally entails constant mentioning of all the stuff it is not good enough at, ignoring all the stuff the competitor lacks, and praising every little advantage the competitor has.
The 6800 cards are not perfect. Their RT as of right now seems to be inferior. And as of right now, there is no DLSS alternative. That does not mean that if I prefer to have 16GB with slower RT over 10GB with faster RT, that I have to be bullied for it or called an AMD fanboy. Because yes, I am going to push back.

The same thing happened on the CPU side. Even after the Ryzen 3000 series, there were still some people that say Intel is still superior. Even with the 5000 series, weird outliers are universalized to show how Intel is still better. Thankfully, the sales do not reflect these people, but, unfortunately, on the GPU side they do.

nVidia played with their customers multiple times. The last one was the RTX2000 series. Suddenly with the 3000 series, everyone forgets and starts praising nVidia to the moon. 3.5GB of usable memory instead of 4GB on the GTX 970? Card still is one of the best cards sold. 3GB 1060 not being an actual 1060 but sold under the same name? People still buy nVidia. The RTX2000 series is the first time where some people actually woke up, but one gen later, and it's all forgotten.
RX 570 beating out the 1050 Ti while being cheaper and having more VRAM? Sorry, too high a power consumption. No one bought it. RX480 having an insignificant amount of additional power drawn through the PCIe slot? Better not buy AMD.

It has come to the point where AMD needs to be a little scummy to gain back market share, and that is really fucking sad. Because the user base is like Dory. They forget too quickly and keep making the company that shoves a large log in their ass richer, so that the size of the log keeps growing every time. Hopefully, AMD gets more sales than previously, because honestly, they deserve it.
Ok, how about this. AMD is not only better than Nvidia. They are light years ahead. I mean like, Nvidia can never catch up now! You happy now? AMD runs cooler, faster, better ray tracing, better rasterization, better everything. Here's your participation trophy.


One thing some of you guys need to understand, is that these companies have different R&D budgets, different ideology, etc. Nvidia pushes pure performance in addition to other graphical features that are ahead of it's time. AMD on the other hand matures over time and gets better performance periodically. Sometimes even gaining the upper hand.


You want everyone to bow down with you, and kiss Lisa Su's feet, as if they are the holy grail of all time. Some years Nvidia nails it on the head, other years, they only add a little better rasterization and other features, the same with AMD. They have done a tremendous job this time around, but if you want everyone to praise AMD, step back and look at your post history. On any given page, I see nothing but Nvidia's bashing from you. You never appreciate anything Nvidia does, so it's crazy you want others to notice what AMD has done to finally start to catch up to Nvidia. Aren't the kettle and pot both black if I remember correctly?
 

Ascend

Member
Ok, how about this. AMD is not only better than Nvidia. They are light years ahead. I mean like, Nvidia can never catch up now! You happy now? AMD runs cooler, faster, better ray tracing, better rasterization, better everything. Here's your participation trophy.
I'm not a proponent of lies, but it doesn't surprise me that you would think something like this would be satisfying.

One thing some of you guys need to understand, is that these companies have different R&D budgets, different ideology, etc. Nvidia pushes pure performance in addition to other graphical features that are ahead of it's time.
You mean like DX10.1? Or Async compute? Or Tessellation?
nVidia's features are rarely ahead of their time. Their main strength is advertising to the gullible.
That does not mean that some of their features are not good. If only they would use their talents to improve the market instead of oppressing it.

AMD on the other hand matures over time and gets better performance periodically. Sometimes even gaining the upper hand.
They were basically neck n neck until nVidia started with their BS tactics to drive AMD out of the market. AMD made its fair share of mistakes in their business, but their lack of R&D was not entirely their fault, because Intel screwed them over, and so did nVidia. It's a miracle they're still here.

You want everyone to bow down with you, and kiss Lisa Su's feet, as if they are the holy grail of all time.
If you think that is my motivation or goal, you clearly have not been paying attention.

Some years Nvidia nails it on the head, other years, they only add a little better rasterization and other features, the same with AMD.
Except nVidia still gets praised in the years that they offer basically nothing, while AMD gets trashed even when they have good products like the RX 480 and the 5700XT.

They have done a tremendous job this time around, but if you want everyone to praise AMD, step back and look at your post history. On any given page, I see nothing but Nvidia's bashing from you.
Oh really? I never admitted that nVidia's RT seems to be superior, for example?

You never appreciate anything Nvidia does,
The dog can bite your hand only so many times before you don't like it anymore, even if it guards your house well.

so it's crazy you want others to notice what AMD has done to finally start to catch up to Nvidia.
And once again your bias unintentionally shows itself. They started to catch up a long time ago.

Aren't the kettle and pot both black if I remember correctly?
Indeed they are...
 
Last edited:
I'm not a proponent of lies, but it doesn't surprise me that you would think something like this would be satisfying.


You mean like DX10.1? Or Async compute? Or Tessellation?
nVidia's features are rarely ahead of their time. Their main strength is advertising to the gullible.
That does not mean that some of their features are not good. If only they would use their talents to improve the market instead of oppressing it.

They were basically neck n neck until nVidia started with their BS tactics to drive AMD out of the market. AMD made its fair share of mistakes in their business, but their lack of R&D was not entirely their fault, because Intel screwed them over, and so did nVidia. It's a miracle they're still here.


If you think that is my motivation or goal, you clearly have not been paying attention.


Except nVidia still gets praised in the years that they offer basically nothing, while AMD gets trashed even when they have good products like the RX 480 and the 5700XT.


Oh really? I never admitted that nVidia's RT seems to be superior, for example?


The dog can bite your hand only so many times before you don't like it anymore, even if it guards your house well.


And once again your bias unintentionally shows itself. They started to catch up a long time ago.


Indeed they are...
So everyone with Nvidia cards are now gullible? I'll take being gullible for better performance/features for $50 more than AMD at the moment!


Do you honestly think Nvidia's hasn't been trashed for the pricing on 20xx cards, even though they were 2 years plus out the gate with not only better performance than the 5700xt, but with raytracing as well, on top of DLSS? Barely anyone cared about DLSS and raytracing in its infancy upon release, but the rasterization was better than anything the 1080 ti or AMD could sell at that time, 2 years ago.


Unlike you, I can be bitten several times by different dogs, but I won't judge the breed based on a single dogs response. I can list countless companies that have released very shitty/amazing products. Just because one line up is trash, doesn't meant it can't be next year's treasure on its new product release. If I went by your logic, I would never trust ASUS ever again because I had to get an ASUS AMD motherboard RMA'd 2x in a matter of weeks. Yet here I am with Asus products, because they have improved.


And again, here we are, with me being biased by saying "AMD is finally catching up to AMD". Well, I could be like you, and say AMD can't touch Nvidia at raytracing, doesn't have a technology in place to combat DLSS, etc. But no, I said they have finally started to catch up with their new cards, that not only run cooler than Nvidia's, but with higher clock speeds, and now implement raytracing! How much praise can I give them, before I sound like a complete AMD fanboy/tool like you? Serious question ⁉
 
Last edited:

Ascend

Member
So everyone with Nvidia cards are now gullible? I'll take being gullible for better performance/features for $50 more than AMD at the moment!
Not everyone. But the ones that easily get excited about fancy new toys are also the easiest to manipulate.
That is all aside from the fact that nVidia simply has a larger mind share, and some people buy nVidia because they don't know Radeon even exists. That's partially on AMD, and partially on the part of gaming community that is biased against AMD.

Do you honestly think Nvidia's hasn't been trashed for the pricing on 20xx cards, even though they were 2 years plus out the gate with not only better performance than the 5700xt, but with raytracing as well, on top of DLSS? Barely anyone cared about DLSS and raytracing in its infancy upon release, but the rasterization was better than anything the 1080 ti or AMD could sell at that time, 2 years ago.
It's basically the first time that nVidia felt it in their pockets. Thank god for that. Otherwise we would have seen a $1000 3080 and a $2500 3090. When was the last time nVidia was criticized and felt the consequences in their pockets?
You are correct that AMD didn't offer performance at the top end. But this is yet another one of those things. The majority of gamers don't buy top cards. The majority of gamers don't spend more than $300 on a graphics card. AMD was there all along. But now that they are, the goal post has shifted from AMD not being able to compete at the top tier, to features.
The only way AMD can win is if they are literally perfect. They have to pull a Zen 3 on the GPU front. It shouldn't be that way, but that's how it is.

Unlike you, I can be bitten several times by different dogs, but I won't judge the breed based on a single dogs response.
nVidia is a single dog.

I can list countless companies that have released very shitty/amazing products. Just because one line up is trash, doesn't meant it can't be next year's treasure on its new product release.
I can forgive a shitty product. Mistakes are bound to happen. I wasn't talking about bad products, but more about deliberate deception of your customers when you know/think you can get away with it.

If I went by your logic, I would never trust ASUS ever again because I had to get an ASUS AMD motherboard RMA'd 2x in a matter of weeks. Yet here I am with Asus products, because they have improved.
That's bad luck. The Geforce Partner Program is not bad luck. As I mentioned before, after the Geforce Partner Program, I ditched all the companies that supported this. I normally bought Asus boards, but I switched to Asrock. My first Asrock motherboard, I actually had to RMA it. The warranty process went fine, and I am still using Asrock, and I don't see the need to go back to Asus at this point. So again, for me, it's more about the company's policy and values rather than the products.
That being said, I have recommended boards from MSI for example to PC builders that want a system with Zen 3, simply because they are the best value. I will not impose my values on someone else, unless provoked.

And again, here we are, with me being biased by saying "AMD is finally catching up to AMD". Well, I could be like you, and say AMD can't touch Nvidia at raytracing, doesn't have a technology in place to combat DLSS, etc.
I have said those things myself, and at this point they are true. Doesn't mean it will remain that way, which is another one of those things... People expect nVidia to always remain on top with all their technologies. But that is not necessarily true. That doesn't mean that you should buy something right now in the hopes that you get that stuff in the future. You might wait forever, or you might not.
Remember when you laughed at me when I said that AMD can possibly come up with an upscaling solution that doesn't require machine learning but still does the job? Yeah...

But no, I said they have finally started to catch up with their new cards, that not only run cooler than Nvidia's, but with higher clock speeds, and now implement raytracing! How much praise can I give them, before I sound like a complete AMD fanboy/tool like you? Serious question ⁉
You had to slip some of that namecalling in there didn't you? In any case, this is the first time I've actually seen you write that without following up with some remarks reasons that all those advances are still worthless.


In the interest of keeping things slightly on-topic...;

 

PhoenixTank

Member
You want front row seats with your popcorn sir? That'll be an extra 5 smackaroo's. Payment is only available in bitcoin or you have to join in. What will it be?
Don't touch me. I will be dealing only with the moneyman, Irobot82 Irobot82 .

In the interest of keeping things slightly on-topic...;


I feel like I've booted up Teletext again.
Is this how the year ends? Everything out of stock? Enough, 2020... enough.
Appreciate the stream link all the same.
 

Irobot82

Member
Don't touch me. I will be dealing only with the moneyman, Irobot82 Irobot82 .

It's just silly. We can all like different things.

Some can buy only AMD because that's what they like. Some can buy Nvidia because that's what they like. Some can swap back and forth depending on their budget/game preferences/feature importance.

In the end, we are all PCMR and we are all better than the console peasants.
 
Top Bottom