• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

This AMD flagship should have been HBM based. This has the stench of a real 6800xt being artifically renamed to 6900xt and pushing the SKU stack ever higher into exorbitant price margin.

Both vendors cards also slot in nicely and conveniently between each others MSRPs. There's not actually any headbutting each other in real competition.

I'm a cynical cunt though, so it could just be me.
 

Rentahamster

Rodent Whores
And yet people get mad at me when I tell them to not buy nVidia on principle. I guess that virtue is gone nowadays.
Tech enthusiasts for the most part don't operate on "virtue". We operate on cold hard numbers. Virtue is a secondary consideration.
 


Jay's review is interesting. Apparently the review BIOS were limiting his to 250w and he had to do some tinkering to get it at the advertised 300w. It seems like AMD was trying to make the temperatures look better than they are because its reference cooler really wasn't up for it. It's ironic that AMD was reluctant to let AIBs take a crack at it, because it seems now that this card really needs them to figure out the right configuration.

Interestingly, he seems to say the 6800 non-XT is the best overall value card this hardware cycle. I was definitely not expecting that from him but now it's pushing me more towards grabbing one of them if I can find it.


Yeah I just watched this review now, it sounds to me like he has either a faulty card or something dodgy is happening with the driver. We haven't seen anyone else reporting these issues so far so I can only assume either a hardware fault or driver issue.

Given the weird temps he is getting I'm leaning towards a hardware issue maybe an the driver could be power limiting due to temps. He also mentioned the fans were barely running even with those high temps so something is definitely wrong compared to a normally operating card.
 

Rikkori

Member
This AMD flagship should have been HBM based.

Wouldn't really have done much of anything for their RT performance & would've shot their margins to hell, and likely would've had worse implications down the stack (remember - simplified GPU designs = cheaper too, in R&D and fabbing as well). The IC innovation really has been great though, so I don't think you could've gotten a better card otherwise, at least not overall.
 

Ascend

Member
If I bought stuff based on principle, I wouldn't buy anything. I'd be a hunter-gatherer in the middle of the woods.
Haha yeah. That is kind of true though. But like most things in life, it's about balance.

Tech enthusiasts for the most part don't operate on "virtue". We operate on cold hard numbers. Virtue is a secondary consideration.
I understand that perfectly. And that would be perfectly justifiable under normal circumstances. The time of 'normal' is long gone though. This is the point where people have to stand still and determine for themselves what is the most important .

So many people are saying not to buy from scalpers. What is the difference between scalpers and companies price gouging? Literally nothing, except there is a middle man in between in one situation and not in the other. Yet for scalpers everyone wants to do everything to stop them, but when the companies themselves want to take advantage, suddenly people don't care (aside from the retailers)... And I cannot understand that. This is especially true for nVidia

And I'm not being selective here either. People are free to go back and look at my post history. I was basically drooling over the 6800XT Nitro. But when the price of $770 was announced, a full $120 over a sort of reasonable MSRP, I immediately changed my mind and dropped the idea of buying it like a hot iron ball.


Fanaticism gets in the way of objectivity, and, more often than not, it's not really the cold hard numbers, but the idea of having the best cold hard numbers as a dopamine rush. For some, the feeling of wasting money on marginally faster products is stronger than the feeling of thinking they could have had just that tiny little edge over what they currently have. And for some it's the opposite.
Most people are more emotionally driven than they think, especially when they think they are being objective.
 
Last edited:

Bolivar687

Banned
My take on this hardware cycle is that it's great to have both companies at the high end at the same time with different advantages but the video card space is still not in a healthy position, and that's before we even get into stock not being available and MSRP not being relevant.

When Pascal launched in 2016, Nvidia made the consumer pay for the performance gains, instead of increasing the value at each price tier. Turing then again elevated the price a bit more for the performance gains, which for many were a bit underwhelming. So even before you get to the pent up demand of a pandemic where everyone is staying at home and a new console generation giving the green light for multiplatform developers to raise the bar visually, a lot of PC gamers who otherwise upgrade more often have been using the same cards for 4 years now. The jump this generation is good, though clearly not what Nvidia promised at their reveal, but even if we were at MSRP, this generation would be fine only relative to what we've gotten used to for the last two generations, but I would certainly not call it good.

Then we come to the reality that these products are not generally available, scalpers have all the breathing room they need, and even retailers are selling these cards far beyond their real value. 3070s and 6800s are not worth $600 just like 3080s and 6800 XTs are not worth the $800 you can buy them for during the few seconds they are available. The Nvidia soldiering is especially toxic this year because while we're arguing about who the real 1440p king is, this is an absolutely abysmal launch for both teams, as the tolerance for getting ripped off is quite high and retailers and AIBs are right there with the scalpers taking advantage of it. When these cards have been available for a bit and we start seeing discounts and rebates, then I might be willing to say these cards offer good value, but that's now looking like it will be months away at best.

A lot of reviewers have been saying we will have to wait until the next hardware cycle to see a battle at Ray Tracing, but it's really pricing where we need renewed competition to have an impact. I actually would say my decision to wait on a PS5 and instead get multiplatform titles on PC for now was a mistake.
 
Last edited:


Couple of interesting things.
Comparing 5700 xt to 6900 xt is interesting.
The 6900 xt has twice the CUs, a higher clock speed, 16gbs of RAM @ 512gbs compared to only 8gbs of RAM @ 448gbs and 23tflops on the 6900 xt vs 9.75tflops on the 5700 xt with all the extra improvements of RDNA 2 over RDNA 1, and it gives a 111% improvement over 5700 xt.
I kinda thought it would have done better.
 


....and it gives a 111% improvement over 5700 xt.
I kinda thought it would have done better.


I haven't watched the review yet but I just had to comment on this.

What? That is more than double the performance of the 5700XT, AMD's flagship RDNA1 card. CU/SM scaling is not linear in a general sense with GPUs. Normally having 70-80% scaling is a really really good result.

In this case they are over 100% better with double the number of CUs. That is pretty amazing and is in line with AMD's "NAVI X2" design goals. In what universe is 111% performance increase generation to generation a bad thing?
 
I haven't watched the review yet but I just had to comment on this.

What? That is more than double the performance of the 5700XT, AMD's flagship RDNA1 card. CU/SM scaling is not linear in a general sense with GPUs. Normally having 70-80% scaling is a really really good result.

In this case they are over 100% better with double the number of CUs. That is pretty amazing and is in line with AMD's "NAVI X2" design goals. In what universe is 111% performance increase generation to generation a bad thing?
Did I say it was a bad thing? If so can you show me that?
I said that the 6700 xt is more than 111% more powerful than a 5700 xt card.
Higher bandwidth, twice the RAM, higher frequency gives greater pixel fill rates than 111% as well as supposed performance per watt increases over RDNA 1.
Add on top of that the addition of infinity cache as well.
To get a less than 1:1 improvement on Tflop performance alone isn't what I expected.
 

Sun Blaze

Banned
Before the dumbshit PR departments from Sony and Microsoft started, no one in the PC enthusiasts space gave a damn about TFLOPs.
I never even used them in comparisons until a few years ago to make it simple to understand for the laymen.
Problem now is, people think it's worth a lot more as a metric than it really is.
 

Sun Blaze

Banned
2080Ti is 45% more powerful than 5700xt.
People were wondering if big Navi could beat 2080Ti....
That's what happens when the web is full of idiots. AMD could have made something matching a 2080 Ti in raster with RDNA.
When it was reported the PS5 was clocked at 2.23GHz with 36 CU's and that AMD had an 80 CU cards in the work, how could anyone believe it would only perhaps match a 2080 Ti?
Even at 2GHz it would still beat it easily.
 

Bolivar687

Banned
Newegg was bundling a reference 6800 with a $120 power supply for $779. That means they selling the reference 6800 for the MSRP of the reference XT variant!!! WTF!
 

BluRayHiDef

Banned
Nvidia has released official benchmark results of Cyberpunk 2077 running on their RTX 30 Series cards and their RTX 2080 Ti.


Cyberpunk-2077-Official-4K-Performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-2560x1440-performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-1920x1080-performance.png
 
this review shows how you can change the performance ladder with your games selection:



i think it's fair to say, that when it comes to games which were build with the new consoles in mind rDNA2 performs much more favorable. and i think it's kinda sad that this aspect was lost in so many reviews.
 
this review shows how you can change the performance ladder with your games selection:



i think it's fair to say, that when it comes to games which were build with the new consoles in mind rDNA2 performs much more favorable. and i think it's kinda sad that this aspect was lost in so many reviews.




i mean, its not really a surprise. If you choose a handfull of games that perform best on one vendor you can skew the results :) Even though they dont show an accurate image. If you also lie about nvidias results like he does in some of his graphs, the gap is even bigger.
 
i mean, its not really a surprise. If you choose a handfull of games that perform best on one vendor you can skew the results :) Even though they dont show an accurate image. If you also lie about nvidias results like he does in some of his graphs, the gap is even bigger.


well you could accuse other reviews of doing it the other way around.
that's the ultimate argument for having a big sample size.

i for my part don't accuse anyone of such things and would say he just opted for showing more recent games... as other opted to review within there usual benchmark suite (which is not optimal, but understandable given the time constraints)



anyways having this datapoint is still valuable, because it shows that navi2x seems to fair better in engines that are build with next gen consoles in mind. that said, make no mistake: im certain Jensen won't let his driver teams sleep till they narrow the gap in those games.
 

Kenpachii

Member
Nvidia has released official benchmark results of Cyberpunk 2077 running on their RTX 30 Series cards and their RTX 2080 Ti.


Cyberpunk-2077-Official-4K-Performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-2560x1440-performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-1920x1080-performance.png

And the 3000 series are all 1080p cards already.
 
well you could accuse other reviews of doing it the other way around.
that's the ultimate argument for having a big sample size.

i for my part don't accuse anyone of such things and would say he just opted for showing more recent games... as other opted to review within there usual benchmark suite (which is not optimal, but understandable given the time constraints)



anyways having this datapoint is still valuable, because it shows that navi2x seems to fair better in engines that are build with next gen consoles in mind. that said, make no mistake: im certain Jensen won't let his driver teams sleep till they narrow the gap in those games.


But none of them are built with next gen consoles in mind, they're current gen games. And if we take 2020 games as "new games", Pc Games Hardware has a bunch of other games, Desperados 3, Detroit Become Human, Black Mesa, Wolven, Resident Evil 3, and all those are slower on Radeons.

Thing is, this video just seems to try to find and push a narative. All his nvidia scores are lower than you actually get with nvidia cards. Dirt 5 has already been removed by a tech channel from the benchamrk suite because it doesnt show consistent results and is skewed for amd. Valhalla is an AMD title that runs very bad on every geforce card. A 5700Xt is the same as a 2080TI in that game. IF you'll cherry pick this examples you'll get an innacurate image of these cards placement in the performance chart
 

BluRayHiDef

Banned
But none of them are built with next gen consoles in mind, they're current gen games. And if we take 2020 games as "new games", Pc Games Hardware has a bunch of other games, Desperados 3, Detroit Become Human, Black Mesa, Wolven, Resident Evil 3, and all those are slower on Radeons.

Thing is, this video just seems to try to find and push a narative. All his nvidia scores are lower than you actually get with nvidia cards. Dirt 5 has already been removed by a tech channel from the benchamrk suite because it doesnt show consistent results and is skewed for amd. Valhalla is an AMD title that runs very bad on every geforce card. A 5700Xt is the same as a 2080TI in that game. IF you'll cherry pick this examples you'll get an innacurate image of these cards placement in the performance chart

Disregarding price, RTX 3090 is the best card to get. Period. It has the best performance in rasterization, ray tracing, and DLSS. It's the King.
 

Xyphie

Member
If there's one reviewer I think just makes up performance figures it's Coreteks. Guy only ever reviews AMD products yet he has 3090's, 3080's, 3070's, 2080 Ti's and high-end Intel processors sitting around for comparisons? yeahright.gif
 
Last edited:

ZywyPL

Banned
Nvidia has released official benchmark results of Cyberpunk 2077 running on their RTX 30 Series cards and their RTX 2080 Ti.


Cyberpunk-2077-Official-4K-Performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-2560x1440-performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-1920x1080-performance.png


I applaud CDPR for not hesitating this time around and not downgrading the game like they did with TWC3, and actually bumping up the visuals throughout the development. And that's all maxed out here, wonder how the game will perform on more optimized settings. And supposedly the performance might get a bit better with upcoming patches. So yeah, that's how you make a game with PC in mind.


i think it's fair to say, that when it comes to games which were build with the new consoles in mind rDNA2 performs much more favorable. and i think it's kinda sad that this aspect was lost in so many reviews.

Especially UWP games from MS Store perform so damn well on Radeon GPUs. But then again, the reviewers usually test the currently most popular/biggest AAA titles, because those are the games people are getting the newest cards for, and it rarely happens to be games made with AMD architecture in mind, that's just the way it is and there's little to no reason to manipulate the benchmarks only to show how well Radeons perform in games nobody cares about anymore or never even heard about.
 

Rikkori

Member
I applaud CDPR for not hesitating this time around and not downgrading the game like they did with TWC3
What's there to down-grade? On the rasterisation front they're very much middle-of-the-pack in terms of what they're putting out. They don't impress in any area except sheer volume of art on screen, which kudos to them, they have brilliant artists and a helluva lot of them, but that's just a money thing. On the ray-tracing side that's a lot less them and a lot more to Nvidia's credit because that's who does the heavy lifting on these projects. Even still, if you go into the weeds it's clear that the RT they're doing is still quite gimped. Yes there's a lot of effects but if you look in detail it's very sparsely used, eg diffuse illumination is not comparable to the level of GI you'd see in Metro Exodus for example, it's a much more lite version, I'd say we can put it above an SSRGI but not much above that. In fact if you had proper RTGI alone that would be a bigger step-up to visual fidelity especially for characters than what you see now where we have the sun excluded from RT.

For me, this game just doesn't impress technically, because like so many others they fail to do BASIC things right while adding a lot of RT. You cannot make up for the lower texture quality, and lower amount of geometric detail with ray tracing. Just like how Quake RTX is not a next-gen looking game just because it adds pathtracing - you need more than that. The disappointment is that because they don't ship proper HQ assets this is not really a 4K game, it's much more suited to 1440p. Same as what happened with The Witcher 3 where what they shipped was really for 1080p gaming, and only as modders started tweaking textures & LODs did it suddenly start looking like something 4K.

And that's without going into even more important things like animations, npc models, interactivity etc.
 

IDKFA

I am Become Bilbo Baggins
Hmmmm

I'm going to buy a gaming rig in a few weeks. Was going to go with a 3070, but the AMD 6800 appears to be the better card and only slightly more expensive.

Got a lot of thinking to do.
 

llien

Member
Nvidia has released official benchmark results
Seriously, why on planet earth would anyone care about inflated values claimed by nvidia or even refer to them as "official"?

It is just a tool to put pressure on reviewers, nothing else.
The "baseline" suggestion by NV on how games perform on their shit leaked once and it was good 10-15% over the board.
Not to go too far, Huang claimed 1060 was 25% faster than RX 480, in reality it was 10%.


RT though
Given what RT does in a handful of games that support it, that even in CP2077 I'd rather keep it off, as it to me personally looks better without, it is a very niche, barely relevant tech to begin with.
And then you have Dirt 5 that is optimized for RDNA2 that beats green cards easily, so, uh, welp...
 
Last edited:

llien

Member
It has the best performance in rasterization,
Even without OC and SAM enabled, it gets routinely beaten by 6900XT and even by 6800XT.

It looks even worse for team green bubble when focusing on the newest games:

E9vGBo5.png


csLlhZR.png


it gets worse for the said team at lower resolutions.
 

BluRayHiDef

Banned
Seriously, why on planet earth would anyone care about inflated values claimed by nvidia or even refer to them as "official"?

It is just a tool to put pressure on reviewers, nothing else.
The "baseline" suggestion by NV on how games perform on their shit leaked once and it was good 10-15% over the board.
Not to go too far, Huang claimed 1060 was 25% faster than RX 480, in reality it was 10%.



Given what RT does in a handful of games that support it, that even in CP2077 I'd rather keep it off, as it to me personally looks better without, it is a very niche, barely relevant tech to begin with.
And then you have Dirt 5 that is optimized for RDNA2 that beats green cards easily, so, uh, welp...

You sound like a hater. Nvidia's benchmarks are not inflated; if they were inflated, then they wouldn't reveal that the RTX 30 Series cards struggle to run the game at native 4K with Ultra settings.

Perhaps you're upset that Nvidia's cards have DLSS, which make it possible to run the game at decent frame rates in 4K with Ultra settings (including ray tracing).

What indicates such is your assertion that the game looks better without ray tracing, which is utterly absurd.

While you're being consumed by your anger and hatred, I will be enjoying this game with Very High settings (including ray tracing) in 4K via DLSS Performance Mode at 60 frames per second.
 
I think its amazing that even with hard data, with numbers, some people still cant accept reality. Illen keeps posting the "new games" benchmarks, ignores that the radeon loses in 2 while claiming the radeon is better in newer games. Also ignoring that more new games run worse on radeon - Black Mesa, Detroit, Desperados 3, Resident Evil 3, Watch Dogs Legion, Doom Eternal, Horizon Zero Dawn and for sure, Cyberpunk.

The radeons dont run better in newer games. They run better in exactly those 5 new games. Thats it.
 
But then again, the reviewers usually test the currently most popular/biggest AAA titles

You are mostly right but for example if they were benching only the most popular games we would see stuff like CS:GO, Fortnite, PUBG, Rocket League and WoW. Then we see medium sized games like Control which has done well for itself but is hardly a 10+ million seller or really AAA at all, it is mostly included due to its solid implementation of RT effects so in that sense it is a graphical showcase.

Some reviewers are even still benchmarking older titles from 2016 at times. Again I'm not really disagreeing with your point as the main thrust of it is right, but it does get a bit more complicated and I suppose depending on the titles chosen to review and how many are sponsored by AMD/Nvidia and how many generally favor each architecture it can really skew the results one way or the other.

But yeah console ports that are not sponsored by Nvidia/AMD seem to perform better on AMD cards in a general sense. While historically UE4 titles for example regardless of console ports or not tend to favor Nvidia cards due to heavy optimizations in the core of the engine for Nvidia.

Anyway I don't really have a strong disagreement with you here or anything, and I'm kind of rambling on a bit at this point but it is interesting to take some of these things into account.
 

BluRayHiDef

Banned
I think its amazing that even with hard data, with numbers, some people still cant accept reality. Illen keeps posting the "new games" benchmarks, ignores that the radeon loses in 2 while claiming the radeon is better in newer games. Also ignoring that more new games run worse on radeon - Black Mesa, Detroit, Desperados 3, Resident Evil 3, Watch Dogs Legion, Doom Eternal, Horizon Zero Dawn and for sure, Cyberpunk.

The radeons dont run better in newer games. They run better in exactly those 5 new games. Thats it.
He genuinely comes across as a fanboy; he reacted with hostility to my comment in which I posted Nvidia's official benchmark results for Cyberpunk 2077 running on their cards. His reaction was completely unwarranted because the post was a neutral report of data. Very strange.
 
Last edited:

CuNi

Member
Even without OC and SAM enabled, it gets routinely beaten by 6900XT and even by 6800XT.

It looks even worse for team green bubble when focusing on the newest games:

E9vGBo5.png


csLlhZR.png


it gets worse for the said team at lower resolutions.

Nice try cherry picking 1 reviewer.
Here's a website that aggregated multiple benchmark results from different reviews and avaraged them.


Bith 6800 and 6800XT lose.
3080 on average 8% and 3090 average 20% faster without RT.

Now we'd have to see how the average is between 6900 and 3090. Though u bet 3090 will still beat it by a few percent.
 

ZywyPL

Banned
You are mostly right but for example if they were benching only the most popular games we would see stuff like CS:GO, Fortnite, PUBG, Rocket League and WoW

Yeah I was thinking about those while typing my post, but then I though that those games run above 60FPS even on 10yo toasters anyway, those are not the games people buy couple of hundred dollar GPUs for, like really, what's the point of testing for example R6 whether it's 380 or 410 FPS? It's absurd framerate either way and no one on the planet has a display capable of actually displaying all those frames, nor the game looks spectacular to justify spending that much on a GPU, neither do any of the popular MP-only titles. So from reviewers point of view who are constrained by time, such benchmarks are a waste of time. Plus, it not a secret people want to see benchmarks from the most demanding titles currently out there, because if those run great, so will the less demanding games.
 

llien

Member
It is interesting how picture is changing from resolution to resolution, most likely case is that games where NV wins get CPU limited at lower resolutions, which is just another way of saying what computerbase said: AMD is faster than green in newer games:


3WvAJ3T.png


ru3CPTP.png



6E7kIi2.png


Nice try cherry picking 1 reviewer.
I don't think you've read what that diagram (it is newest games only) was and for the love of god, please stop stating BS about computerbase, a site with one hell of a reputation.


Bith 6800 and 6800XT lose.
Oh, boy, could you watch what you are commenting on? We are discussing 6900XT.
 
Last edited:

Mister Wolf

Member
Nvidia has released official benchmark results of Cyberpunk 2077 running on their RTX 30 Series cards and their RTX 2080 Ti.


Cyberpunk-2077-Official-4K-Performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-2560x1440-performance.png


cyberpunk-2077-nvidia-geforce-rtx-dlss-quality-mode-1920x1080-performance.png

I should be able to get 60fps if I bump the settings besides RT from ultra to high and use DLSS Q with my 2080Ti.
 
Ilien continues to ignore reality. Especially my post where i pointed out that AMD is NOT faster in newer games. Its slower. Its faster in the 5 newer games that are showned on computerbase. And slower in 10 others that are also new games.

Then we continue with the techspot reviews which has the most AMD leaning results out of all reviews.

At this point we're just messing around, no ? We're joking ?
 

llien

Member
You sound like a hater. Nvidia's benchmarks are not inflated; if they were inflated
You sound like a fanboi.
NV's benchmarks are SHAMELESSLY inflated, and I'm not talking about "Jensen said so... on an official presentation of a product... and it was a lie" (3080 is twice as fast as 2080 isn't it?), but about actual benchmarks in NV's "here are the figures you should get" kit.

And if they were inflated, THEY WOULD NOT MATCH actual benchmarks, and, wait for it, they did not (1060 appeared to be only 10% faster, not 25% as claimed.

have DLSS
I'm upset SCAM like that (claiming that tech is/does something it clearly does not, no it does not improve things across the board, it adds blur, it wipes out fine details, it is particularly bad with quickly moving objects, and you have been shown screenshots at least twice and not just some random screenshots, but from sites hyping the crap out of that TAA derivative) exists.
As of now, I game exclusively on PS4, could not care less what GPU has what fancy words slapped on it.

I should be able to get 60fps if I bump the settings besides RT from ultra to high and use DLSS Q with my 2080Ti.
You don't need NV's help to reduce resolution to 1440p. Without TAA derivative applied to it, it will be much faster too (roughly, 4k to 1440p should double your framerates, half of it is eaten by TAA derivative)
 

Mister Wolf

Member
You sound like a fanboi.
NV's benchmarks are SHAMELESSLY inflated, and I'm not talking about "Jensen said so... on an official presentation of a product... and it was a lie" (3080 is twice as fast as 2080 isn't it?), but about actual benchmarks in NV's "here are the figures you should get" kit.

And if they were inflated, THEY WOULD NOT MATCH actual benchmarks, and, wait for it, they did not (1060 appeared to be only 10% faster, not 25% as claimed.


I'm upset SCAM like that (claiming that tech is/does something it clearly does not, no it does not improve things across the board, it adds blur, it wipes out fine details, it is particularly bad with quickly moving objects, and you have been shown screenshots at least twice and not just some random screenshots, but from sites hyping the crap out of that TAA derivative) exists.
As of now, I game exclusively on PS4, could not care less what GPU has what fancy words slapped on it.


You don't need NV's help to reduce resolution to 1440p. Without TAA derivative applied to it, it will be much faster too (roughly, 4k to 1440p should double your framerates, half of it is eaten by TAA derivative)

Im looking forward to your reveal that you actually bought a 3000 series GPU.
 
Last edited:

CuNi

Member
It is interesting how picture is changing from resolution to resolution, most likely case is that games where NV wins get CPU limited at lower resolutions, which is just another way of saying what computerbase said: AMD is faster than green in newer games:


3WvAJ3T.png


ru3CPTP.png



6E7kIi2.png



I don't think you've read what that diagram (it is newest games only) was and for the love of god, please stop stating BS about computerbase, a site with one hell of a reputation.



Oh, boy, could you watch what you are commenting on? We are discussing 6900XT.

I never said computer base has a bad Rep. Still, as little as 1 game says, so does 1 reviewer. 1 card cannot be used to determine the average power. You need a bigger sample base. You win some, you lose some. Sames goes for amd. Aggregated data shows a better and bigger picture and that clearly shows green ahead with 8 and 20 percent without RT in raster. So if 6900XT isn't 20% faster than 6800XT, then it isn't faster than 3090 too. But sure, try to evade the point i guess.
 

llien

Member
Im looking forward to your reveal that you actually bought a 3000 series GPU.

That's from an impossible territory. NV engages in practices that I strongly oppose, there is no way of me buying their product.

CuNi CuNi
Even after it being brought to attention, you have still not figured we are not talking about cards you've linked.
 

thelastword

Banned
So the 6900XT wallops the 3090 at $500 less......Better in 1080p, better in 1440p and about par in 4K with Nvidia taking the lead with a mixup of older games at 4K....

It's funny everytime AMD destroys Nvidia in a new title we have to hear, It's AMD sponsored, but some of these goons never made such disclaimers when they stick to old Nvidia titles to skew these results in NV's favor. Pretty much the majority of titles are still NV focused, so what's the point in specifying an AMD title, when the majority aren't....

They do the same for RT, some even use NV DLSS vs AMD native to compare RT.....This 6900XT is much more performant from 6800XT as opposed to 3080 to 3090......Then it has some amazing OC headroom unlike Nvidia and it also has SAM.......The crazy thing on some of these benchmarks is that people are still pairing these cards with 8700K's, when there is SAM available.....So SAM is cheating, but DLSS vs AMD Native is not....So much bias out there it's not even funny.....

From here on out, you will see the gap widen between AMD vs Nvidia, most titles will be AMD focused because of the consoles or well optimized for RDNA 2......Nvidia is in for a world of hurt, AMD will even catch up in RT as can be seen, very promising performance in the latest titles as opposed to NV developed RT titles like Control etc......Techtubers have no problems, forcing a million NV titles down our throats, but one AMD title and they must let you know....
 

BluRayHiDef

Banned
So the 6900XT wallops the 3090 at $500 less......Better in 1080p, better in 1440p and about par in 4K with Nvidia taking the lead with a mixup of older games at 4K....

It's funny everytime AMD destroys Nvidia in a new title we have to hear, It's AMD sponsored, but some of these goons never made such disclaimers when they stick to old Nvidia titles to skew these results in NV's favor. Pretty much the majority of titles are still NV focused, so what's the point in specifying an AMD title, when the majority aren't....

They do the same for RT, some even use NV DLSS vs AMD native to compare RT.....This 6900XT is much more performant from 6800XT as opposed to 3080 to 3090......Then it has some amazing OC headroom unlike Nvidia and it also has SAM.......The crazy thing on some of these benchmarks is that people are still pairing these cards with 8700K's, when there is SAM available.....So SAM is cheating, but DLSS vs AMD Native is not....So much bias out there it's not even funny.....

From here on out, you will see the gap widen between AMD vs Nvidia, most titles will be AMD focused because of the consoles or well optimized for RDNA 2......Nvidia is in for a world of hurt, AMD will even catch up in RT as can be seen, very promising performance in the latest titles as opposed to NV developed RT titles like Control etc......Techtubers have no problems, forcing a million NV titles down our throats, but one AMD title and they must let you know....

The 3090 has RTX and DLSS, which make it better than the 6900 XT. In regard to rasterization, whenever the 6900 XT is faster, it's barely so, which means nothing considering that it isn't always faster.

Ampere is simply better than RDNA2; it's a Jack of all trades while RDNA2 is merely a master of one and only in some comparisons.
 
So the 6900XT wallops the 3090 at $500 less......Better in 1080p, better in 1440p and about par in 4K with Nvidia taking the lead with a mixup of older games at 4K....

It's funny everytime AMD destroys Nvidia in a new title we have to hear, It's AMD sponsored, but some of these goons never made such disclaimers when they stick to old Nvidia titles to skew these results in NV's favor. Pretty much the majority of titles are still NV focused, so what's the point in specifying an AMD title, when the majority aren't....

They do the same for RT, some even use NV DLSS vs AMD native to compare RT.....This 6900XT is much more performant from 6800XT as opposed to 3080 to 3090......Then it has some amazing OC headroom unlike Nvidia and it also has SAM.......The crazy thing on some of these benchmarks is that people are still pairing these cards with 8700K's, when there is SAM available.....So SAM is cheating, but DLSS vs AMD Native is not....So much bias out there it's not even funny.....

From here on out, you will see the gap widen between AMD vs Nvidia, most titles will be AMD focused because of the consoles or well optimized for RDNA 2......Nvidia is in for a world of hurt, AMD will even catch up in RT as can be seen, very promising performance in the latest titles as opposed to NV developed RT titles like Control etc......Techtubers have no problems, forcing a million NV titles down our throats, but one AMD title and they must let you know....



This is some quality comedy :) At this point we can joke around this and have fun.

Truth is, the 6900 is even more redundant than the 6800Xt. 1% faster than the 3080 in raster. 11% slower than the 3090 and with everything else worse. Its a peculiar card. They were too far behind nvidia to upset the status quo. But its a good job for them regardless
 
Top Bottom