• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RDNA2 Isn't As Impressive As It Seems

rnlval

Member
Sure, but I am talking about performance per mm/2 in comparison to Nvidia, not the overall cost. It's conceivable that by having a huge cache AMD are sacrificing performance per mm/2 for performance per watt, and maybe for a lower total cost overall.
Many SKUs being faster than my existing RTX 2080 Ti is good from my POV.

From https://wccftech.com/amd-radeon-rx-6800-rdna-2-graphics-card-ray-tracing-dxr-benchmarks-leak-out/

4uoPkcD.png




Without DLSS, RX 6800 seems to be a reasonable GA102 level GPU.
 

rnlval

Member
I agree that AMD seems to have made the right choice, since otherwise they would have had to reduce absolute performance, or increase power consumption, like Nvidia.
I still plan to buy RTX 3080 Ti AIB OC and maybe RX 6800 XT AIB OC on my second gaming PC rig. More selections = better.

RTX support with Blender3D is a factor for my RTX 3080 Ti AIB OC selection.
 

BluRayHiDef

Banned
So? I'm getting the videocard that is available. Fuck the paper launch of Nvidia.

It's not a paper launch. The problem is that insanely high demand has been compounded by suppressed manufacturing and shipping speeds due to the pandemic.

If this were a paper launch, then one person wouldn't be able to have acquired two cards from two different vendors via two different means: an RTX 3080 FTW3 that was bought at Micro Center and a Revel Epic-X RTX 3090 that was ordered via phone from PNY.

CUqWYoV.jpg
 

Armorian

Banned
It is very impressive, not many people (including me) expected AMD to actually compete in high end GPU space. Oh and Nvidia, they didn't charge ~1000$ for 3080 only because they knew what was going on in AMD :messenger_tears_of_joy:
 
This is the kind of drivel hit piece I'd expect from Toms Hardware. I'd argue RDNA2 is more efficient. Smaller bus width and still trading blows at 4K? The Infinity Cache seems to be a great innovation. How about we spin this as Ampere isn't as efficient because it needs more bandwith and a bigger bus to keep up with RDNA2.

There's some discussing going on for a while about yes, Ampere being not much efficient in any resolution smaller than 4K.
Remember the "doubled cuda"? Those cores were already there, they were not added, they were just improved to allow them to do 32bit ops all the time unlike before. But all the rest of the GPU resources wasn't scaled proportionally with this change. Just those extra 32bit cores alone can't push the whole structure forward with them, they can only help with their part and this is where the 4K and higher resolutions comes in. Working on all those extra pixels is their work, on smaller resolutions they go underutilized.
 

BluRayHiDef

Banned
I spent a lot of money on RTX 3080 and 3090 cards so I can't handle the fact these AMD cards might be better and cheaper so I'm gonna make a thread to shit on them and make me feel better.

You may think that that's the case, but it's not. I genuinely have no regret about my purchases; if I wanted to, I could return my 3080 to Micro Center for a full refund since the 30-day return period has not ended yet; however, I'm keeping the card because I'm genuinely satisfied with it.

I regret absolutely nothing.

By the way, you users who will react to this post and have reacted to my other posts with the childish "LOL" emoji seriously need to grow up.
 

mr.dilya

Banned
You may think that that's the case, but it's not. I genuinely have no regret about my purchases; if I wanted to, I could return my 3080 to Micro Center for a full refund since the 30-day return period has not ended yet; however, I'm keeping the card because I'm genuinely satisfied with it.

I regret absolutely nothing.

By the way, you users who will react to this post and have reacted to my other posts with the childish "LOL" emoji seriously need to grow up.

Part of the reason for the reaction to your posts is how you present them. You come off like a snob and a know it all. You also from what I’ve seen are always keen to show off what you have any chance you get. If it’s that type of “show off your stuff” thread cool, but for example nobody asked you for photo evidence of your 3080 and 3090 in here, but you felt the need to post them. I’m sure there are many posters here who have a better system than you who don’t feel the need to seek constant validation every chance they get.

Nobody likes to deal with people like that in real life, let alone online...so if you want more hospitable responses to your posts maybe you need to address how you interact with others. A lot of your posts are tacky and cringe bruh just being real.
 

Boss Mog

Member
55" screen. I sit two to three feet away from it.

W1i2pAn.jpg


You may think that that's the case, but it's not. I genuinely have no regret about my purchases; if I wanted to, I could return my 3080 to Micro Center for a full refund since the 30-day return period has not ended yet; however, I'm keeping the card because I'm genuinely satisfied with it.
You have the RTX cards as your desktop background ffs; you're obviously a fanboy throwing a hissy fit and trying to shit on the competition.

By the way, you users who will react to this post and have reacted to my other posts with the childish "LOL" emoji seriously need to grow up.
Says the dude with a bunch dolls next to his TV...
 

BluRayHiDef

Banned
Part of the reason for the reaction to your posts is how you present them. You come off like a snob and a know it all. You also from what I’ve seen are always keen to show off what you have any chance you get. If it’s that type of “show off your stuff” thread cool, but for example nobody asked you for photo evidence of your 3080 and 3090 in here, but you felt the need to post them. I’m sure there are many posters here who have a better system than you who don’t feel the need to seek constant validation every chance they get.

Nobody likes to deal with people like that in real life, let alone online...so if you want more hospitable responses to your posts maybe you need to address how you interact with others. A lot of your posts are tacky and cringe bruh just being real.

I don't care for your opinion or irrelevant diatribes; just ignore me if you don't like my posts and don't post in my threads. Also, I'm not your bruh.
 
ITT: a lot of clueless fanboys that should stick to slinging rocks in the console wars. OP is objectively correct in his calculations and understanding of what's what. AMD may win this round but it's by luck of having a better foundry make their chips rather than some miraculous design brilliance on their part.

Regardless, I'm still sitting comfortable on my 1080 Ti and will happily wait for a 40 series from Nvidia with most likely 5nm where the jump over Ampere will be massive. 4080 Ti here I come.
 

Senua

Member
I don't care for your opinion or irrelevant diatribes; just ignore me if you don't like my posts and don't post in my threads. Also, I'm not your bruh.
Dude this is the worst way to handle criticim, very John Linneman esque, and we all know what happened ot him :messenger_poop:
 

mr.dilya

Banned
question though, would it be safe to say that AMD’s first two cards in the lineup is more “future proof” than the 3080 and 3070 for the VRAM alone? Or are there other considerations? I know about DLSS right now, but it seems to me that for people who don’t upgrade their GPU every year AMD is probably the wiser decision.
 

Athreous

Member
Hmm, sorry, did AMD show their new cards? Are they better and cheaper than the 30xx series?
I might upgrade next year!
 

Herr Edgy

Member
So what? I am still buying the 6800XT card to shit on nvidias lousy practities, like charging 1500 dollars for 2080ti because there were no competition.
I thought GAF is saying you are responsible for yourself and companies can do no wrong, given the "why don't you just quit if you don't like crunching at CDPR" attitude of many posters here.

Just don't buy it
 

BluRayHiDef

Banned
question though, would it be safe to say that AMD’s first two cards in the lineup is more “future proof” than the 3080 and 3070 for the VRAM alone? Or are there other considerations? I know about DLSS right now, but it seems to me that for people who don’t upgrade their GPU every year AMD is probably the wiser decision.

The RTX 3070 might be in trouble, because it has GDDR6 rather than GDDR6X and it has only 8GBs of it.

On the other hand, the RTX 3080 should be fine for the foreseeable future because it has GDDR6X, which is very fast and provides the RTX 3080 with a bandwidth of 760 GB/s. Hence, the RTX 3080 can swap data in and out of its VRAM fast enough to compensate for its relatively low amount of VRAM in comparison to the RX 6800XT's 16GB of GDDR6.
 

PhoenixTank

Member
BluRayHiDef BluRayHiDef I've heard the "Nvidia will be even better on 7nm" argument since Turing and the lead up to Navi but unfortunately It is basically irrelevant up until they are actually using TSMC's 7nm node.
For better or worse they chose Samsung 8nm, a node with different characteristics and the prospect of a refresh, while plausible, is practically a footnote to your entire post.

There are a few flaws here:
Anandtech estimate about 6 billion of that 26.8B transistor budget goes towards Infinity Cache, which throws off the numbers.
https://www.anandtech.com/show/16202/amd-reveals-the-radeon-rx-6000-series-rdna2-starts-at-the-highend-coming-november-18th/2 said:
Doing some quick paper napkin math and assuming AMD is using standard 6T SRAM, Navi 21’s Infinity Cache would be at least 6 billion transistors in size, which is a significant number of transistors even on TSMC’s 7nm process (for reference, the entirety of Navi 10 is 10.3B transistors). In practice I suspect AMD has some optimizations in place to minimize the number of transistors used and space occupied, but regardless, the amount of die space they have to be devoting to the Infinity Cache is significant. So this is a major architectural trade-off for the company.
As mentioned, likely lower than 6B in reality but still significant and it makes this transistor count argument more of a pears to apples comparison.

You ignore that some of AMD's die space in the CUs will indeed be for ray tracing acceleration, and give the benefit to Nvidia. Fair argument on Tensor core die space, though.

There is also no good reason to suspect that Nvidia would opt to have similarly large sized GA102 dies on TSMC 7nm rather than more chips/wafer & better yields, which would go some way to negate some of the benefits of the overall density improvements. I may be brainfarting on this last point.
 
I'm waiting for availability and real world testing. Leaning more toward a 3080 though personally.
I always find it ironic how PC players act like the adults in the room looking down their nose at “console wars” then realise they’re exactly the same in threads like this but instead of consoles it’s overpriced graphics cards they fight over 😆
There are fanboys for everything, even cell phones, operating systems etc. At least they aren't in here celebrating taking content away from another platform. Puts them miles above those sad sacks.
 

thelastword

Banned
Many SKUs being faster than my existing RTX 2080 Ti is good from my POV.

From https://wccftech.com/amd-radeon-rx-6800-rdna-2-graphics-card-ray-tracing-dxr-benchmarks-leak-out/

4uoPkcD.png




Without DLSS, RX 6800 seems to be a reasonable GA102 level GPU.
I've seen that, and that's why I was telling folks, this is mighty impressive for AMD's first run with RT, Yet NV has DLSS on for better frames.....So I will want to see the 6800's RT performance with Super Resolution on......That's when things will get interesting, because by the looks of it, the 6800 is already over the the 2080ti and 3070 when no upscaling technology is used. I shudder to think of AMD's RT performance with SR....Since rumor is that SR is more performant over DLSS relative to frames.....


Here is the the same game at 1440p......And these games were not even made with AMD's raytracing technology in mind...Hold unto your hats....


AMD-Radeon-RX-6800-Shadow-of-the-Tomb-Raider-QHD.png
 
I'm waiting for availability and real world testing. Leaning more toward a 3080 though personally.
There are fanboys for everything, even cell phones, operating systems etc. At least they aren't in here celebrating taking content away from another platform. Puts them miles above those sad sacks.
ksJII1V.png


Z0OrO4r.png


Hm...

I know they are cherry-picked, but that this is even possible at all says a lot.
I mean, you can find examples of "massive" differences like this when it comes to ryzen VS Intel too. With gains in favor of Intel. It's not the end of the conversation.
 
Last edited:

martino

Member
So what? I am still buying the 6800XT card to shit on nvidias lousy practities, like charging 1500 dollars for 2080ti because there were no competition.
lot of people like me were waiting for high end competition because of this.
but not day one and not without game made fpor next gen tested on the arch though (at least in my case)
 

BluRayHiDef

Banned
I mean, you can find examples of "massive" differences like this when it comes to ryzen VS Intel too. With gains in favor of Intel. It's not the end of the conversation.

It should be considered that the RX 6000 Series cards are paired with Ryzen 5000 Series CPUs and therefore benefit from the performance boost that is intrinsic to that pairing and that they also benefit from the performance boost of the overclock mode known as "Rage Mode." However, not all adopters of the RX 6000 Series cards will pair them with Ryzen 5000 Series CPUs, and the RTX 30 Series cards can be overclocked (though they need a lot of power).
 

pr0cs

Member
There's no need for personal grudges against a business; their purpose is to maximize profit however they can.
And its our jobs as consumers to not reward them for trying underhanded and questionable tactics because they have not had good competition for a while.
Nvidia have been scum the last few gen and now are finally trying to be reasonable only because amd is finally a threat. I refuse to support that behaviour
 

thelastword

Banned

Best one so far. Hilarious....

The truth is this has been done so many times by NV. Release an expensive card, when the competition shows their wares they are ready to cut $500-700 off the prime card with more vram and better performance. So where was that performance all the time, why could they not offer their fans/customers that perf and price in the first place. Look AMD just launch Ampere, and they are already readying a refresh weeks later, most probably with more ram and at a lesser price.....

Yet I don't bother too much, Nvidia fans defend anything they do to the death, as an example, this thread here. So whatever they end up spending over and over again on the hypetrails of Jensen, they deserve......They got robbed spending all that cash on the 2080ti, with a 3080 Super offering similar performance later on. Same with the 2080.....The biggest hype for the 35.58TF 3090 and it can't beat the 23TF 6900XT.....People will defend NV to the death, no matter how much lower the performance and how much higher the price...I think this is now a fact of life........The only way that can change is for people to start rewarding the companies giving them the best bang for their bucks, that's clearly AMD.....It's the only way these proud NV fans will realize better performance per watt and dollar is extremely more important for the industry and all gamers as a whole.....Only when everybody goes to the red team including their friends, will they feel silly and left out.....Right now they are only riding the brand train instead of the sensible and logical train.....Yet the market is about to change drastically in the GPU space, just as the CPU space has changed.....Change is inevitable...
 

BluRayHiDef

Banned
Best one so far. Hilarious....

The truth is this has been done so many times by NV. Release an expensive card, when the competition shows their wares they are ready to cut $500-700 off the prime card with more vram and better performance. So where was that performance all the time, why could they not offer their fans/customers that perf and price in the first place. Look AMD just launch Ampere, and they are already readying a refresh weeks later, most probably with more ram and at a lesser price.....

Yet I don't bother too much, Nvidia fans defend anything they do to the death, as an example, this thread here. So whatever they end up spending over and over again on the hypetrails of Jensen, they deserve......They got robbed spending all that cash on the 2080ti, with a 3080 Super offering similar performance later on. Same with the 2080.....The biggest hype for the 35.58TF 3090 and it can't beat the 23TF 6900XT.....People will defend NV to the death, no matter how much lower the performance and how much higher the price...I think this is now a fact of life........The only way that can change is for people to start rewarding the companies giving them the best bang for their bucks, that's clearly AMD.....It's the only way these proud NV fans will realize better performance per watt and dollar is extremely more important for the industry and all gamers as a whole.....Only when everybody goes to the red team including their friends, will they feel silly and left out.....Right now they are only riding the brand train instead of the sensible and logical train.....Yet the market is about to change drastically in the GPU space, just as the CPU space has changed.....Change is inevitable...

1. When has Nvidia ever released a SKU with better performance and more VRAM at a lower price? Please give me an example.

2. Ampere teraflops are not the same as RDNA2 teraflops. You're comparing apples to oranges.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
3080 owner and future Zen 3 owner here. I’ll wait for RDNA 2 benchmarks. Happy to swap my 3080 for an AMD card if it’s better. But if I’m also missing out on legit RTX and DLSS, I’ll spend the extra $50 and lose some FPS for the 3080.

I am happy to even be having this discussion. Nvidia thinks they can squeeze every dollar they can out of us. Great to see AMD being competitive again.
 
Some of y'all need to grow up. I'm not trying to defend BluRayHiDef BluRayHiDef , but I'm in the same boat. I was so close to jumping the gun for the 30XX cards, but decided to wait and see what AMD has to offer. I'm actually still waiting on actual benchmarks outside of AMD. (Did the same while waiting for 30XX cards when reviewers received them). It doesn't matter if it's Intel, AMD, Nvidia, or even PowerVR. Whoever has the best raytracing performance, best visuals @ high framerates, will get my money. If AMD has faster rasterization, as well as faster raytracing, I will buy it. If AMD has faster rasterization, but slower raytracing, I will go with Nvidia.

There's no point in ME (IMO) getting a GPU that isn't the most performant in features that will eventually be in just about every game in the future. If you don't care about the best raytracing performance, AMD is perfectly fine for you.
 

llien

Member
TLDR;

AMD RDNA2 beats Ampere at performance, price, power consumption, sensible VRAM size, all while using less transistors, BUT:

If we use our imagination and write off deliberate number of Ampere transistors to "RT" and "Tensor Cores" we can figure that even though Ampere looks like a poor, inferior product, it is actually a very very good design, somewhere, deep inside. :messenger_beaming:
 

nani17

are in a big trouble
It's not a paper launch. The problem is that insanely high demand has been compounded by suppressed manufacturing and shipping speeds due to the pandemic.

If this were a paper launch, then one person wouldn't be able to have acquired two cards from two different vendors via two different means: an RTX 3080 FTW3 that was bought at Micro Center and a Revel Epic-X RTX 3090 that was ordered via phone from PNY.

CUqWYoV.jpg

Dude seriously come on I own an Nvidia card but the majority of tech YouTubers call it a paper launch. It's a shit launch it was bad an yes AMD is making them scared very scared.

There seems to be a trend in your threads of protecting Nvidia. You have one good AMD getting better isn't bad for you at all.
 
Last edited:

recursive

Member
Part of the reason for the reaction to your posts is how you present them. You come off like a snob and a know it all. You also from what I’ve seen are always keen to show off what you have any chance you get. If it’s that type of “show off your stuff” thread cool, but for example nobody asked you for photo evidence of your 3080 and 3090 in here, but you felt the need to post them. I’m sure there are many posters here who have a better system than you who don’t feel the need to seek constant validation every chance they get.

Nobody likes to deal with people like that in real life, let alone online...so if you want more hospitable responses to your posts maybe you need to address how you interact with others. A lot of your posts are tacky and cringe bruh just being real.
A picture of them both sitting on a desk no less.
 
Top Bottom