• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreleased AMD Radeon RX GPU Shows Up in OpenVR Benchmark & Outperforms The NVIDIA GeForce RTX 2080 Ti by Up To 17%

Ascend

Member
AMD's recent GPUs have all been very late responses to what Nvidia has long had on the market, and only then they come out with a marginally better value product, if the only thing you really care about is performance/$ in mainstream benchmarked games. Try a little something outside mainstream like OpenGL emulation on CEMU, and performance was atrocious until finally we're getting Vulkan options, which we can't really thank AMD for.
Uh... You're aware that Vulkan would not exist without AMD's Mantle, right?
As for the whole OpenGL emulation on CEMU... How is that AMD's fault? Isn't it the job of the developers to optimize it for AMD hardware?

And let's not go into AMD's old DX9/DX11 CPU overhead issue that they never fixed. These sort of issues prop up left and right once you go out of the comfort zone that is the mainstream benchmark games, and even those have some doozies that AMD often fails to address. I don't want it to be like that, but that's just reality.
Do you know what that CPU overhead 'issue' actually entails? Let me give you a hint. Why does Crysis, even to this day, run like sh*t on even the latest hardware?
AMD did improve a lot on the wrongfully named CPU overhead issue with driver updates, but I guess either nobody knows or nobody really cares.
Yes, things are like that, but the why is still important.

There's far more to Nvidia's feature support than just Ray-tracing. They've pushed most of the real improvements in the industry, beginning with things like G-Sync. AMD has merely followed suit. Things like Chill, Boost, Async compute, what do these really offer me when AMD GPUs are louder, slower and demand more power regardless? FreeSync was useful response to proprietary G-Sync, but it would not exist without it either, and most often the implementations were sub-par.
How can you say AMD 'follows suit' and use Chill and Boost as an example when nVidia never implemented those? Additionally, this really shows how skewed your perspective really is... I mean...

Who uses smaller nodes first? (In Before "They do it because they need to!")
Who had DX 10.1 features first?
Who came with a unified shader architecture first?
Who first came with tessellation support on their cards?
Who was the first to put sound through HDMI on their cards?
Who had DX12 support first?
Who had concurrent async compute first?

And how exactly did the whole Anti-lag thing go? Oh right... nVidia lied saying they did it first, and then they released a low latency mode to compete with AMD after AMD already released their anti-lag. But somehow it's AMD that follows suit...

But seriously, AMD cards are rarely louder, slower and more power hungry at the same time, especially if taking AIB cards into account. And that's the exact function of Radeon Chill, which no one cares about because it's specifically AMD's. That reduces your power consumption significantly, if you really truly care about that. In reality, people don't care about it, except to trash AMD.

I disagree that FreeSync would not exist without G-Sync. Maybe the timing of it changed, but it was pretty much inevitable. And if we start talking about variable refresh rate over HDMI, nVidia still does not have that. And that is one example that has dragged on for ages on nVidia's side. But I guess people only pay attention when it's AMD dragging their feet and missing certain features....

Then there's the software feature set. AMD has a new pretty GUI, and half the shit in it doesn't work on their newest GPUs as per Gamers Nexus. Nvidia's control panel is old, but at least it works. Then there's features like Ansel and now adding Reshade filter support on top of what they already. Where's AMD's equivalent OSD features? They took forever even with Relive or whatever they call the equivalent of Shadowplay is these days. Even if AMD manages to bring some marginally useful software feature that Nvidia doesn't have, image sharpening for example, it's usually replicated in no time, whereas the opposite takes forever or never happens at all. That's the reality of the differences between AMD and Nvidia when it comes to software support and resources they can dedicate.
Hello variable refresh rate over HDMI again.
Radeon Chill?
ReLive is now practically superior to ShadowPlay. Look which one is awarded being the best for features...;

As for Gamers Nexus stuff that doesn't work. Did you look at the fixed issues list in the latest driver release? Here;

Are there issues? Yes. At least AMD is open about it in their driver releases, which issues are fixed and which still are pending. nVidia's issues are never published like this, but, just go to the nVidia forums and you'll see how many issues people have. But somehow, that still is not seen as an issue among the gaming community. It's still true that nVidia has way more resources than AMD, which makes it even more baffling that AMD is slammed for what they offer, rather than praised, considering their limited resources.

I'm not acting like it's bad in itself. Not sure how you got that idea.
Maybe because you said " if this GPU comes some time in H2 and goes right against Nvidia's next gen, suddenly it'll just be another 5700XT at best "
Expressed in this way, it comes over as extremely belittling.

Not sure how you got that idea. I got a 2070S for ~100€ more compared to what I could've gotten a 5700XT for a few reasons. I play games like BOTW on CEMU, and modded Skyrim LE. Performance with these games on Radeon is atrociously bad, due to their ignored optimizations for things like OpenGL and CPU overhead issues like I already mentioned. I play on a 4K panel, and even 2070S isn't fast enough for it, and going 5700XT would not help there and beyond there's no choice at all. AMD cards also like to suck a bunch of idle power for no reason on multi-monitor setups. I could go on endlessly about all these sort of little issues that AMD never gets around to addressing, which ultimately makes it more reasonable to pick an Nvidia card. It's so much more than just the perf/$ in currently benchmarked titles, and that's where Radeon starts to stumble.
Remember this?

Didn't think so. Because it only matters when it's AMD.

You seem to forget AMD has a node advantage, one that'll be erased this year.
Node "advantage"... The reason nVidia is practically always later on a node is because they prefer the matured process. Releasing products on a smaller nodes early is not really an advantage at all.
Having said that, even if you correct for node size, the 5700XT is still smaller compared to the 2070 Super. The architecture advantage of RDNA will be reduced but will not go away after nVidia's node shrink, if things remain similar. This might change with AMD's incorporation of hardware RT, but we'll see.
Additionally, AMD has reserved practically the whole of TSMC's 7nm wafers. nVidia would have to go to Samsung for their chips, and it's well-known that TSMC's 7nm tech, both DUV and EUV, are superior to Samsung's at this point where it matters most; yields.

Clock to clock is a useless metric in real world products.
No it isn't. It shows how slow or fast an architecture is. It becomes arguably useless in end products if the clock speeds vary by a lot, like in the case of Gen 1 Ryzen (4GHz) and Intel's CPUs (5GHz). But for GPUs of AMD vs nVidia the clocks aren't really that different, so that makes it extremely relevant.
While Vega was about 15% slower than Pascal, RDNA is equal to Turing (technically it's 1% faster, but that's margin of error stuff). Put differently, RDNA is a 39% uplift in per clock performance compared to Polaris, and 28% over Vega. If you don't understand how huge this is, no one can help you. And this is the reason why some of us are quite optimistic about big Navi and RDNA2.
And if what they just said at CES is true, they cannot be underestimated. They updated Vega CUs, and received a 59% uplift in performance. That would now put Vega at 31% higher performance per clock than RDNA 1... So.. Yeah...

I'm not saying AMD is dragging because Nvidia has a lot of cards on the market. I'm saying they're dragging because they don't have competitive products out on the market in several segments. Any fool can see that. They had essentially 5700XT and non-XT last year, everything else was either old or effectively useless in terms of moving the market. This year they'll have Navi 20. That's probably it. Nvidia in turn will likely refresh their whole stack or close to it.
As already mentioned, the 5700 series is the go-to card for anything over $300. If someone buys nVidia, they were most likely going to buy nVidia anyway.
Vega 56 and 64 were pretty good deals last year, despite their age. The 5600 series is coming out now, and is the obvious choice for the $200 - $300 range. But I guess it will be considered 'late', because it's AMD.
Polaris, especially the RX 570, was still the king in the $100 - $200 segment, except no one cared about it because it's not an nVidia card. The nVidia cards released later in this segment were never considered 'late', despite Polaris dominating that range for years.

AMD is doing what they need to be doing. They are following their own releases, rather than adapting to nVidia. I don't know if you noticed, but, after a long while, nVidia has released a huge list of cards to actually combat AMD's releases. When was the last time that happened? The momentum has shifted from AMD adapting to nVidia, to nVidia adapting to AMD. nVidia are faster for sure, but I see a change in trend. Most don't see it yet.

If AMD is just competitive, people still go for Nvidia, right? It's not the job of consumers to help failing companies. It's their job to sell the product, and in AMD's case that's going to require a product faster and more efficient than Nvidia.
It's not the job of consumers, but it's a good idea to be conscious of what you're supporting with your money.

People blindly buying nVidia cards is the equivalent of people blindly putting money into loot boxes. And when someone comes around and says that there no loot boxes in this publisher's game, buy this game instead, everyone starts saying that the loot boxes add to the gaming experience and that games with loot boxes are superior. And then they go on to say that it's the job of the publisher without the loot boxes to convince people to buy their game. Sounds like those people are already convinced, and that publisher is better off not bothering.

As for what AMD requires, nobody really knows. There was no real competition for the RX 570 for quite a while. No one bought it. Everyone went for the 1050 Ti instead.

When AMD releases a card that is faster and more efficient, people will start talking about drivers.
When AMD fixes the drivers, people will start talking about price.
When AMD lowers the price, people will start talking about ray tracing.
When AMD adds ray tracing, people will start talking about whatever else is required to still justify how nVidia is better.

There's absolutely no worry right now that AMD is going to deliver a product that will dominate Nvidia in the GPU market. They could have a dominating product for half a decade and they still wouldn't be where Nvidia is right now in terms of market leadership.
And do you see that as a good, or a bad thing? To me, that is one of the worst things to happen in this industry.

And I'll say again. People thought the same about Intel. Yeah Yeah, nVidia isn't sleeping like Intel is. I get it. But AMD's CPU brand making a name for itself will inevitably carry over to their GPU sales. Because people are shallow like that. Don't underestimate what can happen.

If AMD can push some next gen console RT/Navi advantage in the PC market, great for them, but we've all heard these stories before. They amounted to less than nothing this gen.
They didn't have an updated architecture to go along with it.

I would like nothing more than AMD to actually offer better products.
So that you can buy nVidia cheaper, right?

As it stands, we're a long way off from that. I don't bemoan people who get themselves an AMD GPU, there's plenty of use cases where they offer plenty of value and sufficient support, aka the mainstream volume segment I talked about.
And for some reason the mainstream doesn't really use AMD cards. Why is that?

I bemoan people who think there's no reason to pick Nvidia other than overpaying for more performance or blind brand worship.
There are always reasons. But the majority of reasons given are either obviously biased or simply excuses. It's basically the same reasons that people use when they prefer an iPhone rather than an Android phone. The funny thing is that the majority of the large companies see the value in AMD. But for some reason, gamers don't. If AMD really was as trash as people make them out to be, they wouldn't be the primary choice for consoles, super computers and even Apple products.

Or people that claim AMD is on par with their feature set outside "that useless ray-tracing feature". That's crassly oversimplifying things. On paper Radeon is always better perf/$. Once you start playing with them (year or two later) and shit don't work right (you don't play that mainstream benchmark), it might not feel that way anymore.
I find this argument quite amusing, considering nVidia generally does way better in mainstream benchmarks than AMD, and AMD products perform better and better over time.
 

thelastword

Banned
NfHmnES.png


Q936ePi.png


reddit
And this is what I was saying, this is Open VR where Nvidia traditionally beats the AMD cards, the fact an AMD card beats Nvidia's best card by 17% means it's a beast of a card and even faster in traditional games.....So 30% or more over the 2080Ti seems about right....in traditional games....

Having said this, I hope AMD also boosts it's speed in OpenGL, they need to work on a driver that takes care of that once and for all....Leave no stone unturned....
 

VFXVeteran

Banned
Here you go, you believe the 50% Nvidia said no questions asked, no graphs, nothing, but a leak which shows a benchmark, not from AMD mind you, so you would assume bias, so no reason to lie, but you have doubts over such a card beating the 2080ti......I think I heard the narrative sometime ago that AMD would never have a card to beat Nvidia.....So this fightback is essentially the position I knew NV fans would take when AMD did.....

Have you ever used an AMD board or Nvidia board for graphics development? There is a reason why Nvidia has most of the market share in nearly every aspect of GPU computing. Turing is at least 50% faster than Pascal. Their boards are verifiable leaps every 2yrs. We are about to build a mini-GPU server for our simulations that will contain over 50 GPUs all Nvlinked together for the massive RAM boost. That is just not available from AMD. I have NO reason to believe Nvidia will come through on their quotes because they've proven it in the past.

I think you should ask yourself, what's in it for you? If RDNA 2.0 comes out and is indeed faster than the 2080Ti, how does that relate to a console? I've already stated my case for being biased towards Nvidia products. What's your intentions for cheering on AMD?
 

llien

Member
There is a reason why...
People bought slower, power hungry, more expensive Prescott over Athlon.
Or touched Fermi cards.
Or pathetic 950/960/1050/1050Ti GPUs dominating the market.

The "market share is indicative of product quality" is so obviously wrong, it's insane people still bring it up.
 
Uh... You're aware that Vulkan would not exist without AMD's Mantle, right?
As for the whole OpenGL emulation on CEMU... How is that AMD's fault? Isn't it the job of the developers to optimize it for AMD hardware?
Vulkan is not being developed by AMD, they simply gave the remnants of the failed Mantle API they didn't have the resources to push. They made that API in an attempt to grapple their overhead issues, right. One's they never fixed.

Do you know what that CPU overhead 'issue' actually entails? Let me give you a hint. Why does Crysis, even to this day, run like sh*t on even the latest hardware?
AMD did improve a lot on the wrongfully named CPU overhead issue with driver updates, but I guess either nobody knows or nobody really cares.
Yes, things are like that, but the why is still important.

Yes, games are often unoptimized. Yet, they still perform better on one vendor, and that's who you go to if you want to play such games. What people care about, is the actual gaming experience in games they play. "Why" only matters as far as how companies can and should to respond to it.


How can you say AMD 'follows suit' and use Chill and Boost as an example when nVidia never implemented those? Additionally, this really shows how skewed your perspective really is... I mean...

Who uses smaller nodes first? (In Before "They do it because they need to!")
Who had DX 10.1 features first?
Who came with a unified shader architecture first?
Who first came with tessellation support on their cards?
Who was the first to put sound through HDMI on their cards?
Who had DX12 support first?
Who had concurrent async compute first?

The issue with such "features" are that they pertain to the end goal, which is providing and faster, quieter, and visually better end result. Nobody "enjoys" Async compute, or DX12 in itself. AMD has indeed often been the first to do something (more so during ATI days as you might have noticed), yet have not managed to leverage that to any substantial degree. It's worthless to be first when your first implementation sucks, and your competition just struts in and shows you how it's done. Tech industry is full of companies that "did it first", yet failed to make a compelling product around said idea.

But seriously, AMD cards are rarely louder, slower and more power hungry at the same time, especially if taking AIB cards into account. And that's the exact function of Radeon Chill, which no one cares about because it's specifically AMD's. That reduces your power consumption significantly, if you really truly care about that. In reality, people don't care about it, except to trash AMD.

They don't care about Radeon Chill, because it's a minor power saving feature that doesn't really save anything at all when we're talking about peak gaming power. Nvidia GPUs clock down too when not maxed. Big deal. As for being louder, slower and more power hungry, I could point you a million reviews where that's the case. I don't see why you feel the need to try to dismiss it. AMD has had to up their clocks and voltage for several generations now beyond optimal just to hit performance targets to compete. This just the reality of the situation, and why things like "clock-to-clock" architecture differences matter so little, when said architecture cannot hit the required clocks at any reasonable voltage level. This is another case of "AMD is equally efficient" on paper, but any review you look at they're guzzling power, even on a superior node.

I disagree that FreeSync would not exist without G-Sync. Maybe the timing of it changed, but it was pretty much inevitable. And if we start talking about variable refresh rate over HDMI, nVidia still does not have that. And that is one example that has dragged on for ages on nVidia's side. But I guess people only pay attention when it's AMD dragging their feet and missing certain features....

So what VRR does say an LG C9, that's probably the most popular gaming TV, currently support? Nvidia has kept G-Sync proprietary for obvious reasons, and AMD would have been no different had they been in the same position. FreeSync over HDMI is a proprietary standard, too. There's very little reason for Nvidia to add their own to the mix at this point. Most of the FreeSync over HDMI panels are pretty trash in terms of VRR range.


Are there issues? Yes. At least AMD is open about it in their driver releases, which issues are fixed and which still are pending. nVidia's issues are never published like this, but, just go to the nVidia forums and you'll see how many issues people have. But somehow, that still is not seen as an issue among the gaming community. It's still true that nVidia has way more resources than AMD, which makes it even more baffling that AMD is slammed for what they offer, rather than praised, considering their limited resources.

They are slammed because they are a billion dollar company offering products in a competitive market, and fail to address their competition at the same level. Some people may know the reasons why they fail at that, but it's not our business to give pity points to companies that can't compete on the same level. It's laudable when AMD can do better as they do in CPUs right now, but if they don't, we have no obligation to praise them for coming sorta close with more limited resources, and even less to actually buy inferior products.


Maybe because you said " if this GPU comes some time in H2 and goes right against Nvidia's next gen, suddenly it'll just be another 5700XT at best "
Expressed in this way, it comes over as extremely belittling.
It says nothing about 5700XT being a bad product, just that it won't beat its competition in performance, and has to compete on value. Which the 5700XT absolutely does. Navi 20 in all likelihood won't be alone there being significantly faster than 2080 Ti when it actually comes out, more likely it'll be a "3080" competitor or something like that, and won't wow anyone, even if the idea of a GPU significantly faster than 2080 Ti might sound impressive right now.

Remember this?
Didn't think so. Because it only matters when it's AMD.

Of course it matters. It's a thing they fixed. AMD I don't think they've ever even bothered acknowledging. For Nvidia it was a bug, for AMD it's a feature.


Node "advantage"... The reason nVidia is practically always later on a node is because they prefer the matured process. Releasing products on a smaller nodes early is not really an advantage at all.
Having said that, even if you correct for node size, the 5700XT is still smaller compared to the 2070 Super. The architecture advantage of RDNA will be reduced but will not go away after nVidia's node shrink, if things remain similar. This might change with AMD's incorporation of hardware RT, but we'll see.
Additionally, AMD has reserved practically the whole of TSMC's 7nm wafers. nVidia would have to go to Samsung for their chips, and it's well-known that TSMC's 7nm tech, both DUV and EUV, are superior to Samsung's at this point where it matters most; yields.

We'll just have to see about this, won't we? but claiming that AMD using 7nm node now provides them no advantage at all is rather lol-worthy. I guess it was dumb of AMD to do it then.


No it isn't. It shows how slow or fast an architecture is. It becomes arguably useless in end products if the clock speeds vary by a lot, like in the case of Gen 1 Ryzen (4GHz) and Intel's CPUs (5GHz). But for GPUs of AMD vs nVidia the clocks aren't really that different, so that makes it extremely relevant.
While Vega was about 15% slower than Pascal, RDNA is equal to Turing (technically it's 1% faster, but that's margin of error stuff). Put differently, RDNA is a 39% uplift in per clock performance compared to Polaris, and 28% over Vega. If you don't understand how huge this is, no one can help you. And this is the reason why some of us are quite optimistic about big Navi and RDNA2.
And if what they just said at CES is true, they cannot be underestimated. They updated Vega CUs, and received a 59% uplift in performance. That would now put Vega at 31% higher performance per clock than RDNA 1... So.. Yeah...

Like I said above, there's more to an architecture's real world performance than some artificial clock-to-clock comparison when other limitations of said architecture come to play in actual products. It's worthless for AMD to have a theoretical clock-to-clock parity if they can't push their clocks high enough without massive power consumption. I've heard these stories about how AMD's architecture is on par in theory plenty of times, but none of us live in theory-land where GPUs only run at equally AMD-efficient clocks.


As already mentioned, the 5700 series is the go-to card for anything over $300. If someone buys nVidia, they were most likely going to buy nVidia anyway.
Vega 56 and 64 were pretty good deals last year, despite their age. The 5600 series is coming out now, and is the obvious choice for the $200 - $300 range. But I guess it will be considered 'late', because it's AMD.
Polaris, especially the RX 570, was still the king in the $100 - $200 segment, except no one cared about it because it's not an nVidia card. The nVidia cards released later in this segment were never considered 'late', despite Polaris dominating that range for years.

I remember there was a quote by some ATI guy back in the day (Carrell Killebrew maybe?) that was paraphrased something like "The best way to lose a fight is not to show up". AMD hasn't shown up for years now in several categories, and it shows in market share and mind share. Rehashing the same GPUs year over year is just embarrassing at this point.

AMD is doing what they need to be doing. They are following their own releases, rather than adapting to nVidia. I don't know if you noticed, but, after a long while, nVidia has released a huge list of cards to actually combat AMD's releases. When was the last time that happened? The momentum has shifted from AMD adapting to nVidia, to nVidia adapting to AMD. nVidia are faster for sure, but I see a change in trend. Most don't see it yet.
You're right in that they certainly aren't responding to Nvidia's launches beyond price adjustments of their outdated parts. They are slowly bringing a new gen on 7 nm which is something, but I would be careful of expecting that to amount to significant pressure on Nvidia. I guess compared to what they've done in the past few years anything looks significant though. Most don't see anything when there's nothing to look at. Some see things that simply aren't there and never will be.


People blindly buying nVidia cards is the equivalent of people blindly putting money into loot boxes. And when someone comes around and says that there no loot boxes in this publisher's game, buy this game instead, everyone starts saying that the loot boxes add to the gaming experience and that games with loot boxes are superior. And then they go on to say that it's the job of the publisher without the loot boxes to convince people to buy their game. Sounds like those people are already convinced, and that publisher is better off not bothering.

People don't just blindly buy Nvidia. They trust the brand because it's proven trustworthy, and recommended by more knowledgeable people. I don't know where you were going with this loot-box analogy but it makes no sense.

As for what AMD requires, nobody really knows. There was no real competition for the RX 570 for quite a while. No one bought it. Everyone went for the 1050 Ti instead.

When AMD releases a card that is faster and more efficient, people will start talking about drivers.
When AMD fixes the drivers, people will start talking about price.
When AMD lowers the price, people will start talking about ray tracing.
When AMD adds ray tracing, people will start talking about whatever else is required to still justify how nVidia is better.

These are all issues that obviously matter to people. And people have constantly mocked Nvidia for it's Ray-tracing feature being useless, so it's hyperbolic to claim that everyone is somehow unfairly exaggerating its value. At the same time it's a touted key feature in AMD's next gen for some reason. What AMD needs is to succeed in all these required parts, not just a single one. Of course the performance is the main feature, and architectural efficiency is the key to making that happen at a justifiable cost while making them profit.

Nvidia is in their current position because they could leverage their fastest halo products to the point that people in the lower segments simply believe the lower end is superior too. In reality AMD may provide more value there, but Nvidia doesn't even need to match that due to mindshare advantage and having the resources to push the necessary software support, a wide range of products and marketing for it. This stuff is business 101, it's hardly a secret that no one knows. AMD does well in CPUs now simply because they finally managed to make a competitive architecture that's easy to market. for GPUs it doesn't look all that promising right now. Nothing they have shown so far says they'll even catch up to Nvidia. Suddenly when they're making a single high end GPU, Lisa Su thinks it's very important for AMD, yet years before AMD was dead silent on the matter, arguing for the importance of the volume segments. The reality is they had nothing competitive for the high end, and it was far more important to spend the limited resources on CPUs. A smart move, but also the reason why AMD has nothing of note against 2080 Ti. I certainly don't blame Raja Koduri for their failures of Vega, he probably did what he could with what was given.



They didn't have an updated architecture to go along with it.
GCN was that architecture. There's nothing special about Navi 2.0 compared to how GCN was at the time of previous gen launching, besides the RT unit.


So that you can buy nVidia cheaper, right?
We buy what suits our needs best.

And for some reason the mainstream doesn't really use AMD cards. Why is that?

There are always reasons. But the majority of reasons given are either obviously biased or simply excuses. It's basically the same reasons that people use when they prefer an iPhone rather than an Android phone. The funny thing is that the majority of the large companies see the value in AMD. But for some reason, gamers don't. If AMD really was as trash as people make them out to be, they wouldn't be the primary choice for consoles, super computers and even Apple products.

This kind of thinking is just petty saltiness over other people valuing other things and dismissing it as "biases" or "excuses". Of course your reasons for selecting a product are the only valid and unbiased ones. What a joke. I explained above why and how mindshare works in the more casual audience that makes purchases driven by it, and it doesn't simply come out of some blind bias. It takes time to change brand recognition when it's been a certain way for a long time, but the way it's gotten there is never without a good reason. AMD is not some misunderstood company that's always offered better performance and price, and Nvidia has somehow unjustly acquired its customers by using some shady marketing means and everyone is just too dumb to realize it. It's a ludicrous fanboy mindset that's so full of itself it's not even funny.

I find this argument quite amusing, considering nVidia generally does way better in mainstream benchmarks than AMD, and AMD products perform better and better over time.
I don't quite follow how these two arguments relate in any way. Does AMD products perform better over time, or do they just suck when they launch? It's a matter of perspective. I'd rather take the performance now. If Nvidia performs better in mainstream benchmarks, and it performs even better outside them, yeah, not a lot of reasons to buy AMD, other than enjoying that FineWine benchmark that says ancient GCN card performance increased a few percentage points over Nvidia in just half a decade of driver improvements lol.
 

VFXVeteran

Banned
People bought slower, power hungry, more expensive Prescott over Athlon.
Or touched Fermi cards.
Or pathetic 950/960/1050/1050Ti GPUs dominating the market.

The "market share is indicative of product quality" is so obviously wrong, it's insane people still bring it up.


I feel like you guys are arguing just for the sake of arguing. Can you look at the facts? What do the facts tell you about right now this very second concerning AMD and Nvidia? Which card is the best performer regardless of price? Which card has the best SDK for developers? Which card has the best driver support? Which card runs games the fastest and with more features? Which card is the best choice for chained VRAM GPU servers? Which card implements RT?

Let's try to avoid the imaginary argument that puts AMD in a good light and focus on the facts. Right now, AMD is just not even a consideration for people that ask those questions.
 
Last edited:
I feel like you guys are arguing just for the sake of arguing. Can you look at the facts? What do the facts tell you about right now this very second concerning AMD and Nvidia? Which card is the best performer regardless of price? Which card has the best SDK for developers? Which card has the best driver support? Which card runs games the fastest and with more features? Which card is the best choice for chained VRAM GPU servers? Which card implements RT?

Let's try to avoid the imaginary argument that puts AMD in a good light and focus on the facts. Right now, AMD is just not even a consideration for people that ask those questions.

Driver performance gains are a huge advantage for Nvidia. The resources they pour into maximising game performance pays off very well. Down side is when your card is no longer getting those optimisations its performance suffers relative to AMD cards of the same era. AMD cards typically age better (or have a worse start) IME.

On console this advantage doesn't really exist so I think AMD is relatively better off in that market. Would be great to see AMD get to a point where they could support their own cards to the degree Nvidia does.
 

llien

Member
Can you look at the facts?
Which part of "people bought slower, power hungry, more expensive Prescott over Athlon" is not a fact?
Or are you failing to comprehend the conclusion stemming from there: "market share is indicative of product quality" is obviously wrong

Which card has the best driver support?
It is a FUCKING 2020 TODAY, WHAT THE FUCK AM I READING?!?!?!

Which card is the best performer regardless of price?
So, it's ok to buy inferior low end cards (one product), because there is a another niche product? What kind of "logic" is that?
How, the hell, can that "logic" come from someone who is working in IT?
 
Last edited:

VFXVeteran

Banned
Which part of "people bought slower, power hungry, more expensive Prescott over Athlon" is not a fact?
Or are you failing to comprehend the conclusion stemming from there: "market share is indicative of product quality" is obviously wrong


It is a FUCKING 2020 TODAY, WHAT THE FUCK AM I READING?!?!?!


So, it's ok to buy inferior low end cards (one product), because there is a another niche product? What kind of "logic" is that?
How, the hell, can that "logic" come from someone who is working in IT?

I don't know what barked up your tree but you really need to tone it down.
 

Ascend

Member
I feel like you guys are arguing just for the sake of arguing. Can you look at the facts? What do the facts tell you about right now this very second concerning AMD and Nvidia? Which card is the best performer regardless of price?
Why does only the fastest card matter?

Which card has the best SDK for developers?
I wouldn't know. Can you provide a source that says nVidia is better? Because I cannot find any recent ones.

Which card has the best driver support?
Firstly;
Full report for you to read at your own discretion;

Now, let's take a look at how often AMD releases WHQL drivers, for the sake of clarity... If I choose all of them it's going to be too many, so WHQL only...
AMD (https://www.techpowerup.com/download/amd-radeon-graphics-drivers/) :
AMD Radeon Software Adrenalin 19.12.2 WHQL
December 12th, 2019 - What's New

AMD Radeon Software Adrenalin 2019 19.10.1 WHQL
October 17th, 2019 - What's New

AMD Radeon Software Adrenalin 2019 19.9.2 WHQL
September 24th, 2019 - What's New

AMD Radeon Software Adrenalin 2019 19.8.1 WHQL
August 22nd, 2019 - What's New

AMD Radeon Software Adrenalin 2019 19.5.2 WHQL
June 3rd, 2019 - What's New

AMD Radeon Software Adrenalin 2019 19.4.1 WHQL
April 3rd, 2019 - What's New

That's a new WHQL driver every month or every two months. Sure. That's less than nVidia. nVidia's WHQL driver releases vary between once every two months and three a month. But let me ask you a question. Do you like reinstalling drivers a lot? I for one do not like to keep reinstalling drivers every time. Most of the time I install a driver every couple of months, if that. AMD's WHQL releases are more than sufficient, and are on time for pretty much all new games nowadays.
Not that WHQL really means anything. WHQL is more a label than anything else. It cannot be some sort of safety net or quality assurance when nVidia's WHQL drivers have killed GPUs multiple times;

Now let's take a look at how quickly drivers are available on game releases. Let's take one of the more well-known games... COD Modern Warfare was released on October 25th.
When were the driver released?
AMD:
AMD Radeon Adrenalin 2019 Edition Graphics Driver 19.10.2 Hotfix (Release date: October 25, 2019)
Support For:
Call of Duty: Modern Warfare
With ultra presets on the Radeon RX 5700 XT, achieve up to 18% better performance playing Call of Duty: Modern Warfare with Radeon Software Adrenalin 2019 edition 19.10.2 than with Radeon Software Adrenalin 2019 Edition 19.10.1. RS-322

nVidia:
GeForce Game Ready 440.97 WHQL drivers (Release date: October 22, 2019)
Our latest GeForce Game Ready driver delivers day-one support for Call of Duty: Modern Warfare and The Outer Worlds.

Yes. nVidia is technically earlier. But does that really matter in this case? It's not as if the AMD driver is late. Additionally, AMD specifically mentions what improvement to expect compared to the prior driver. No such thing from nVidia. How do we know that something actually changed?
Also a fun note... Take a look at the comments on nVidia's page regarding this driver...
Just to quote a few;
"The 440.97 and the 440.52 beta driver have messed up my blacks in HDR on my c9."

"My geforce experience won't even open giving me an error 0x0003 I need help. "

"Why have we not recieved a response as to why MW isn't showing up under our games in Geforce Experience? "

And look how many people say "same problem" on that last one... So much for "more reliable drivers"...


So we just proved that;
  • AMD has more stable drivers
  • AMD has frequent driver releases
  • AMD has timely driver support for games
  • AMD has a superior driver release notes.

What other excuse do you have for the driver support argument?

Which card runs games the fastest and with more features?
What price range are we talking about? If it's the absolute fastest, how's your Titan RTX doing? You're not gonna tell me you don't have one, right? RIGHT?

Which card is the best choice for chained VRAM GPU servers?
Say hi to Google Stadia

Which card implements RT?
Say hi to XSX and PS5.

Let's try to avoid the imaginary argument that puts AMD in a good light and focus on the facts.
I'm sure you'll really significantly love the driver facts provided above.

Right now, AMD is just not even a consideration for people that ask those questions.
Really? Let's see which of your questions are actually really relevant...

Which card is the best performer regardless of price?
Asking the question which card is the fastest regardless of price is a useless question if you're unable/unwilling to pay that anyway.

Which card has the best SDK for developers?
The best SDK for developers question is useless once again, unless you're a developer yourself. So this question is irrelevant for the majority of the gaming market.

Which card has the best driver support?
Driver support we already touched upon enough.

Which card runs games the fastest[?]
Why does a card need to run games the fastest? No one is going to notice 180 vs 160 fps. The real question to ask is which performance you actually need, which products satisfy your criteria and how much they cost.

[Which card runs games] with more features?
Why is the amount of features relevant? The features you're going to actually use is a much better metric, and that is what you should be asking. To me, variable refresh rate over HDMI is a lot more important than RT at this point, for example.

Which card is the best choice for chained VRAM GPU servers?
Asking which card is the best for chained GPU servers is relevant for who? Google? We all know what they chose. Completely irrelevant question.

Which card implements RT?
It's quite funny, how running games the fastest and RT are both somehow mandatory questions, even though they are currently pretty much mutually exclusive. In other words, you can't expect to both run games the fastest and run RT. It's one or the other. But in any case... The real question is whether it's worth paying for RT right now or not. I recommend going for the approach described above on your "running games the fastest" question, which again is asking which performance you actually need, which products satisfy your criteria and how much they cost, and then making the choice.. If it happens to have RT, so be it. If it doesn't, it really is no big deal.

None of those questions help someone pick out the best card for their personal use. Your questions are all ego-boosting questions. Not exactly unexpected, considering that a large portion of nVidia buyers do it for bragging rights rather than actually using it for gaming.
 
Last edited:

Dontero

Banned
The AMD PS5 leak points to a dual GPU setting, if anyone still believes that rumour.

Ever heard of MisterXmedia ? There won't be any dual gpu solutions. There will be only SOC aka one ship with integrated everything. Days of discrete parts are gone simply because it is much more expensive to create such console.
 

VFXVeteran

Banned
Why does only the fastest card matter?

... <snip>

I don't feel like having this argument. I've worked at companies that only use Nvidia products. I know several developers at gaming companies that use Nvidia GPUs on their PC platforms for initial game design (including 1st party ones owned by Sony) before porting to consoles (both current and next-gen). I don't need to look up articles online about the merits or lack thereof for the two graphics cards. I've compared their software, drivers, built-in APIs, performance, (in some form or another) etc.. using them in both industries (film and games) at the professional level along with several other people.

If you really want to answer those questions yourself, buy both graphics cards, pick up a graphics programming book, install the SDKs and start learning how to use them. Then pick up one of the main rendering engines like Unreal and/or use an off-line animation application like Maya/3DS Max/Houdini using Arnold/Octane/Redshift - all of which use CUDA for path-tracing. Run your tests, program your tools using OpenCL (for AMD) and CUDA (for Nvidia), run your 3D programs in Linux and Windows, etc. then come back and talk about your experiences.
 

Ascend

Member
I don't feel like having this argument. I've worked at companies that only use Nvidia products. I know several developers at gaming companies that use Nvidia GPUs on their PC platforms for initial game design (including 1st party ones owned by Sony) before porting to consoles (both current and next-gen). I don't need to look up articles online about the merits or lack thereof for the two graphics cards. I've compared their software, drivers, built-in APIs, performance, (in some form or another) etc.. using them in both industries (film and games) at the professional level along with several other people.

If you really want to answer those questions yourself, buy both graphics cards, pick up a graphics programming book, install the SDKs and start learning how to use them. Then pick up one of the main rendering engines like Unreal and/or use an off-line animation application like Maya/3DS Max/Houdini using Arnold/Octane/Redshift - all of which use CUDA for path-tracing. Run your tests, program your tools using OpenCL (for AMD) and CUDA (for Nvidia), run your 3D programs in Linux and Windows, etc. then come back and talk about your experiences.
There are companies that only use AMD as well, so, what's your point?
It is a good idea though, to keep things separate. I was obviously arguing about the perspective of buying a card at retail as a gamer. All those questions you posed are pretty much irrelevant for that.
If you want to argue from the developer perspective, that's fine. But I doubt there are many here that find that useful, and using those arguments to convince gamers to buy nVidia is quite nasty and deceitful.
 

pawel86ck

Banned


Cant wait to hear more details. It looks like AMD will finally beat 2080ti, and maybe even compete with high end Ampere series.
 
There are companies that only use AMD as well, so, what's your point?
It is a good idea though, to keep things separate. I was obviously arguing about the perspective of buying a card at retail as a gamer. All those questions you posed are pretty much irrelevant for that.
If you want to argue from the developer perspective, that's fine. But I doubt there are many here that find that useful, and using those arguments to convince gamers to buy nVidia is quite nasty and deceitful.
His arguments seem more than fair to me. For the better part of the last ~12 years(?) Nvidia has been the top dog for a reason. I find it strange that people are acting like AMD has always been on the top for whatever reason because they have finally put out some good CPU's. Their GPU's have always played second fiddle to Nvidia. I'm glad to see AMD putting pressure on NV, but it's like with this leak people are forgetting that the 2080ti is over 1.5 years old. Which is ancient by technology standards. So for them to have a card that can beat a the top of line 1.5 year old GPU doesn't nearly seem as exciting to me as to maybe some of the other posters on here. If true, it's great news for everybody overall, but it's not like NV will be sitting on their laurels and not have something in the pipeline to replace their lineup this year. AMD's biggest problem of the last 2(3?) cycles of GPU's is that they compete sure, but they compete only after coming out 7+ months after the cards they are competing with. They are constantly late to the party.

I'll say that I DO hold some excitement though that AMD has seemingly turned their development issues around and are finally competing on timelines that make NV have to work harder as NV has become pretty brazen in their pricing structure as there hasn't been ANY competition for them to have to worry about in any significant matter. Overall it's a win for everyone that AMD does well, but it's not like they've dethroned NV on any GPU related releases yet. I look forward to see how much this seemingly new GPU war will benefit the consumer on average overall. I'm in the market for a new GPU as my even more ancient 1080ti still plays everything these days super well and I want a good reason to get a new GPU, which RTX just simply didn't provide.
 
Have you ever used an AMD board or Nvidia board for graphics development? There is a reason why Nvidia has most of the market share in nearly every aspect of GPU computing. Turing is at least 50% faster than Pascal. Their boards are verifiable leaps every 2yrs. We are about to build a mini-GPU server for our simulations that will contain over 50 GPUs all Nvlinked together for the massive RAM boost. That is just not available from AMD. I have NO reason to believe Nvidia will come through on their quotes because they've proven it in the past.

I think you should ask yourself, what's in it for you? If RDNA 2.0 comes out and is indeed faster than the 2080Ti, how does that relate to a console? I've already stated my case for being biased towards Nvidia products. What's your intentions for cheering on AMD?

They're single handedly keeping me a pc gamer by having shit that is semi-affordable? I literally laugh at nvidia's entire product stack. They're either raped at the low end, or they're raping ME at the mid to high. Nah, I'm good bruv.

Anyone who pays over £400 for a mid range GPU is a cunt and has ruined this hobby.

I hope AMD just leaves the high-end sector as it is. Some people want AMD to "compete" only to have cheaper green cards...no interest to buy red at all. Some "fine" examples around here already.


Let nvidia gauge the fools even more with higher prices.


They won't leave, they'll just up their own prices to hit margin and slot in neatly between nvidia products. We're already seeing it... Not suspicious at all.

We'll reach the point that CPU's were at for nearly a decade. Overpriced, incremental bullshit upgrades, people moving to the used market and consoles or just not upgrading at all.
 
They're either raped at the low end, or they're raping ME at the mid to high. Nah, I'm good bruv.

Anyone who pays over £400 for a mid range GPU is a cunt and has ruined this hobby.

Overpriced, incremental bullshit upgrades, people moving to the used market and consoles or just not upgrading at all.
You're not wrong, but the reason NV has been massively over-inflating their cards prices is because they haven't had ANY competition for a very long time. NV's "high end" 3 years ago was ~$800, the same "high end" 1.5 years ago was $1200. It's only recently that they've been forced to compete which is evident with their 'Super' series having $50 price reduction pretty much across the board with the release of AMD's 5700. $1200 for NV's high end was an obvious money grub and really felt like a slap in the face considering the performance delta wasn't raised nearly as high as it should have considering the massive price hike. People that are against AMD doing well are working against their best interest as them doing well only creates a healthy marketplace. I want to upgrade my GPU as it's starting to lose the performance on newer releases that I loved having, but there hasn't been a compelling reason to in the last 2 years. I'm hoping that changes here soon.

WHATEVER GODDAMN CARD GIVES ME THE BEST PERFORMANCE FOR CYBERPUNK 2077 maxed out is what I want.

These mufucka's had better fight for my money:

giphy.gif
 
Last edited:

Ascend

Member


Cant wait to hear more details. It looks like AMD will finally beat 2080ti, and maybe even compete with high end Ampere series.

Looking good... Hopefully it's as good as it sounds. I already know I'll be getting an RDNA2 card. My R9 Fury while still doing the job, is getting close to needing an upgrade, particularly due to the amount of VRAM.

His arguments seem more than fair to me. For the better part of the last ~12 years(?) Nvidia has been the top dog for a reason. I find it strange that people are acting like AMD has always been on the top for whatever reason because they have finally put out some good CPU's. Their GPU's have always played second fiddle to Nvidia. I'm glad to see AMD putting pressure on NV, but it's like with this leak people are forgetting that the 2080ti is over 1.5 years old. Which is ancient by technology standards. So for them to have a card that can beat a the top of line 1.5 year old GPU doesn't nearly seem as exciting to me as to maybe some of the other posters on here. If true, it's great news for everybody overall, but it's not like NV will be sitting on their laurels and not have something in the pipeline to replace their lineup this year. AMD's biggest problem of the last 2(3?) cycles of GPU's is that they compete sure, but they compete only after coming out 7+ months after the cards they are competing with. They are constantly late to the party.
Was the R9 290X late?

I'll say that I DO hold some excitement though that AMD has seemingly turned their development issues around and are finally competing on timelines that make NV have to work harder as NV has become pretty brazen in their pricing structure as there hasn't been ANY competition for them to have to worry about in any significant matter. Overall it's a win for everyone that AMD does well, but it's not like they've dethroned NV on any GPU related releases yet. I look forward to see how much this seemingly new GPU war will benefit the consumer on average overall. I'm in the market for a new GPU as my even more ancient 1080ti still plays everything these days super well and I want a good reason to get a new GPU, which RTX just simply didn't provide.
Fair enough.
 
Was the R9 290X late?

My memory is a bit hazy and I'm pretty sure I had a 780 at the time, but yeah, I'd concede to your point, they weren't really late then. Looking up old comparison's it seems like the 290x was competing with the 780ti at the time, which at the time the 290x did beat that card mostly, it coming with a higher clock speed and more memory at a lower price made it the better buy. I guess there were a few outlier moments where AMD did compete, but then they'd be blown away very soon after if memory serves me correct. Hell, looking back I can't believe I went from the GTX 680 to the 780. What a waste of money.

I can say that I personally have nothing against AMD, the last build I did for my wife I used AMD hardware as performance/price is much better than what Intel was offering. I've not bought a AMD GPU in a very very long time as when they finally did get around to competing I would already have an NV card in my system and whenever I would come around on an upgrade cycle, they'd be behind whatever NV was currently offering. Maybe I just kinda fell into the cracks where my upgrade cycles never matched AMD's releases, but it always seemed to me that their 'flagships' always felt late to the party.
 
Last edited:

Xyphie

Member
GTX Titan came out like 6 months before 290X at $1000. When 290X came out at $550 nVidia instantly dropped the price on GTX 780 to $500 from $650 and released the 780 Ti at $700 which was ~10% faster than 290X.
 

Ascend

Member
My memory is a bit hazy and I'm pretty sure I had a 780 at the time, but yeah, I'd concede to your point, they weren't really late then. Looking up old comparison's it seems like the 290x was competing with the 780ti at the time, which at the time the 290x did beat that card mostly, it coming with a higher clock speed and more memory at a lower price made it the better buy. I guess there were a few outlier moments where AMD did compete, but then they'd be blown away very soon after if memory serves me correct. Hell looking back I can't believe I went from the GTX 680 to the 780. What a waste of money.

I can say that I personally have nothing against AMD, the last build I did for my wife I used AMD hardware as performance/price is much better than what Intel was offering. I've not bought a AMD GPU in a very very long time as when they finally did get around to competing I would already have an NV card in my system and whenever I would come around on an upgrade cycle, they'd be behind whatever NV was currently offering. Maybe I just kinda fell into the cracks where my upgrade cycles never matched AMD's releases, but it always seemed to me that their 'flagships' always felt late to the party.
You're right that the general trend goes this way at this point. It wasn't like that in the past though.

But I'm glad you acknowledged that the upgrade from the GTX 680 to 780 was a waste of money... Even nowadays a lot of people are doing exactly that with nVidia cards, which is why nVidia has deep pockets that AMD simply cannot compete with. Most AMD users don't have a yearly upgrade cycle mentality, while many nVidia users do. If not, it's two years for many.

I for one buy a single GPU every four years at most. And I haven't bought nVidia since I was screwed over with their GeForce4 MX440.
 
Last edited:
Top Bottom