• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA/AMD (Ampere v RDNA2) war games - what the hell is going on?

supernova8

Banned
So we've had Ampere revealed, we roughly know what to expect.

RTX 3090 (from $1499) - complete overkill SKU.
RTX 3080 (from $699) - NVIDIA claims 2x performance over 2080 Ti (2080 not Ti, thanks to those who corrected). Looks like an over-promise but 70-80% gain is believable from the numbers.
RTX 3070 (from $499) - NVIDIA claims better than the 2080 Ti, probably as long as you don't go above 1440p.

On the AMD/RDNA2/Big Navi side we still don't really know.

We've somewhat of a polarisation between tech youtubers/leakers with some people thinking Ampere has killed AMD chances, others thinking AMD is quietly waiting to pounce.

On the understanding that, at this moment in time, we really have no idea, let's think about what could happen in a few months time. I'll go first.

I think AMD will release a 16GB GPU that destroys the 3070 but doesn't quite beat the 3080. I think they'll release it for $499.
It would be crazy if they also released a 24GB or 32GB (is that even possible?) variant for around $899.


It's perfectly within the realms of possibility that NVIDIA then releases a 3070 Super/Ti later on with 16GB and higher clocks (it's not quite as power hungry as the 3080/90).

Question for everyone:

Why do we think NVIDIA has been so aggressive on pricing?
 
Last edited:

MH3M3D

Member
I'm disappointed with the 3000 cards' low VRAM and high power usage. I think AMD will take advantage of that.
They will probably beat the 3070 and have another card that competes with the 3080. Nvidia will respond with Ti versions and AMD will then lower its prices.
The real winners will be us gamers.
 
Last edited:
RTX 3080 (from $699) - NVIDIA claims 2x performance over 2080 Ti. Looks like an over-promise but 70-80% gain is believable from the numbers.

Actually they claim "up to 2 times" performance of a 2080, not a 2080ti.

In real world scenarios the performance obviously will not match 2x a 2080. As with any early claim or marketing slide, no matter who it is from always take it with a grain of salt until we have real benchmarks.

Regarding the 3080 vs 2080 for Doom Eternal at 4K Nightmare settings that Nvidia showed, those settings rely on a large amount of VRAM to not be bottlenecked. The 2080 only has 8GB VRAM, which means it does not perform as well at 4K in this title as even the 1080ti which has 11GB VRAM. Yes you heard that right, the 1080ti beats out the 2080 at those settings in Doom by around 10+fps despite the 2080 being a more powerful card due to the extra VRAM.

This is why Nvidia chose to compare this game with these settings for the 2080 (8GB) vs 3080 (10GB) face off knowing it would make the 3080 look far better than the 2080. People would then extrapolate that out as a baseline performance increase for all titles vs a 2080 and assume a massive power jump. Great marketing certainly, and the 3080 is definitely a nice jump over the 2080 but had they chosen a 2080ti instead (which has more VRAM) for the comparison the results would have been a lot closer. 2080ti averages around 140fps in Doom Eternal with those settings I believe.

Anyway sorry to sidetrack the thread on this but I think it is important to set the record straight on any incorrect information and also contextualize the only "real world" scenario/comparison we have seen with the 3080. I'm not saying it is not a powerful card, I think the 3000 series seem great so far (minus the low VRAM) but trying to bring people back down to earth.

As for your question: "Why do we think NVIDIA has been so aggressive on pricing?"

Simple, they likely know that AMD is cooking up something reasonably competitive with RDNA2, something that can likely compete at the 3080 (flagship high end) level. They saw how AMD got their shit together and challenged them at the mid range with RDNA1 (5700XT etc..) and they don't want to get Intel'd essentially. This is definitely good for gamers, even the most hardcore green blooded Nvidia fanboys can't see a problem with this as (at least for now perceived) competition from AMD has caused Nvidia to agressively drop prices. If Radeon group was still a shitshow then the 3000 series would likely look/be priced a bit different to what we see today.
 

llien

Member
RTX 3080 (from $699) - NVIDIA claims 2x performance over 2080 Ti.
Nvidia.
Claimed.
2x performance. (up to, lol, and even shills like DF figure 1.6-1.9 so far)
Over 2080.

As for your question: "Why do we think NVIDIA has been so aggressive on pricing?"

Simple, they likely know that AMD is cooking up something reasonably competitive with RDNA2, something that can likely compete at the 3080 (flagship high end) level.
Most likely case, imo.
 
Last edited:

nochance

Banned
The pricing seems aggressive in comparison to Turing, but it is in line with Pascal.
Nvidia is doing the same thing they always did, Turing was the exception, AMD can't really compete for the last 5-8 years.
 

llien

Member
because we will see a 3080ti and the 3090 is just a ridiculously high margin performance showcase for the 0.001% that are dumb enough to fall for it.

How does "let me make it $500 more expensive" align with "people weren't buying" please?

It does not. AMD won't threaten that product is explaining it. Or "I don't really want to sell those, it's just for claiming the halo product".
 
How does "let me make it $500 more expensive" align with "people weren't buying" please?
The 3090 is the new Titan...
That thing is completely and utterly irrelevant for their sales numbers. It´s just "performance-crown" image management.....

3070, and the inevitable 3050/60 as well as the respective Ti versions is where they make the sales numbers.
As usual I might add.

It might surprise you, but most people would never consider spending 700+$ on a GPU.
 
Last edited:

pawel86ck

Banned
Actually they claim "up to 2 times" performance of a 2080, not a 2080ti.

In real world scenarios the performance obviously will not match 2x a 2080. As with any early claim or marketing slide, no matter who it is from always take it with a grain of salt until we have real benchmarks.

Regarding the 3080 vs 2080 for Doom Eternal at 4K Nightmare settings that Nvidia showed, those settings rely on a large amount of VRAM to not be bottlenecked. The 2080 only has 8GB VRAM, which means it does not perform as well at 4K in this title as even the 1080ti which has 11GB VRAM. Yes you heard that right, the 1080ti beats out the 2080 at those settings in Doom by around 10+fps despite the 2080 being a more powerful card due to the extra VRAM.

This is why Nvidia chose to compare this game with these settings for the 2080 (8GB) vs 3080 (10GB) face off knowing it would make the 3080 look far better than the 2080. People would then extrapolate that out as a baseline performance increase for all titles vs a 2080 and assume a massive power jump. Great marketing certainly, and the 3080 is definitely a nice jump over the 2080 but had they chosen a 2080ti instead (which has more VRAM) for the comparison the results would have been a lot closer. 2080ti averages around 140fps in Doom Eternal with those settings I believe.

Anyway sorry to sidetrack the thread on this but I think it is important to set the record straight on any incorrect information and also contextualize the only "real world" scenario/comparison we have seen with the 3080. I'm not saying it is not a powerful card, I think the 3000 series seem great so far (minus the low VRAM) but trying to bring people back down to earth.

As for your question: "Why do we think NVIDIA has been so aggressive on pricing?"

Simple, they likely know that AMD is cooking up something reasonably competitive with RDNA2, something that can likely compete at the 3080 (flagship high end) level. They saw how AMD got their shit together and challenged them at the mid range with RDNA1 (5700XT etc..) and they don't want to get Intel'd essentially. This is definitely good for gamers, even the most hardcore green blooded Nvidia fanboys can't see a problem with this as (at least for now perceived) competition from AMD has caused Nvidia to agressively drop prices. If Radeon group was still a shitshow then the 3000 series would likely look/be priced a bit different to what we see today.
There's is comparison with 2080ti in doom eternal. RTX 3080 is 50% (up to 65%) faster and 2080ti has 1GB more VRAM. It looks like RTX 3080 will be indeed around 80% faster than standard RTX 2080, and probably more with RT.

You say RDNA2 GPU will be competitive, but Ampere will be already much cheaper compared to Turing and even if RDNA2 will match Ampere raster performance Nvidia will still have the upper hand (DLSS, better compatibility with exsiting RT games).
 
People need to stop with this "NVIDIA is aggressive with prices because they know AMD is cooking up something good".

That reasoning is stupid and people made the same one regarding the 980 Ti and 1080 Ti, AMD ended up having no answer.

They might or might not have something great in the pipeline, NVIDIA being aggressive with prices isn’t indication of that at all.
 
There's is comparison with 2080ti in doom eternal. RTX 3080 is 50% (up to 65%) faster and 2080ti has 1GB more VRAM.

Yes it seems Nvidia have recently released a new comparison video with the 2080ti for Doom Eternal. Granted we don't know what settings/drivers etc... they are using and as with anything we will need to wait for independent benchmarks to see the real performance/make an informed comparison. But that looks like a nice uplift for Doom.

It looks like RTX 3080 will be indeed around 80% faster than standard RTX 2080, and probably more with RT.

I think 60-70% might be a more realistic figure? Either way an impressive uplift, but again we will need to wait for real benchmarks and comparisons to see how it plays out across a number of titles. Of course with RT on I expect a bigger uplift than pure rasterization as the RT gains for Ampere seem great compared to Turing.

You say RDNA2 GPU will be competitive, but Ampere will be already much cheaper compared to Turing and even if RDNA2 will match Ampere raster performance Nvidia will still have the upper hand (DLSS, better compatibility with exsiting RT games).

When I say competitive I mean roughly equal/higher rasterization performance, ya know the actual "power" of a card we have always compared. No matter what anyone says this is still and will always remain the most important metric regarding a cards performance. At the moment there are thousands of games on steam for example that don't use either RT or DLSS and there will continue to be newer releases that don't support either.

I find it interesting that for Nvidia fans, or just people swept up in the current 3000 series hype train the narrative seems to continue shifting as new rumours come out about what RDNA2 might bring.

First the narrative was "Lol AMD would be lucky to compare to a 3070 with their highest card! Expect mid range performance from AMD as usual!". Then as rumours come out stating that RDNA2 might actually be competitive or might even exceed 3070/3080 in power, then suddenly the narrative shifts to "Well even if they do match/exceed in rasterization, that is not important anymore! Because it is all about RT and DLSS!!".

RDNA2 cards will have Ray Tracing. How performant this will be is anyone's guess. Current rumours seem to suggest better than Turing, but probably not as good as Ampere. So let's assume for now that a best case scenario for RDNA2 would be somewhere between 60-80% as performant as Ampere at RT. That is still pretty solid, especially for a first generation RT product. And dare I say it reasonably competitive? Now maybe it could also be much worse than that figure? Hard to say until we have any actual benchmarks.

As for DLSS, it seems like a fantastic technology. Nvidia's R&D teams have come up with something really great here and I really like to see them continue to advance it. Is it the be all and end all of GPU/Graphics technology? Of course not. Is it still great for games that support it? Yeah it seems like a great feature.

Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this: Fake is never better than Real.

DLSS is not perfect, introduces some artifacts and obviously is not as good as native res. Having said that is it "close enough" in a lot of cases? It seems like it might be. I say "Might be" because I have not used it myself so can't comment from first hand experience but most reports have it performing quite well for a small reduction in image quality with a few scaling artifacts here and there.

Now here is the rub, there are only around 10 games that support DLSS right now. Out of the thousands on PC. Most of those recent and older games will never support DLSS. Even a lot of newer games probably won't. The reason is that so far it has to be implemented on a game by game basis, rather than something that can be applied to all games. So trying to use DLSS as some kind of trump card when it is barely supported as of now is a bit silly.

People also seem to not understand that this is not something you can just automatically turn on and it just works for all games. So when people try to downplay actual power (rasterization) and instead hype DLSS as more important when it has a tiny handful of bespoke supported games this is just silly and won't matter to 90% of people and so far 99% of games. Could it become more important/supported in the future? Sure and if almost every game supports DLSS in 4 years time then I'll eat my crow.

Will AMD have something to compete/compare with DLSS? So far I haven't heard anything about it so we can probably safely guess it is unlikely. Could they have something in the future either through software updates or later cards? Who knows, maybe. Again it depends on how much DLSS continues to improve/is adopted by developers. Right now anyway I wouldn't base my GPU purchase solely on DLSS/AI Upscaling as the deciding factor or most important feature.

Do I think AMD will be competitive on price? Almost definitely. It has generally been their strategy to undercut Nvidia so I expect either slightly cheaper or worst case scenario equal pricing for equivalent power tier from Nvidia. I'm not really trying to downplay Nvidia here and upsell AMD, not even trying to pop the hype bubble for Nvidia/3000 series fans, my only goal is to try to reign in expectations a little and bring everyone back down to earth is all. Also correcting false information that we often see continuously such as "Nvidia claimed 2x performance of a 2080ti!" etc.

As with anything, I could be wrong about everything above. Most of what we know about Ampere right now is marketing without any real independent benchmarks and we only have varying rumours about RDNA2, sometimes contradictory. The best and most measured course of action is to wait until both release and have actual benchmarks to compare them.
 

pawel86ck

Banned
Yes it seems Nvidia have recently released a new comparison video with the 2080ti for Doom Eternal. Granted we don't know what settings/drivers etc... they are using and as with anything we will need to wait for independent benchmarks to see the real performance/make an informed comparison. But that looks like a nice uplift for Doom.



I think 60-70% might be a more realistic figure? Either way an impressive uplift, but again we will need to wait for real benchmarks and comparisons to see how it plays out across a number of titles. Of course with RT on I expect a bigger uplift than pure rasterization as the RT gains for Ampere seem great compared to Turing.



When I say competitive I mean roughly equal/higher rasterization performance, ya know the actual "power" of a card we have always compared. No matter what anyone says this is still and will always remain the most important metric regarding a cards performance. At the moment there are thousands of games on steam for example that don't use either RT or DLSS and there will continue to be newer releases that don't support either.

I find it interesting that for Nvidia fans, or just people swept up in the current 3000 series hype train the narrative seems to continue shifting as new rumours come out about what RDNA2 might bring.

First the narrative was "Lol AMD would be lucky to compare to a 3070 with their highest card! Expect mid range performance from AMD as usual!". Then as rumours come out stating that RDNA2 might actually be competitive or might even exceed 3070/3080 in power, then suddenly the narrative shifts to "Well even if they do match/exceed in rasterization, that is not important anymore! Because it is all about RT and DLSS!!".

RDNA2 cards will have Ray Tracing. How performant this will be is anyone's guess. Current rumours seem to suggest better than Turing, but probably not as good as Ampere. So let's assume for now that a best case scenario for RDNA2 would be somewhere between 60-80% as performant as Ampere at RT. That is still pretty solid, especially for a first generation RT product. And dare I say it reasonably competitive? Now maybe it could also be much worse than that figure? Hard to say until we have any actual benchmarks.

As for DLSS, it seems like a fantastic technology. Nvidia's R&D teams have come up with something really great here and I really like to see them continue to advance it. Is it the be all and end all of GPU/Graphics technology? Of course not. Is it still great for games that support it? Yeah it seems like a great feature.

Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this: Fake is never better than Real.

DLSS is not perfect, introduces some artifacts and obviously is not as good as native res. Having said that is it "close enough" in a lot of cases? It seems like it might be. I say "Might be" because I have not used it myself so can't comment from first hand experience but most reports have it performing quite well for a small reduction in image quality with a few scaling artifacts here and there.

Now here is the rub, there are only around 10 games that support DLSS right now. Out of the thousands on PC. Most of those recent and older games will never support DLSS. Even a lot of newer games probably won't. The reason is that so far it has to be implemented on a game by game basis, rather than something that can be applied to all games. So trying to use DLSS as some kind of trump card when it is barely supported as of now is a bit silly.

People also seem to not understand that this is not something you can just automatically turn on and it just works for all games. So when people try to downplay actual power (rasterization) and instead hype DLSS as more important when it has a tiny handful of bespoke supported games this is just silly and won't matter to 90% of people and so far 99% of games. Could it become more important/supported in the future? Sure and if almost every game supports DLSS in 4 years time then I'll eat my crow.

Will AMD have something to compete/compare with DLSS? So far I haven't heard anything about it so we can probably safely guess it is unlikely. Could they have something in the future either through software updates or later cards? Who knows, maybe. Again it depends on how much DLSS continues to improve/is adopted by developers. Right now anyway I wouldn't base my GPU purchase solely on DLSS/AI Upscaling as the deciding factor or most important feature.

Do I think AMD will be competitive on price? Almost definitely. It has generally been their strategy to undercut Nvidia so I expect either slightly cheaper or worst case scenario equal pricing for equivalent power tier from Nvidia. I'm not really trying to downplay Nvidia here and upsell AMD, not even trying to pop the hype bubble for Nvidia/3000 series fans, my only goal is to try to reign in expectations a little and bring everyone back down to earth is all. Also correcting false information that we often see continuously such as "Nvidia claimed 2x performance of a 2080ti!" etc.

As with anything, I could be wrong about everything above. Most of what we know about Ampere right now is marketing without any real independent benchmarks and we only have varying rumours about RDNA2, sometimes contradictory. The best and most measured course of action is to wait until both release and have actual benchmarks to compare them.
AMD wasnt competitive in high end since 7970 Tahiti, and I doubt things will change now, especially what we already know about RDNA2 (Xbox Series X use it) / Ampere. When Turing launched Nvidia had to train AI for each game specifically in order to use DLSS 1.0, but now they dont need to do that, so we will most likely see many games supporting 2.0 from now on.
 
Last edited:
AMD wasnt competitive in high end since 7970 Tahiti, and I doubt things will change now, especially what we already know about RDNA2 (Xbox Series X use it) / Ampere. DLSS 1.0 Nvidia had to train AI for each game specifically, now they dont need to do that, so we will most likely see many games supporting 2.0 from now on.
False. R9 290X was competitive with the 780 Ti but generally weaker at launch. Down the line it pulled away but that was mainly due to continued driver support.

It did run hot and was loud as heck though.
 

supernova8

Banned
AMD wasnt competitive in high end since 7970 Tahiti, and I doubt things will change now, especially what we already know about RDNA2 (Xbox Series X use it) / Ampere. When Turing launched Nvidia had to train AI for each game specifically in order to use DLSS 1.0, but now they dont need to do that, so we will most likely see many games supporting 2.0 from now on.

Where did they say that they don't need to train games specifically for it? There's still a DLSS developer program so I don't think it's (yet) true to say DLSS 2.0 'just works'. It still has to be 'implemented'.
 
Nvidia had to train AI for each game specifically in order to use DLSS 1.0, but now they dont need to do that, so we will most likely see many games supporting 2.0 from now on.

Is this true? I was under the impression that DLSS 2.0 is not a feature you can turn on in the driver level/control panel but something that still had to be trained/implemented on a per game basis. I could be wrong about that but as far as I was aware it still needs to be trained per game, no?

Could someone with more knowledge than me about DLSS chime in here to confirm? My understanding was that not training per game was the "end goal" of where they would like to eventually be with a DLSS 3.0 or something like that. Again I could be wrong.
 

Leonidas

Member
Next gen seems to be going like this

Nvidia still performance leader.
Nvidia still RT leader.
Nvidia still features leader.
Nvidia still perf/watt leader, but not by much.

So let's assume for now that a best case scenario for RDNA2 would be somewhere between 60-80% as performant as Ampere at RT. That is still pretty solid, especially for a first generation RT product. And dare I say it reasonably competitive?

60-80% is not competitive. If that's the case Nvidia will take the top 3-5 spots in every RT benchmark making anything other than RTX a bad buy if you care about RT performance.

Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this: Fake is never better than Real.

DLSS is not perfect, introduces some artifacts and obviously is not as good as native res. Having said that is it "close enough" in a lot of cases? It seems like it might be. I say "Might be" because I have not used it myself so can't comment from first hand experience but most reports have it performing quite well for a small reduction in image quality with a few scaling artifacts here and there.

DLSS 2.0 actually does look better than native 4K TAA. I say this as someone who has actually used the technology in a number of games.

Now here is the rub, there are only around 10 games that support DLSS right now. Out of the thousands on PC. Most of those recent and older games will never support DLSS. Even a lot of newer games probably won't. The reason is that so far it has to be implemented on a game by game basis, rather than something that can be applied to all games. So trying to use DLSS as some kind of trump card when it is barely supported as of now is a bit silly.

One of DLSS goals is to boost RT performance. DLSS is in 100% of games that released with RT so far, if this continues it will be in the games that matter.
 
Last edited:

supernova8

Banned
Actually they claim "up to 2 times" performance of a 2080, not a 2080ti.

In real world scenarios the performance obviously will not match 2x a 2080. As with any early claim or marketing slide, no matter who it is from always take it with a grain of salt until we have real benchmarks.

Regarding the 3080 vs 2080 for Doom Eternal at 4K Nightmare settings that Nvidia showed, those settings rely on a large amount of VRAM to not be bottlenecked. The 2080 only has 8GB VRAM, which means it does not perform as well at 4K in this title as even the 1080ti which has 11GB VRAM. Yes you heard that right, the 1080ti beats out the 2080 at those settings in Doom by around 10+fps despite the 2080 being a more powerful card due to the extra VRAM.

This is why Nvidia chose to compare this game with these settings for the 2080 (8GB) vs 3080 (10GB) face off knowing it would make the 3080 look far better than the 2080. People would then extrapolate that out as a baseline performance increase for all titles vs a 2080 and assume a massive power jump. Great marketing certainly, and the 3080 is definitely a nice jump over the 2080 but had they chosen a 2080ti instead (which has more VRAM) for the comparison the results would have been a lot closer. 2080ti averages around 140fps in Doom Eternal with those settings I believe.

Anyway sorry to sidetrack the thread on this but I think it is important to set the record straight on any incorrect information and also contextualize the only "real world" scenario/comparison we have seen with the 3080. I'm not saying it is not a powerful card, I think the 3000 series seem great so far (minus the low VRAM) but trying to bring people back down to earth.

As for your question: "Why do we think NVIDIA has been so aggressive on pricing?"

Simple, they likely know that AMD is cooking up something reasonably competitive with RDNA2, something that can likely compete at the 3080 (flagship high end) level. They saw how AMD got their shit together and challenged them at the mid range with RDNA1 (5700XT etc..) and they don't want to get Intel'd essentially. This is definitely good for gamers, even the most hardcore green blooded Nvidia fanboys can't see a problem with this as (at least for now perceived) competition from AMD has caused Nvidia to agressively drop prices. If Radeon group was still a shitshow then the 3000 series would likely look/be priced a bit different to what we see today.

Yeah my bad I got it wrong on the 2080 Ti. Still as you say, it didn't really hit 2x performance even in the cherry-picked DF benchmarks.

With RDNA1 (5700XT) I agree they actually competed but it seems that (if the Steam hardware survey stuff is correct) hardly anyone bought into the 5700XT. I guess the difference between Intel and NVIDIA is that people were annoyed Intel was ripping them off with miniscule improvements while charging more and more, whereas NVIDIA has been charging more and actually providing better performance.

Not quite as stellar as if it could be if they'd had real competition, but I think for instance the GTX 1060 was a reasonable card for a reasonable price and AMD still wasn't really competitive at the time.

I'm with you though I really hope for everyone's sake that AMD brings the heat.
 

supernova8

Banned
People need to stop with this "NVIDIA is aggressive with prices because they know AMD is cooking up something good".

That reasoning is stupid and people made the same one regarding the 980 Ti and 1080 Ti, AMD ended up having no answer.

They might or might not have something great in the pipeline, NVIDIA being aggressive with prices isn’t indication of that at all.

Those cards were the prelude to NVIDIA taking the piss, culminating in them taking a massive hot yellow stinky piss with the RTX series.
 

supernova8

Banned
The 3090 is the new Titan...
That thing is completely and utterly irrelevant for their sales numbers. It´s just "performance-crown" image management.....

3070, and the inevitable 3050/60 as well as the respective Ti versions is where they make the sales numbers.
As usual I might add.

It might surprise you, but most people would never consider spending 700+$ on a GPU.

True most I've spent (in recent memory) has been about $350. I would be willing to go up to $450 for a 3070 with 16GB or an AMD equivalent.
 

Ascend

Member
People need to stop with this "NVIDIA is aggressive with prices because they know AMD is cooking up something good".

That reasoning is stupid and people made the same one regarding the 980 Ti and 1080 Ti, AMD ended up having no answer.

They might or might not have something great in the pipeline, NVIDIA being aggressive with prices isn’t indication of that at all.
Actually it makes perfect sense. For every of those cards you mentioned, AMD had something coming. Whether AMD ultimately delivered or not is another story.

For the 980 Ti, there was the Fury X. The Fury X was quite good, except it was hampered by 4GB.
For the 1080 Ti there was the Vega 64. It ended up having poor CU scaling and be power hungry.
For the RTX 2000 series, nVidia knew AMD had nothing, and look what happened to prices.
The moment the 5700XT was on the Horizon, nVidia immediately answered with lowered prices and higher performance with their Super cards. And that was only in the areas where AMD had something. The rest stayed overpriced.

Every time AMD has something, nVidia becomes more aggressive in pricing. They do not know how well it will perform, but they do their best to stay ahead of AMD at all cost. And this time, there is the factor of nVidia missing out on the node they actually wanted to use, which was TSMC 7nm. So nVidia knows they are at a slight disadvantage here. Not because of the node alone, but because of the advances of RDNA. Going by the prices, nVidia expects AMD to compete with the RTX 3080, but not the RTX 3090.

If nVidia did not think AMD had nothing to compete, I wouldn't be surprised if we would have gotten double the prices that we have now for the RTX 3000 series.
 
Last edited:
I would like to revisit this thread in about 1 months time, after it's revealed AMD have a competitor to the 3080. Then the argument against them will be 100% all on DLSS and RT being the deciding factor, and AMD is still not doing enough despite more memory and lower power draw/better efficiency.
 

pawel86ck

Banned
Is this true? I was under the impression that DLSS 2.0 is not a feature you can turn on in the driver level/control panel but something that still had to be trained/implemented on a per game basis. I could be wrong about that but as far as I was aware it still needs to be trained per game, no?

Could someone with more knowledge than me about DLSS chime in here to confirm? My understanding was that not training per game was the "end goal" of where they would like to eventually be with a DLSS 3.0 or something like that. Again I could be wrong.
You cant turn it on in each game just yet, but Nvidia is no longer relying on individual (per-game) neural networks, but single generic neural network, so we can expect DLSS 2.0 support in many games because it will be much easier (and less expensive) to implement.

 
Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this: Fake is never better than Real.

DLSS is not perfect, introduces some artifacts and obviously is not as good as native res.
DLSS 2.0 is already at the point where the picture is absolutely indistinguishable from the native resolution as long as you don`t do a thorough picture analysis with several 100% of zoom. In some cases like Death Stranding you can even see more detail than with native res
150-200% the performance for a picture where the differences aren´t visible without 4x zoom...... how can anyone even try to downplay this?


RDNA2 cards will have Ray Tracing. How performant this will be is anyone's guess. Current rumours seem to suggest better than Turing, but probably not as good as Ampere. So let's assume for now that a best case scenario for RDNA2 would be somewhere between 60-80% as performant as Ampere at RT. That is still pretty solid, especially for a first generation RT product. And dare I say it reasonably competitive?
How would such a performance gap in any form be competitive? This would only be competitive if the price was also substantially lower.


When I say competitive I mean roughly equal/higher rasterization performance, ya know the actual "power" of a card we have always compared. No matter what anyone says this is still and will always remain the most important metric regarding a cards performance. At the moment there are thousands of games on steam for example that don't use either RT or DLSS and there will continue to be newer releases that don't support either.
Trying to downplay the importance of RT performance which is about to become a standard feature this november is absolutely silly....

Now here is the rub, there are only around 10 games that support DLSS right now. Out of the thousands on PC. Most of those recent and older games will never support DLSS. Even a lot of newer games probably won't. The reason is that so far it has to be implemented on a game by game basis, rather than something that can be applied to all games. So trying to use DLSS as some kind of trump card when it is barely supported as of now is a bit silly.
DLSS 2.0 does not require a special ML-network per game anymore.
We can absolutely expect this to be in all bigger releases from now on.
 
Last edited:
Actually it makes perfect sense. For every of those cards you mentioned, AMD had something coming. Whether AMD ultimately delivered or not is another story.
Yeah no shit. AMD had "something coming". Almost as if AMD were dealing in semi-conductors with graphics cards as one of their primary products.

AMD sure has something coming, we all know they do. They haven't been secretive about it. The argument that NVIDIA is aggressive with prices because they fear them is dumb and based on precedents, completely false.

As said before, the reason I believe AMD will deliver this time is purely based on the consoles. If they can give us 2080-Class power in sub-200W with 52 CU's then I have no doubt a full chip will at least crush the 3070.
 

llien

Member
There's is comparison with 2080ti in doom eternal. RTX 3080 is 50% (up to 65%) faster
Bait for wenchmarks from reputable sources. There are older videos of 2080Ti running that game notably faster.
I still wonder why TPU @btarunr claimed it was 20-30%.

no longer relying on individual (per-game) neural networks
Mean's true ML approach (DLSS 1.0, per game trained NN that failed against AMD's FidelityFX, which also runs on NV, by the way) has fluked.
And now all that is left is TAA on steroids, sprinkled with buzzwords.

I wonder how long would it take hackers to grab the NN weights and figure the topology.

the picture is absolutely indistinguishable from the native resolution as long as you don`t do a thorough picture analysis with several 100% of zoom.

And that does not, with the same reservation, apply to normal upscalers, right?
As in I can tell if it is 4k, 1080p or 1440p when watching 65" from 4.5 meter distance.

The argument you are making is against higher resolution in general.
 
Last edited:

ZywyPL

Banned
I think the reason are the upcoming consoles, with expected 499-599$ price tags while supporting 4K, RT, HDR, and what's not, NV would have a really hard time selling 800/1200/2000$ cards to anyone except die-hard PC enthusiasts. Plus the die sizes are much lower this time around, by about 20%, that's where the Turings ridiculous prices mostly came from.


Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this: Fake is never better than Real.

And yet it is, without any aliasing in sight like the native image does, or any blurring caused by post-processing AA. Sure, it's not perfect (yet), it can cause some strange artifacts sometimes as seen in Death Stranding (maybe that's the game's fault not the tech?), but at the end of the day, you are getting super clear, razor-sharp image, with double the framerate, so needless to say, so far nothing better has been been discovered. Maybe except supersampling, like rendering in 8K on a 4K display, sure it eats a ton of resources, but is completely flawless in return.
 

Ascend

Member
Yeah no shit. AMD had "something coming". Almost as if AMD were dealing in semi-conductors with graphics cards as one of their primary products.

AMD sure has something coming, we all know they do. They haven't been secretive about it. The argument that NVIDIA is aggressive with prices because they fear them is dumb and based on precedents, completely false.
Funny how you're simply dismissing the argument itself with a condescending attitude, and simply repeating the same old statement.

Can't talk to closed minds I guess. Believe whatever you want to.

As said before, the reason I believe AMD will deliver this time is purely based on the consoles. If they can give us 2080-Class power in sub-200W with 52 CU's then I have no doubt a full chip will at least crush the 3070.
At least you do have some sense.
 
And that does not, with the same reservation, apply to normal upscalers, right?
As in I can tell if it is 4k, 1080p or 1440p when watching 65" from 4.5 meter distance.

The argument you are making is against higher resolution in general.
the difference in picture clarity between a classic-upscaled image and a DLSS 2.0 reconstructed image is massive.
Test it in Control or Death Stranding and you will see what I mean.

So yes, if DLSS is available I would much rather use that than take the performance hit for native 4k for example. Especially if RT is involved.
 
Last edited:

pawel86ck

Banned
Bait for wenchmarks from reputable sources. There are older videos of 2080Ti running that game notably faster.
I still wonder why TPU @btarunr claimed it was 20-30%.


Mean's true ML approach (DLSS 1.0, per game trained NN that failed against AMD's FidelityFX, which also runs on NV, by the way) has fluked.
And now all that is left is TAA on steroids, sprinkled with buzzwords.

I wonder how long would it take hackers to grab the NN weights and figure the topology.
Nvidia has compared both GPUs in the same scenes, so you cant ask for more fair comparison. I'm sure it's possible to find less demanding benchmark locations where 2080ti will reach over 100fps as well, but performance delta should remain the same.

AMD fidelity FX CAS is just sharpening filter, you may as well turn on sharpening filter in your HDTV and call it fidelity CAS.
 
Funny how you're simply dismissing the argument itself with a condescending attitude, and simply repeating the same old statement.

Can't talk to closed minds I guess. Believe whatever you want to.
I'm dismissing the argument because it's been wrong two generations in a row. What's there to be said exactly? I'm being condescending because you didn't even bother addressing my argument and instead stated exactly why the belief that they'd be competitive based on price is stupid by bringing up garbage cards like the Fury X to prove a point.
 

psorcerer

Banned
Now here is the rub, there are only around 10 games that support DLSS right now.

I think MSFT will release DirectMLSS which will run on all vendors and is compatible with all DX12U games.

I was under the impression that DLSS 2.0 is not a feature you can turn on in the driver level/control panel but something that still had to be trained/implemented on a per game basis. I could be wrong about that but as far as I was aware it still needs to be trained per game, no?

Any temporal SS/AA solution needs motion vector data to look good. I.e. you still need to implement your game render engine in a specific way. And frankly all games that use TAA are good to implement DLSS. But it's indeed not a free feature you need to write code for it.

DLSS 2.0 actually does look better than native 4K TAA.

It's a fallacy. I.e. it may look much better if the developer was not accurate enough in their MIP levels for 4K. But overall it cannot be better by definition.
Only if DN is trained per game then it can look better than native, for obvious reasons.
 

psorcerer

Banned
I wonder how long would it take hackers to grab the NN weights and figure the topology.

Sadly the value of DNN is in the training set and not the topology. That's why startups are less and less competitive there.
NV have enough resources to make a great training set.
I'm not sure about AMD though.
But I do expect MSFT to train their DNNs as we speak.
 
It's a fallacy. I.e. it may look much better if the developer was not accurate enough in their MIP levels for 4K. But overall it cannot be better by definition.
Only if DN is trained per game then it can look better than native, for obvious reasons.
Wrong. ML reigns supreme in pattern recognition of any kind. The easiest cases of that being letters, and basic geometric forms. That´s why DLSS 2.0 can already construct details like writing to a higher clarity than the native resolution, and this absolutely will get better and better.
 
Last edited:

MadYarpen

Member
Personally I am waiting for the whole picture.
I need a GPU for UWQHD monitor (most likely), and I want a reasonably priced one. So 3070 seems the way to go.

I would like to be able to choose between nVidia and competitive AMD. But I expect it will come down to DLSS and RT. And it does not seem likely AMD will be able to compete with this...
 
60-80% is not competitive. If that's the case Nvidia will take the top 3-5 spots in every RT benchmark making anything other than RTX a bad buy if you care about RT performance.

I don't know, lets say at the top end estimation, 80% I think is probably "reasonably competitive" with Ampere taking a definite lead. But who knows maybe it could be 90+% of Ampere? Would that be competitive in your view or would only a 5% loss to Ampere be competitive? I'm honestly asking because I don't really know how the RT will play out, still to early to tell but it is likely that Ampere will beat out RDNA2 in RT. The speculation at the moment I guess is "by how much?" and "is that amount enough/competitive?"

I seriously doubt RDNA2 RT will surpass Ampere so it will likely clock in at some percentage below the equivalent Ampere card. Of course if their rasterization is better on RDNA2 then maybe that could balance it out and the final FPS number will actually be pretty close? Honestly it is too early to tell anything concrete.

More than likely you are probably better off going with Nvidia if Ray Tracing is the single most important feature to you and you have to have the best RT performance regardless of price.
 

psorcerer

Banned
Wrong. ML reigns supreme in pattern recognition of any kind. The easiest cases of that being letters, and basic geometric forms. That´s why DLSS 2.0 can already construct details like writing to a higher clarity than the native resolution, and this absolutely will get better and better.

SS is a case of inventing missing details.
Pattern recognition cannot fill in these details.
You need something that's called "context". And for different games the context can be totally different.
I can give an example: everybody knows that DNNs can stylize paintings. Like draw me a Madonna but using a Van Gogh's style.
But what you expect in gaming is that you can say: draw me a game in this new John Doe game developer style that you have no idea of. And then DNN does that...
There is nothing similar, literally nothing, between Keena: Bridge of Spirits and Death Stranding, yet you expect a DNN to invent perfect details for both?
 
Last edited:

Sentenza

Member
This reads almost like a fanfiction with a light icing of wishful thinking on top.

Why do we think NVIDIA has been so aggressive on pricing?
I don't. Not more than they usually do, at least. Which is depending on their costs to begin with.

Also, I expect AMD to barely manage to trade blows with a 3070 and at a similar pricepoint, let alone being competitive with the other two.
 
SS is a case of inventing missing details.
Pattern recognition cannot fill in these details.
You need something that' called "context". And for different games the context can be totally different.
I can give an example: everybody knows that DNNs can stylize paintings. Like draw me a Madonna but using a Van Gogh's style.
But what you expect in gaming is that you can say: draw me a game in this new John Doe game developer style that you have no idea of. And then DNN does that...
There is nothing similar, literally nothing, between Keena: Bridge of Spirits and Death Stranding, yet you expect a DNN to invent perfect details for both?
Without a problem......what you call "context" is easily extrapolated from a surprisingly small sample base.
You are sorely underestimating what NN can do nowadays without much manual input.
 
Last edited:

psorcerer

Banned
Without a problem......

That's not an answer. It cannot.
Either details will be imperfect or there is an existing context where the network was trained (i.e. realistic games, there's only one "realistic" context, so every training with Quixel for example, will benefit all Quixel-based games, case in point: Death Stranding).
 
That's not an answer. It cannot.
Either details will be imperfect or there is an existing context where the network was trained (i.e. realistic games, there's only one "realistic" context, so every training with Quixel for example, will benefit all Quixel-based games, case in point: Death Stranding).
Yes it can.
Again, you don`t need specific context for every single target..... that is the whole point of a sample base and a self learning NN.......
 
Last edited:

psorcerer

Banned
Yes it can.
Again, you don`t need specific context for every single target..... that is the whole point of a sample base and a self learning NN.......

There is no training phase in these.
It's only inference and a pretty tight budget one.
But if they do use some fine-tuning setup at the development time it may be pretty ok.
 
There is no training phase in these.
Ofc there is. You don´t actually think Nvidia ever shuts their NN down or stops adapting their sample base, do you? The learning/training process is neverending.

It's only inference and a pretty tight budget one.
Correct. Which is why you have specialized hw to apply it in time.

But if they do use some fine-tuning setup at the development time it may be pretty ok.
I think that`s a given. You don´t just set a flag and have flawless TAA either.
 
Last edited:

supernova8

Banned
to counter attack next gen consoles?

Well they launched the 780 Ti for $699 the same month as the PS4 launched for $399 (and also when AMD had the R9 290X which was a pretty decent card) so that doesn't really hold true unless you go all the way back to PS3.
 
Top Bottom