• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA/AMD (Ampere v RDNA2) war games - what the hell is going on?

amigastar

Member
With graphic cards i always go with nvidia, never been dissapointed. i will get the 3000 series next year. As far as i know AMD has nothing like DLSS.
 
Last edited:

llien

Member
Test it in Control
That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.

ML reigns supreme in pattern recognition of any kind.
This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you could

what NN can do nowadays without much manual input.
NN can do exactly the same it could do decades ago.
There is no free cheese, you train NN, it gets biases.
If you train per game, you could teach NN to be biased in a way the art works in that particular game (e.g. Team Fortress visuals vs God of War).
The more generic you go, the less help NN brings.

Now, that was theory. When applied in practice (DLSS 1.0) we saw that FidelityFX (which also runs on NV, mind you) beats all that DataCenter magic.

And now we have TAA boosted somewhat being sold as the the second coming.

For the love of god, please stop referring to it by target resolution, it is terribly misleading.


AMD has nothing like DLSS.
As far as I know, people who always buy NV tend to have very sporadic knowlege of GPU tech.
 
Last edited:

DaGwaphics

Member
It's a fallacy. I.e. it may look much better if the developer was not accurate enough in their MIP levels for 4K. But overall it cannot be better by definition.
Only if DN is trained per game then it can look better than native, for obvious reasons.

That may be true in practice but not as a rule. The ML inference is additive to the final image, as the neural networks get smarter (which they will continually with the DLSS 2.0 implementation) they can more effectively ADD detail to images, even above a native 4k image.

The primary advantage to end-users is the ability to run @ a convincing 4k on systems that may lack the Vram or bandwidth to do so natively.
 

VFXVeteran

Banned
So we've had Ampere revealed, we roughly know what to expect.

RTX 3090 (from $1499) - complete overkill SKU.
RTX 3080 (from $699) - NVIDIA claims 2x performance over 2080 Ti (2080 not Ti, thanks to those who corrected). Looks like an over-promise but 70-80% gain is believable from the numbers.
RTX 3070 (from $499) - NVIDIA claims better than the 2080 Ti, probably as long as you don't go above 1440p.

On the AMD/RDNA2/Big Navi side we still don't really know.

We've somewhat of a polarisation between tech youtubers/leakers with some people thinking Ampere has killed AMD chances, others thinking AMD is quietly waiting to pounce.

On the understanding that, at this moment in time, we really have no idea, let's think about what could happen in a few months time. I'll go first.

I think AMD will release a 16GB GPU that destroys the 3070 but doesn't quite beat the 3080. I think they'll release it for $499.
It would be crazy if they also released a 24GB or 32GB (is that even possible?) variant for around $899.


It's perfectly within the realms of possibility that NVIDIA then releases a 3070 Super/Ti later on with 16GB and higher clocks (it's not quite as power hungry as the 3080/90).

Question for everyone:

Why do we think NVIDIA has been so aggressive on pricing?

Because they want more people to adopt the PC. It will make going to a unified platform that much easier when Sony/MS decide to abandon their hardware and fully sell their brand and software. (y)
 
If AMD was smart they would leverage the fact that they make CPUs to add value to their GPUs.

Just imagine if Nvidia make X86 CPUs or owned ARM - They would 100% have figured out a way that using Nvidia GPU's with Nvidia CPUs would provide some extra benefit(s). Hell maybe they'd add CPU's onto their GPUs.

Why can't AMD leverage SOME kind of advantage? Sure they make APUs but so far they're not all that compelling - but they SHOULD BE by now. And having a top tier CPU on the same die as a top tier GPU, sandwiched between 3D stacked memory SHOULD provide some advantages that ONLY AMD can provide. But instead they've completely failed to deliver in this area.

I expect to be disappointed by Big Navi. Prove me wrong AMD. Please.
 
Last edited:

supernova8

Banned
That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.

Interesting point, how many PC games that have been released in the last, say, 2 years have ray-tracing built in without having to install mods?

Given NVIDIA's eagerness to market RTX one would be forgiven for thinking NVIDIA is footing some of the bill (like a slightly more widely used Hairworks).
 

amigastar

Member
That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.


This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you could


NN can do exactly the same it could do decades ago.
There is no free cheese, you train NN, it gets biases.
If you train per game, you could teach NN to be biased in a way the art works in that particular game (e.g. Team Fortress visuals vs God of War).
The more generic you go, the less help NN brings.

Now, that was theory. When applied in practice (DLSS 1.0) we saw that FidelityFX (which also runs on NV, mind you) beats all that DataCenter magic.

And now we have TAA boosted somewhat being sold as the the second coming.

For the love of god, please stop referring to it by target resolution, it is terribly misleading.



As far as I know, people who always buy NV tend to have very sporadic knowlege of GPU tech.
My knowledge is fine, thank you. I don't want to exclude AMD here but the new 3000 Series is looking good over here, thats all i'm saying.
 
Last edited:
This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you could
Yeah, educate yourself on that please....
I work with that tech professionally and that is exactly what it does best...the use case.

NN can do exactly the same it could do decades ago.
Aw gee, if only the hardware had evolved since then.,...

If you train per game, you could teach NN to be biased in a way the art works in that particular game (e.g. Team Fortress visuals vs God of War).
The more generic you go, the less help NN brings.
This is the same bullshit you started with the RT discussion.
We see excellent results already and yet you just whine around about the "how".

And now we have TAA boosted somewhat being sold as the the second coming.
Wow...that is so ignorant I don`t even know what to say to this obvious troll attempt.

That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.
More troll nonsense....

I think at this point you`ve made abundantly clear that you just follow your "nvidia bad" narrative, no matter what it`s about. You can continue your troll bullshit on my ignore list.
 
Last edited:

llien

Member
convinced
troll nonsense....
Fuck off with brigading assholery.

Yeah, educated yourself on that please....
I work with that tech professionally and that is exactly what it does best...the absolutely easiest use case.
Your statement was about math, a front on which you are lacking, I guess.
FidelityFX beating non-TAA solution is quite notable.

This is the same bullshit you started with the RT discussion.
...so ignorant I don`t even know what to say...
"Thank you to telling me something I didn't know", would be good enough. Or just nothing.

WURrBOh.png



 
Last edited:

diffusionx

Gold Member
Well they launched the 780 Ti for $699 the same month as the PS4 launched for $399 (and also when AMD had the R9 290X which was a pretty decent card) so that doesn't really hold true unless you go all the way back to PS3.

The PS4 sucked hardware-wise. These consoles are reasonably competent at least, and have been hyped based on hardware capability. I don't think Nvidia has focused on teraflops so much in their unveilings in the past, but my bet is they did that because that's what gamers are talking about online (even if it is a meaningless number, especially on PC).

Basically all this talk about how the next consoles are 2080-ish in one box for $500 may be pressuring Nvidia to blow away the 2080/1080TI and get people to upgrade.
 
Last edited:

psorcerer

Banned
Ofc there is. You don´t actually think Nvidia ever shuts their NN down or stops adapting their sample base, do you?

It's generic.

Correct. Which is why you have specialized hw to apply it in time.

I'm not sure that that much hardware is needed. We'll see :messenger_winking:

I think that`s a given. You don´t just set a flag and have flawless TAA either.

AFAIR, the consensus is that TAA should be custom per-game.
Viewing DLSS as a "generic" part of that custom TAA seems to make a lot of sense.
Anyway, fine-tuning is a per game training.
 

Leonidas

Member
I don't know, lets say at the top end estimation, 80% I think is probably "reasonably competitive" with Ampere taking a definite lead. But who knows maybe it could be 90+% of Ampere? Would that be competitive in your view or would only a 5% loss to Ampere be competitive? I'm honestly asking because I don't really know how the RT will play out, still to early to tell but it is likely that Ampere will beat out RDNA2 in RT. The speculation at the moment I guess is "by how much?" and "is that amount enough/competitive?"

90% would be good but it seems highly unlikely. And you also have to take into consideration that so far DLSS is supported in 100% of RT games. If that continues Nvidia basically gets an extra 50-60% boost over whatever RT advantage they likely already have. In that sense I can't see AMD being competitive with RT unless they are close to Nvidia in RT and also have something that is as good as DLSS 2.0...
 
It's generic.
And you think that means it`s not constantly trained and expanded. I guarantee you that this system hasn`t had a free day in the last 3 years.

Anyway, fine-tuning is a per game training.
With the system as it`s supposed to/marketed to work it`s probably more like feedback to nvidia with a follow up extension/adjustment of the sample base if at all necessary.
If the implementation is really that simple will come out via dev interviews or something sooner or later.
 
Last edited:
Question for everyone:

Why do we think NVIDIA has been so aggressive on pricing?

They can hit any price point they want for any performance within reason. They could've made 2080ti much cheaper but decided to give you huge die size and expensive components.

Now they are going cheaper perhaps to offset huge SSD costs to make system affordable. Just a guess.
 

Ascend

Member
I'm dismissing the argument because it's been wrong two generations in a row. What's there to be said exactly? I'm being condescending because you didn't even bother addressing my argument and instead stated exactly why the belief that they'd be competitive based on price is stupid by bringing up garbage cards like the Fury X to prove a point.
I didn't bother addressing your argument? I made a cohesive and logical argument as to why it might be the case that nVidia is reacting to AMD being at least present at the high end. There are reasons as to why this time it's more than just being present, and I didn't get to that in that reply, but I will in this one.

In any case... Did you even HAVE an argument? This is what you said;

People need to stop with this "NVIDIA is aggressive with prices because they know AMD is cooking up something good".

That reasoning is stupid
Not an argument

people made the same one regarding the 980 Ti and 1080 Ti,
I never saw anyone make the argument back then that the prices of the 980 Ti and 1080 Ti were good/cheap because AMD was going to come out with something better. No one considered them cheap. In fact, the pricing of all these cards were in line with their previous generation. The 980 Ti and 1080 Ti had almost the same release price, and so did the 780 Ti. Those prices were expected and normal. So, again, not an argument, but simply a lie about the past.
Now, everyone considers the RTX 3000 series good/cheap, but, that is only because of the inflated RTX2000 prices, which coincides with when it was well-known that AMD would not release any high end cards at all. That you want to dismiss that fact is your problem.
Different situation, different perspective.

AMD ended up having no answer.
Previous events are not a guarantee for future results. But the fact that AMD's products were not up to par is only of partial significance to your 'argument'. At least AMD participated. And I'll say it again. The one moment that AMD did not have a product in the high end, nVidia boosted up the prices significantly. What explanation do you have for that...?

Additionally, RDNA has proven to be a big jump, considering a 40CU GPU (5700XT) performs almost the same as a 64 CU one (Radeon VII). There were no such jumps in the past, not to that degree. There was zero indication that GCN made any significant advances after Polaris, which was known to be targeted at the mid range.
If you're going to use the past, at least use the whole picture, not only the one that suits your bias. Otherwise it is not really a good argument.

They might or might not have something great in the pipeline, NVIDIA being aggressive with prices isn’t indication of that at all.
It's not just the pricing.... I have a few questions for you.

Why are nVidia calling the RTX 3080 their flagship, despite the RTX 3090 existing?
Why is the RTX 3090 called the RTX 3090, and not the RTX Titan X Ampere? It is clearly a Titan class GPU.
Why is the RTX 3090 double the price of the RTX 3080, despite expecting uplift in performance being only 20%? Even the 2080 Ti was not so expensive compared to the 2080... Not to mention they use the same chip...

The most important question;
Every 80 class GPU has been 104, while every 80Ti/Titan class has been 102. Suddenly, the 80 non Ti class GPU is a 102 chip and we don't have a Titan. Why has nVidia shifted down their 102 chip one tier?

GTX 1070/GTX 1070Ti/GTX 1080: GP104
GTX 1080 Ti/Titan X Pascal: GP102

RTX2070S/2080/2080S: TU104
RTX2080 Ti/Titan X : TU102

RTX 3070: GA104
RTX 3080 / RTX 3090: GA102
 
Last edited:
Won't bother debunking the nonsense of your post.

I think AMD can be competitive based solely on what RDNA2 has shown us thus far. None of the stupidity about NVIDIA being afraid and lowering their prices.

They will beat the 3070 without a doubt. Can they match the 3080? Perhaps but this is not a guarantee.
 
Last edited:

Ascend

Member
Won't bother debunking the nonsense of your post.
You use my so-called not addressing your arguments as an excuse to be condescending (even though I actually did), and now you reply with something like this? Hypocrite. How low can you go?

I guess you're out of ammo. And too arrogant to admit it. Guess we're done here.
 
Last edited:
You use my so-called not addressing your arguments as an excuse to be condescending (even though I actually did), and now you reply with something like this? Hypocrite. How low can you go?

I guess you're out of ammo. And too arrogant to admit it. Guess we're done here.
I won't bother addressing it because it wouldn't even be productive. I think NVIDIA lowering their prices has little to do with them being afraid of AMD based on precedents.

I think AMD can reasonably compete based on what we have. We both believe AMD can do well so why bother going over the price part when it changes nothing to our conclusions?

That'd be arguing just to do it. Pointless.
 
Last edited:

Leonidas

Member
The most important question;
Every 80 class GPU has been 104, while every 80Ti/Titan class has been 102. Suddenly, the 80 non Ti class GPU is a 102 chip and we don't have a Titan. Why has nVidia shifted down their 102 chip one tier?

GTX 780 was based on the same big GPU as the first generation Titan/780 Ti, they're simply going back to their roots.

They've done it to beat AMD. Same reason they are rumored to be coming out with a new 103 GPU this gen, which will also most likely beat RDNA2.
 
Last edited:

FireFly

Member
I think AMD can be competitive based solely on what RDNA2 has shown us thus far. None of the stupidity about NVIDIA being afraid and lowering their prices.
It's not a question of fear. It's a question of pricing products to compete against the presumed performance of their AMD counterparts. If you have a monopoly in a market, you increase pricing to maximise revenue. If you don't, then you have to price match.

I don't see what is super controversial about this.
 
Last edited:


Interesting showcase of Nvidias RTXGI. From min 30+ he also goes a bit into DLSS about how it just works immediately without finetuning (which is ofc possible, too).
Pretty frigging amazing if it`s not just PR talk.
 
Last edited:

RoboFu

One of the green rats
Nvidia knows their customers. They know how to market to those customers. It is pretty simple to!

The 3090 is all mindshare. Barely anyone will buy one compared to the number of PC gamers purchasing new gpus. Hell if you go by steam charts barely anyone has a 20xx card., but that is not the point. The point is it's their show horse. It's something for their customers to get behind and use in online arguments.

For whatever reason AMD has been ignoring this. It doesn’t matter what it costs as long as it’s a monster. They are playing too close to sales numbers and statistics. They see most people buy in the lower to mid end range so that is where they focus while totally ignoring that critical mindshare aspect completely.
 
It's not a question of fear. It's a question of pricing products to compete against the presumed performance of their AMD counterparts. If you have a monopoly in a market, you increase pricing to maximise revenue. If you don't, then you have to price match.

I don't see what is super controversial about this.
Because we've seen this exact scenario in the past.

Here you have an article supposing the reason for the then unannounced 1080 Ti. It was apparently priced in a way that made buying Titan X foolish and was in anticipation of AMD having a monster in the work. Something which never happened.

If we go back to the first GTX Titan GPU, launched in February 2013, Nvidia used their GK110 chip (first seen in the Quadro K6000—sound familiar?), only with one SMX disabled. At the time, Nvidia had the fastest and second fastest GPUs for gaming, and they weren't in any hurry to reduce prices. It wasn't until October of 2013 that AMD finally had a viable competitor to the GTX 780 and Titan.

When AMD launched their first Hawaii GPU, the R9 290X, they laid claim to the title of the fastest gaming GPU, beating out the GTX Titan in most games. Nvidia's response came one month later with the GTX 780 Ti, which had half the memory of the GTX Titan (and 'slow' FP64 support) but included the fully enabled GK110 chip, along with higher clocks, all at a lower price. Nvidia closed the performance gap, and even if they didn't outright win, they at least had a viable claim to the throne.

Back to the present, we know that AMD is prepping Vega 10 for release—it might make it out in 2016, but 'early' 2017 is more probable. Whether they end up calling it the RX 490, RX Fury, or something else isn't important; Vega will come out, and it could be a performance monster. Best indications are it will have 16GB of HBM2 and 4,096 cores, with higher clocks and significantly better performance than Fury X.


Nvidia spoiled the launch of the Fury X by releasing the GTX 980 Ti. They had more memory, overall better performance (even if there are a few cases where Fury X beat the 980 Ti), and the cost of manufacturing GM200 is significantly lower than Fiji. Looking at GP102 and Vega, assuming the rumors are anywhere close to accurate, Nvidia is going to try to do the same again with the 1080 Ti.
 

Audiophile

Gold Member
They've knocked it out of the park (and I say that reluctantly, technical talent aside, I'm not a big fan of Nvidia) but the only caveat here seems to be VRAM. I expect an eventual Super/Ti variant of the 70/80 will fare better. Someone from Nvidia expressed that it'll be fine cause it's just enough for current gen titles (some examples of which were years old), but moving forward I'm not sure.

Also, the gulf between the 3080 and 3090 seems wonky, more than twice the price for 20% more power, and yet it has 2.4x the VRAM.
 

FireFly

Member
Because we've seen this exact scenario in the past.

Here you have an article supposing the reason for the then unannounced 1080 Ti. It was apparently priced in a way that made buying Titan X foolish and was in anticipation of AMD having a monster in the work. Something which never happened.
Well, as I see it there are two claims:

1.) Nvidia price their products based on the expected performance of their AMD counterparts.
2.) Nvidia has an accurate track record in predicting AMD's performance.

The 1080 Ti seems to be a counter example to claim 2.), not claim 1.). For a counter example to claim 1.) I think we would need a situation where AMD was known in advance not to be releasing any competitive cards, and Nvidia still didn't alter the pricing from past generations.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
No idea why people keep saying the pricing from nvidia is aggressive, great, whatever, those are ridiculous prices if you ask me, and super/ti/whatever versions are still pending. All models are more expensive than their old counterparts, the jump in power doesn't matter, they'll be outpaced/obsolette/whatever with new models at roughly the same pace as always, but cost more, and some have clearly gimped capabilities like the amount of VRAM to make them less attractive compared to far more expensive models. They have no competition and it shows. Outside scalper pricing due to no stock (which may affect these models too), a 1080 could be had for way better price than 3080 MSRP. People who went for a 1070 or a 1080 will have to go down to a medium/low end model this generation instead of stick with medium/high if they want to invest a similar amount years later. Nothing about any of this is aggressive pricing unless you mean hostile towards the consumer :(

i dunno if AMD has anything to rival the top of the line from Nvidia but hey, maybe they'll at least offer better pricing for the mid tier performance level and in this way cause nvidia to drop the prices too despite the hype. Maybe they'll be able to offer better stuff eventually now they're doing good in the CPU space, invest more etc.
 
Last edited:

supernova8

Banned
Because we've seen this exact scenario in the past.

Here you have an article supposing the reason for the then unannounced 1080 Ti. It was apparently priced in a way that made buying Titan X foolish and was in anticipation of AMD having a monster in the work. Something which never happened.

Those are examples of where NVIDIA already had a superior card waiting and AMD was late to the party. This time around AMD is supposedly launching roughly the same time as NVIDIA (or a month or so after) so if NVIDIA wants to pull off the same trick they would have to render some of their own just released cards obsolete, no?

For example:
1080 - May 2016
1080 Ti - March 2017
(10 month gap)

1070 - June 2016
1070 Ti - November 2017
(1 year + gap)

With RTX 20 series they released the 2080 and 2080 Ti roughly the same time (September 2018).

Most recently we have:

2070 - October 2018
5700 XT - July 2019
2070 Super - July 2019
(9 month gap)

5700 XT was competitive with the 2070 (and is also competent against 2070 Super in some titles) but it came out almost a whole year after 2070 released. Makes you wonder what would've happened if the 5700 XT was available back in October 2018.

The little trick that NVIDIA has pulled every time has been a situation where they could essentially sell their initial offering without competition at all for roughly 12 months before AMD comes out with something then they release the Super/Ti variants to crush AMD.

I don't see how this trick works if AMD launches their own product almost immediately after NVIDIA launches theirs AND is performance competitive straight away.
 
Last edited:

DaGwaphics

Member
All models are more expensive than their old counterparts

Aren't the launch prices of the 3070/80 identical to the launch prices of the 2070/80, FE were even a bit more if I'm not mistaken. How are those more expensive? Or are you using current prices?
 

supernova8

Banned
Aren't the launch prices of the 3070/80 identical to the launch prices of the 2070/80, FE were even a bit more if I'm not mistaken. How are those more expensive? Or are you using current prices?

Founders were 599 and 799 respectively for 2070 and 2080, so they (new cards) are definitely cheaper.

Kinda lame that the 3070 only has 8GB when the 2070 had the same 2 years ago. 2080 Super was $699 at launch.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
10
Aren't the launch prices of the 3070/80 identical to the launch prices of the 2070/80, FE were even a bit more if I'm not mistaken. How are those more expensive? Or are you using current prices?
1080 MSRP was $599 (not counting founders edition crap, everyone waits for 3rd party anyway, we'll see if the 3080 third party will be 100 cheaper or not but I've not heard anything of the sort and I don't think they specified the announced prices are FE only either).
 
Last edited:

DaGwaphics

Member
Founders were 599 and 799 respectively for 2070 and 2080, so they (new cards) are definitely cheaper.

Kinda lame that the 3070 only has 8GB when the 2070 had the same 2 years ago. 2080 Super was $699 at launch.

They said 16GB variants would be available. I like the options, especially once DirectStorage gets up and running, you might not need as much VRAM as you did previously. 4K users could pickup the 16GB while 1440p users could stick with the 8GB regardless.

1080 MSRP was $599 (not counting founders edition crap, everyone waits for 3rd party anyway, we'll see if the 3080 third party will be 100 cheaper or not but I've not heard anything of the sort).

Not much point in comparing GTX to RTX, RTX brings along bigger dies, you have to expect a bit of a premium.
 
Last edited:

ZywyPL

Banned
Nvidia knows their customers. They know how to market to those customers. It is pretty simple to!

The 3090 is all mindshare. Barely anyone will buy one compared to the number of PC gamers purchasing new gpus. Hell if you go by steam charts barely anyone has a 20xx card., but that is not the point. The point is it's their show horse. It's something for their customers to get behind and use in online arguments.

For whatever reason AMD has been ignoring this. It doesn’t matter what it costs as long as it’s a monster. They are playing too close to sales numbers and statistics. They see most people buy in the lower to mid end range so that is where they focus while totally ignoring that critical mindshare aspect completely.

100% this. Not many Steam users have 20xx cards, but the vast majority does have an Nvidia card, that's how it works, because it doesn't matter how much you can spend, 200 or 2000, you'll always go with your money to the one that's the best. While on the other hand, AMD basically goes straight out saying their entire lineup is nothing but 1080p cards, with one 1440p card... Which ironically - there are so much more Freesync displays out there than G-sync ones, and cheaper at the same time, while AMD is unable to provide GPUs that can actually utilize them, that's such a huge yet missed opportunity, they could have been THE go-to company when it comes to high refresh gaming.
 

Rickyiez

Member
There's nothing going on. How can there be war games if AMD has shown nothing yet. Nvidia is clearly way ahead now, they even have RTX 3080 running Doom Eternal 4k with FPS counter .

Until AMD announce something solid, I'm not too interested with Radeon yet. Don't get me wrong I love AMD, I had a Ryzen 3700x but Radeon have been disappointing up until now.
 

supernova8

Banned
100% this. Not many Steam users have 20xx cards, but the vast majority does have an Nvidia card, that's how it works, because it doesn't matter how much you can spend, 200 or 2000, you'll always go with your money to the one that's the best. While on the other hand, AMD basically goes straight out saying their entire lineup is nothing but 1080p cards, with one 1440p card... Which ironically - there are so much more Freesync displays out there than G-sync ones, and cheaper at the same time, while AMD is unable to provide GPUs that can actually utilize them, that's such a huge yet missed opportunity, they could have been THE go-to company when it comes to high refresh gaming.

Buying NVIDIA - maaaaan this gon be guuuuuuuud
Buying AMD - hope this fucking works

NVIDIA is generally the obvious choice unless you're strapped for cash and there's some budget Radeon available that'll get the job done. That's why I have my RX 560 paired with this 3900X. Just waiting to unleash this 12-core beast alongside a commensurate GPU.
 

Kenpachii

Member
Won't bother debunking the nonsense of your post.

I think AMD can be competitive based solely on what RDNA2 has shown us thus far. None of the stupidity about NVIDIA being afraid and lowering their prices.

They will beat the 3070 without a doubt. Can they match the 3080? Perhaps but this is not a guarantee.

The whole 3000 series is a reaction on AMD.
 

Ascend

Member
GTX 780 was based on the same big GPU as the first generation Titan/780 Ti, they're simply going back to their roots.

They've done it to beat AMD. Same reason they are rumored to be coming out with a new 103 GPU this gen, which will also most likely beat RDNA2.
Well there you go.
 
I guess it's weird to see the AMD superfans digging in their heels even harder now that Nvidia is showing the biggest advantage in technology they have ever had over AMD.

That being said, in the real world, the actual population of AMD superfans are few and far between it seems.


AMD is barely in the top 10, with the RX 580 in rank 9, and its next card is the RX 570, at rank 16. As for the RX 5700 XT, it doesn’t appear until way down the list, somehow below the 2080 Ti despite being about 1/3 of its price. The 5700 XT has about 0.88% of the market.
 
Last edited:

Ascend

Member
I guess it's weird to see the AMD superfans digging in their heels even harder now that Nvidia is showing the biggest advantage in technology they have ever had over AMD.
Not everything is a console/hardware war. What is really weird is the lack of critical thinking towards nVidia. But I guess that was always the case...

Leaving this here...
 

kiphalfton

Member
Nvidia pricing looks competitive, relative to last gen Turing stuff. That's it. Nvidia over priced Turing and here we are thanking our lucky stars that the rtx 3070 is $499, when it should be sitting at $399. We've been fed crap long enough, that we now jump at paltry scraps. Before Turing, the x70 was on par with the previous gen x80Ti (when the gtx 970 and 1070 were released).
 
Last edited:
I find it hilarious that the 3090 is viewed as an overkill SKU in anything other than power draw.
The thing is unlikely to be much more than 20% faster than a 3080, but at more than twice the price.

Nvidia really know how to market a product. Damn.
 

LOLCats

Banned
power usage is a thing for me. we shouldn't need 300w for video cards, its stupid and lazy. i hope AMD does better with big navi line, but if history tells us anything, they wont.
 

Bolivar687

Banned
I want to believe that AMD has gotten over the Polaris days where they let power draw and the console business take them off track, assuming the RX 490/490X was taken away from PC consumers to go into the PS4 Pro and Xbox One X. And it does seem like RDNA1 did what it intended to do, to bring the gaming performance back up from the compute-focused Vega and Radeon VII cards. I think a sober prediction would be that they'll have two RDNA2 cards, one that competes with the 3070 and one that comes within the striking difference but likely does not beat the 3080.
 

Ascend

Member
I find it hilarious that the 3090 is viewed as an overkill SKU in anything other than power draw.
The thing is unlikely to be much more than 20% faster than a 3080, but at more than twice the price.

Nvidia really know how to market a product. Damn.
I share your sentiment. The thing is, they know exactly what to do to make the masses gobble it up. The RTX 3080 and RTX 3090 are the exact same chip. So it's ludicrous that one should be twice the price of the other, but that's how nVidia operates.
 

ZywyPL

Banned
share your sentiment. The thing is, they know exactly what to do to make the masses gobble it up. The RTX 3080 and RTX 3090 are the exact same chip. So it's ludicrous that one should be twice the price of the other, but that's how nVidia operates.

24GB GDDR6X ain't cheap, that's few hundred bucks alone.
 
Top Bottom