• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Ulysses 31

Member
Big fps boosts in Metro Exodus, Doom Eternal, Crysis 3 at 4K(at least in starting areas) with 3090 over 2080TI. đź‘Ś

Using 4:4:4 is a bit wonky, exiting games can give funky desktop colors in 120 Hz on my Samsung. Using full RGB instead, which to my knowledge is like 99% the same, gives no issues.
 
Last edited:
I've messed up bigtime. Owned 2080Ti. Sold it on ebay for buttons on 3090 announcement. Had a pre-order for a EVGA 3090 xc3 Ultra which got delayed until after the 9th Oct. Found store which had stock of MSI 3090 and stated next day delivery. Ordered MSI but realised that I'd be going over my allowed overdraft. Panic cancelled EVGA pre-order to stop overdraft and sat around all day waiting for MSI to arrive. It didn't. Received a "your order has been cancelled" email on the MSI.

Just purchased 1000w PSU and XL case to fit 3090 in. What a waste. Back to square one.

Thinking about just getting SLI 2080Ti's off ebay while they're going cheap.

Same happened to me mate, cancelled a card from OCUK for 3090 on Very only to get the same message. I must have asked for refunds 4 times in the last 2 weeks. I had the same shit with PC World the week before.

Now I gone from a 2080ti to intel 630 lol.
 
Last edited:

benno

Member
Had anybody used the EVGA step up program?
Yes. I moved from a 980Ti to 1080 when they first came out, took about 3 months to go through it all. Reading EVGA forums and they're saying there's a very long wait, 6 months or so. Also my upgrade was free, it seems they've upped the price of the 30x0 cards when you step-up so you have to pay, IIRC the 3080 is about 800 and the 3090 is 2000.


Same happened to me mate, cancelled a card from OCUK for 3090 on Very only to get the same message. I must have asked for refunds 4 times in the last 2 weeks. I had the same shit with PC World the week before.

Now I gone from a 2080ti to intel 630 lol.

Got an email from Scan with some discount code if I want to return, going to get hold of chat tomorrow to ask if I return I keep my place in the queue.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I've messed up bigtime. Owned 2080Ti. Sold it on ebay for buttons on 3090 announcement. Had a pre-order for a EVGA 3090 xc3 Ultra which got delayed until after the 9th Oct. Found store which had stock of MSI 3090 and stated next day delivery. Ordered MSI but realised that I'd be going over my allowed overdraft. Panic cancelled EVGA pre-order to stop overdraft and sat around all day waiting for MSI to arrive. It didn't. Received a "your order has been cancelled" email on the MSI.

Just purchased 1000w PSU and XL case to fit 3090 in. What a waste. Back to square one.

Thinking about just getting SLI 2080Ti's off ebay while they're going cheap.
Or you can just be patient and wait it out.
 

llien

Member
Despite ignoring key aspect: pricing, an interesting summary on generation perf bumps:



508zqfk.png


Perf/$ of the top of the line card:

FWgvADbl.png



Perf/power:

rzlShi1.png



Value:

CwD132A.png


By AdoredTV

Note that this is 3090, figures would be much better (on value side :)) for 3080.
 
Last edited:

magaman

Banned
Big fps boosts in Metro Exodus, Doom Eternal, Crysis 3 at 4K(at least in starting areas) with 3090 over 2080TI. đź‘Ś

Using 4:4:4 is a bit wonky, exiting games can give funky desktop colors in 120 Hz on my Samsung. Using full RGB instead, which to my knowledge is like 99% the same, gives no issues.

What a waste of money. Get more FPS in old games. Congrats on the purchase?
 

rofif

Banned
What a waste of money. Get more FPS in old games. Congrats on the purchase?
dude wtf. This kinda of salt can be applied to anything:
-new car - oh great, it goes as fast as last one
-new speakers - oh great, they play your old music better
-new console - oh great, I can play my old backwards compatible games
Just.. cmon !!!

Ulysses 31 Ulysses 31 my 3080 coming this tuesday ! it is replacing rtx 2070 so thats over 100% faster at 4k :O
 

pawel86ck

Banned
No salt here as I think it's hilarious. I have zero interest in a 3090, as it's practically useless as of now aside from measuring dicks.

A fool and his money :)
B4BPCgamer has concluded his 3090 is not a good value for what you pay, however he also said when you play games and see massive difference compared to 2080ti you really dont think about it. Overall he is very happy with 3090 purchase. 3090 can deliver 60fps at 4K where 2080ti cant, so you cant say it's useless. All you can say it's not worth the investment, however I bet people who buy such GPU arnt poor, so they dont even think about price / performance ratio.
 
Yes. I moved from a 980Ti to 1080 when they first came out, took about 3 months to go through it all. Reading EVGA forums and they're saying there's a very long wait, 6 months or so. Also my upgrade was free, it seems they've upped the price of the 30x0 cards when you step-up so you have to pay, IIRC the 3080 is about 800 and the 3090 is 2000.

Got an email from Scan with some discount code if I want to return, going to get hold of chat tomorrow to ask if I return I keep my place in the queue.
oof, not sure I’m a fan of a 6 month wait. I’m sure I can probably find a 3070 by then organically.

I was originally going to go with he 3080 but idk if I can stomach spending over $700 on a single computer component..
 

Kenpachii

Member
No salt here as I think it's hilarious. I have zero interest in a 3090, as it's practically useless as of now aside from measuring dicks.

A fool and his money :)

Everything is all useless until it is not. 3090 will age like fine whine for the entire next gen. 3080 not so much.
 

Spukc

always chasing the next thrill
If a 3090 is only 10% more powerfull than a 3080 they are gonna age the same, more ram can only take you that much forward in term of performance.

If cyberpunk is enough to put on his knees a 3080 without dlss, it's gonna be the same for a 3090.
Yet the game launches on the xbox one 🤣
 

rofif

Banned
If a 3090 is only 10% more powerfull than a 3080 they are gonna age the same, more ram can only take you that much forward in term of performance.

If cyberpunk is enough to put on his knees a 3080 without dlss, it's gonna be the same for a 3090.
Sure but we all know that 3090 is for prosumers or people who don't care about money. 3080 is the real card here and for the money, it seems to be excellent
 

Kenpachii

Member
If a 3090 is only 10% more powerfull than a 3080 they are gonna age the same, more ram can only take you that much forward in term of performance.

If cyberpunk is enough to put on his knees a 3080 without dlss, it's gonna be the same for a 3090.

Raw performance isn't the main selling point of the card its the v-ram and that's why people are buying it. The same as people bought titans for its v-ram to mod skyrim with, the extra performance was just a bonus.

U all are looking through this generation goggles. Next generation will see a massive jump in v-ram requirements for multiple reasons and if you experience generation jumps before u will know this.

Just look at the specs before and after a gen jump from last time.

Minimum requirements metro 2033 ( biggest graphical powerhouse game on PC before the PS4 launched )

256 v-ram minimum
1gb ram minimum

512 v-ram recommend
2gb ram recommend

Next generation game ( PS4 / xbox one ) ac unity:

2gb v-ram minimum
6gb ram minimum

3gb v-ram recommend
8gb ram recommend.

PS5/xboxsx

We know they will use more then 3x the current v-ram usage because of the xbox series X which spends 10gb of it on v-ram.

conclusion:

3090 is a next generation card that will age like fine whine simple because of v-ram, much like how the original titan aged well.
3080 is a super charged current gen GPU and reminds me of a 580gtx faster then PS4 GPU but ultimately useless because couldn't play the games with its 1,5gb v-ram pool.

Nvidia will drop the 3080 10gb models like a brick the moment they got the next new shiny card to sell you. I bet a lot of 1080ti/2080ti users won't even consider it a upgrade simple because of this.
 
Last edited:

GymWolf

Gold Member
Raw performance isn't the main selling point of the card its the v-ram and that's why people are buying it. The same as people bought titans for its v-ram amounts not much the performance increase when they use them for gaming.

U all are looking through this generation goggles. Next generation will see a massive jump in v-ram requirements for multiple reasons and if you experience generation jumps before u will know this.

Just look at the specs before and after a gen jump from last time.

Minimum requirements metro 2033 ( biggest graphical powerhouse game on PC before the PS4 launched )

256 v-ram minimum
1gb ram minimum

512 v-ram recommend
2gb ram recommend

Next generation game ( PS4 / xbox one ) ac unity:

2gb v-ram minimum
6gb ram minimum

3gb v-ram recommend
8gb ram recommend.

PS5/xboxsx

We know they will use more then 3x the current v-ram usage as they have room for it if not more simple because PC wil lhave higher settings and ray tracing is also a thing now. All consume even more memory.

conclusion:

3090 is a next generation card that will age like fine whine simple because of v-ram, much like how the original titan aged well.
3080 is a super charged current gen GPU and reminds me of a 580gtx.
so a 3080 with 20gb of ram is nextgen too??

i'm sorry but i'm not optimistic like you, you can have all the ram you want, stuff like rtx and many other things in games still needs a shitload of power and 10% more is not nextgen in the slightest compared to a 3080 (semantics i guess).

of course i'm talking about using a 3090 strictly for gaming, don't care about other stuff.

if a current gen game (cyberpunk) can put a 3090 on his knees without dlss, i don't care how much ram it has.

but i'm ignorant noob even if i started playing on pc when the game pod was released :ROFLMAO: , so who knows, i'm always pessimistic with these things.
 
Last edited:

Kenpachii

Member
Nvidia is in trouble because of their inferior 8nm samsung node.



Video is pretty awful to be honest.

Nvidia could have made 2 designs and wanted to go with 7 until 8 was more interesting or simple the only opinion and shifted over. It's not uncommon that this happens. The only thing that matters is that nvidia got the cards out and keeps the performance crown as of now against AMD that will come out with faster then 2080ti cards.

So where's the trouble here.

Then his titan and power consumption rate. Nvidia much like any other company promotes there products the way they want it. 250w limit isn't even holded by titan lineup itself and why he thinks that 250w is some magical barrier which clearly isn't the case is also not much helping his argument.

Then his titan vs 290, people bought a titan for multiple reasons, 1, more v-ram, and 2 nvidia gpu with all its features and support. It was also cooler and less power hungry indeed. It was not just power and energy more efficient which he tries to validate his titan ampere idea with.

At Least he tried i guess.

quited watching after 24 minutes tho, logic does doesn't hold up. mayve after it he makes a point that is interesting who knows i won't see it.
 
Last edited:

SantaC

Member
Video is pretty awful to be honest.

Nvidia could have made 2 designs and wanted to go with 7 until 8 was more interesting or simple the only opinion and shifted over. It's not uncommon that this happens. The only thing that matters is that nvidia got the cards out and keeps the performance crown as of now against AMD that will come out with faster then 2080ti cards.

So where's the trouble here.

Then his titan and power consumption rate. Nvidia much like any other company promotes there products the way they want it. 250w limit isn't even holded by titan lineup itself and why he thinks that 250w is some magical barrier which clearly isn't the case is also not much helping his argument.

Then his titan vs 290, people bought a titan for multiple reasons, 1, more v-ram, and 2 nvidia gpu with all its features and support. It was also cooler and less power hungry indeed. It was not just power and energy more efficient which he tries to validate his titan ampere idea with.

At Least he tried i guess.

quited watching after 24 minutes tho, logic does doesn't hold up. mayve after it he makes a point that is interesting who knows i won't see it.
TMSC is waaaaay superior samsung at this point.

AMD will shock the world at october 28th.
 

Rikkori

Member
No it’s not and it shows

by this logic tesla is driving videogame console.
Wrong. 3090 does even worse than previous Titan RTX in professional workloads & there's also a Titan Ampere on its way. 3090 is a gaming card. It's basically Turing all over again, with Nvidia jacking up prices even further, except now they disguised it with x90 instead of x80 Ti and gave it more memory.
 

nemiroff

Gold Member
VRAM again.. Memory expanded cards for "neurotic" gamers might end up GPU market's biggest milking cow going forward, and perhaps for no good reason other than that people being hit with perpetuated non-scientific FUD on internet forums.. But wait, I'm not saying it will or won't matter, I just urge people to start back up their notions with demonstratable facts before going bombastic on this topic. The little I know is that much of this is heavily contextual where even trends in next gen console development is a factor (and speaking of context: As one case remember that MSFS, often touted as a next gen game, is seamlessly streaming 2 Petabytes of textures/data through a 50mbit internet line..). Up to a certain point bandwidth for PC GPUs is often a more important factor than VRAM size, but you don't see that much discussion around that.. Even engineers in the field say it's hard to give a definitive general answer here, which is why I guess we see so much ambiguity
 
Last edited:

pawel86ck

Banned
Is there any reason they didnt include the 680 in those comparison graphs?
Goes directly from 580 to 780.
He said GTX 680 was only midrange kepler although Nvidia marketed it as high end GPU (x80 was considered high end back then, and now we have Ti and Titan cards). In reality 780ti was the real high end kepler.
 

pawel86ck

Banned
so a 3080 with 20gb of ram is nextgen too??

i'm sorry but i'm not optimistic like you, you can have all the ram you want, stuff like rtx and many other things in games still needs a shitload of power and 10% more is not nextgen in the slightest compared to a 3080 (semantics i guess).

of course i'm talking about using a 3090 strictly for gaming, don't care about other stuff.

if a current gen game (cyberpunk) can put a 3090 on his knees without dlss, i don't care how much ram it has.

but i'm ignorant noob even if i started playing on pc when the game pod was released :ROFLMAO: , so who knows, i'm always pessimistic with these things.
I'm sure we will see texture packs for next gen ports on PC, and with 24GB 3090 owners will be able to run games with maxed out textures and with pretty much the same performance.
 

cucuchu

Member
Decided to keep my FE 3080 and not trade for 3090 (+ paying difference). At 1440p its really not worth the extra money and by the time I get a solid 4k monitor, the 4080 will be on the horizon. 3090 is still a great card but I think its really only worth it (for gaming) if you plan on skipping the 40XX series or of course if money is a non-issue for you. For the extra $800 I would be spending on a 3090 for minimal upgrade over 3080, I can use that on a 4080 in 1 to 2 years and get a probable substantial upgrade.
 

magaman

Banned
dude wtf. This kinda of salt can be applied to anything:
-new car - oh great, it goes as fast as last one
-new speakers - oh great, they play your old music better
-new console - oh great, I can play my old backwards compatible games
Just.. cmon !!!

Ulysses 31 Ulysses 31 my 3080 coming this tuesday ! it is replacing rtx 2070 so thats over 100% faster at 4k :O

No, it really can't. The 3090 is overpriced, objectively speaking, from a price/performance ratio when compared to 3080 or 3070. It's just needless dickwagging just like how the TITAN was.

B4BPCgamer has concluded his 3090 is not a good value for what you pay, however he also said when you play games and see massive difference compared to 2080ti you really dont think about it. Overall he is very happy with 3090 purchase. 3090 can deliver 60fps at 4K where 2080ti cant, so you cant say it's useless. All you can say it's not worth the investment, however I bet people who buy such GPU arnt poor, so they dont even think about price / performance ratio.

As I said, a fool and his money.

Everything is all useless until it is not. 3090 will age like fine whine for the entire next gen. 3080 not so much.

Give it literally 3 months and there will be a 3080 with more VRAM, which will render the 3090 all but redundant aside from benchmark boys.
 

llien

Member
So where's the trouble here.
AMD is poised to roll out smaller, more efficient chips with better RAM configs
At least one of those will be in vicinity of 3080, which also makes it close to 3090.
Money saved on chip manufacturing are lost (with interests) on cooling and power consumption woes.

Nvidia hardware corps have failed.

Nvidia FUD corps: DLSS TAA upscaling (not mentioning original resolution from which you are upscaling is genious), RT hyping still work, but we are talking hardware here.
 

Nydus

Member
What people often seem to forget with 20Gb 3080 is that it will be faaaaar away from 699€. It seems the driving force behind the high price of the 3090 is the humongous amount of vram. Yeah 20Gb would be awesome for future proofing. But for now 10gb for 699€ are enough and by the time it won't be people can upgrade to a 4080. I don't see myself spending 899 and more for the same performance and just more vram that I maybe use at 4k.
 
Last edited:

CuNi

Member
Give it literally 3 months and there will be a 3080 with more VRAM, which will render the 3090 all but redundant aside from benchmark boys.

It already is obsolete.
By the time games come out that will actually hit performance if you don't have more than 10GB VRAM, you'll already have the next GPU gen out which will have more VRAM from the beginning.
If you push games to the limit on max and shelve out for a XX80 or XX90 card, you're one of the people that get a upgrade after each or every 2nd GPU gen anway, so thinking about what games might bottleneck you in 3 years is really obsolete.
 
Amazon says it'll be here by the 7th, but it has yet to ship. I suppose if it is coming from an in-state warehouse, it'd only need a couple days, but I'm still antsy for that confirmation.
 
Is there any reason they didnt include the 680 in those comparison graphs?
Goes directly from 580 to 780.


Of course theres a reason =)) Other graphs dont suit the agenda he's trying to push. Mismatching comparisons, cherry picked results, going from site to site then switching to syntetic benchmarks, so he can push a misleading result. Comparing geforce 285 to 580. 580 to 780. 2080Ti to 3090 ????

He's sorting results from all over the place to make stuff up, to make it look worse than it is. He doesnt specify which resolution, he jump generations. Some people aparently believe they "got it", they understand whats going on.

Whats going on is we have a new card which has one of the biggest performance jumps in history. Why would someone give the slightest fuck about the horseshit that guy spews there, with twisted "value" comparisons ? Bottom line is, for almost half the price of the 2080Ti, you get 80-90% boost from base 2080 and 30-50% over the near double in price 2080Ti. You dont need cherry picked graphs to get that
 
Last edited:

Siri

Banned
Any thoughts on running a 3090 on a 700 watt PSU?

Present setup: 2080 TI - 9900k - LG C9.

Only a small 3090 will fit my mini-itx case. The Zotac card should fit. The placement of the power connectors is important. I’ll run out of room at the end. Also, the power connectors need to be on top and closer to the middle.

I think that a 700 watt PSU will probably work if I don’t overclock anything.

EDIT; now that I think about it... if I’m not going to overclock the 3090 because of PSU constraints maybe it would be better to buy a 3080 and overclock that.

At 4K the difference between a 2080 TI and a 3090, at max settings, is decent. Not so much with the 3080. That extra 20-15% is actually significant when the frames are below 60 on the 2080 TI.
 
Last edited:

BluRayHiDef

Banned
It already is obsolete.
By the time games come out that will actually hit performance if you don't have more than 10GB VRAM, you'll already have the next GPU gen out which will have more VRAM from the beginning.
If you push games to the limit on max and shelve out for a XX80 or XX90 card, you're one of the people that get a upgrade after each or every 2nd GPU gen anway, so thinking about what games might bottleneck you in 3 years is really obsolete.

The console versions of games that are also made for PC typically correspond to their PC counterparts with graphics set to "Medium." Hence, because 10GBs of the Xbox Series X's VRAM is intended to be used exclusively by its GPU (it's called GPU Optimal Memory), we can expect 10GBs of VRAM to be necessary to run games with merely medium settings on PC when running them at 4K - as soon as next year, when Cyberpunk 2077 will be enhanced for the Xbox Series X and PlayStation 5.

Developers have been restricted to 5.5GBs on the PlayStation 4 Pro and 9GBs on the Xbox One X for both CPU and GPU functionality. Hence, we know that less than each amount was available for the GPU of each system (less than 5.5GBs for the PlayStation 4 Pro's GPU and less than 9GBs for the Xbox One X's GPU).

Considering this, we can be sure that developers will definitely take advantage of the 10GBs of VRAM designed for the Xbox One X's GPU and will allocate an equal amount of VRAM to the PS5's GPU to assure parity. The Xbox Series X's and PlayStation 5's more expensive graphical features such as native 4K resolution, ray tracing, and higher quality character models and object models will necessitate the use of such an amount of VRAM for each machine's GPU.

Hence, we can conclude that 10GBs will be necessary to run the PC versions of games developed also for PlayStation 5 and Xbox Series X at merely "Medium" settings when running them at 4K. This will make the RTX 3080 inadequate for 4K gaming in the foreseeable future and will reveal the RTX 3090 as a viable option for 4K gaming. Also, when the 20GB models of the RTX 3080 are released, the relative cost of the RTX 3090 will be more reasonable since it doubles as both a gaming card and a work-station card that will offer a bit more RAM and functionality.
 
Last edited:

CuNi

Member
Hence, we can conclude that 10GBs will be necessary to run the PC versions of games developed also for PlayStation 5 and Xbox Series X at merely "Medium" settings when running them at 4K. This will make the RTX 3080 inadequate for 4K gaming in the foreseeable future and will reveal the RTX 3090 as a viable option for 4K gaming. Also, when the 20GB models of the RTX 3080 are released, the relative cost of the RTX 3090 will be more reasonable since it doubles as both a gaming card and a work-station card that will offer a bit more RAM and functionality.

I would object this conclusion.
The majority of PC games is not in XX90 oder XX80-Series GPUs but in XX60. The 3070 will only have 8GB VRAM and the XX60 probably either equal or 6GB VRAM, so games on PC will still be optimized for well below 10GB VRAM.
Also, again, just because a application allocates 10GB of VRAM does not mean it needs this amount of VRAM to run efficiently and smoothly. PC GPUs are still more powerful than their console counterparts, especially when it comes to Raytracing.

Also new console generations only spin up slowly, so it will take a good while before you see 3rd party games that take advantage of all the high end tech in consoles. Only first party games will do this and Sony is not really aiming to port those games to PC right at release. Microsoft, EA, Ubisoft etc. all will develop games that look equally on PC and Consoles and on PC a 3070 will be sufficient for that.
 

BluRayHiDef

Banned
I would object this conclusion.
The majority of PC games is not in XX90 oder XX80-Series GPUs but in XX60. The 3070 will only have 8GB VRAM and the XX60 probably either equal or 6GB VRAM, so games on PC will still be optimized for well below 10GB VRAM.
Also, again, just because a application allocates 10GB of VRAM does not mean it needs this amount of VRAM to run efficiently and smoothly. PC GPUs are still more powerful than their console counterparts, especially when it comes to Raytracing.

Also new console generations only spin up slowly, so it will take a good while before you see 3rd party games that take advantage of all the high end tech in consoles. Only first party games will do this and Sony is not really aiming to port those games to PC right at release. Microsoft, EA, Ubisoft etc. all will develop games that look equally on PC and Consoles and on PC a 3070 will be sufficient for that.

I'm not talking about all PC gamers or all PC game requirements but am talking strictly about PC gamers who game at 4K / PC gaming at 4K. For 1080p and 1440p, xx60 and xx70 and xx80 cards will be adequate, but for 4K gaming they will not be adequate.
 
Top Bottom