• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Rikkori

Member
Ampere cards use GDDR6X so that might make some kind of impact. I'm not an expert on the matter but I'm reading that it provides a power advantage over standard GDDR6. How much and to what practical effect I'm not certain, but something to keep in mind.
Oh, it's going to be fine for this game for sure (with DLSS on especially), but I wanted to note the trend because some people are still in denial about vram requirements going up.
 

Malakhov

Banned
Ampere cards use GDDR6X so that might make some kind of impact. I'm not an expert on the matter but I'm reading that it provides a power advantage over standard GDDR6. How much and to what practical effect I'm not certain, but something to keep in mind.
Gddr6x has double the bandwidth of gddr6 but so far they haven't been used that way. Mostly used to get the same speed but using less power for it, less heat etc.. also easier to overclock. Possibly reach double bandwidth someday
 
Last edited:

cucuchu

Member
Oh, it's going to be fine for this game for sure (with DLSS on especially), but I wanted to note the trend because some people are still in denial about vram requirements going up.

Yeah its most likely not gonna be enough to max out everything at Ultra at 4k going into late 2021. 1440p is my resolution of choice for PC gaming though so I'm not too worried about it. Even though I'm gonna try my best to get a 3080 at launch, part of me won't be that disappointed that I'm forced to see what AMD has to show and if Nvidia responds with a 20gb 3080.
 
Last edited:

Rikkori

Member
Yeah its most likely not gonna be enough to max out everything at Ultra at 4k going into late 2021. 1440p is my resolution of choice for PC gaming though so I'm not too worried about it. Even though I'm gonna try my best to get a 3080 at launch, part of me won't be that disappointed that I'm forced to see what AMD has to show and if Nvidia responds with a 20gb 3080.
It's a deal with the devil, I'm doing the same. Want a new card ASAP and will just deal with re-selling 2 years down the line for an upgrade, rather than waiting around for it all to settle. I'll do a more long term upgrade then to a proper flagship (4090 or whatever) in conjunction with probably an 8K miniLED TV. Still on 4K 60/1440p 120 until then so the 3080 will manage it easier.
 

YodaBB

Member
I think I'm going to wait for either a 12gb+ 3080Ti or maybe a AMD 6000. Best of luck to everyone hoping to snag a card this week! :goog_horns:
 

Rygeist

Banned
Super disappointed with some of the leaked benchmarks for the 3080. It doesn't seem to OC well at all. Think I'm just gonna wait for the 3080ti
 
Last edited:

Red-Town

Member
From reddit:
KDsANEI.png
 

CrustyBritches

Gold Member
Fortnite is getting RT, DLSS, and Reflex support on Sept. 17th. Gonna be the new battleground for Ampere vs Big Navi. We'll see how AMD does with RT. Either way this game is gonna fly with DLSS enabled.

Watch Dogs Legion requirements...
WCCFwatchdogslegion10.jpg


"Additionally, the game will support ray-traced reflections and NVIDIA’s DLSS Ultra Performance mode, built to accelerate 8K gaming."

This is the game they'll be bundling with some Ampere cards. Looks like 2080ti can do 4K/Ultra with RT. Ampere cards will put the beatdown on this game.
 
Last edited:

bohrdom

Banned
C'mon guys 10 GB is enough, why do you want more? Didn't you hear what Nvidia said and 'le allocation izuh notuh utilisation'? :p

WDLPCSpecs_RTX-v9_Blue-960.jpg




They are right. Allocation isn't utilization. Engine devs will optimize their engines if they need to. I wouldn't be worried about the 10 GB VRAM requirement since they developed this card for games to run at 4K @ 60 fps. Consoles are most likely not gonna use more than 10 gb of VRAM anyways.
 

kraspkibble

Permabanned.
8GB VRAM for 1440p and 11GB for 4K...ouch lol.

RTX 3090 it is. if this is what Watch Dogs needs then i dread to see what Cyberpunk 2077 will need.
 
Last edited:

Ironbunny

Member
8GB VRAM for 1440p and 11GB for 4K...ouch lol.

RTX 3090 it is. if this is what watch Dogs needs then i dread to see what Cyberpunk 2077 will need.

Yea that 10GB on 3080 really is a bummer. :/ Might still have to go for the 3080 and switch to super or Ti later myself. 3090 price makes my eyes water. Interesting to see if the rumours about FE having better overclocking ability compared to 3rd party is true. For the 3080 versus 3090 you are paying for the memory which imo might very well be worth it if you play at 4k or over.
 
8GB VRAM for 1440p and 11GB for 4K...ouch lol.

RTX 3090 it is. if this is what Watch Dogs needs then i dread to see what Cyberpunk 2077 will need.
Watch Dogs won’t use the full 11gb of VRAM.

The more VRAM the GPU has available, the more the game will use up.

It will run just fine with 10gb. 11gb is recommended only because that’s what the 2080ti has, which at the time of those recommendations, is still the most powerful card on the market.
 
Watch Dogs won’t use the full 11gb of VRAM.

The more VRAM the GPU has available, the more the game will use up.

It will run just fine with 10gb. 11gb is recommended only because that’s what the 2080ti has, which at the time of those recommendations, is still the most powerful card on the market.
Pretty obvious they just list the amount of VRAM the recommended card for that settings has.

WD Legions is not going to use 16 GB of system ram either.
 
Last edited:

McHuj

Member
Asus Strix 3080 is really 849? Shit. I may have to just with FE. I'm not really interested in OC'ing, but I want the best cooling and quietest.
 

Rikkori

Member
They are right. Allocation isn't utilization. Engine devs will optimize their engines if they need to. I wouldn't be worried about the 10 GB VRAM requirement since they developed this card for games to run at 4K @ 60 fps. Consoles are most likely not gonna use more than 10 gb of VRAM anyways.
Engine devs are gonna "optimize" their engines for a particular card? lol. Fucking Ubisoft doesn't even care to optimise games for the platform as whole simply pushing out consoles builds to PCs (ala Assassin's Creed), and you think they care if you have to turn textures down because your card has "only" 10 GB? :messenger_tears_of_joy:

Damn, I wish I had your optimism.
 

Sentenza

Member
Some people seem to be unaware of the fact that just because a game MAY fill as much VRAM as it's available, it doesn't mean it NEEDS that much to work properly.
Same goes with system RAM, by the way.
 

BluRayHiDef

Banned
Gddr6x has double the bandwidth of gddr6 but so far they haven't been used that way. Mostly used to get the same speed but using less power for it, less heat etc.. also easier to overclock. Possibly reach double bandwidth someday

Since GDDR6X isn't an official iteration of GDDR, as it hasn't been officialized by JDEC and was co-developed by Nvidia specifically for their graphics cards, will it ever become/i] official or will its improvements be incorporated into a new iteration (GDDR7?) that will be official? Is the latter what happened with GDDR5X?
 

BluRayHiDef

Banned
Since GDDR6X isn't an official iteration of GDDR, as it hasn't been officialized by JEDEC and was co-developed by Nvidia specifically for their graphics cards, will it ever become official or will its improvements be incorporated into a new iteration (GDDR7?) that will be official? Is the latter what happened with GDDR5X?
 

Ellery

Member
Looks like the 2X from 2080 to 3080 was (like many expected) just the best case scenario in one game with RTX enabled. Real performance increase looks to be at around 60-65% over the 2080, which is still impressive
 

Rikkori

Member
The tl;dr is 25% over 2080 Ti, but worse OCer, so OC for OC it's more like 15% over the 2080 Ti.

What I haven't seen from anyone yet is undervolting tests, which is where I think the card will shine more than Turing. And on a side note, this bodes well for AMD, as they can absolutely match this in terms of raw performance. 7nm TSMC just shits on Samsung.
 
The cooling on FE cards are whats the word inadequate ? 57 dBA? That is AMD cooler territory. (n)

Now lets wait for AIB card reviews so we'll have a comparison. Maybe it's not just the FE cooler that is terrible, but the additional power draw over previous cards makes that cooler look much worse than it would on previous ~250 W cards. Most importantly the huge performance increase is here just not 2x 2080.
 

Rickyiez

Member
Looks like the 2X from 2080 to 3080 was (like many expected) just the best case scenario in one game with RTX enabled. Real performance increase looks to be at around 60-65% over the 2080, which is still impressive

It's almost 2 times faster in Doom Eternal 4k . Anyhow , it's a card that is 80-90% faster than my 1080Ti with Ray tracing , good enough upgrade for me

No review thread btw ?
 
Last edited:

CrustyBritches

Gold Member
Looks like the Chinese 3DMark leaks were accurate with 3080 getting a 11449 score in Port Royale RT benchmark compared to 7969 for the 2080ti. Looks like 3080 is about 25-30% more powerful than a 2080ti.

Quick look at some benchies:
-RDR2-
3080 = 4K/69fps/Max Settings
2080ti = 4K/49fps/Max Settings

-FS 2020-
3080 = 4K/42fps/Ultra Settings
2080ti = 4K/35fps/Ultra Settings

-Death Stranding-
3080 = 4K/106fps/VHQ Settings
2080ti = 4K/81fps/VHQ Settings
index.php


index.php


index.php
 

Rickyiez

Member
It is for the entire system

I swear some people doesn't read the context of the graph or information . They just deduce the conclusion they wanted , and have a chance to hate or say something like "terrible" . Same thing with the Watchdog Legion requirement and vram section
 
so the absolute latest and greatest 30tflop card cant even get 45fps on a new release game in 4k?

I know FS2000 melts PCs but i would have thought a brand new 30tflop card would have been able to handle it better.

with thats said, i assumed a 2080ti could do better than 35fps.

Im assuming it would get 60fps in 1440p end DLSS bringing it up to 4k
 
Top Bottom