• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

BluRayHiDef

Banned
I can still run games in 4k on highest settings like rdr2 with 6gb of vram and it's not even being used anywhere near 100%.

The 3080 10gb VRAM will last until next gen easily

RDR2 is a current-gen game; it was designed to run on the amounts of VRAM that are available to developers on the PlayStation 4 (Pro) and Xbox One (X): 5GBs of GDDR5 (PS4), 5.5GBs of GDDR3 (XBO) and GDDR5 (PS4 Pro), and 9GBs of GDDR5 (XBO-X). Keep in mind that these amounts are for both CPU and GPU data, so the amounts available for GPU data are less than these.

For the PlayStation 5 and Xbox Series X, which will each allocate at least 13.5GBs of VRAM to games (for both CPU and GPU data), we can assume that developers will use at least 10GBs for GPU data alone since 10GBs of the Xbox Series X's VRAM is referred to as GPU Optimal Memory due to having a bandwidth of 560GB/s.

So, VRAM requirements for discrete graphics cards will increase this generation.
 

regawdless

Banned
TUF oc. Runs pretty nice and cool :D
But like always, you don't really feel the difference in games.

Mine runs very cool as well, even under extreme stresstests over hours, 71C was the hottest I've seen it. What temps are you getting?
My chip hits the power limit when I go above 1855mhz. It's stock clock is 1740mhz, so I got +115mhz.
You more than doubled that. Congratulations on winning the silicon lottery!
 

DonMigs85

Member
maybe it's better to just wait for the rumored 7nm TSMC Ampere cards at this point. Or will they just jump straight to 5nm for Hopper?
 

rofif

Banned
Mine runs very cool as well, even under extreme stresstests over hours, 71C was the hottest I've seen it. What temps are you getting?
My chip hits the power limit when I go above 1855mhz. It's stock clock is 1740mhz, so I got +115mhz.
You more than doubled that. Congratulations on winning the silicon lottery!
My 3080 founders goes to about 1800-1850 I've not tried any oc. hits 320w pretty often. Fan is non audible at 1000rpm and still pretty quiet at 1800rpm (max it seems). At 320W it hits 76c hot hot hot but it's fine
 

regawdless

Banned
My 3080 founders goes to about 1800-1850 I've not tried any oc. hits 320w pretty often. Fan is non audible at 1000rpm and still pretty quiet at 1800rpm (max it seems). At 320W it hits 76c hot hot hot but it's fine

You mean it boosts up to 1850mhz or is this the "boost clock" shown in GPU Z?
Because with a boost clock of 1855mhz, my card actually goes up to 2070mhz in game according to MSI Afterburner.
 
Last edited:

rofif

Banned
You mean it boosts up to 1850mhz or is this the "boost clock" shown in GPU Z?
Because with a boost clock of 1855mhz, my card actually goes up to 2070mhz in game according to MSI Afterburner.
I don't know. 1890 is the highest I've seen in afterburner osd
 

Malakhov

Banned
RDR2 is a current-gen game; it was designed to run on the amounts of VRAM that are available to developers on the PlayStation 4 (Pro) and Xbox One (X): 5GBs of GDDR5 (PS4), 5.5GBs of GDDR3 (XBO) and GDDR5 (PS4 Pro), and 9GBs of GDDR5 (XBO-X). Keep in mind that these amounts are for both CPU and GPU data, so the amounts available for GPU data are less than these.

For the PlayStation 5 and Xbox Series X, which will each allocate at least 13.5GBs of VRAM to games (for both CPU and GPU data), we can assume that developers will use at least 10GBs for GPU data alone since 10GBs of the Xbox Series X's VRAM is referred to as GPU Optimal Memory due to having a bandwidth of 560GB/s.

So, VRAM requirements for discrete graphics cards will increase this generation.
I agree and that's the point. The 3080 will last the next gen especially with the ddr6 VRAM they've got going there. It will last up until the next big release, I'm not worried for one bit

For the folks who are worried I'll gladly take your 10gb of vram version at cost instead of waiting for my TUF 3080 and then you can get the 20gb version ;)
 
Last edited:

Nydus

Gold Member
Mine runs very cool as well, even under extreme stresstests over hours, 71C was the hottest I've seen it. What temps are you getting?
My chip hits the power limit when I go above 1855mhz. It's stock clock is 1740mhz, so I got +115mhz.
You more than doubled that. Congratulations on winning the silicon lottery!
Temps are around 68°C.

The card goes to a dude who can't fit a trio in his case. So I get his trio. ^^
 

Ascend

Member
Everyone hates this guy, but, I still think it's worth watching. Short version? A bunch of RTX 3080 cards will be available in October, over 300k units, while the 3090 will have over 30k units. The market will also pretty much be flooded with RTX 3070 cards.

 

RobRSG

Member
Everyone hates this guy, but, I still think it's worth watching. Short version? A bunch of RTX 3080 cards will be available in October, over 300k units, while the 3090 will have over 30k units. The market will also pretty much be flooded with RTX 3070 cards.



Short version for this channel: Tales from Quagmire’s ass.
 

saintjules

Member
Everyone hates this guy, but, I still think it's worth watching. Short version? A bunch of RTX 3080 cards will be available in October, over 300k units, while the 3090 will have over 30k units. The market will also pretty much be flooded with RTX 3070 cards.



Interesting to note that the 3080 20GB is only for the AIB models (if true).
 
Last edited:

Rbk_3

Member
Everyone hates this guy, but, I still think it's worth watching. Short version? A bunch of RTX 3080 cards will be available in October, over 300k units, while the 3090 will have over 30k units. The market will also pretty much be flooded with RTX 3070 cards.


This guy said availablity of the 3080 wouldn’t be too bad and that the 3090 would be limited. I wouldn’t trust a god damn thing out of his mouth.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Please don’t post videos from that liar. He posts garbage just to make money.

Best Buy did a drop today. Did anyone get one? I hope so.
 
Last edited:

mitchman

Gold Member
This guy said availablity of the 3080 wouldn’t be too bad and that the 3090 would be limited. I wouldn’t trust a god damn thing out of his mouth.
What? He said NVidia would force scarcity for the cards before they were launched, he had a full video about that. Just see what happened and how accurate it was.
 

Ascend

Member
This guy said availablity of the 3080 wouldn’t be too bad and that the 3090 would be limited. I wouldn’t trust a god damn thing out of his mouth.
Really...? I don't recall him saying that. I recall him saying that there would be many shortages, but that the shortage would be deliberately forced by nVidia...

 
Last edited:

Krappadizzle

Gold Member
This guy said availablity of the 3080 wouldn’t be too bad and that the 3090 would be limited. I wouldn’t trust a god damn thing out of his mouth.
I've watched quite a lot of his videos and for weeks before release he kept saying there would be huge shortages on the 3080, so I'm not sure where you got your information from. It wasn't from him.

62 on the waiting list for my card.

How far down a waiting list do you know you are? How would you even check.
 
Last edited:

Ulysses 31

Member
I'm warming up to the Aorus 3090 XTREME with its LED display. 👀

Would be a hard choice if the ROG STRIX and AORUS XTREME were to become available to me at the same time.
screen-img.png

7128E5PMHIL._AC_SL1500_.jpg
 
Last edited:
I'm pretty happy about getting the RTX3080. I can game at 4k 120hz VRR where before I could game at 1440p. As I sit 2 ft from a 77" screen, it makes a big difference.
 

BluRayHiDef

Banned
Another unsuccessful trip to Micro Center for an RTX 3090.

I've visited my local Micro Center every weekday morning this week and whenever they've had EVGA RTX 30 Series cards, they've always been 3080s - no 3090s (from any AIB). When will EVGA 3090s (preferably FTW3s) be at brick-and-mortar stores?
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Another unsuccessful trip to Micro Center for an RTX 3090.

I've visited my local Micro Center every weekday morning this week and whenever they've had EVGA RTX 30 Series cards, they've always been 3080s - no 3090s (from any AIB). When will EVGA 3090s (preferably FTW3s) be at brick-and-mortar stores?
Well, since you’re there, anyway, can you grab me an EVGA 3080? I’d prefer it over my FE that hopefully arrives today :)
 

Dirk Benedict

Gold Member
I'm just chilling out for now. I haven't managed to grab a 3090, but I'm still confident in getting one before the year is done, and even if I don't. I'm extremely patient.
 

BluRayHiDef

Banned
Quick question. Someone told me that upgrading my RAM would increase the lowest 1% of frames that I experience with my RTX 3080 because I'm currently using 32GB of 2400MHz DDR4 RAM (8 x 4GB). Is this true? If I were to upgrade to 3200MHz, would it make a difference? My current RAM setup is quad-channel since I'm using an Asus X99 board. If I do upgrade my RAM, would it matter if I upgrade to four 8GB sticks to maintain four channels or two 16GB sticks for dual channel(which would be cheaper)?
 

DeaDPo0L84

Member
Well, my situation isn't so bad because I have an EVGA RTX 3080 FTW3 to hold me over until I can get an RTX 3090.

I'm sure you've answered this multiple times before but why are you dead set on getting a 3090 when we know everything we know about the performance increase? Especially when you currently have one of the better 3080 options.
 

J3nga

Member
Quick question. Someone told me that upgrading my RAM would increase the lowest 1% of frames that I experience with my RTX 3080 because I'm currently using 32GB of 2400MHz DDR4 RAM (8 x 4GB). Is this true? If I were to upgrade to 3200MHz, would it make a difference? My current RAM setup is quad-channel since I'm using an Asus X99 board. If I do upgrade my RAM, would it matter if I upgrade to four 8GB sticks to maintain four channels or two 16GB sticks for dual channel(which would be cheaper)?
I doubt that, you're running intel CPU which isn't that demanding on ram, Ryzen CPU's are way more demanding for ram memory and you're running in quad channel. Have you tried OC'ing your ram memory yet?
 
Last edited:

Rbk_3

Member
I'm sure you've answered this multiple times before but why are you dead set on getting a 3090 when we know everything we know about the performance increase? Especially when you currently have one of the better 3080 options.

Plus I am pretty sure he said he is running a 4790k. Would be way way way better off using that extra money to overhaul his system.
 

Rbk_3

Member
I've watched quite a lot of his videos and for weeks before release he kept saying there would be huge shortages on the 3080, so I'm not sure where you got your information from. It wasn't from him.



How far down a waiting list do you know you are? How would you even check.

Here is where I am getting that from. Screen I sent to one of my buddies on Aug 29. “OK Availability” for the 3080.
lg5NGAE.jpg
 

BluRayHiDef

Banned
I'm sure you've answered this multiple times before but why are you dead set on getting a 3090 when we know everything we know about the performance increase? Especially when you currently have one of the better 3080 options.

1. An impulse that won't go away.

2. The belief that the PC versions of multi-platform games will require more than 10GBs of VRAM for graphical settings higher than the graphical settings of PS5's and XSX's versions, since those consoles will be able to use as much as 10GBs for GPU data alone. 10GBs of the XSX's 16GBs of GDDR6 is called "GPU Optimal Memory," which implies that Microsoft believes that 10GBs will be necessary for next-gen graphics on console. So, PC may need more if its versions of games will be designed to look and run better.

3. I also do occasional video editing and would benefit from the buttload of VRAM.

4. The meager 10% to 15% of additional performance of the RTX 3090 relative to the RTX 3080 can push the few games that the RTX 3080 cannot render at 60 frames per second at 4K with maximum settings past that threshold (e.g. Control).
 

WakeTheWolf

Member
Everyone hates this guy, but, I still think it's worth watching. Short version? A bunch of RTX 3080 cards will be available in October, over 300k units, while the 3090 will have over 30k units. The market will also pretty much be flooded with RTX 3070 cards.


I hope so I'm considering the 3070
 

Rentahamster

Rodent Whores
Quick question. Someone told me that upgrading my RAM would increase the lowest 1% of frames that I experience with my RTX 3080 because I'm currently using 32GB of 2400MHz DDR4 RAM (8 x 4GB). Is this true? If I were to upgrade to 3200MHz, would it make a difference? My current RAM setup is quad-channel since I'm using an Asus X99 board. If I do upgrade my RAM, would it matter if I upgrade to four 8GB sticks to maintain four channels or two 16GB sticks for dual channel(which would be cheaper)?
Google for videos or articles about memory speeds affecting frame rates and frame times. There's a lot out there.

In the meantime you can try to do the same thing for free by overclocking the RAM you already have.


Here is where I am getting that from. Screen I sent to one of my buddies on Aug 29. “OK Availability” for the 3080.
lg5NGAE.jpg

In later videos he said that 3080 stock would be limited and 3090 stock even more so.
 

BluRayHiDef

Banned
I doubt that, you're running intel CPU which isn't that demanding on ram, Ryzen CPU's are way more demanding for ram memory and you're running in quad channel. Have you tried OC'ing your ram memory yet?

I tried OC'ing the RAM yesterday; no matter what I did, my PC would not boot. I think the issue is that even though all eight sticks of RAM that I'm using are the same model, they're from three different batches and therefore don't have the same tolerance levels for voltages and other settings. Four of them came together; three of them also came together with a fourth stick that died; and I bought one of them by itself to replace the one that died.

I also tried OC'ing my CPU; I got it to boot at 4.5GHz, but it would randomly crash after a bit of time while sampling games. Having said that, I didn't notice any improvements in frame rates when sampling games, so I don't think there's any point in me trying to overclock it again.
 

J3nga

Member
I tried OC'ing the RAM yesterday; no matter what I did, my PC would not boot. I think the issue is that even though all eight sticks of RAM that I'm using are the same model, they're from three different batches and therefore don't have the same tolerance levels for voltages and other settings. Four of them came together; three of them also came together with a fourth stick that died; and I bought one of them by itself to replace the one that died.

I also tried OC'ing my CPU; I got it to boot at 4.5GHz, but it would randomly crash after a bit of time while sampling games. Having said that, I didn't notice any improvements in frame rates when sampling games, so I don't think there's any point in me trying to overclock it again.
In most games you're GPU bounded especially @4K so it makes sense that extra CPU boost has no effect whatsoever. Having that said, if you're planning to upgrade whole platform(last time you said you want 5900X, correct me if I'm wrong) then there's no rush really, unless you'll buy just ram sticks now and later on use those on your new platform if there's no compatibility issue, up to you really, but neither scenario is better or worse than the other IMO.
 

Krappadizzle

Gold Member
Lots of people trying to convince themselves that a 3090 is a good buy.
Sunk cost fallacy at it's finest.

I dunno, MLID has an insufferable air about him, but his ‘prediction’ has played out pretty much as he said.

Almost exactly like he said. Though to be fair, just about anyone paying attention could have made a pretty close/similar prediction. Let's not forget he also said that the 30xx series would have a "Traversal Co-processor" which was absolutely WAY off.
 
Last edited:
Top Bottom