• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

|OT| Nvidia RTX 30XX |OT|

BluRayHiDef

Member
Aug 21, 2015
2,697
3,437
725
I agree. Control has the most basic texturing of any modern game to date. I can't believe how simple they made the texturing. The PBR shading is also elementary at best.
I guess they wanted to minimize production costs since Quantum Break didn't do well financially.
 

GymWolf

Member
Jun 11, 2019
11,320
16,088
565
Yes, the SSD in the XSX is capable of reading and writing data at only 2.5GB/s. However, it isn't bottlenecked by a bloated operating system and hundreds of background applications. Hence, you'll need one that's faster to compensate for those bottlenecks. Get the drive that I have: Sabrent 2TB Rocket NVMe 4.0 Gen4 PCIe M.2 Internal SSD.
Don't you said that this one was too slow? or you were talking specifically about going against the ps5 ssd?!

Also, didn't nvidia created the same ssd sauce inside the console with their last gpu series to eliminate the bottlenecks on pc? It was called rtx IO.
 
Last edited:

GymWolf

Member
Jun 11, 2019
11,320
16,088
565
Too slow for the PlayStation 5, not for the Xbox Series X.
Yeh i edited my post.

So you think i'm gonna be okay for at least 3-4 years with the same ssd you have until the console get a refresh with maybe even more faster ssd.
 

BluRayHiDef

Member
Aug 21, 2015
2,697
3,437
725
Just for the record, why didn't you waited for a more faster ssd? Do you think the same as me that devs are not gonna develop around the ssd inside ps5 but more probably around the ssd inside sex?!
It was on sale and I needed an SSD ASAP because I was sick of deleting games to make room for others.
 

gunslikewhoa

Member
Mar 3, 2014
3,014
2,499
630
Micro Center has a PowerSpec with RTX 3080 available in Columbus for $2500. If I were still looking, I would snap this up in a second. I've purchased an open box PowerSpec laptop and desktop in the past and they were both great.

 
Last edited:

regawdless

Member
Sep 21, 2020
236
315
230
What's the powerlimit on it? What does GPU z say is the boost clock? And can you benchmark timespy extreme? I don't bench normal timespy cause my 7700k bottlenecks.
OK so I'm home:

Power limit 109
Boost clock 1870mhz
Timespy extreme 7600
Graphics score 9036
 
  • Like
Reactions: Nydus

kiphalfton

Member
Dec 27, 2018
748
874
395
Anybody bought a Zotac 3080? Would rather have an evga card, but let's be real nobody is getting their brand of choice.
 

sertopico

Member
Oct 30, 2012
862
806
565
www.youtube.com

That's some good and bad news, IF that's true. Well, I will wait till next year.
 

Spukc

Member
Jan 24, 2015
14,812
13,407
830

That's some good and bad news, IF that's true. Well, I will wait till next year.
Dude let me tell you better wait another year!
Bet something even better will come!

sorry ignore that my m8 just told me of you wait 1 year longer something even better will come!
 

sertopico

Member
Oct 30, 2012
862
806
565
www.youtube.com
Dude let me tell you better wait another year!
Bet something even better will come!

sorry ignore that my m8 just told me of you wait 1 year longer something even better will come!
I don't get your sarcasm. One way or another the shortage will last till beginning of next year (JHH stated that), unless you wanna pay triple for a card on ebay. No thanks.
 
  • Like
Reactions: TheKratos

Spukc

Member
Jan 24, 2015
14,812
13,407
830
I don't get your sarcasm. One way or another the shortage will last till beginning of next year (JHH stated that), unless you wanna pay triple for a card on ebay. No thanks.
You have no idea what i just heard dude.
here it comes...

if you wait 3 years..
yup
You guessed it.. Something even better😱😱😱😱
 

sertopico

Member
Oct 30, 2012
862
806
565
www.youtube.com
You have no idea what i just heard dude.
here it comes...

if you wait 3 years..
yup
You guessed it.. Something even better😱😱😱😱
That's not the point "dude". If there had been enough cards around at reasonable prices and not sold by scalpers I would have bought it already. Given the current situation having some more patience is essential, from my personal perspective. Sorry but this launch has been a disaster, Samsung hasn't clearly been able to handle the production of these cards. There are shops ordering 2000 pieces and they get 100 shipped. That is not normal, this 8nm chip was born crippled (despite the great performance).
 

CuNi

Member
Sep 4, 2014
1,141
1,013
765
Germany
That's not the point "dude". If there had been enough cards around at reasonable prices and not sold by scalpers I would have bought it already. Given the current situation having some more patience is essential, from my personal perspective. Sorry but this launch has been a disaster, Samsung hasn't clearly been able to handle the production of these cards. There are shops ordering 2000 pieces and they get 100 shipped. That is not normal, this 8nm chip was born crippled (despite the great performance).
So you say "wait" based on a unverified rumor that they might move production to 7nm?
Even if we would give it the benefit of the doubt and assume it's real, either those will be Super cards or same performance cards that run slightly cooler and perform better for probably a higher price point.
This is NVIDIA we're talking about. They'll never let this chance slip to not increase the pricing. 8nm 699$ and 7nm 749$ probably.
If they plan to move to 7nm, I can't see those cards ship before Summer/Winter 2021 anyway and if you hate Ampere that much for what it is, if you're okay with waiting till late 2021 to get a card, do yourself a favor and wait a bit longer till 2022 for the next architecture.
 
  • LOL
Reactions: Spukc

sertopico

Member
Oct 30, 2012
862
806
565
www.youtube.com
So you say "wait" based on a unverified rumor that they might move production to 7nm?
Even if we would give it the benefit of the doubt and assume it's real, either those will be Super cards or same performance cards that run slightly cooler and perform better for probably a higher price point.
This is NVIDIA we're talking about. They'll never let this chance slip to not increase the pricing. 8nm 699$ and 7nm 749$ probably.
If they plan to move to 7nm, I can't see those cards ship before Summer/Winter 2021 anyway and if you hate Ampere that much for what it is, if you're okay with waiting till late 2021 to get a card, do yourself a favor and wait a bit longer till 2022 for the next architecture.
I am not telling anybody to do what I do, all I said was I'd rather wait until january and see what happens. We also don't have any ETA on new shippings, I contacted asus and they have no clue about it. I aam not taking that rumour for granted, that is why I used the conditional in my first post, IF that rumor's true, than better wait till next year (FOR ME, better repeating :D ). Unfortunately I skipped Turing and I am stuck with a 1080Ti, which can still run anything at 2k with no problems whatsoever, but I also wanted to test rtx on the new upcoming games. I am a bit disappointed that's it. Regarding the 7nm I would find april/may reasonable for the super/ti/whatever they're called series. Let's see.
 
  • Like
Reactions: TheKratos and CuNi

CuNi

Member
Sep 4, 2014
1,141
1,013
765
Germany
I am not telling anybody to do what I do, all I said was I'd rather wait until january and see what happens. We also don't have any ETA on new shippings, I contacted asus and they have no clue about it. I aam not taking that rumour for granted, that is why I used the conditional in my first post, IF that rumor's true, than better wait till next year (FOR ME, better repeating :D ). Unfortunately I skipped Turing and I am stuck with a 1080Ti, which can still run anything at 2k with no problems whatsoever, but I also wanted to test rtx on the new upcoming games. I am a bit disappointed that's it. Regarding the 7nm I would find april/may reasonable for the super/ti/whatever they're called series. Let's see.
I feel you. I'm still on a 970 and sadly cannot wait much longer to upgrade as I plan on getting into VR.
I want to see what AMD has to offer as they already produce on 7nm TSMC. I'm mostly interested in 1080p and 1440p performance anyway but I gotta say if they don't have anything to counter RTX/DLSS I might still go with NVIDIA.
Let's see how this all plays out!
 
Apr 11, 2016
832
420
395
I guess they wanted to minimize production costs since Quantum Break didn't do well financially.

Quantum Break had all the money in the world at its disposal. It was a premiere exclusive from Microsoft. Control was made at a nearly indie level, with a budget of 20-30 million euros. Thats peanuts compared to todays budgets. You can see the cuts they had to make from beggining to end in the game, while playing. Samey enviroments, that by their nature dont require a lot of detail, they chose that asphalt looking type of enviroment. The weapon which is just one weapon that morfs. The lack of enemy variety, lack of variety in combat scenrios, in scripting. They really did great with such a low amount of money, but you can tell it was low
 

nemiroff

Member
Feb 19, 2018
1,330
1,683
465
Then what is the point of GPUs with memory capacities above 5GB (of GDDR6X) or a corresponding amount of the slower GDDR6? As explained in the Reddit post that I embedded in one of my posts above, 5GBs of GDDR6X is enough for current games at 4K Ultra, yet we have 16GB and 20GB variants of the RTX 3070 and RTX 3080, respectively, coming out?
As usual I'll just begin with the disclaimer that I'm not claiming to be especially knowledgeable in this field, I'm talking out of my ass, but with the heart of a engineer.. Anyway, I'd say the answer is contextual. An important part is flexibility, because obviously not all code are made the same way. A TLDR perfect answer in a perfect world would be that a balanced bandwidth can often be more important than size in itself, exactly because data caching/streaming, in that if you manage to hold/cache enough data to render your scene/s at a certain quality that would be "enough" VRAM.. But in general much of "what decides" what is enough" relies on factors often not mentioned in these discussions, f.ex. like market trends and "developer wishes", and all the way to even a strategic lowest common denominator. in my opinion the rumored 20GB version of the 3080 might end up massively underutilized - At least for the next few years even, and this much for the simple reason there is a 10GB version which Nvidia claim is the flagship of the 30xx generation, and seems to fulfill all of the above criteria. It wouldn't be much of a flagship if it ended up redundant a few months later because it didn't support the majority of high end settings..
 
Last edited:

BluRayHiDef

Member
Aug 21, 2015
2,697
3,437
725
As usual I'll just begin with the disclaimer that I'm not claiming I'm especially knowledgeable in this field, I'm talking out of my ass, but with the heart of a engineer.. Anyway, I'd say the answer is contextual. An important part is flexibility, because obviously not all code are made the same way. A TLDR perfect answer in a perfect world would be that a balanced bandwidth can often be more important than size in itself, exactly because data caching/streaming, in that if you manage to hold/cache enough data to render your scene/s at a certain quality that would be "enough" VRAM.. But in general much of "what decides" what is enough" relies on factors often not mentioned in these discussions, f.ex. like market trends and "developer wishes", and all the way to even a strategic lowest common denominator. in my opinion the rumored 20GB version of the 3080 might end up massively underutilized - At least for the next few years even, and this much for the simple reason there is a 10GB version which Nvidia claim is the flagship of the 30xx generation, and seems to fulfill all of the above criteria. It wouldn't be much of a flagship if it ended up redundant a few months later because it didn't support the majority of high end settings..
So, you think the RTX 3080-20GB will be a gaming and a production card?
 

sertopico

Member
Oct 30, 2012
862
806
565
www.youtube.com
I feel you. I'm still on a 970 and sadly cannot wait much longer to upgrade as I plan on getting into VR.
I want to see what AMD has to offer as they already produce on 7nm TSMC. I'm mostly interested in 1080p and 1440p performance anyway but I gotta say if they don't have anything to counter RTX/DLSS I might still go with NVIDIA.
Let's see how this all plays out!
Yup, I have a positive feeling about the upcoming amd gpus as well. We all need some proper competitors, a monopoly has never been good.
I personally own a gsync monitor, so I guess I'll have to go for nvidia again, changing monitor is out of question, specially after recently updating my cpu and mobo.

Anyway, you managed to stretch that 970 all the way till 2020, well done. It still manages to offer decent perfs at 1080p. :)
 
  • Like
Reactions: CuNi

Dr.D00p

Member
May 23, 2019
670
1,446
350
Same, people have a hard-on for VRAM, ridiculous
It all depends on how long you keep your GPU's.

If you intend to upgrade to the 40xx cards in a couple of years, 10GB will be fine until then, but if you say, only upgrade every other generation, 4-5yrs, then having 20GB will give it the legs to last that long without having to make (too many) compromises in the settings menus in years 4 & 5.
 
Last edited:

nemiroff

Member
Feb 19, 2018
1,330
1,683
465
So, you think the RTX 3080-20GB will be a gaming and a production card?
I'm quite frankly sort of puzzled by a 20GB version and why anyone would want it for the expected price difference. I mean, at least until there's some tangible use cases from multiple developer sources who are willing to make a potential niche market happy within the "life expectancy" of a 3080. But this is all conjecture anyway.
 
Last edited:
  • Thoughtful
Reactions: BluRayHiDef

nemiroff

Member
Feb 19, 2018
1,330
1,683
465
Yes. I wouldn't be surprised if the 10GB is phased out completely. But that is my own speculation.
Well, in theory the market could drive this decision by only buying the 20GB version at a larger profit margin. But there's no apparent technically reason for why it should be phased out. And if they did anyway at a 200 dollar price hike and without actual demonstratable use cases it could potentially end up looking like a giant flip flop and one of the biggest scams in the GPU business. Remember that Nvidia on multiple occasions have explained why they put 10GB VRAM on the 3080, and with no indication of it being under-specced.

I also think 200 seems slightly on the low side considering they would have to either go with 16Gb GDDR6x die size which is not even released from Micron yet, there's no tooling in any fab to make it, or completely redesign the PCBs, and possibly even rebuild the heatsink and GPU case size to take care of the extra heat. Edit: Actually they could just use modified 3090 designs, lol.


Disclaimer: CONJECTURE, potentially pure BS.
 
Last edited:

Krappadizzle

Member
Oct 4, 2011
13,945
3,819
1,170
I had a chance to buy a 3080 today but it was a Pallit and I dont know anything about this brand, so i second guessed myself and then it was gone a couple mins later.
Same shit happened to me the other day. I managed to get a EVGA card into my cart, but I had just woken up and was still all groggy and it took me a minute to realize it was the card I wanted. And by the time I realized it, it was already too late.
 

Malakhov

Member
Jun 6, 2004
7,520
1,531
1,790
It all depends on how long you keep your GPU's.

If you intend to upgrade to the 40xx cards in a couple of years, 10GB will be fine until then, but if you say, only upgrade every other generation, 4-5yrs, then having 20GB will give it the legs to last that long without having to make (too many) compromises in the settings menus in years 4 & 5.
I can still run games in 4k on highest settings like rdr2 with 6gb of vram and it's not even being used anywhere near 100%.

The 3080 10gb VRAM will last until next gen easily
 
  • Like
Reactions: regawdless