• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is 8gb-10gb Vram enough for next gen? Can we settle this once and for all?

Dr_Salt

Member
So as we all know the new Nvidia RTX30 series cards namely the rtx3070 and the rtx3080 are shipping with 8gb and 10gb VRAM respectively, I´ve read all kinds of arguments in favor and against the notion that 8gb-10gb is enough for next gen even at 2k and 4k resolutions at high frame rates.

I am not as tech savy as some people here and I think it would be good if we have some of our more knowledgeable users chime in on this.

So what do you guys think? Is this amount of VRAM enough for next gen or will they become obsolete after we start seeing our first games developed exclusively for next gen?
 

Celcius

°Temp. member
I don't believe 10gb will be enough for 4K gaming at max settings throughout all of next gen. Reminds me of when I bought a 3gb 780 Ti at the start of last gen and it was great at 1080p for years... then FFXV came along. Upgrading to a 1070 Ti with 8gb made a huge difference and I wasn't even using the optional high-res texture pack... just 1080p max settings.
 

Sejan

Member
I don’t know much about graphics cards and vram, but I’d imagine it’s fine if Nvidia is releasing their founder’s edition cards with that amount. As comparatively cheap as these cards are compared to the 20 series, they could have easily doubled the vram and added $100 to the price if it made that big of a difference. Even at that price point many would have still talked about them as a bargain compared to the 20 series.
 
The answer is no.

There will be MANY games soon that will use more than 8 and 10 GB of VRAM.

2021 will be a cross-gen year, but even then we will see lots of games that will use more. By 2022, that 8-10 GB of RAM will be a huge limiting factor of your 3070 and 3080, even though the rest of the GPU is more than up to the challenge.

10GB may seem like enough VRAM right now, but next gen is about to start.

My 1080ti has 11GB and it's 3.5 years old.
 
Last edited:

johntown

Banned
If you don’t plan on running 4K textures on modern games it will be enough.

With 16GB of vram I was still not able to run Resident Evil 2 Remake with max setting. I had to lower features for stability because I demand ultra textures.
 

Rikkori

Member
Here's something to consider: every time people said "x ram is enough" they ended up with egg on their face. This gen. The gen before it. And on and on it goes. Sadly you can see if you go watch those arguments happen back then we see the same discussions today, almost to the letter, as if what's being said doesn't matter it just has to be repeated.

The simple facts are: vram usage is heavily dependent on the game, the resolution, the settings, and other features made available. There is no universal answer & we don't know how next-gen games will use vram & what settings they will/won't allow.

All I know is that at 4K there's several games which struggle with "just" 8 GB. Hell, there's situations where there's some issues even at 1440p. So looking at it like that, I wouldn't personally buy the 3080 10 GB if I didn't expect to sell it off before the next arch hits. It just makes no sense, texture quality costs almost nothing in terms of performance but has huge image quality implications, same for various image streaming options. Those are just not things I'd want to turn down, ever.
Something else to consider is this - as you add RT you further increase memory requirements which may tip you over what you would otherwise have under just rasterisation. So then you have to make further trade-offs. And don't expect DLSS to save your ass, because not every game will have it & there will still be major games that partner with AMD and therefore you can expect Nvidia features to not be there for them.

People who think RTX IO (Direct Storage) will help with this fundamentally misunderstand what that technology does, and as said above by iD's lead programmer (and even Nvidia themselves) - it's in no way a substitute for memory. It's simply too slow.
Likewise improved compression & increased memory bandwidth really end up doing the same thing - getting things faster into memory but you still need a certain amount per "scene" below which you cannot go and then compensate with with more bandwidth.

If someone wants to buy a GPU to last them the whole console-gen I'd very much recommend against buying a 10 GB card, especially as we know 16 GB & 20 GB cards are on the way as soon as October/November. If you just want a GPU for Cyberpunk & would upgrade again in 2 years then yeah - go for it.
 
Last edited:

ethomaz

Banned
You need to create new tricks but as a generation it is really a small jump.

Most devs wanted more but RAM is not that cheap.
 

ACESHIGH

Banned
It should be enough in order to have good looking textures at 1080p for a long time.
This is something I have wondered for a while, how good will the "low" settings visuals of AAA games be?
A lot of games these days look pretty good with low/mid settings. Picture next gen ones.
 

Rentahamster

Rodent Whores
Is this amount of VRAM enough for next gen or will they become obsolete after we start seeing our first games developed exclusively for next gen?
That depends on what resolution and detail settings you're using.

You can already use a game like DOOM Eternal to really strain a GPU's memory capacity.
 

LordOfChaos

Member
Can't speak for games, but from a 3D modeling standpoint, when it comes time to render a scene, 8 GB of VRAM can get full pretty easily.

I was hoping for 16+ GB for the 3080. I might wait for the 3080 Ti.

That's because they will charge you up the butt for identical silicon with more VRAM attached in the Quadro series.

And that's where the 3090 has interest. Overkill for most gamers, but like some Titans before it, it serves as a great baby's first heavy workstation card, with 24GB of VRAM at least.
 

skneogaf

Member
1080p and 1440p yes 4k it's already reaching that amount of ram.

I think vrs and other techniques are likely to reduce ram usage but I'd still rather have more than 10GB.

Which is why I'm not buying the 3080 and waiting for the 3080ti or choosing the 3090 which has too much ram for 4k gaming.
 
I read someone at reeee stating that FS2020 used 7-8gb with everything maxed out on his rig. He might be wrong tho idk.

Then his GPU only had 8GB of VRAM.

All maxed out at 4K it will use 12.5GB VRAM ( on a Radeon VII which has 16GB VRAM ) and 22GB of system RAM.

Now some people will argue that it's not really "using" that much but rather "reserving" that much "because it can" ... Is this true?

I don't know which is true because I don't know of any way of testing if a game is "using" all it's VRAM or just "reserving" it.
 
Last edited:

skneogaf

Member
If you don’t plan on running 4K textures on modern games it will be enough.

With 16GB of vram I was still not able to run Resident Evil 2 Remake with max setting. I had to lower features for stability because I demand ultra textures.

That game had incorrect readings on ram usage as when you put the highest settings at 4k it said you used over 11GB which my 2080ti has but 3rd party applications that show gpu readings like ram and fps said it wasn't using anywhere near 11GB.
 
Last edited:

Patterson

Member
Fight Sim 2020 is like crysis though, it’s the absolute top of the range so you don’t use it as a benchmark for what the rest of the games will use/need.
I think it’s worth considering if you’re planning on holding onto the card for a few years. Of course if it’s anything less than 4k sure it’s fine, but if he plans on playing max 4k it’s possible to run into limits.

There’s also rumors of 16 and 20 GB variants coming out later. Maybe in the form of ti or Super or whatever but I think those are the ones to get.

Personally I think Nvidia wanted to target these prices to win back some goodwill early and get people excited to buy and in doing so skimped on memory.
 
Last edited:

Kenpachii

Member
Then his GPU only had 8GB of VRAM.

All maxed out at 4K it will use 12.5GB VRAM ( on a Radeon VII which has 16GB VRAM ) and 22GB of system RAM.

Now some people will argue that it's not really "using" that much but rather "reserving" that much "because it can" ... Is this true?

I don't know which is true because I don't know of any way of testing if a game is "using" all it's VRAM or just "reserving" it.

Only way to test it is to get gpu's with x amount of v-ram on it and see when it bottlenecks by seing massive stutters. Software can't be used.
 

LED Guy?

Banned
No I do not think Xbox Series S RAM amount can be enough for Next-genration of gaming, goddammit even PS5 and Xbox Series X are barely enough, developers are out here complaining about Xbox Series S, they have been complaining about it since 2019 when Jason Schreier has bee telling us a long time ago.

I hope Xbox Series S fails miserably so that developers just quit supporting it, just think of a game like RDR 3, GTA 6, The Witcher 4, Control 2 (if there's one), and many others as well, it's not enough, it's just not enough, Microsoft is out here bottlenecking a whole generation.
 

EctoPrime

Member
With just 8GB of vram the best case scenario is being able to play a nextgen game with low settings in a few years. 16GB Could become the minimum just to compensate for developers going nuts with console data streaming and slow pc drives.
 
Last edited:

Elog

Member
If you like good graphics you want as much VRAM as possible - either by adding more physical VRAM or by adding an advanced I/O solution for faster turn-around or both. The VRAM pool holds back graphics a lot today since a game needs at lowest specifications to be able to run at a 4tflops/6GB VRAM/HDD PC. That limits the amount of geometry games run with, number of textures in any given environment and the resolution of those textures - the first two are not scalable but more or less set for any given game - and the lowest common denominator sets that bar.
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
If you plan to game maxed out at 4K, I don't think it will be enough within a couple of years. 1440p high frame rate, though? I think 10GB will be more than enough.
 
Last edited by a moderator:
I don't believe it is enough, certainly not for cards that cost $500 and $700 (and those are best case prices give scarcity).

No-one is saying that a 10gb card will have dramatically lower performance because of a game requiring higher memory. Just that there will certainly be games that shoot up against this limitation and performance will be impacted (at 4K especially).

We're right at the start of a new gen and 4K will be a resolution even more of us will game at, so buying ludicrously expensive cards with the same memory that a $400 RX 480 came out with 4 years ago is quite stupid.

Also, they will drop in value once RDNA2 cards with over 10GB and Ampere cards with 16GB/20GB are revealed soon.
 
Last edited:

bbeach123

Member
4g is barely enough for 1080p this gen . And the console only have like 8gb ram .

Next gen is 16gb at 1440p/4k . So my guess is 8gb will be barely enough later this gen .

But people will upgrade their GPU next 3-4 years anw so its not a problem . Im sure 8gb will be enough the next 4 years .
 

cormack12

Gold Member
Dwight SchruteOh, no no no, without the battery pack and the optional memory booster, it's barely three pounds.
Ryan HowardHow much memory does it have without the booster?
Dwight Schrute50L.
Ryan HowardI'm sorry, "L"?
Jim HalpertHow much L to a K?
Dwight SchruteYou are really going to want the booster.
 

Orta

Banned
I'm waiting for the ti variants and hoping they start at 16gb. Wouldn't touch the vanilla 3070/80, they'll be hamstrung by next-gen content/console ports.

Even something like RE3 remake if I'm not mistaken goes over the 8gb of vram mark.
 
Last edited:

Greeno

Member
Personally I am concerned that 10GB on the 3080 will be a problem. My current card has more.

Anyway, this is relevant from lead engine programmer at id Software.


This just continuously tells me that id Tech may not yet be ready for the IO solutions provided in the new graphic cards and the new consoles. Another developer from id Software was saying that he was expecting 32 GBs of memory in the new consoles (when talking about the Series S) and that we should have been at 128 GBs.

It seems to me that UE5 is one of a really small set of engines ready for next-gen use. I'm sure the game engines at Sony will also be ready for this as they seem to have went all in on their IO solution.
 

WakeTheWolf

Member
It already maxes out in some games like Warzone and that's usage not allocation. I wouldn't go with 3080. Then again Nvidia dont really care about screwing over buyers just look at how 2080ti became a meme lol.
 

ZywyPL

Banned
Duh, what year is it? I thought early 2000 when people thought more RAM=more powerful GPU was two decades ago... Consoles will use like 6-7GB for the GPU, and those are forced to render at 4K, so 8-10GB for FHD/QHD will be way more than enough. Not to mention the so-called "next-gen" games will stream the data from the SSD anyway.
 

GHG

Gold Member
No I don't think it will.

We go through this same conversation at the start of every new generation. Remember when the 2GB on the 680 was supposed to be "enough"?
 
1080p and 1440p yes 4k it's already reaching that amount of ram.

I think vrs and other techniques are likely to reduce ram usage but I'd still rather have more than 10GB.

Which is why I'm not buying the 3080 and waiting for the 3080ti or choosing the 3090 which has too much ram for 4k gaming.
What do you mean the 3090 has too much ram for 4k gaming ? Not being sarcastic im genuinely curious as to what you mean by that because im leaning 3090 so i can be good for 4k this entire upcoming generation. Thinking the evga 3090 ftw3 to be specific. Any opinions ?
 
Last edited:

Allandor

Member
Then his GPU only had 8GB of VRAM.

All maxed out at 4K it will use 12.5GB VRAM ( on a Radeon VII which has 16GB VRAM ) and 22GB of system RAM.

Now some people will argue that it's not really "using" that much but rather "reserving" that much "because it can" ... Is this true?

I don't know which is true because I don't know of any way of testing if a game is "using" all it's VRAM or just "reserving" it.
Well forget those numbers. Engines do packet loading and caching as much as RAM/VRAM is available, because games don't require a fast SSD in the system. So games have loaded many many things into memory that might be needed (or never).
So you only need a fraction of the memory you need now.
 
Last edited:
Here's something to consider: every time people said "x ram is enough" they ended up with egg on their face. This gen. The gen before it. And on and on it goes. Sadly you can see if you go watch those arguments happen back then we see the same discussions today, almost to the letter, as if what's being said doesn't matter it just has to be repeated.

The simple facts are: vram usage is heavily dependent on the game, the resolution, the settings, and other features made available. There is no universal answer & we don't know how next-gen games will use vram & what settings they will/won't allow.

All I know is that at 4K there's several games which struggle with "just" 8 GB. Hell, there's situations where there's some issues even at 1440p. So looking at it like that, I wouldn't personally buy the 3080 10 GB if I didn't expect to sell it off before the next arch hits. It just makes no sense, texture quality costs almost nothing in terms of performance but has huge image quality implications, same for various image streaming options. Those are just not things I'd want to turn down, ever.
Something else to consider is this - as you add RT you further increase memory requirements which may tip you over what you would otherwise have under just rasterisation. So then you have to make further trade-offs. And don't expect DLSS to save your ass, because not every game will have it & there will still be major games that partner with AMD and therefore you can expect Nvidia features to not be there for them.

People who think RTX IO (Direct Storage) will help with this fundamentally misunderstand what that technology does, and as said above by iD's lead programmer (and even Nvidia themselves) - it's in no way a substitute for memory. It's simply too slow.
Likewise improved compression & increased memory bandwidth really end up doing the same thing - getting things faster into memory but you still need a certain amount per "scene" below which you cannot go and then compensate with with more bandwidth.

If someone wants to buy a GPU to last them the whole console-gen I'd very much recommend against buying a 10 GB card, especially as we know 16 GB & 20 GB cards are on the way as soon as October/November. If you just want a GPU for Cyberpunk & would upgrade again in 2 years then yeah - go for it.
Ok so im fairly new to pc gaming and youre talking the language im trying to hear so id like to hear your opinion if its ok with you . I have a 2080ti build from 2 Years back and im looking to upgrade again because clearly 4k with raytracing on the 2080ti is not that realistic. Obviously a much better 1440p card. So as far as the new cards go im thinking 3090. Not to sound arrogant but money isnt an issue for me. Im leaning specifically towards the evga 3090 ftw3 based on the little bit of research ive done. Is it safe to assume that will cover the true next gen games that were set to see soon enough ? And im referring to 4k specifically. Youre thoughts seem to cut to the chase so if you wouldnt mind 🙏 Open to any suggestions you may have. Please and thanks.
 

kikii

Member
If you don’t plan on running 4K textures on modern games it will be enough.

With 16GB of vram I was still not able to run Resident Evil 2 Remake with max setting. I had to lower features for stability because I demand ultra textures.
ur CPU or RAM being bottleneck ?
 

Rikkori

Member
Ok so im fairly new to pc gaming and youre talking the language im trying to hear so id like to hear your opinion if its ok with you . I have a 2080ti build from 2 Years back and im looking to upgrade again because clearly 4k with raytracing on the 2080ti is not that realistic. Obviously a much better 1440p card. So as far as the new cards go im thinking 3090. Not to sound arrogant but money isnt an issue for me. Im leaning specifically towards the evga 3090 ftw3 based on the little bit of research ive done. Is it safe to assume that will cover the true next gen games that were set to see soon enough ? And im referring to 4k specifically. Youre thoughts seem to cut to the chase so if you wouldnt mind 🙏 Open to any suggestions you may have. Please and thanks.
It will definitely cover the next-gen for sure. Obviously there could be games that push some extreme brute force settings where even computer clusters wouldn't be enough but it's unlikely we'll see it (in particular, being able to increase ray counts would dramatically tank performance - most RT now is done with 1 ray per pixel or less). Not that it matters - if it's not happening with a 3090 it's not happening at all.
As for the 3090 FTW3, I think that will be a solid choice and they usually do very good & beefy air coolers. Would be my choice as well. But if you can wait, and since you have a 2080 Ti I don't see why not - I'd wait for the AIO versions. When you have a card that outputs this much power watercooling becomes much more efficient than air cooling and the Hybrid cards are coming with 240mm Radiators which would be plenty, and that way you don't have to setup a watercooling loop on your own, it's much easier to install. Those will be more of an Oct/Nov launch afaik.
 
Top Bottom