• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is 12gb vram next on the chopping block, or does it still have some life?

how long do you think 12gb vram will last now


  • Total voters
    144

HL3.exe

Member
Bit of a overreaction. Even 8gb is still fine for a couple of years for most games. Using console equivalent settings.

Think of it as: when the next consoles hit, the 'lowest common denominator' ceiling goes up a bit. Thats generally when most hardware specs need a refresh. So around 2027. Before that, nah. Plenty of scalability options until then.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
No I didn't have any problems. The game ran flawlessly for me at high settings at 1440p and had no major impact on image quality.

The mindset of 'everything needs to be completely maxed out to be considered acceptable' is laughably foolish.
Flawlessly is one thing. Acceptable is another.
 

Silver Wattle

Gold Member
16GB is the sweet spot ATM, 12GB will be fine for at least 2 more years, but once the RT/AI stuff starts kicking up 12GB will start faltering.
For your situation I suggest waiting until next year to upgrade, buying now is always preferred, but cyclically, the current timing is bad for those with a decent card as you don't have the necessity to upgrade.
 

RaySoft

Member
I'm a bit worried for VRAM capabilities now that 8gb is fully outdated and relegated to budget tier. I want a 4070 super, but paying that much for 12gb vram which could be outdated in 2 or 3 years sounds like a bad decision imo.
Buy a 4070 Ti Super with 16GB ram. Case closed. Try to match the ram specs of consoles, and you will be golden the whole generation.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
That's not exactly a good example. The game is old. Take a look at Last of Us 2 pre patch or even partly today.

Should have specified Phantom Liberty as well which seemed more demanding, Alan Wake 2 ran well too. I'm sure it might choke a bit with TLOU but we're talking about "graphically intensive 4K 60fps games with bells and whistles on" here - 12gb VRAM is going to be more than enough for people aiming for anything less, for some time. Frankly if you're aiming for these sorts of specs you've probably already bought a card with 16+, I just went a bit cheaper because DLSS is doing most of the work on a lot of these games anyway. Hogwarts was lovely.
 
Last edited:

Kataploom

Gold Member
This is how i see it.

12gb for 1080p
16gb for 1440p
16+gb for 4k
Haven't had any issue with 12gb at 1440p at all, it's always far from getting maxed out, most games top at 10gb and those tend to be demanding or badly optimized ones, most don't even reach 7gb on ultra textures actually. 12gb VRAM won't be an issue for a while, you should be more worried about processing power to move UE5 games than VRAM if you have a 12gb cards
 

squarealex

Member
Just about the worst card you could have picked.
The worst card for what and who? I've mainly pick it bc I've needed 16Gb for StableDiffusion. The card only cost 250$ when I sell my 2060S, I have more rasterization power for all bullshit Nvidia people loved like DLSS / FrameGeneration.

I'm just pitty for people picking 3060 8Gb (the most scam card ever) and 3060 12Gb thinking this is a good stuff.
 
I'm a bit worried for VRAM capabilities now that 8gb is fully outdated and relegated to budget tier. I want a 4070 super, but paying that much for 12gb vram which could be outdated in 2 or 3 years sounds like a bad decision imo.
Depends what you want it for. Eventually it won't be enough like always but 12GB is plenty for 1440p. 16GB is more future proof but that's what's necessary for 4k right now at max settings.

Keep in mind this is only at the most unoptimized settings (ultra) once you drop down to high or medium your VRAM requirements go down. If I were you I'd save up and wait for the 50 series at the end of the year.
 
The worst card for what and who? I've mainly pick it bc I've needed 16Gb for StableDiffusion. The card only cost 250$ when I sell my 2060S, I have more rasterization power for all bullshit Nvidia people loved like DLSS / FrameGeneration.

I'm just pitty for people picking 3060 8Gb (the most scam card ever) and 3060 12Gb thinking this is a good stuff.
The 4060 ti to is decent for AI the problem is it trades a significant amount of bandwidth for L2 cache which is not so good for AI but fine for games. This makes it slower than it otherwise could be as consumer LLMs 70bil parameters and smaller tend to be VRAM limited first and bandwidth limited second.
 

MastAndo

Member
I just picked up a 4070 Super since the price was right (relatively) and I wanted something that would fit nicely in an SFF case. My 2080 Super was still serviceable but it was cramped in my case and started running super hot. It sounded like a jet engine when in heavy use and I couldn't deal with it.

4K is cool and the 4070 Ti Super was tempting and all, but I've always been content with 1440p with all the graphical bells and whistles, especially given my sitting distance from a 55" OLED. Maybe I'm blind, but I've never noticed all that much of a difference between 1440p and 4K with my setup.

I'll take a glance at the 6000 or 7000 series cards when they drop to see what's out there, but I'm thinking I'm set for 3-4 years, all things given.
 
No I didn't have any problems. The game ran flawlessly for me at high settings at 1440p and had no major impact on image quality.

The mindset of 'everything needs to be completely maxed out to be considered acceptable' is laughably foolish.
Yea it's dumb let me buy HW and dump on it the most unoptimized and unnoticeable settings on the planet just so I can go online and yell "MAX SETTINGS". A waste of energy, silicon and money.

Caveman behavior.
 

Mithos

Member
Just about the worst card you could have picked.
Might be, might also be the best some can get within what their budget is.
I mean its €500+ (us$540+) for a 4060Ti 16GB here, and atm 4070 is DEAD, since 4070 Super is at the exact same price, but those cards are still €650+ (us€700+).
 
This helped a lot. I think I'll be fine saving and getting a 4070/4070 super then.

Ti/Ti super felt too expensive but 16gb would be really futureproof.
If I were you and I had an ok card right now I'd just save my money for a 50 series card. Buying this late into a GPU generation is not a good move and pretty has never been a good move. If you wait till the 50 series cards are out there you can buy that or get a 40 series card for cheap.
 

Gaiff

SBI’s Resident Gaslighter
If I were you and I had an ok card right now I'd just save my money for a 50 series card. Buying this late into a GPU generation is not a good move and pretty has never been a good move. If you wait till the 50 series cards are out there you can buy that or get a 40 series card for cheap.
Those cards could be 9 months away and it could take another year for the 5070 to hit the market. Makes no sense to wait for a 50 series card when they’re not close to arrive.
 
Those cards could be 9 months away and it could take another year for the 5070 to hit the market. Makes no sense to wait for a 50 series card when they’re not close to arrive.
If you have an ok card what's the rush? That and card prices drop as soon as a new gen is announced specially for the used card market where people kneejerk sell their cards in anticipation.

Not to mention Battlemage and RDNA4 are launching this year and might be much better than Alchemist/RDNA3 and could provide some price competition on the Nvidia side. This is the worst time to buy a new card in the GPU cycle, it has always been the case.

Best time to buy cards is at the start of a new gen.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
If you have an ok card what's the rush? That and card prices drop as soon as a new gen is announced specially for the used card market where people kneejerk sell their cards in anticipation.

Not to mention Battlemage and RDNA4 are launching this year and might be much better than Alchemist/RDNA3 and could provide some price competition on the Nvidia side. This is the worst time to buy a new card in the GPU cycle, it has always been the case.
He wants to upgrade to a 4070 so he probably has something like a 3070 at best.
Best time to buy cards is at the start of a new gen.
No. The Super cards came well over a year after the start of Lovelace and the start of the gen only had the 4090 and 4080. It’s nonsensical to tell someone to wait upwards of 9 months for a product. They’re not coming out next week so they’re a non-factor.
 

Hoddi

Member
It's not that 12GB won't be fine for the next couple of years. But I'm not sure I'd spend this kind of money on it in 2024.

My own 2080 Ti still does well with 11GB at 3440x1440 in most cases. But I've still seen a few games where it runs out when I enable advanced features like RT. 12GB is cutting it a bit close and PS5 Pro might also bump the higher-end requirements by another gigabyte or two.

I don't intend to replace my card any time soon. But I wouldn't take less than 16GB if I had to.
 
Last edited:

64bitmodels

Reverse groomer.
Those cards could be 9 months away and it could take another year for the 5070 to hit the market. Makes no sense to wait for a 50 series card when they’re not close to arrive.
I don't think I'm gonna ever able to get the money for this hardware til months down the line anyways, and I'm fine with my hardware being a bit outdated.
He wants to upgrade to a 4070 so he probably has something like a 3070 at best.
6650xt. I also have a 5600x and have little plan to upgrade that so I don't want to get a card so powerful it'll bottleneck
 

WitchHunter

Banned
I'm a bit worried for VRAM capabilities now that 8gb is fully outdated and relegated to budget tier. I want a 4070 super, but paying that much for 12gb vram which could be outdated in 2 or 3 years sounds like a bad decision imo.
You can always pay a visit to the local chop shop electronics guru to upgrade ram to 24gb.
 

Rocinante618

Neo Member
A 16GB frame buffer is the safest option to get for the long term, so basically a RTX 4070 Ti Super or RTX 4080. I feel the 4090 is way, way overpriced at the the moment.
 

Bergoglio

Member
Like i said its not about 4gb of vram its that more v-ram can be used then 12, and the 12gb will bottleneck as result. If games already require 12gb it doesn't take a lot to hit a wall sooner rather then later.



https://www.tomshardware.com/video-...sis-another-game-that-can-exceed-8gb-vram-use

The 12GB cards all appear to be sufficient for 1440p very high, at least in terms of VRAM capacity, but the RTX 3080 10GB does show a drop in minimum fps relative to the 4070.

Now imagine having it RT and it will peg that 12gb of v-ram even harder.

Like i said u want to play it safe, get 12gb for 1080p, 16gb for 1440p.

All i am saying really is, that 12gb is not safe to last in the future for 1440p, better to go for more v-ram if u sit on a gpu longer.
24gb of VRAAM is just perfect for 4K and full RT.
 
Last edited:
Those cards could be 9 months away and it could take another year for the 5070 to hit the market. Makes no sense to wait for a 50 series card when they’re not close to arrive.
Yeah it's like not buying a PS5 and waiting for the PS6.

There's ALWAYS new tech coming out. You'll be waiting forever for the next thing.
 

SF Kosmo

Al Jazeera Special Reporter
12GB will be enough for at least the remainder of this console gen.

Already the biggest VRAM liability are shoddy ports from consoles that have unified RAM, rather than legitimate technical need. But because of that, those demands aren't likely to get driven up until the base spec they're porting from goes up.

As far as well optimized native PC stuff there's really not as much pressure, DLSS has ended the resolution wars and texture compression is improving. Increasing RT has added some need for VRAM, but the fact is that devs who want to make things work in 12GB will be able to for the foreseeable future.
 

Sentenza

Member
I'm on a 3080ti with 12 Gb right now.
I won't accept anything less than 16 for my future 5080 but I would ideally aim above that: 20 or 24 GB.
 
He wants to upgrade to a 4070 so he probably has something like a 3070 at best.

No. The Super cards came well over a year after the start of Lovelace and the start of the gen only had the 4090 and 4080. It’s nonsensical to tell someone to wait upwards of 9 months for a product. They’re not coming out next week so they’re a non-factor.
Yeap the best time to buy is at the start of a gen for your product segment. So if you're a xx60 buyer whenever the new xx60 card comes out. That's because that will give you the lowest prices for the old gen xx60 and it will give you the best performance and features for the newer gen along with the longest lifetime/relevance.


If you bought a 670 (kepler) in 2012 that gives you a full 2 years of the latest tech, buying a 770 (just a refresh of Kepler like a Super card) in 2013 would give you a minimal bump to performance and no new architectural improvements (such as efficiency, nvenc, tensor cores, RT cores etc) or features because it's still Nvidia selling you Kepler again. But if you waited till the next generation i.e the 970 (Maxwell) you'd get a bigger jump in performance and significant architectural improvements and features such as Maxwell supporting Nvidia Reflex while the 2013 770 doesn't.


You could also opt to wait 2 generations to upgrade meaning if you bought a 670 at launch in 2012 you can skip the 970 in 2014 and buy the 1070(Pascal) in 2016 giving you a big leap in performance, architectural improvements and features. Buying at the start of the gen for your card line gives you a full lifetime to experience the best tech meaning you'd get 4 years out of a 670. This has always been the case with Nvidia, buying an 8000 (Tesla) card was a better deal at launch than waiting to get a 9000 series, 100 series or 200 series card, then upgrading that 8000 series card to a next generation 400 series (Fermi) or a 2 generational leap to 600 series (Kepler) was the right move.
 

//DEVIL//

Member
it really depends for what ? 4k gaming ? it was obvious from the day games like the last of us remake hit the PC that you will need at least 16 gigs for 4k with ray tracing.

if you gaming at 2k ? no your 12 is fine for 2 years to come or 3.

If i were to buy a card now that is not a 4090 / 5090 ( considering the 5080 is probably same performance as 4090 give or take ), then I would pick a 4070ti super.

or a used 4080 / 4090 depending on your budget.

or... used 7900xtx
 
Last edited:

Bojji

Member
it really depends for what ? 4k gaming ? it was obvious from the day games like the last of us remake hit the PC that you will need at least 16 gigs for 4k with ray tracing.

if you gaming at 2k ? no your 12 is fine for 2 years to come or 3.

If i were to buy a card now that is not a 4090 / 5090 ( considering the 5080 is probably same performance as 4090 give or take ), then I would pick a 4070ti super.

or a used 4080 / 4090 depending on your budget.

or... used 7900xtx

After few (pinnacle of optimization) games last year (Last of Us, Hogwarts, RE4, Ratchet) vram requirements actually dropped to "normal" level for most new games. UE5 for example games don't go above 8GB usually.

PS5 has 12.5GB of memory available to games and that's for both VRAM and RAM tasks (GPU and CPU needs) so at least on PS5 like settings 12GB cards should last entire generation. But yeah, RT adds potential unknown factor to that and DLSS3 is actually the worst VRAM sucker out there.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
it really depends for what ? 4k gaming ? it was obvious from the day games like the last of us remake hit the PC that you will need at least 16 gigs for 4k with ray tracing.

if you gaming at 2k ? no your 12 is fine for 2 years to come or 3.

If i were to buy a card now that is not a 4090 / 5090 ( considering the 5080 is probably same performance as 4090 give or take ), then I would pick a 4070ti super.

or a used 4080 / 4090 depending on your budget.

or... used 7900xtx
A 5080 will very likely be much stronger than a 4090 unless we got a Turing 2.0 situation going on. Traditionally, the 80 card has always significantly outperformed the previous king of the hill.

780>680 (690 was just a dual 680)
980>780 Ti
1080>980 Ti
3080>2080 Ti
4080>3090

The sole exception is the 2080 which was originally on par with the 1080 Ti but is now quite a bit faster.
 
This becomes a problem only if you are obsessed with Maxing out the game presets

It never made sense to me: how many people can actually tell "high" from "very high/ultra"?
 
Last edited:

Caffeine

Member
i see a few gpus coming out in 12GB configuration so it seems to be a standard for at least the next half decade at least.
 

//DEVIL//

Member
A 5080 will very likely be much stronger than a 4090 unless we got a Turing 2.0 situation going on. Traditionally, the 80 card has always significantly outperformed the previous king of the hill.

780>680 (690 was just a dual 680)
980>780 Ti
1080>980 Ti
3080>2080 Ti
4080>3090

The sole exception is the 2080 which was originally on par with the 1080 Ti but is now quite a bit faster.
Yeah but the problem is the 4090 is/ was too powerful for its own good compared to even the 4080.

The latest rumors suggest that the 5090 is about 60% faster than 4090 in theoretical numbers.

Assuming this number holds value, I am going to assume the 5080 is on bar or 10/15% faster than 4090 but with 16 gigs or 20 vs 24. ( Unless Nvidia decided to go more than 24 gigs for 5090 and give the 24 for 5080 which I don't see it being Nvidia and all )

I am not going to upgrade to 5000 from 4090 except if they add AI to play a big role like dlss did. And if that AI is 5000 exclusive. Then kinda will want to upgrade
 
Another shitty playstation port came out which will inevitably get fixed, and people are freaking out again. The overall speed of the card is going to be the issue down the road in 99% percent of games, not the vram.

According to steam less than 5% of gamers have more than 12 gb. 8 is far from dead.
 
Top Bottom