• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

8 Gb of Vram is not enough even for 1080p gaming.

Spyxos

Gold Member


I have the Rtx 3070 and Rtx 3060ti both are used for 1080p gaming and as you can see in this video the 8Gb Vram is not enough at all. Even at such a low resolution. Don't even get me started on higher resolutions.

00:00 - Welcome back to Hardware Unboxed
01:25 - Backstory
04:33 - Test System Specs
04:48 - The Last of Us Part 1
08:01 - Hogwarts Legacy
12:55 - Resident Evil 4
14:15 - Forspoken
16:25 - A Plague Tale: Requiem
18:49 - The Callisto Protocol
20:21 - Warhammer 40,000: Darktide
21:07 - Call of Duty Modern Warfare II
21:34 - Dying Light 2
22:03 - Dead Space
22:29 - Fortnite
22:53 - Halo Infinite
23:22 - Returnal
23:58 - Marvel’s Spider-Man: Miles Morales
24:30 - Final Thoughts

dr71238bz5ta1.png
 
Last edited:

Spyxos

Gold Member
I'm running Last Of Us on 8GB VRAM RTX 3060TI at 1440p very high settings at around. 85-90fps..

But i think i'll try to get RTX 5xxx late next year with at leat 16gb vram
04:48 - The Last of Us Part 1
 
Last edited:

Beelzebubs

Member
Cool, you can play a game released before the 3070.
Now try something demanding from 2023 and see how that goes.

BTW, have you seen that Jedi Survivor has a minimum requirement of 8GB of vram?
I have but I'm too cheap to buy it when it first comes out. Will wait til it's on Gamepass Ultimate and hope it's optimized by then. A lot of stuff at the moment does have huge VRAM requirements but I'm hoping there will be more refinement especially with UE5 games starting to come out. Also got DLSS as an option for scaling. Doesn't have to be native 4k.
 

winjer

Gold Member
I have but I'm too cheap to buy it when it first comes out. Will wait til it's on Gamepass Ultimate and hope it's optimized by then. A lot of stuff at the moment does have huge VRAM requirements but I'm hoping there will be more refinement especially with UE5 games starting to come out. Also got DLSS as an option for scaling. Doesn't have to be native 4k.

Vram usage is not going down. It's going up, as it has always been since the first graphics accelerators.
Remember that consoles have 16GB of unified memory. And they are the base for almost every game.
 

Bojji

Member
I have but I'm too cheap to buy it when it first comes out. Will wait til it's on Gamepass Ultimate and hope it's optimized by then. A lot of stuff at the moment does have huge VRAM requirements but I'm hoping there will be more refinement especially with UE5 games starting to come out. Also got DLSS as an option for scaling. Doesn't have to be native 4k.

Dlss and fsr doesn't change VRAM usage that much. Most of it is just game data, and you can't reduce it under certain amount, potato textures may help but probably not always.

Nvidia fucked up 3070ti/3070/3060ti owners and they did it willingly, even fucking 3060 had 12 GB.
 

01011001

Banned
my 3060ti must have imagined playing Cyberpunk with RT reflections and Alex's optimised settings at 1440p DLSS Quality mode then...

weird...
and even then the VRAM wasn't my limiting factor. it was mostly a CPU and RT performance limitation.

Plague Tale Requiem also runs fine, with again the limiting factor not being the VRAM but indeed the rasterization performance of the card

let's rewrite that headline:
"8 GB of VRAM is not enough even for 1080p gaming, if the developers suck at making a PC version"
 
Last edited:

Spyxos

Gold Member
my 3060ti must have imagined playing Cyberpunk with RT reflections and Alex's optimised settings at 1440p DLSS Quality mode then...

weird...
and even then the VRAM wasn't my limiting factor. it was mostly a CPU and RT performance limitation.

Plague Tale Requiem also runs fine, with again the limiting factor not being the VRAM but indeed the rasterization performance of the card

let's rewrite that headline:
"8 GB of VRAM is not enough even for 1080p gaming, if the developers suck at making a PC version"
This is not about Cyberpunk 2077, I played it without any problems as well. But the game is now almost 3 years old. This is about new games.
 

01011001

Banned
This is not about Cyberpunk 2077, I played it without any problems as well. But the game is now almost 3 years old. This is about new games.

when new games are less ambitious, yet have more issues, it's not the hardware that's the issue, its shit developers.

and Plague Tale Requiem also has no VRAM issues on the same card, and runs just fine, while looking miles ahead compared to games that do struggle
 

T4keD0wN

Member
Yeah, nvidia releasing cards with the same amount of vram as rx480 years later at multiple times it price is pathetic.
Although since its enough for Plague Tale Requiem without rt and Cyberpunk 2077 id say its more of a game problem.
 
Last edited:

Spyxos

Gold Member
when new games are less ambitious, yet have more issues, it's not the hardware that's the issue, its shit developers.

and Plague Tale Requiem also has no VRAM issues on the same card, and runs just fine, while looking miles ahead compared to games that do struggle
I can't say anything about Plague Tale Requiem, I played it on console but yes it is very pretty.
 

SmokedMeat

Gamer™
I'm running Last Of Us on 8GB VRAM RTX 3060TI at 1440p very high settings at around. 85-90fps..

But i think i'll try to get RTX 5xxx late next year with at leat 16gb vram

Last of Us easily eats up over 10GB of VRAM. Are you on a DLSS performance mode with some turned down settings?
 

SeraphJan

Member
when new games are less ambitious, yet have more issues, it's not the hardware that's the issue, its shit developers.

and Plague Tale Requiem also has no VRAM issues on the same card, and runs just fine, while looking miles ahead compared to games that do struggle
I prefer Plague Tale Requiem graphic over Last of us part 1 too.
 

winjer

Gold Member
Yeah, nvidia releasing cards with the same amount of vram as rx480 years later at multiple times it price is pathetic.
Although since its enough for Plague Tale Requiem and Cyberpunk 2077 id say its more of a game problem.

It's worse than that. The first mainstream GPU with 8GB of vram was the RX390, from 2015. With an MSRP of 320 $US
Today NVidia is charging 500-700 US$ for 8Gb GPUs.
 

yamaci17

Member
you can play tlou with high textures at 1440p or 4K DLSS performance. 1080p is a cakewalk



- Disable hardware accerelation for discord+steam if you use them (most critical ones)

- Untick GPU accerelated rendering in web views in Steam's settings

gNF8dlT.png


- Open CMD, use " taskkill /f /im dwm.exe " Don't worry, it won't kill the DWM, it will just reset it. This will reduce dwm.exe's VRAM usage if your system was open for a bit of time. You can create a batch code if you want to and keep it on your desktop. Run it before running a game.
- Go to task manager, details tab, select column, tick "dedicated gpu memory usage". observe what gobbles up your VRAM. turn off everything you can.

ideally you can have
a) 150 200 mb idle vram usage at 1080p/single screen
b) 250-350 mb idle vram usage at 1440p/single screen
c) 400-600 mb idle vram usage at 4k/single screen


if you can get your idle vram usage to 300 mb at 1440p; you can use 7.4 gb worth of texture/game data and ran the game smoothly without problems (evidenced by the video, and even recording itself takes vram)

- if you're on w11, and do not use widgets uninstall it as it uses 100 to 200 mb of vram.
Open powershell with admin rights
winget uninstall "windows web experience pack"
if it gets installed again through microsoft store, disabie it via group policy editor

- or upgrade if you have muneh. if you can't reduce your idle vram usage below 500 mb or simply don't want to, devs won't cater to your multitasking needs.

I warned everyone that you would have to sacrifice on textures / multitasking ability back in 2020. I bought 3070 at MSRP price at launch knowingly that I'd have to turn down from Ultra. however you can still get decent/acceptable image quality out of it.

I will keep using my GPU with aforementioned tricks; as long as I'm not forced into PS2 textures.
 
Last edited:

K2D

Banned
Sounds more like an architecture/optimization (consoles/SoC) issue than naturally higher spec requirements. Got to wait this out before I jump back into pc gaming.
 

Spyxos

Gold Member
you can play tlou with high textures at 1440p or 4K DLSS performance. 1080p is a cakewalk



- disable hardware accerelation for discord+steam if you use them (most critical ones)

gNF8dlT.png



- taskkill /f /im dwm.exe (resets dwm) it will release some vram back if your system was open for a bit of time. you can create a batch code if you want to
- go to task manager, details tab, select column, tick "dedicated gpu memory usage". observe what gobbles up your VRAM. turn off everything you can there.

ideally you can have
a) 150 200 mb idle vram usage at 1080p/single screen
b) 250-350 mb idle vram usage at 1440p/single screen
c) 400-600 mb idle vram usage at 4k/single screen


if you can get your idle vram usage to 300 mb at 1440p; you can use 7.4 gb worth of texture/game data and ran the game smoothly without problems (evidenced by the video, and even recording itself takes vram)

- if you're on w11, and do not use widgets uninstall it as it uses 100 to 200 mb of vram.
Open powershell with admin rights
winget uninstall "windows web experience pack"
if it gets installed again through microsoft store, disabie it via group policy editor

- or upgrade if you have muneh. if you can't reduce your idle vram usage below 500 mb or simply don't want to, devs won't cater to your multitasking needs.

Are there any disadvantages to manually removing Vram from the Os?
 

yamaci17

Member
Are there any disadvantages to manually removing Vram from the Os?
no there isn't, killing dwm.exe will reset it as if you've restarted the computer. ideally a computer has the lowest idle vram usage at a clean restart; however with some time, dwm.exe gets bloated with unnecessary vram usage. there are no downsides to it, as I've been using these tactics since 2020 to play at 4k/ray tracing comfortably in many titles

its just a shortcut to return to idle vram usage without needing to restart the computer

analyzing task manager/details/dedicated vram usage is a must
 
Last edited:

k_trout

Member
no matter how i look at this, it seems like real bad business practice to release a game which 80% of your target audience cant play
good luck with unrealistic ram requirements, the majority of pc gamers just going to play other things
 
Last edited:

Bojji

Member
my 3060ti must have imagined playing Cyberpunk with RT reflections and Alex's optimised settings at 1440p DLSS Quality mode then...

weird...
and even then the VRAM wasn't my limiting factor. it was mostly a CPU and RT performance limitation.

Plague Tale Requiem also runs fine, with again the limiting factor not being the VRAM but indeed the rasterization performance of the card

let's rewrite that headline:
"8 GB of VRAM is not enough even for 1080p gaming, if the developers suck at making a PC version"

when new games are less ambitious, yet have more issues, it's not the hardware that's the issue, its shit developers.

and Plague Tale Requiem also has no VRAM issues on the same card, and runs just fine, while looking miles ahead compared to games that do struggle

You should check PTR in video in OP, it's fine without RT on 3070 but with RT enabled performance tanks to single digits.

Cyberpunk is old at this point and it never had very good textures in vanilla version.
 

DaGwaphics

Member
8GB will likely continue to be fine at 1080p. People forget that you can turn things down from "Ultra" to "High" with little image quality degradation in most cases, and what that does for the 8GB cards at 1080p is immeasurable. There are simply too many 8GB cards in use for developers to drop support there within this console generation, especially considering the likely best sellers from the 4000 series and 7000 series will be 8GB as well.

Certainly for higher resolutions though, you'll probably need more.
 

Bojji

Member
no matter how i look at this, it seems like real bad business practice to release a game which 80% of your target audience cant play
good luck with unrealistic ram requirements, the majority of pc gamers just going to play other things

Same thing happened to 2GB card owners in 2013 and 4GB card owners sometime after that. Requirements always go up with new consoles.

Damn, my plan was a 5600 + 6600 FHD gaming pc.

Go RTX 3060 12GB or if you have more money for 6700/6700XT.
 
Last edited:

01011001

Banned
You should check PTR in video in OP, it's fine without RT on 3070 but with RT enabled performance tanks to single digits.

well I said at console settings... so no RT.

also running the game on Ultra is fucking stupid and isn't representative of how people play the game.
I bet with reasonable settings it runs fine with RT enabled.


Cyberpunk is old at this point and it never had very good textures in vanilla version.

it's an open world game, so of course textures are less crisp than in a linear game.
but still no issue with RT reflections enbaled. and the issues I do get are not VRAM related
 
Last edited:

winjer

Gold Member
10gb with 3080 is not enough for Forspoken, RE2,3,4 remakes (4k with rt) and tlou.
I've personally only had problems with these games but for RE games, I just disabled RT and then no issues

NVidia initially had plans to release the 3080 with 20GB of vram. But they choose to cut it before release.
If NVidia had kept the 20GB, you could have everything max out with no issues.
 

yamaci17

Member
well I said at console settings... so no RT.

also running the game on Ultra is fucking stupid and isn't representative of how people play the game.
I bet with reasonable settings it runs fine with RT enabled.
The game in general is super heavy to a point you wouldn't bother with ray tracing if you want high framerates.

But yes, High + ray tracing is perfectly doable on the 3070. You can see VRAM usage is quite tame at High+raytracing+1440p/DLSSQuality



But the problem is not VRAM, it is the rasterization+raytracing power itself or rather, game's demand on these front. the game is already super demanding in terms of rasterization, to a point you wouldn't really bother with ultra settings even without ray tracing on a 3070. you can see above where even 1440p dlss quality gets around 40 FPS. this game when ray tracing enabled is heavier to run than cyberpunk.

4K DLSS performance High settings is perfectly doable too



Irrevelant but I will share this one anyways (even better perf without recording, recording causes minor hitches here and there due to extra vram pressure)



Only sacrifice you actually do here is reducing texture quality from ultra to high. Ideally, if you had more VRAM, you could use high settings but keep textures at ultra. however game looks gorgeous both ways so I'm not really "sweating" over it.
 

The Cockatrice

Gold Member
Ah yes HardwareUnboxed the msot unreliable source of information. I instantly clicked the RE4 because I knew it was BS. First of all, texture quality in RE does not change, besides low/medium/high ofc, but the vram allocation on High it does as the name implies. Just allocates more VRAM. There is no difference between having 8GB or 6GB High settings.

A Plague Tale runs perfectly fine on 8GB VRAM and as for the rest of the big names he pulled out like Callisto Protocol, TLOU, Hogwarts and fucking FOrspoken do I need to remind everyone that these are absolute bad ports? Also shocking that maxing out settings for almost no visual benefits consumes a lot of VRAM.

You can make SKyrim consume 24GB of VRAM. Jesus, so much misinformation over and over.

Fucking reminder every single time for the oblivious people here: you can absolutely play games at 1080p, 1440p and even 4k if you use DLSS/FSR and dont max out dumb shit settings that consume A TON of vram for no visual benefits such as shadows(I think in RE4 max shadows almost consume 1gb of VRAM and youd need a 5000% zoom to notice the difference) or volumetric shit in clouds/fogs/rays. Common sense that these youtubers have none and gaf doomsayers need a reality check.
 
Last edited:

k_trout

Member
Same thing happened to 2GB card owners in 2013 and 4GB card owners sometime after that. Requirements always go up with new consoles.
this is not the same as then, pc gaming is a lot more lucrative these days, and if you tailor your game to the 1% who can afford the top hardware you going to lose out on all that pc gaming money, which makes no sense especially in the current financial situation. Once again the myth of pc gaming being about the biggest fastest best seems to permeate the internet. pc gaming is really about the opposite for the majority. Yes hardware requirements change over time but the majority of users lag behind those requirements by years. Just look at the big sellers on pc.
 
Last edited:

yamaci17

Member
this is not the same as then, pc gaming is a lot more lucrative these days, and if you tailor your game to the 1% who can afford the top hardware you going to lose out on all that pc gaming money, which makes no sense especially in the current financial situation. Once again the myth of pc gaming bring about the biggest fastest best seems to permeate the internet. pc gaming is really about the opposite for the majority. Yes hardware requirements change over time but the majority of users lag behind those requirements by years. Just look at the big sellers on pc.
it is not the same not because of targets but because of actual vram targets

2 gb really went obsolete due to being extremely short of what consoles had

however 4 gb never had any problem at 1080p/console settings, similar to PS4. ps4/xbox one usually allocated 3.5-4 GB for GPU memory data and 1.5-2 GB for CPU memory data to games.

you can play any most recent peak ps4 games at console settings just fine with 4 GB VRAM;






problems stem when you want to above consoles in terms of resolution and texture quality with 4 GB. and if you have a 1050ti/1650 super/970, you will have to use CONSOLE equivalent settings to get playable framerates regardless.

in the same respect, new consoles have around 10 gb alloctable memory for gpu operations, and 3.5 gb for cpu memory operations. but they do this at 4k/upscaling with 4k textures. reducing textures just a bit, and using a lower resolution target 1440p/1080p/DLSS combinations will allow 8 GB to last through a quite bit of time yet. its just that problem stems from angry 3070 users who bought the decrepit VRAM'ed GPU for upwards of 1000+ bucks. this is why hw unboxed is super salty about this topic, as the same budget can be spent towards a 6700xt/6800 and even have leftover money on top.

for me personally, I always see 3070 as a high/reduce textures a bit card. But I practically got it for almost free, and I'd never pay the full price for it. I always discouraged people from getting it.

But people with 2070s/2060super/2070 should be fine. they should just reduce background clutter to a minimum and should not chase ultra settings. 3070/3070ti is a bit different as it can push ultra with acceptable framerates in certain titles, and puts people into these situations, sadly.

people who act like 4 GB was dead in recent years (2018 to 2021) are people who believe 4 GB is dead for 1440p/high settings whereas most 4 GB cards are not capable of that to begin with. so I don't where this "4 gb was ded in bla bla year" argument began.

4 gb vram was enough for almost all ps4 ports at ps4 equivalent settings. almost.
 
Last edited:

Hugare

Member
Eh, playing with Medium textures is fine when 90% of the games look like some blurry shit due to bad TAA or other image reconstruction techniques

Going from RE 4 Remake to RE 4 HD Remaster is like putting some glasses

The HD Remaster has no post processing effects blurring the image so every texture is crisp

Problem is that most recent games dont scale textures quality properly, 'cause they are made with console in mind, so you either play with the intended textures at "High/Ultra" or terrible, vomit looking "Medium" textures
 
Last edited:

k_trout

Member

yamaci17 yamaci17

ye totally, but doesnt matter how we rationalise it, terraria and other games will outsell all the ps5 releases by a huge factor, the majority of pc gamers will upgrade when they upgrade, not because console releases require it
 
lol pos game D4 reserved 18GB VRAM on my 4090. Can anyone here explain why this is happening when we usually can't even see an uplift in image quality or whatever?
 

yamaci17

Member

yamaci17 yamaci17

ye totally, but doesnt matter how we rationalise it, terraria and other games will outsell all the ps5 releases by a huge factor, the majority of pc gamers will upgrade when they upgrade, not because console releases require it
well for 2 gb gpus, console releases forced it to. but yes, gaming market was much more smaller back then. not many made a fuss about gtx 770 being super obsolete. steam was evolving to a bigger storefront those days, aaa pc gaming was not that big at the time. and people were accustomed to upgrade every 2-3 years with cheap prices.

with gpus like 1060/1070/970 aaa mainstream gaming has become possible. and many people are now in the ecosystem. and yes, it will be very hard to push them out of it.

however 8 gb is not a "ded" meme vram budget anyways. games tailored for 10 gb budget can easily be scaled down to 8 gb (especiallly considering consoles will target 4k assets/4k upscaling/4k textures whereas most 8 gb gpus will target 1440p assets/1440p upscaling/1440p textures). hell, they will have to scale games all the way back to 5-6 gb gpu memory budget for series s.

yes not being able to run games at ultra textures purely because of vram limitations is very saddening but some people act like its the end of the world
 
Last edited:

RobRSG

Member
I think it is funny how many of these are AMD sponsored titles that look no better than titles from 2018.

That’s a new time low for Hardware Unboxed, milking the cow to the max here.

Alas, the biggest question is: How come TLOU Part I runs perfectly on a system with 16GB total memory, when in the PC you need 32GB RAM + 16GB video RAM to even display textures properly?
 
Last edited:
Top Bottom