• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Godfall (PC) requires 12GB VRAM for 4K (max settings)

Type_Raver

Member
Watching the combat trailer, I see some God of War'ish action. Whats unclear is the depth of the combat.
If its similar, then i might check this out for PC.
 

reksveks

Member
The following is relevant for those who own or are interested in the RTX 3080.

Here's a screen-capture of the benchmark results for Watch Dogs: Legion running on the RTX 3080. Don't mind that the data displayed by MSI Afterburner/ Rivatuner designates the GPU as the RTX 3090; I didn't relabel the data entries when I had the RTX 3080 installed. Anyhow, notice that the amount of VRAM that's listed as having been used by the benchmark is 10.13 GBs out of 9.84GBs. Is this an error or did the benchmark pull the additional 0.29 GBs from general system RAM (10.13 GBs - 9.84 GBs = 0.29 GBs)?

What makes me think that the benchmark did this is the amount of system RAM that's listed as having been used when the benchmark ran on the RTX 3080 relative to when it ran on the RTX 3090 with the same settings: 7.32 GBs vs 6.06 GBs, which is a difference of 1.26 GBs. Obviously 1.26 GBs is more than 0.29 GBs, but I'm assuming that the benchmark needed to duplicate the 0.29 GBs of data in order to minimize the amount of time that it took to find the data since system RAM is much slower than VRAM.

What do you guys think?

By the way, Task manager confirms the benchmarks use of more than 10 GBs:

XUFTUYV.png

Don't really know why i am posting this but if you use Gamebar (Win+G), you can also get the CPU/GPU usage as well, seems like a nice way than Task Manager
 

Kenpachii

Member
The following is relevant for those who own or are interested in the RTX 3080.

Here's a screen-capture of the benchmark results for Watch Dogs: Legion running on the RTX 3080. Don't mind that the data displayed by MSI Afterburner/ Rivatuner designates the GPU as the RTX 3090; I didn't relabel the data entries when I had the RTX 3080 installed. Anyhow, notice that the amount of VRAM that's listed as having been used by the benchmark is 10.13 GBs out of 9.84GBs. Is this an error or did the benchmark pull the additional 0.29 GBs from general system RAM (10.13 GBs - 9.84 GBs = 0.29 GBs)?

What makes me think that the benchmark did this is the amount of system RAM that's listed as having been used when the benchmark ran on the RTX 3080 relative to when it ran on the RTX 3090 with the same settings: 7.32 GBs vs 6.06 GBs, which is a difference of 1.26 GBs. Obviously 1.26 GBs is more than 0.29 GBs, but I'm assuming that the benchmark needed to duplicate the 0.29 GBs of data in order to minimize the amount of time that it took to find the data since system RAM is much slower than VRAM.

What do you guys think?

By the way, Task manager confirms the benchmarks use of more than 10 GBs:

XUFTUYV.png

All of that reported v-ram data is bullshit anyway.

Also the 3080 has 10240mb v-ram.
 
Last edited:

synce

Member
Still boggles the mind that people paid $1k+ for 10gb ram in 2020. This is the first gen I've ever went AMD for CPU and the first one for GPU since they were still ATI.
 

rofif

Can’t Git Gud
I fucking don't get it.
Consoles have 16gb shared ram.
On pc (like mine) I have 16gb ram and 10gb vram (3080fe and I've paid 700 and even sold watch dogs so good deal).
Does it really not matter if that vram is gddr6 or 6x or however fast? Does it not matter if You would have 16gb ram or even 32? Why are pc's so fucking stupid.... Shouldn't the memory swap in instant ?
 
Last edited:

rofif

Can’t Git Gud
Still boggles the mind that people paid $1k+ for 10gb ram in 2020. This is the first gen I've ever went AMD for CPU and the first one for GPU since they were still ATI.
700 for fastest and even amazing looking gpu with gddr6x fast memory and free watch dogs legion?!
Oh I feel so failed. the 12gb vram requirement news are a bit worrying but I am sure it will be fine....
 

Stuart360

Member
Lol i'll be surprised if it needs more than 8gb of vram, maybe 9gb at a push. This is just like all the HD texture packs than 'need' 8gb of vram to use, but used around 5gb of vram on my 980ti (Doom insane shadows, Fallout 4 texture pack, Shadow of Mordor and War, etc).
 
So how would this pan out if this game was on series x even thou it has 16gig of ram but split into to? Also 4k x 4k textures which is pretty high was helped by also having infinite cache which i heard? If ps5 does that infinite cache would ps5 seize to have that dynamic 4k x 4k textures?
 

ZywyPL

Banned
GAF - 4K is a waste.
Also GAF - supposedly every singe person on the forum actually plays at 4K and 8-10GB is not enough anymore.
 

kraspkibble

Permabanned.
lol RIP 3080 owners.

i knew 10GB wasn't gonna be enough. there are games out TODAY that use all of that and we're about to start a new generation of consoles so things are only gonna get worse. The 3080 Ti is DOA too with a pathetic 12GB.

16GB should be the ABSOLUTE minimum for 4K on PC going forward. The only real options are the AMD cards or the RTX 3090.
 

KungFucius

King Snowflake
I fucking don't get it.
Consoles have 16gb shared ram.
On pc (like mine) I have 16gb ram and 10gb vram (3080fe and I've paid 700 and even sold watch dogs so good deal).
Does it really not matter if that vram is gddr6 or 6x or however fast? Does it not matter if You would have 16gb ram or even 32? Why are pc's so fucking stupid.... Shouldn't the memory swap in instant ?

Great question. I have been asking the same thing. Aren't the drivers also helping tell the games how to allocate too? There has to be more to this. Why else would nvidia release a "4K" card that can't run 4k games? They can't be that incompetent, can they?

Do use 3080 owners need to scalp it and move over to AMD or is there more to it?
 

Mista

Banned
12GB? no fucking way. Those new cards can't even make it

Only people with 3090. 4K gaming is dumb anyways

The game doesn't even look that special so I still find this weird
 
Last edited:

rofif

Can’t Git Gud
Great question. I have been asking the same thing. Aren't the drivers also helping tell the games how to allocate too? There has to be more to this. Why else would nvidia release a "4K" card that can't run 4k games? They can't be that incompetent, can they?

Do use 3080 owners need to scalp it and move over to AMD or is there more to it?
Yeah, also I have no idea why 4k resolution uses more vram. It's the same textures when the game is set to 1080p.
And 3080 is top of the line. There will be riots i they replace it with 20gb model or something. Maye gddr6x is faster so we dont need so much of it
 
Last edited:

sobaka770

Banned
Don't really know why i am posting this but if you use Gamebar (Win+G), you can also get the CPU/GPU usage as well, seems like a nice way than Task Manager

Watch dogs at 4k resolution ultra textures does overflow the RAM on my RTX 3080 leading to stutter. However I also believe that there's poor optimization going on because HD textures request more VRAM than necessary leading to overflow even when I drop other VRAM heavy settings such as shadow quality down. The memory usage also seems to accumulate the longer I play.
 

ZywyPL

Banned
People just aren't ready for the coming vram limitations. Still living on hope. We saw this before with kepler, with maxwell, and so on. People never learn. 🤷‍♂️



That's so painful to watch... Granted, there are other settings than Ultra, there's also DLSS, but still, 8FPS?...
 

Kenpachii

Member
Great question. I have been asking the same thing. Aren't the drivers also helping tell the games how to allocate too? There has to be more to this. Why else would nvidia release a "4K" card that can't run 4k games? They can't be that incompetent, can they?

Do use 3080 owners need to scalp it and move over to AMD or is there more to it?

Artificial need to upgrade. Nvidia has a long history of doing this shit.
 
Last edited:

reksveks

Member
Watch dogs at 4k resolution ultra textures does overflow the RAM on my RTX 3080 leading to stutter. However I also believe that there's poor optimization going on because HD textures request more VRAM than necessary leading to overflow even when I drop other VRAM heavy settings such as shadow quality down. The memory usage also seems to accumulate the longer I play.

Really hope that PC game engines start using SFS much quicker than they started using VRS.
 
Top Bottom