• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Godfall (PC) requires 12GB VRAM for 4K (max settings)

MadAnon

Member
Oh, we should care about prices. And I never said AMD did better. If we don't care they'll sell us $1000 "midrange" cards. At some point you don't get your money's worth. Sometimes it's better to wait and upgrade later. Voting with our wallet.
You have turned this into completely different topic now.
 

MadAnon

Member
No, its not midrange. That would be the 2060 or the upcoming 3060. And no, thats no low range, that would be anything from the 1650 card or lower.

Also no, i dont expect to play at 4k/ultra/60fps even with a high range card. But i should be able to play at lower fps at least. But even then the graphics need to at least fit into the damn thing.
1650 is entry level. You clearly have no idea how these products are categorized.
 
Last edited:
No, its not midrange. That would be the 2060 or the upcoming 3060. And no, thats no low range, that would be anything from the 1650 card or lower.

Also no, i dont expect to play at 4k/ultra/60fps even with a high range card. But i should be able to play at lower fps at least. But even then the graphics need to at least fit into the damn thing.
The 50/50 ti is low range, 60/70 is mid range 80/80 ti is enthusiast (which it seems like there is no 3080 ti, so the 90 is the replacement for it, and possibly for titan as well.
 
vram.png


And Godfall uses 12GB? Lol
 

VFXVeteran

Banned
No surprise here. I mentioned a few months ago when I showed Crysis Remake, Marvel Avengers, Horizon Zero Dawn, and other games that VRAM allocation was quite large. Memory bandwidth is going to be a BIG thing going forward and beyond. The shortcut hacks won't be able to sustain the complexity of assets coming through the pipe. Here is yet again, proof of what I've stressed over and over.
 

marquimvfs

Member
"Partner showcase" lol.

Imagine having 4K x 4K textures and still looking like a slightly better Destiny. A game with these visuals that can't run at 4K with 10 GB of GDDR6x memory is simply badly optimized for that configuration, likely on purpose to promote the "partner."
Now you know how an AMD buyer feels when Nvidia "partners" with devs. (Even if isn't clear if that's really the case here)
 

VFXVeteran

Banned
"Partner showcase" lol.

Imagine having 4K x 4K textures and still looking like a slightly better Destiny. A game with these visuals that can't run at 4K with 10 GB of GDDR6x memory is simply badly optimized for that configuration, likely on purpose to promote the "partner."

Everything else about the game looks to be really clean and typical of all other games out. The added 4k textures are, of course, more detailed than any console equivalent game..
 
Last edited:

Mister Wolf

Gold Member

Developed by Nvidia, no DLSS.

Must have been paid off by AMD.

Then there's COD where there's no DLSS.

And F1 2020 which is a codemasters game with an AMD marketing partnership but yet... Miraculously it has DLSS support.

Your point stands... Exclusively in your own head.

Both fall into category number one.
 

GHG

Member
Both fall into category number one.

What category is that? The one where AMD paid them off to not have DLSS or the one where they didn't?

How do you explain F1 2020 having DLSS despite codemasters already having a deal with AMD? Did AMD's cheque bounce?
 
No surprise here. I mentioned a few months ago when I showed Crysis Remake, Marvel Avengers, Horizon Zero Dawn, and other games that VRAM allocation was quite large. Memory bandwidth is going to be a BIG thing going forward and beyond. The shortcut hacks won't be able to sustain the complexity of assets coming through the pipe. Here is yet again, proof of what I've stressed over and over.
Not really proof of anything tbh. It's more speculation, and you've still really yet to "prove" that VRAM in the 3080 and below will be an issue. You've constantly talked about allocation, but that's not proof, that's just that, allocation. I'd like to see actual "use" in-game, not hypothetical "allocation" that show's nothing. You could allocate a ton of memory for a ton of different programs, doesn't mean the game/program/etc actually needs it. I say all this in the nicest, non-condescending way possible. What does your overlay say is actually physically being used for memory in all those games you mentioned?
 
Last edited:
Will be interesting to see how much better optimized the consoles will be memory wise, with their SSDs and MS with SFS and XVA.
 

VFXVeteran

Banned
Not really proof of anything tbh. It's more speculation, and you've still really yet to "prove" that VRAM in the 3080 and below will be an issue. You've constantly talked about allocation, but that's not proof, that's just that, allocation. I'd like to see actual "use" in-game, not hypothetical "allocation" that show's nothing. You could allocate a ton of memory for a ton of different programs, doesn't mean the game/program/etc actually needs it. I say all this in the nicest, non-condescending way possible. What does your overlay say is actually physically being used for memory in all those games you mentioned?

You will be hard pressed to find articles of actual usage of VRAM in a game. In fact, I wouldn't even try to expect that you'd get these kinds of readings unless you have a special tool to monitor that.

Games will allocate based on a logical reason that they may use it. Maybe not all at the same time as you'll definitely dump pointers when not in use. Allocation of memory showed less with my 2080Ti than with my 3090 with the same game. That's not for nothing.

I do think 3080 may suffer from 10G VRAM usage but we'll need someone to benchmark that to really tell. I mean it is obvious that some games will go past the 10G VRAM this generation - swapping will slow performance.
 
You will be hard pressed to find articles of actual usage of VRAM in a game. In fact, I wouldn't even try to expect that you'd get these kinds of readings unless you have a special tool to monitor that.

Games will allocate based on a logical reason that they may use it. Maybe not all at the same time as you'll definitely dump pointers when not in use. Allocation of memory showed less with my 2080Ti than with my 3090 with the same game. That's not for nothing.

I do think 3080 may suffer from 10G VRAM usage but we'll need someone to benchmark that to really tell. I mean it is obvious that some games will go past the 10G VRAM this generation - swapping will slow performance.
I hope that becomes a standard measurement going forward as it may have an impact on people's purchases. I do think memory will be an issue eventually, I just don't think it'll be an issue for a few years and well after the next round of cards come out.
 

Abriael_GN

RSI Employee of the Year
Now you know how an AMD buyer feels when Nvidia "partners" with devs. (Even if isn't clear if that's really the case here)

I don't, I got a 3070 and I can't care less about this. 1440p is where it's at for me. I just find it funny and a bit pathetic (whoever does it). Incidentally, AMD "partners" with devs just as much as Nvidia and always does.
 
Last edited:

Oppoi

Member
Any card under the 16gb is a budget card really. Mentioned this from day one and people will realize this sooner than later when new games come out that are builded for next gen. Which godfall clearly is and a simplistic one on top of it.

I'd love to hear the opinion from someone who leave the wiping to themselves and not one of their servants.
 
Last edited:

ClosBSAS

Member
None of those cards are 3x more powerful than the PS5. For argument's sake let's take the 5700XT. You think a 3080 is 3x more powerful than a 5700XT?
they are yes....if a 3080 is twice as powerful as a 2080ti, ps5 has equivalent of 2080....so yes, at least, LEAST, 3 times more powerful in raw computeing power than ps5.
 
Last edited:
A new console generation is *the* worst time to get a new GPU.

You need to let the new consoles lead and see what develops.

I still remember getting burned buying a GTX 780 on release when the PS4 was settling in. Because Sony got 8gb of DDR5 in the PS4 even my build with a 780 and 16Gb couldn’t run newer titles as well as it should due to the insufficient memory on the card itself.

Shortly after the TI cards started getting released to make up for this but it was a complete waste.

The same will happen this gen, as always. 3080 is a stop gap for the real next gen cards coming next.
 
Last edited:

BluRayHiDef

Banned
The following is relevant for those who own or are interested in the RTX 3080.

Here's a screen-capture of the benchmark results for Watch Dogs: Legion running on the RTX 3080. Don't mind that the data displayed by MSI Afterburner/ Rivatuner designates the GPU as the RTX 3090; I didn't relabel the data entries when I had the RTX 3080 installed. Anyhow, notice that the amount of VRAM that's listed as having been used by the benchmark is 10.13 GBs out of 9.84GBs. Is this an error or did the benchmark pull the additional 0.29 GBs from general system RAM (10.13 GBs - 9.84 GBs = 0.29 GBs)?

What makes me think that the benchmark did this is the amount of system RAM that's listed as having been used when the benchmark ran on the RTX 3080 relative to when it ran on the RTX 3090 with the same settings: 7.32 GBs vs 6.06 GBs, which is a difference of 1.26 GBs. Obviously 1.26 GBs is more than 0.29 GBs, but I'm assuming that the benchmark needed to duplicate the 0.29 GBs of data in order to minimize the amount of time that it took to find the data since system RAM is much slower than VRAM.

What do you guys think?

By the way, Task manager confirms the benchmarks use of more than 10 GBs:

XUFTUYV.png
 
Last edited:

reptilex

Banned
What's the PS5 equivalent of FidelityFX (cas) if any? I know they already failed on the checkboard vs DLSS2 front, and I have yet to see VRS and variable primitive geometries on PS5, but what about this?
 

rnlval

Member
No way in hell there wont be a 3080 Super in response to the performance and PRICE of that RX 6900.
I wouldn't buy RTX 3080 with 10 GB of VRAM when my current MSI RTX 2080 Ti Gaming X Trio (AIB OC) has 11 GB VRAM.

RTX 3090 is priced like a Titan RTX. I rather wait for RTX 3080 Ti with 20 GB-to-22 GB of VRAM.
 
Last edited:

inflation

Member
What's the PS5 equivalent of FidelityFX (cas) if any? I know they already failed on the checkboard vs DLSS2 front, and I have yet to see VRS and variable primitive geometries on PS5, but what about this?
They cracked the same problem with different methods. If you watch The Road to PS5 or follow some of the patents discussion here days ago, you would know more.
 

xPikYx

Member
Counterplay Games VD Keith Lee:
At 4K resolution using Ultra HD textures, Godfall requires tremendous memory bandwidth to run smoothly. In this intricately detailed scene, we are using 4K Ă— 4K texture sizes and 12 GB of VRAM memory to play at 4K resolution.



(At 1.07)

Bu****it, I played modded games like the witcher or fallout 4 plenty of 4k or even 8k textures and never heard of a problem like this, this is simply marketing
 

rnlval

Member

20 GB config leads to a 320-bit bus configuration.

Note that I have two gaming PCs (with RTX 2080 Ti and RTX 2080 respectively) and one of them doubles as a Blender 3D hardware RT rig. I plan to buy RTX 3080 Ti AIB OC and RX 6800 XT AIB OC.
 

BluRayHiDef

Banned
Counterplay Games VD Keith Lee:
At 4K resolution using Ultra HD textures, Godfall requires tremendous memory bandwidth to run smoothly. In this intricately detailed scene, we are using 4K Ă— 4K texture sizes and 12 GB of VRAM memory to play at 4K resolution.



(At 1.07)

Contrary to what others have said, this game looks incredible with those 4K x 4K textures. It looks truly next-gen; I'm definitely getting this for PC.
 

nkarafo

Member
1650 is entry level. You clearly have no idea how these products are categorized.
You are wrong. There are lower tier cards than the 1650 but whatever, even if everything you say is correct, this game still makes the 3080 10GB obsolete. Is that a high enough level card for you?
 
Top Bottom