• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Godfall (PC) requires 12GB VRAM for 4K (max settings)

GymWolf

Member
Amd with the most obvious jab in the history of obvious jabs.

Maybe next time pick a game that people gives a fuck about and that doens't look like a nice crossgen at best :ROFLMAO:

I love this:ROFLMAO:
 
Last edited:

Great Hair

Banned
Try this more recent video for PS5.
It looks better than the PC one.


Yep, definitely better.



Start of the sequence 1st i-frame


End of the sequence last i-frame


At 1min12 to 1min13s i counted 58 frames between 2!! clear frames (I- Frames?), those 58 between both "i-frames" where blurry like after getting heavily drunk.

At one point, the whole screen was white. Motion blur and anime fans gonna like this ;)

 

nkarafo

Member
Is this game supposed to be graphically advanced technically? Because all i see is garish colors and motion blur.
 

GymWolf

Member
In a few weeks.

Godfall runs better at max settings on RTX 3080 with 10GB of VRAM than on the RX 6800XT because we done goofed and put Raytracing in this, raytracing whose cost goes up with resolution.
They said that the game utilize amd-optimized rtx, i don't remember the name of the tech.

so maybe even with rtx on the amd cards are gonna be better?!
 

GymWolf

Member
None of those cards are 3x more powerful than the PS5. For argument's sake let's take the 5700XT. You think a 3080 is 3x more powerful than a 5700XT?
like i said, if we talk only about teraflop, 30 and 36 are triple the number of tf inside a ps5.
If you talk about vram, then no, only the 3090 is barely 2 times better.
 

Great Hair

Banned
How much did AMD pay them to not include DLSS?

Not as much as Nvidia paid Crytek to make AMD cards look bad in Crysis 2 by adding shit underneath the map that needed to be "tessellated" .. thus destroying the performance on AMD cards during the "Tessellation Wars"

 

GymWolf

Member
If you just look at teraflops like you are, then the 3090 is over 2x more powerful than the 2080ti. Except it isn't.
fair point.

I was not trying to talk shit about the ps5, i just thought that you forgot about the tf insides modern gpus.
 
Last edited:

nani17

are in a big trouble
3080 owners

giphy.gif
 
Last edited:

GHG

Member
90% of games don't need DLSS. Maybe the game from a developer pounding their chest that it uses 12GB of VRAM does though.

The reality is there's not a single game out there that wouldn't benefit from DLSS since it improves image quality with no performance cost.

What is also a reality is the fact that it requires some work on the developers end along with co-operation with Nvidia to get it working and that's not something everyone wants to do.

Unless Nvidia can find a way to get it implemented across the board at the driver level we are going to continue to see just a handful of games utilising the tech.

The irony in thinking AMD are throwing money around to stop developers using features that can be used on competitor GPU's. You must have forgotten the history surrounding middleware like physx, and that's just one example.
 
Last edited:

Mister Wolf

Gold Member
The reality is there's not a single game out there that wouldn't benefit from DLSS since it improves image quality with no performance cost.

What is also a reality is the fact that it requires some work on the developers end along with co-operation with Nvidia to get it working and that's not something everyone wants to do.

Unless Nvidia can find a way to get it implemented across the board at the driver level we are going to continue to see just a handful of games utilising the tech.

Any game that uses raytracing needs to have DLSS as an option. If they don't, then they come across as foolish or were paid not to include it.
 

cucuchu

Member
So this is VRAM allocated yeah? Not utilized, so if utilization doesn't reach above 10gb, whats the issue? (For 3080 owners) I'm at work so I can't see the content of the OP so I'm assuming that is the metric being used. Games typically use no where near the allocated VRAM. Plus the 3080 VRAM is GDDR6X with higher bandwidth than the VRAM found in the AMD cards/New consoles so total overall VRAM usage would be lower anyways...unless there is some wonky optimization of assets going on. I'm still in the camp they should of included more VRAM on the 3080 but I've yet to see real evidence in practice of the 10gb of higher bandwidth VRAM actually inhibiting performance in terms of practical VRAM usage.
 
Last edited:
I'm sure 3080 owners will be fine. They can just lower the texture quality. :messenger_winking:
Why would they? Have you not seen re2 and other games that can "allocate" more than your GPU's vram size? Check out the video a couple posts up, about the difference between allocation and utilization. They are not the same.
 

Senua

Gold Member
If VRAM utilization is not higher than 10gb, of which 3080 has higher bandwidth memory than that of the new consoles anyways, it wouldn't matter. Games do not usually come close to eating up even 60% of the VRAM allocation. So its a complete non-issue.
Stop talking sense ffs you're ruining the thread.
 

Rikkori

Member
People just don't understand how held back texture quality has been in general (even many so called "HD texture packs"). To actually see 4K x 4K, that's something special and I'm very happy to see it actually happening, and when they update it to UE5 it's going to look straight up magical.
 
PS5 should have enough VRAM for this.
but what of series x which only has 10GB fast speed ram
on top of my head? 3080-3090-6800xt-6900 if we talk about TF.

ps5 is 10.something vs 30-36-i don't know amd gpus TF but they are on par with nvidia gpus so...

Wait, maybe you were talking about vram?
The newer cards have less gaming performance per teraflop.

As for vram if nvidias ssd solution aint up to par, theyll suffer, with sony the ps5 can stream dozens of GBs in mere seconds.
 
4K gaming is getting close, one more generation of gpus and it’s gonna be amazing. 2022-2023 is gonna be a special year.
 
Last edited:

nkarafo

Member
So you buy 3070 to play 4K, Ultra?
It's a 500$ card in 2020 so yes.

Keep in mind that VRAM limitations don't get better at low frame rates. Either it's 30 or 60fps, the moment VRAM needs to swap, you will get massive stutters. That makes the 3070 inferior to next-gen consoles at 4K, despite being more powerful. It means, it's low amount of VRAM is it's bottleneck.
 

MastaKiiLA

Member
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
 

martino

Member
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
this is a thread about pc requirements .....
Who is bringing his complex where ? i'm not wondering
 
Last edited:
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
You literally came into a PC thread to bitch about people coming into a console thread. And even so, unless the game is specifically mentioned to be running on a console, 9 times out of 10 you are almost always looking at PC footage of multiplat games.

Even still, you don't get to dictate what people can post in any thread. If you want focused video game discussion on a game, either create the thread specifically to discuss what you want, participate in the ones that are currently going, go to Gamefaqs, ignore people/post(there is a button) or shut the hell up.
 
Last edited:
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.


PC will handle better graphics, better framerate, better controls, better response, modding. Its just complete domination 360° Sometimes you just wanna reiterate that.
 
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
Are console gamers going to shit up every thread on pc version of games that are on both? This is getting annoying.



 
Last edited:

Krisprolls

Banned
The reality is there's not a single game out there that wouldn't benefit from DLSS since it improves image quality with no performance cost.

I don't know, the result looks pretty subpar on Watch Dogs Legion. DLSS removes details (see DLSS thread on Legion) and it slightly blurs the picture... People don't seem to think it's the second coming of Jesus anymore. Still a good upscaling technique for sure.
 
Last edited:

MadAnon

Member
It's a 500$ card in 2020 so yes.

Keep in mind that VRAM limitations don't get better at low frame rates. Either it's 30 or 60fps, the moment VRAM needs to swap, you will get massive stutters. That makes the 3070 inferior to next-gen consoles at 4K, despite being more powerful. It means, it's low amount of VRAM is it's bottleneck.
It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.
 

Krisprolls

Banned
It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.

I love how $500 is midrange now. Nvidia certainly brainwashed people pretty well. Midrange should never be more than $300, like it used to be. I paid even less for my 1070 if I remember correctly.
 
Last edited:

MadAnon

Member
I love how $500 is midrange now. Nvidia certainly brainwashed people pretty well. Midrange should never be more than $300, like it used to.
Who cares what you think about prices. The reality is that mid range now is $400-500. RX 5700 Xt was a mid range card starting at $449. Looks like AMD is also brainwashing?

Your memory is fooling you because 1070 launched for $380. 970 launched for $330. See the trend?
 
Last edited:

Krisprolls

Banned
Oh, we should care about prices. And I never said AMD did better. If we don't care they'll sell us $1000 "midrange" cards. At some point you don't get your money's worth. Sometimes it's better to wait and upgrade later. Voting with our wallet.
 

nkarafo

Member
It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.
No, its not midrange. That would be the 2060 or the upcoming 3060. And no, thats no low range, that would be anything from the 1650 card or lower.

Also no, i dont expect to play at 4k/ultra/60fps even with a high range card. But i should be able to play at lower fps at least. But even then the graphics need to at least fit into the damn thing.
 
Last edited:
Top Bottom