• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.

Daylight (PC) has Absurdly High VRAM Usage (3GB+ @ 1080P)

1st Course

Member
Aug 10, 2012
5,977
0
0
Bench from Russian site GameGPU



Some posts from Guru3D and Steam

more than 3gb vram @1080
ue4 ftw




i'll have to raise afterburner parameters to 4gb now.

next gen we have arrived, seems only some of us can see it though.

1080 no / low aa and dof = 3167mb vram.
btw the game looks lovely, folks hating on it through review scores and vids are letting their noobness show.
game is very nice lookin btw.

the game is actually quite demanding, it takes up close to 3gb ram of the 8gb in my rig and even fills up the 2gigs of vram on my gtx760. and despite it being installed on a samsung 830 ssd, i still get reloading-hickups. i doubt this will be the usual requirements for ue4 games and think it is just very poorly optimized, taking into consideration that it seems pretty unpolished in other aspects as well. hadn't it come free with my gpu, i'd be pretty mad.

50-60fps on 1080p maxed out with 670 4GB sli, vram usage went as high as 3.4GB
 

bee

Member
Apr 13, 2005
3,881
45
1,475
the same can be said with many game, it's just caching. 780 has 3gb it uses 3gb, 290x has 4gb it uses 4gb, nothing new
 

brain_stew

Member
Feb 20, 2007
19,261
1
1,215
Thankfully, i ignored pc gaf's advice about 3gb vs 4gb cards. 3gb cards are this gen's equivalent of the 1.2gb cards that came out with the gtx 4XX.

As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.

People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.

Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
 

EmpReb

Banned
Feb 10, 2014
534
0
0
I learned my lesson last time I got a gpu and everyone said 1 gb of vram was enough. Should have gotten a 2GB that time. Glad I went the 4GB route this upgrade TF and this game have already proven I was right that vram would be to be the first be used up this Gen .
 

andreyblade

Neo Member
Nov 10, 2013
104
0
0
I'm seriously surprised that Epic didn't step in to help Zombie with Daylight considering it's the first UE4-powered game and needs to accurately show off what the engine can do. Here's hoping the studio will blow our minds and eyeballs with Fortnite
 

brain_stew

Member
Feb 20, 2007
19,261
1
1,215
I learned my lesson last time I got a gpu and everyone said 1 gb of vram was enough. Should have gotten a 2GB that time. Glad I went the 4GB route this upgrade TF and this game have already proven I was right that vram would be to be choked.

People are always quick to forget the past. D Cards like the 320MB 8800 GTS and 512MB 4870 were lauded at release yet just a year later both cards were running into serious VRAM constraints. The 2GB GTX 770 will end up with the same unfortunate fate.

If you don't change your card every year and value longevity then you should always take the higher VRAM option.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Jun 9, 2004
24,937
0
0
How will does the ps4 version run it?

The framerate is rough on PS4, which is weird because it's the one of the worst-looking games I've seen on PS4 and definitely has the worst IQ I've seen on PS4 out of 26 titles.
 

Cudder

Member
Aug 25, 2010
9,561
3
870
Canada
The framerate is rough on PS4, which is weird because it's the one of the worst-looking games I've seen on PS4 and definitely has the worst IQ I've seen on PS4 out of 26 titles.

not to mention the environments are small as fuck. the game runs like crap on PS4, complete joke.
 

mr2xxx

Banned
Mar 12, 2011
8,208
289
825
My 570 GTX is looking at me with sad puppy eyes, he knows he's not long for this world.
 

artist

Banned
May 7, 2006
16,629
0
0
As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.

People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.

Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
NVIDIA NVIDIA NVIDIA
 

Duxxy3

Member
Nov 30, 2007
22,331
4
795
New consoles have oodles of ram, so developers stop optimizing ram usage. It's why I was wary of upgrading my PC so soon into the new generation.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Jan 29, 2008
36,152
8
0
Australia
Seems like an Unreal Engine 4 thing. Both the realistic rendering and cave demos take up ~2.5GB VRAM, meanwhile the Elemental demo hits 3+GB VRAM.
 

SapientWolf

Trucker Sexologist
Jul 4, 2004
35,733
0
0
New consoles have oodles of ram, so developers stop optimizing ram usage. It's why I was wary of upgrading my PC so soon into the new generation.
The One doesn't really have a lot of VRAM to work with, so mulltiplatform games are going to have to use texture streaming to accommodate that.

What usually happens on the PC is that there's some setting that provides a small increase in texture quality with a massive increase in VRAM usage (i.e. insane textures in Titanfall). Plus, Daylight isn't exactly the game that convinces me that you need 6+GB of VRAM this gen.
 

brain_stew

Member
Feb 20, 2007
19,261
1
1,215
NVIDIA NVIDIA NVIDIA

I've never hidden the fact that I sit in team green for all manner of reasons (stability, S3D, gsync, faster driver support, less hitching and previously Nvidia Inspector but RadeonPro looks to have surpassed it now) but they're currently dropping the ball in terms of VRAM. At every important pricepoint AMD seem to give you an extra 1GB of VRAM and if you're buying a card to last you 2+ years (which I firmly believe encompasses most gamers) then that's going to make a huge difference.

The GTX 770 has the horsepower to see through this console generation yet it barely has another year or two's life in it because 95% of cards are sold with a piddling 2GB GDDR5.
 

ChawlieTheFair

pip pip cheerio you slags!
Jul 8, 2013
4,573
0
0
New Orleans
Seems like an Unreal Engine 4 thing. Both the realistic rendering and cave demos take up ~2.5GB VRAM, meanwhile the Elemental demo hits 3+GB VRAM.

While that is the case, the game doesn't look as good as those demos. Especially the elmental one.



I mean it's decent looking, but if I didn't already know it was UE4 I wouldn't have guessed it was.
 

Stimpack

Member
Aug 14, 2012
3,611
1
490
What a ridiculous question!
Great. So what does this mean for current gen graphics cards? 2GB seems to be the average if I'm not mistaken. So no AA without new cards or what? Also have we heard anything else about the whole "unified memory" for the next generation of cards?
 

SparkTR

Member
Jul 16, 2008
9,571
0
0
New consoles have oodles of ram, so developers stop optimizing ram usage. It's why I was wary of upgrading my PC so soon into the new generation.

Well this game in particular apparently runs like ass on PS4, with terrible IQ as well I think. Might be worth it for devs to start optimizing again.
 

Eusis

Member
Apr 15, 2011
36,667
1
705
As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.

People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.

Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
It's part of why I felt my mentality of "get PS4 now, upgrade PC later" was smartest. Granted everyone's got different priorities or even enough money to just get a Titan Z to go with their PS4 AND XB1, but if you want the newest games guaranteed to look great and you want to make sure your PC truly destroys consoles at a relatively reasonable price you'll have to wait on consoles to age some and computer hardware to advance even further, being merely somewhat ahead will get you stomped on in a year or two I suspect.

Plus these days I feel like the average PC will do for most indie games. Maybe not this one, but then it's on consoles (and didn't seem to get a great reception anyway) and anything that IS a problem can be something to look forward to when you build a new PC.
 

JLeack

Banned
Dec 14, 2012
3,106
0
0
California
My Ivy Bridge 3570k and GTX 570 can barely run this game despite it looking... Not that great. Performance, or mask thereof, is my biggest gripe.
 

Prophet Steve

Member
Sep 16, 2009
12,961
0
920
Netherlands
Compare frame rates between a graphics card with a lower and higher amount of VRAM and then I'll start to worry. This is a very unreliable way of measuring.
 

thematic

Member
Oct 24, 2005
490
0
0
where are the peoples that always said "you're not gonna use > 2GB for ANY GAME @ 1080p!!!!", "by the time you need > 2GB, your chip won't enough to render it!!!" ?

i got GTX 760 2GB...
4GB cards looks brighter than yesterday :(
 

Thoraxes

Member
Jan 12, 2009
28,526
0
0
soundcloud.com
where are the peoples that always said "you're not gonna use > 2GB for ANY GAME @ 1080p!!!!", "by the time you need > 2GB, your chip won't enough to render it!!!" ?

i got GTX 760 2GB...
4GB cards looks brighter than yesterday :(

Those people were talking about RAM, not VRAM.
 

SighFight

Member
Sep 20, 2013
432
15
500
Germany
where are the peoples that always said "you're not gonna use > 2GB for ANY GAME @ 1080p!!!!", "by the time you need > 2GB, your chip won't enough to render it!!!" ?

i got GTX 760 2GB...
4GB cards looks brighter than yesterday :(

Just because a game uses all memory that is available it doesn't necessarily mean that there is a performance gain.
edit and 4GB on a GTX 760 would still be a waste of money.
 

Lingitiz

Member
Sep 19, 2010
9,349
0
0
abrasiontest.wordpress.com
This being the first UE4 game prior to Epic putting a game out, and Zombie Studios being notorious for poor optimization makes this no surprise. Luckily it seems like the game is pretty poor and doesn't even look all that great, so we're not missing out on much.
 

Eusis

Member
Apr 15, 2011
36,667
1
705
Those people were talking about RAM, not VRAM.
If you're talking about only ever needing 2 GB of RAM in your computer you're even dumber.

Though the fact I needed 2 GB to max out Max Payne 3 a few years back should've indicated that with these consoles that absolutely wouldn't cut it.
 

Durante

Member
Oct 1, 2006
48,836
1
0
peter.metaclassofnil.com
I really think that we need some more detailed benchmarks, e.g. a 770 with 2 GB vs. one with 4GB, before jumping to conclusions. It might just be that the engine tries to actively cache as much as possible, but without a large performance impact.

Anyway, I got a 4GB 770 :p
 

riflen

Member
Feb 12, 2013
2,402
0
0
I've never hidden the fact that I sit in team green for all manner of reasons (stability, S3D, gsync, faster driver support, less hitching and previously Nvidia Inspector but RadeonPro looks to have surpassed it now) but they're currently dropping the ball in terms of VRAM. At every important pricepoint AMD seem to give you an extra 1GB of VRAM and if you're buying a card to last you 2+ years (which I firmly believe encompasses most gamers) then that's going to make a huge difference.

The GTX 770 has the horsepower to see through this console generation yet it barely has another year or two's life in it because 95% of cards are sold with a piddling 2GB GDDR5.

GPU designers are not creating GPUs to last 2+ years, that's something that price-conscious consumers have chosen to expect. They are creating GPUs to play games that exist today. Whether a GPU is still useful 2+ years later is up to developers, who design their games with a VRAM budget in mind.
The VRAM allocation on GPUs is a product of the bus width employed by the GPU design. 256-bit and 512-bit = 1,2,4,8,16,32GB, 384-bit = 1.5,3,6,12,24GB. After the bus width is finalised, the next consideration is cost and whether the GPU has the throughput to utilise the VRAM.
A 2GB 770 will play ue4 games fine. You might have to stick to 1920x1080 and (gasp) turn down some of the more VRAM hungry settings.
 

GhostTrick

Banned
Jan 11, 2012
16,580
12
0
I really think that we need some more detailed benchmarks, e.g. a 770 with 2 GB vs. one with 4GB, before jumping to conclusions. It might just be that the engine tries to actively cache as much as possible, but without a large performance impact.

Anyway, I got a 4GB 770 :p



Well then good sir, start a benchmark with your 4GB 770... then remove the fans and cut half of the memory chips, then run a new benchmark :p
 

thematic

Member
Oct 24, 2005
490
0
0
Those people were talking about RAM, not VRAM.

nope, it's VRAM

Just because a game uses all memory that is available it doesn't necessarily mean that there is a performance gain.
edit and 4GB on a GTX 760 would still be a waste of money.

so, if most developers uses 4GB VRAM for OMGULTRA-Texture quality, 2GB VRAM can run it without performance drop?

If you're talking about only ever needing 2 GB of RAM in your computer you're even dumber.

Though the fact I needed 2 GB to max out Max Payne 3 a few years back should've indicated that with these consoles that absolutely wouldn't cut it.

LOL yeah
 

wiggleb0t

Banned
Mar 29, 2010
1,236
0
0
where are the peoples that always said "you're not gonna use > 2GB for ANY GAME @ 1080p!!!!", "by the time you need > 2GB, your chip won't enough to render it!!!" ?

i got GTX 760 2GB...
4GB cards looks brighter than yesterday :(

There is heaps of misinformed advice given on these forums. Common comments is SLI/Crossfire is only for people who run benchmarks... It's simply moronic/childish stupidity that doesn't warrant a response.

The 2GB vram vs 4gb on a 256bus was 'tackled' via a 'review' on the 680 though that's at 1000mhz less ram speed and a 7xx series ram can bring + 1000ish overclock. Unfortunately the 'review' that was often quoted where they tested skyrim and then came to a conclusion that it served no point having the 4GB model.. Allowing for no room for different scenarios or readers to use reasoning you have perpetual cycles of misinformed comments.

"This is NeoGaf" goes both ways from techheads to utter stupidity.
 

Durante

Member
Oct 1, 2006
48,836
1
0
peter.metaclassofnil.com
so, if most developers uses 4GB VRAM for OMGULTRA-Texture quality, 2GB VRAM can run it without performance drop?
You don't really understand the argument being made. It's not that there won't be any performance drop, it's that if a modern engine uses large amounts of memory for caching, the performance drop with less memory will not be nearly as severe as it would be were you to run out of GPU memory in a traditional scenario (where all the assets in GPU memory are constantly needed).

We don't know if this is the case here or not, but just knowing how much memory the game uses alone doesn't tell us the whole story.
 

Eusis

Member
Apr 15, 2011
36,667
1
705
I wonder if there was frequent hitching? You CAN go over, but it may need to keep loading stuff in and thus cause performance hiccups, like happened with the Witcher 2 before jumping to a newer card. Nevermind that the card I'm using (560 Ti with 1 GB ram) is on there and reports <20 FPS. I'm not Dennis, I'd want games to be reasonably smooth most of the time, and while that's not a game that needed maxing out I do wonder how bad some next-gen games will look on this once they start really using the system, and it already seems to roughly hold its own against them as it is.

But that's more fodder for "I want a $200-ish upgrade that destroys consoles and runs the stuff consoles can't handle pretty well", and while that might be doable now and hardware always get better and better anyway the longer the wait the wider the gulf.

EDIT: There's also the fact that eventually they just stop optimizing for older hardware well, even if said older hardware could destroy consoles.