• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Daylight (PC) has Absurdly High VRAM Usage (3GB+ @ 1080P)

As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.

People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.

Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.

I'm glad someone said it, a lot of people were dismissing this but in the back of my head i always thought "this is going to bite us in the ass".

Nvidia could make like AMD and start piling up on VRAM in their cards though. Then again i guess they need to leave a reason to upgrade their cards ;)
 

dark10x

Digital Foundry pixel pusher
Is this a joke? That's pretty harsh.
I wonder how the Zombie Studios of today compares to the original team?

They've always been kind of hit or miss, though, really. They were originally responsible for Spec Ops: Rangers Lead the Way (which was awful), among other things.

I was surprised to learn that they developed ZPC (Zero Population Count), though. It was based on Bungie's Marathon 2 engine and was just insanely weird.
 

manzo

Member
While that is the case, the game doesn't look as good as those demos. Especially the elmental one.

daylight-5.jpg


I mean it's decent looking, but if I didn't already know it was UE4 I wouldn't have guessed it was.

OH GOD MY EYES

God damnit, bloom and other effects didn't make my eyes water like chromatic aberration does
 

thematic

Member
You don't really understand the argument being made. It's not that there won't be any performance drop, it's that if a modern engine uses large amounts of memory for caching, the performance drop with less memory will not be nearly as severe as it would be were you to run out of GPU memory in a traditional scenario (where all the assets in GPU memory are constantly needed).

We don't know if this is the case here or not, but just knowing how much memory the game uses alone doesn't tell us the whole story.

okay, i understand "memory for caching" thing...
but my question still stand: is 2GB VRAM gonna be enough when developers uses higher resolution texture/"assets constantly needed" for next gen games @1080p?

or current games already uses "the ultra highest texture resolution/quality" for 1080p? because while i play Tomb Raider 2013 on Ultimate (used approx 1.6GB VRAM), some texture are still feels "low-res" esp. when zoomed in (leaf, rock, etc)

now if next gen all have 8GB RAM/VRAM (combined), and use most of it for graphics texture (remember console only got 7790/7870 hardware level), will PC with 2GB VRAM (with similar hardware level) enough for "Ultimate" settings without tone it down?

basically what i'm not agree is : "you won't use > 2GB VRAM for 1080p"
is 1080p really won't use 2GB VRAM?
or it's just another "benchmark" 2GB vs 4GB with game which didn't use "OMGULTRAHIRES-Texture"?
 

2San

Member
As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.

People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.

Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
How so people recommending the GTX760/770 are doing that based on numbers we do know. The main point is that if you get a GTX770 4GB you might as well get a GTX780 for a bit more and it will destroy the 770 in the vast majority of situations even in the future. The GTX760 does not have the computational power to justify 4GB RAM, based on the numbers we do have. PC-GAF tries to give you best value for your money. If you want to waste money for very specific scenario's, be my guest.
 

Tak3n

Banned
worth noting that nearly all GTX cards from 660 onwards were coming with a code for this game.. and they have 2GB

not to mention the requirements

Operating System : Microsoft Windows 7 / Microsoft Windows 8
Processor : Intel Core i5@2.8 GHz / AMD Phenom II X4@3.2 GHz
RAM: 2 GB
Video card : GeForce GT 630 1GB DDR3, Radeon HD 5570 1024MB or higher
Sound Card: DirectX 9.0c compatible
Free space on hard disk: 4.1 GB
 

MisterM

Member
It ran fine last night on my 7870XT 2GB @ 1080p high settings, I got a constant Vsync'd 60. Although that was without the NVidia specific stuff (Bokeh DOF, PhysX & HBAO+ I believe).

It was still a terrible game though. Why did I pre-order :'(
 
This was obvious the moment we learnt about the amount of RAM in the new consoles (regardless of OS footprint at launch).

It is not the best time to invest in a new GPU if you are just a regular user that upgrades once in a while, not when affordable solutions that will last you for years are around the corner.
 

HowZatOZ

Banned
Wait, so my 770 4GB is going to clock out at roughly 4GB usage and not even hit 60FPS @ 1080p properly? Who the hell built this game?!
 

R_Deckard

Member
OH GOD MY EYES

God damnit, bloom and other effects didn't make my eyes water like chromatic aberration does

Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.

As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!

Edit: Also the Unified Memory was always going to cause issues, the next thing will be the PCIe 3.0..Dat 16GB/s limit gonna get smaller and smaller!
 

R1CHO

Member
This was obvious the moment we learnt about the amount of RAM in the new consoles (regardless of OS footprint at launch).

It is not the best time to invest in a new GPU if you are just a regular user that upgrades once in a while, not when affordable solutions that will last you for years are around the corner.

Not that obvious.

First only one of the consoles have gddr5.

Second, even if the ps4 has 8GB of gddr5, apparently only 6 are avalible to developers, and from that how much is going to be used as vram by the average game?

If I have to bet I would say closerto 2 gB than 4.

On the other hand pc gamers are free to set rendering resolution and a lot of other parameters that impact vtam consumption, but for arround 1080p gaming is not crazy ro believe that 2GB gpu are ok.
 

Durante

Member
Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.

As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!
He doesn't need to get used to it, as long as a game is UE4 it should be very simple to disable :p
 

SmartBase

Member
So the game uses all the VRAM that's available to it? How is that absurd? Plenty of games do that.

Feel like I'm missing something here.
 
Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.

As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!

Also the Unified Memory was always going to cause issues, the next thing will be the PCIe 3.0..Dat 16gbs limit gonna get smaller and smaller!

It is a post processing effect that is very easy to enable but also very easily disabled. I did it for the Reflections demo.

With CA:


Without CA:


Do we know of any games that genuinely use up 4GB of memory, rather than this cache method mentioned ?

Does BF4 use over 2GB in Ultra ?

I think there are some titles, but anti-aliasing and higher resolutions have a big effect.

I thought 4.5gb of VRAM was considered low-end?

Hell no, very few cards that achieve that.

So the game uses all the VRAM that's available to it? How is that absurd? Plenty of games do that.

Feel like I'm missing something here.

People want to overreact. It is possible that 2GB may be getting not enough soon, but this is not the proof.
 

Eusis

Member
Hadn't heard of Chromatic Aberration before.

Makes games look like I'm supposed to be wearing 3D glasses... and makes me wonder how they'd look if you WERE wearing them.
People want to overreact. It is possible that 2GB may be getting not enough soon, but this is not the proof.
It's still best to wait awhile I think unless you need a replacement right now though. Wait until the true next gen games start coming out, the ones that are multiplatform and don't bother with PS3/360 like Arkham Knight or Witcher 3, and see what kind of computer hardware those demand. You pick something just because it's above what consoles are marked as being you may be in for a rude awakening depending on what you even want out of performance.
 

DarkFlow

Banned
The amount of FUD being spread in here is off the charts, I really don't know where to begin. For one, the game looks to be using whatever amount of VRAM is available, so it scales. That really means nothing by itself. The other thing is, unless you plan on running games above 1080P, 2 - 3GB is more then enough to run anything you can throw at it. As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.
 

Eusis

Member
In this specific case it might also be a good idea for horror, go for a more unreal and surreal look to the world. In something trying to look realistic with normal vision it'd just look stupid though.
As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.
This sounds like a double edged sword to me though. On one hand a game may not use that much VRAM because it's busy juggling the pool with other games, but on the other perhaps a game decides to just blow most of them on the video and thus for video card to compare it needs to have a much larger pool to work with. Ergo, better to see what actually happens with the multiplats if you want to make a safe, relatively future proof purchase if that's your goal.
 

riflen

Member
The amount of FUD being spread in here is off the charts, I really don't know where to begin. For one, the game looks to be using whatever amount of VRAM is available, so it scales. That really means nothing by itself. The other thing is, unless you plan on running games above 1080P, 2 - 3GB is more then enough to run anything you can throw at it. As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.

I thought it was proven to be 3GB for PS4 OS at this time? 5GB for your game, with another 512MB available under certain particular circumstances, if possible.

http://www.neogaf.com/forum/showthread.php?t=782997

So said Naughty Dog recently.
 

JaseC

gave away the keys to the kingdom.
The amount of FUD being spread in here is off the charts, I really don't know where to begin. For one, the game looks to be using whatever amount of VRAM is available, so it scales. That really means nothing by itself. The other thing is, unless you plan on running games above 1080P, 2 - 3GB is more then enough to run anything you can throw at it. As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.

Both the PS4 and X1 currently reserve 3GB for the OS.
 

Knurek

Member
Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.

As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!

Why would devs want to simulate looking at a game through a pair of bad/cheap corrective glasses?
 

Dr Dogg

Member
Seems like an Unreal Engine 4 thing. Both the realistic rendering and cave demos take up ~2.5GB VRAM, meanwhile the Elemental demo hits 3+GB VRAM.

Yeah the Elemental demo you posted in your thread used every last mb of my VRAM at the default settings. Though asides from one tiny little hitch in the first seconds assets didn't have the usual steaming issues associated with UE.
 

Zoned

Actively hates charity
Seems like developers didn't bother to optimize it. Ended up producing a bomba material though
 

Eusis

Member
Why would devs want to simulate looking at a game through a pair of bad/cheap corrective glasses?
Trippier moments. I imagine most any visual effect has its place, the question is what that place is.

Also graphics probably driven by novelty just as much as being technically impressive, so funny filters and post processing would count. Something that just makes it feel fresh.
 

gtrunner

Banned
that's what's happening. I was playing it with and over clocked gtx 670 2gb it would run smooth until i turn around in a circle then hick up.

but even 780s tis only have 3 GB of vram it would be pointless upgrading to one if you are going to run into the same vram issue next year with bigger games.

happy i haven't upgraded yet, i would be pissed.
 

derExperte

Member
I'll belive more than 2GB is needed when someone shows frametimes benchmark of 2GB vs 4GB card.

I'll believe it when a game that doesn't run like poop almost everywhere and looks like it's actually using the VRAM for something needs more than 2GB to look good. Not sure why anyone is drawing any conclusions from this here.
 
Alot of PS4 developers seem to be allocating 2GB for Vram usage from what I've seen...which is the sweet spot for 1080p gaming as well as leaving at least 3GB for everything else.
 
On the other hand pc gamers are free to set rendering resolution and a lot of other parameters that impact vtam consumption, but for arround 1080p gaming is not crazy ro believe that 2GB gpu are ok.
8GB of RAM (GDDR5 or DDR3 doesn't reallly matter in this case) means that GPUs with 2GBs of VRAM are not going to cut it the moment devs start using that as the new baseline, especially if you want to go above console settings.

If you only upgrade once in a while you want something that is future-proof (and most want something affordable too). 2GBs of VRAM may be enough right now, but it won't be for long.
 

mrklaw

MrArseFace
It's part of why I felt my mentality of "get PS4 now, upgrade PC later" was smartest. Granted everyone's got different priorities or even enough money to just get a Titan Z to go with their PS4 AND XB1, but if you want the newest games guaranteed to look great and you want to make sure your PC truly destroys consoles at a relatively reasonable price you'll have to wait on consoles to age some and computer hardware to advance even further, being merely somewhat ahead will get you stomped on in a year or two I suspect.

Plus these days I feel like the average PC will do for most indie games. Maybe not this one, but then it's on consoles (and didn't seem to get a great reception anyway) and anything that IS a problem can be something to look forward to when you build a new PC.

I think this time though it is just about ram. The power of current cards should see you safely through, but you'd want 4GB to be safe.
 

DarkFlow

Banned
8GB of RAM (GDDR5 or DDR3 doesn't reallly matter in this case) means that GPUs with 2GBs of VRAM are not going to cut it the moment devs start using that as the new baseline, especially if you want to go above console settings.

If you only upgrade once in a while you want something that is future-proof (and most want something affordable too). 2GBs of VRAM may be enough right now, but it won't be for long.
That's not how it works. That 8gb is not VRAM, it's really 5GB of everything ram since 3 is used for OS. You can't lump it all together and say, ha they have so much VRAM, cause they don't. They have to use a lot of that ram the same way a PC uses its ram to load and run shit.
 

riflen

Member
I think this time though it is just about ram. The power of current cards should see you safely through, but you'd want 4GB to be safe.

In fact this time it could be much more about your PC's CPU also.
The consoles have programmable GPUs (GPGPU) that can and will be used by developers to vastly improve performance for certain workloads. The GPU will take on tasks that would otherwise be left for the CPU to handle (and which it's not very good at doing).

The hardware fragmentation that is inherent in the PC space, means that developers cannot rely on the user's GPU having sufficient GPGPU capability. So, they don't take advantage of GPGPU so often and the burden is placed on the CPU for calculations such as the interaction of particles and other complex physics simulation.

This situation is one reason why you see Nvidia paying developers to include their PhysX technology in titles. The developer just cannot spend the time and money on implementing features that an unknown number of users can take advantage of effectively.
 
Top Bottom