michaelius
Banned
I'll belive more than 2GB is needed when someone shows frametimes benchmark of 2GB vs 4GB card.
We have already established Zombie Studios is a garbage developer.
As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.
People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.
Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
I wonder how the Zombie Studios of today compares to the original team?Is this a joke? That's pretty harsh.
While that is the case, the game doesn't look as good as those demos. Especially the elmental one.
I mean it's decent looking, but if I didn't already know it was UE4 I wouldn't have guessed it was.
You don't really understand the argument being made. It's not that there won't be any performance drop, it's that if a modern engine uses large amounts of memory for caching, the performance drop with less memory will not be nearly as severe as it would be were you to run out of GPU memory in a traditional scenario (where all the assets in GPU memory are constantly needed).
We don't know if this is the case here or not, but just knowing how much memory the game uses alone doesn't tell us the whole story.
How so people recommending the GTX760/770 are doing that based on numbers we do know. The main point is that if you get a GTX770 4GB you might as well get a GTX780 for a bit more and it will destroy the 770 in the vast majority of situations even in the future. The GTX760 does not have the computational power to justify 4GB RAM, based on the numbers we do have. PC-GAF tries to give you best value for your money. If you want to waste money for very specific scenario's, be my guest.As soon as the PS4 was announced with 8GB GDDR5 it should have been clear as day to anyone that 4GB would be the minimum to get you comfortably through this generation. 3GB cards should hold on for a while yet, but that's the absolute minimum VRAM capacity that even a midrange user should be aiming for.
People are still recommending stuff like a 2GB GTX 770 every day in the PC thread and it's never sat well with me.
Same applies to a 780 Ti. I'd you're buying it for the long term then 6GB option is the only one you should be considering. If you change your card every 12 months then that is a different matter.
OH GOD MY EYES
God damnit, bloom and other effects didn't make my eyes water like chromatic aberration does
This was obvious the moment we learnt about the amount of RAM in the new consoles (regardless of OS footprint at launch).
It is not the best time to invest in a new GPU if you are just a regular user that upgrades once in a while, not when affordable solutions that will last you for years are around the corner.
He doesn't need to get used to it, as long as a game is UE4 it should be very simple to disableGet used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.
As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!
He doesn't need to get used to it, as long as a game is UE4 it should be very simple to disable
Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.
As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!
Also the Unified Memory was always going to cause issues, the next thing will be the PCIe 3.0..Dat 16gbs limit gonna get smaller and smaller!
Do we know of any games that genuinely use up 4GB of memory, rather than this cache method mentioned ?
Does BF4 use over 2GB in Ultra ?
I thought 4.5gb of VRAM was considered low-end?
So the game uses all the VRAM that's available to it? How is that absurd? Plenty of games do that.
Feel like I'm missing something here.
It's still best to wait awhile I think unless you need a replacement right now though. Wait until the true next gen games start coming out, the ones that are multiplatform and don't bother with PS3/360 like Arkham Knight or Witcher 3, and see what kind of computer hardware those demand. You pick something just because it's above what consoles are marked as being you may be in for a rude awakening depending on what you even want out of performance.People want to overreact. It is possible that 2GB may be getting not enough soon, but this is not the proof.
Why would anyone ever want to do this?
This sounds like a double edged sword to me though. On one hand a game may not use that much VRAM because it's busy juggling the pool with other games, but on the other perhaps a game decides to just blow most of them on the video and thus for video card to compare it needs to have a much larger pool to work with. Ergo, better to see what actually happens with the multiplats if you want to make a safe, relatively future proof purchase if that's your goal.As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.
I only got the 2GB 680 OH GOD WHAT HAVE I DONE
The amount of FUD being spread in here is off the charts, I really don't know where to begin. For one, the game looks to be using whatever amount of VRAM is available, so it scales. That really means nothing by itself. The other thing is, unless you plan on running games above 1080P, 2 - 3GB is more then enough to run anything you can throw at it. As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.
The amount of FUD being spread in here is off the charts, I really don't know where to begin. For one, the game looks to be using whatever amount of VRAM is available, so it scales. That really means nothing by itself. The other thing is, unless you plan on running games above 1080P, 2 - 3GB is more then enough to run anything you can throw at it. As far as comparing this to the HD twins, they have to share that 8 gigs with EVERYTHING, and on the ps4 2 of it is already locked up by the OS.
I thought it was proven to be 3GB for PS4 OS at this time? 5GB for your game, with another 512MB available under certain particular circumstances, if possible.
http://www.neogaf.com/forum/showthread.php?t=782997
So said Naughty Dog recently.
Well shit that's even worse.Both the PS4 and X1 currently reserve 3GB for the OS.
Get used to Chromatic Aberration dude, ALL UE4 games use it LOADS look at the Infiltrator it is loaded with it.
As is The Division, The Order 1886 even the latest DriveClub trailer...Nice easy way to look SNAZZY and save on AA!
Seems like an Unreal Engine 4 thing. Both the realistic rendering and cave demos take up ~2.5GB VRAM, meanwhile the Elemental demo hits 3+GB VRAM.
Trippier moments. I imagine most any visual effect has its place, the question is what that place is.Why would devs want to simulate looking at a game through a pair of bad/cheap corrective glasses?
I'll belive more than 2GB is needed when someone shows frametimes benchmark of 2GB vs 4GB card.
8GB of RAM (GDDR5 or DDR3 doesn't reallly matter in this case) means that GPUs with 2GBs of VRAM are not going to cut it the moment devs start using that as the new baseline, especially if you want to go above console settings.On the other hand pc gamers are free to set rendering resolution and a lot of other parameters that impact vtam consumption, but for arround 1080p gaming is not crazy ro believe that 2GB gpu are ok.
It's part of why I felt my mentality of "get PS4 now, upgrade PC later" was smartest. Granted everyone's got different priorities or even enough money to just get a Titan Z to go with their PS4 AND XB1, but if you want the newest games guaranteed to look great and you want to make sure your PC truly destroys consoles at a relatively reasonable price you'll have to wait on consoles to age some and computer hardware to advance even further, being merely somewhat ahead will get you stomped on in a year or two I suspect.
Plus these days I feel like the average PC will do for most indie games. Maybe not this one, but then it's on consoles (and didn't seem to get a great reception anyway) and anything that IS a problem can be something to look forward to when you build a new PC.
I'll belive more than 2GB is needed when someone shows frametimes benchmark of 2GB vs 4GB card.
That's not how it works. That 8gb is not VRAM, it's really 5GB of everything ram since 3 is used for OS. You can't lump it all together and say, ha they have so much VRAM, cause they don't. They have to use a lot of that ram the same way a PC uses its ram to load and run shit.8GB of RAM (GDDR5 or DDR3 doesn't reallly matter in this case) means that GPUs with 2GBs of VRAM are not going to cut it the moment devs start using that as the new baseline, especially if you want to go above console settings.
If you only upgrade once in a while you want something that is future-proof (and most want something affordable too). 2GBs of VRAM may be enough right now, but it won't be for long.
Well yeah this is at 4K resolution. No one recommends a 2GB card for that resolution.
Notice the 2GB 680 spikes compared to the 4GB version.
True.Well yeah this is at 4K resolution. No one recommends 2GB card for that resolution.
I think this time though it is just about ram. The power of current cards should see you safely through, but you'd want 4GB to be safe.
Notice the 2GB 680 spikes compared to the 4GB version.