So I lied. The game actually uses 5.8GB of VRAM when playing the single player @1440p, and only 4.8GB of VRAM when using the benchmark.
You need a card with 6GB of VRAM if playing on ultra, just a heads up.
That EVGA 780 Ti 6Gb is looking pretty good right about now...
Ultra Texture Details look insanely same as the High Texture Details from those screen shots, something is clearly wrong.
I don't think its too much to expect games with next-gen as a baseline(and thus much higher vRAM requirements) to have improved texture quality all-round.
That article got updated. Now you can really see a difference.
I still don't get why you would need these huge amounts of VRAM...
Bumping for new page. Guys, this is the person you want listening to you if you've got any potential solutions!People saying FEAR 3 SLI bits work, please post a shot of your NV Inspector window, a game screenshot showing scaling, your game settings, and your system config, because tests on my Win 7 x64, TITAN SLI system show 10-15% at best:
Yea, I get why its happening, but the new consoles are capable of using their video memory to good effect, so its kind of lousy to get massively increased requirements without it being put to good effect like they could be.Of course, I agree. But the point is these massive VRam requirements have little to do with graphical advances, merely a shift in resource allocation as a result of the new consoles.
Watch Dogs was really the first evidence of this. Suddenly a game requires 4GB for ultra textures that look like nothing even remotely special.
Is it me or AC Revelations has no AO ?
People saying FEAR 3 SLI bits work, please post a shot of your NV Inspector window, a game screenshot showing scaling, your game settings, and your system config, because tests on my Win 7 x64, TITAN SLI system show 10-15% at best:
That article got updated. Now you can really see a difference.
I still don't get why you would need these huge amounts of VRAM...
If you actually go to the site and see them in Fullscreen, its quite a difference. Ground especially. The rock face is definitely recolored and higher resolution. The ground is simply higher resolution.So Ultra changes grey to brown...
Yea, I get why its happening, but the new consoles are capable of using their video memory to good effect, so its kind of lousy to get massively increased requirements without it being put to good effect like they could be.
8gb of shared memory, not only VRAM.I think console ports are going to become more and more something to avoid going forward. Devs have 8GB of VRam to play with in consoles, that can easily result in PCs needing like 6GB for a lazy port, using just 2GB system ram. Same with Evil Within needing 4GB to even run.
I don't think so. VRAM requirements will go up but playing at console quality won't be a problem at all. With or without consoles more VRAM will always be a good thing to have.The only good thing is that PCs will eventually catch up because obviously consoles stay completely static in their capabilities. So eventually PCs will again easily be able to handle anything consoles throw at them as high Vram cards become standard and cheaper.
It's going to be a bit rough for a while, though, I reckon.
If you actually go to the site and see them in Fullscreen, its quite a difference. Ground especially. The rock face is definitely recolored and higher resolution. The ground is simply higher resolution.
At least we're seeing some marked advantage. Shame that 99% of people have to use the fairly ugly High textures.
You'd think, but it's clear that the PC is going to require more VRam than consoles to do the same thing. And consoles have quite a lot to start with now.
We all thought the 970 was some great deal by Nvidia for once, but in reality they've just been preparing for this VRam apocalypse. It's probably that cheap because more and more games will come out that a 970 won't even be able to max out.
There should have been more clarity about this sea change in pc requirements ages ago. How many people bought high-end 2GB or 3GB cards in the past year when Nvidia/ATI knew these people were probably pissing money away.
I think console ports are going to become more and more something to avoid going forward. Devs have 8GB of VRam to play with in consoles, that can easily result in PCs needing like 6GB for a lazy port, using just 2GB system ram. Same with Evil Within needing 4GB to even run.
The only good thing is that PCs will eventually catch up because obviously consoles stay completely static in their capabilities. So eventually PCs will again easily be able to handle anything consoles throw at them as high Vram cards become standard and cheaper.
It's going to be a bit rough for a while, though, I reckon.
I have a MSI GTX 970 with 4GB of VRAM and with the HD texture pack installed my Nvidia driver crashed within ten minutes of playing the game. Put the textures on high and not a single problem since, so I guess you really do need 6GB to max out the game.
Mind that I have 'High Quality' enabled for textures in my Nvidia global settings, so there isn't any texture compression going on outside of the game. Turning it down might improve performance but I doubt it will make the game look much better than regular high textures since you'll be slightly blurring the textures if you do that. Might be worth a try though?
Anyway, all that doesn't change the fact that the VRAM usage is completely ridiculous. The game doesn't look that impressive and some of the textures are kinda shitty even on higher setting. The game seems to run fine for the most part and I wouldn't call it a bad port, but with the way these textures look there clearly could've been some extra optimization.
E: Going to try and see if I run into any problems when I put the Nvidia texture setting on 'Quality'. Will report back.
Ultra preset, 1080p, haven't downloaded the 'Ultra' textures so it'll be defaulting to the 'high' textures.
i7 4770k @ 4.5Ghz
GTX 780 SLI (2x)
SLI off (stutter to 25fps also seemed to happen before it loaded):
SLI on (the stutter to 10fps happened before it loaded):
I'm reasonably happy with that. I'm using the F.E.A.R 3 SLI compatibility bit that Romir suggested (had to add the game's executable to get it to work).
I'm on a G-Sync monitor and it looked rather good performance wise. Can't play it at the moment though, just wanted to bench it then run. G-Sync capping my FPS at 144, so i'll redo that SLI one after to see if I can get a higher average.
I don't really consider that a good thing. With the disparity in power between consoles and PC's this time around, this is a pretty sad state of affairs that we have to wait years before we can handle next-gen games at proper settings without some high end GPU.The only good thing is that PCs will eventually catch up because obviously consoles stay completely static in their capabilities. So eventually PCs will again easily be able to handle anything consoles throw at them as high Vram cards become standard and cheaper.
I don't think any games are using that technology yet.Okay, so that whole Nvidia opimization stuff might actually be pretty amazing. I've been testing for a short while with the 'Quality' textures setting in the Nvidia control panel, which in this game doesn't look noticably worse than 'High Quality', and my VRAM usage is hovering around just 3.5GB with both Ultra and the HD Texture DLC enabled. I'm pretty sure it is all working as intended as textures do seem to be a bit sharper.
Going to play for a longer period of time to see if it isn't just my current in-game location not being that GPU intensive or anything.
I don't really consider that a good thing. With the disparity in power between consoles and PC's this time around, this is a pretty sad state of affairs that we have to wait years before we can handle next-gen games at proper settings without some high end GPU.
I don't think any games are using that technology yet.
Sadly, I did not get as good performance on my setup (although sli certainly helped). Running 3570k@4.5ghz, 8 gigs ram, geforce 2x780 sli (pny reference designs) and windows 7 64 bit:
sli disabled, ultra settings except textures high, 1080p:
sli enabled, sli bits and rendering settings from fear 3, ultra settings except textures high, 1080p:
I did not note any graphics glitches during the benchmark with sli.
inspector settings:
note: You have to add the game exe to the existing nvidia profile. It does not point to the correct exe in the steam version
This set of images seems to be one of the only ones showing a sizeable difference. I'm guessing the majority of texture assets may be the same as high or something?
I don't think any games are using that technology yet.
Developers only have access to around 5GB-5.5GB of RAM on the PS4 and Xbox One, however. In total. That's for everything.
I don't really consider that a good thing. With the disparity in power between consoles and PC's this time around, this is a pretty sad state of affairs that we have to wait years before we can handle next-gen games at proper settings without some high end GPU.
I suspect maybe that discrepancy is due to the CPU, in the recommended settings it suggests an i7 3770. I've got an overclock applied to these cards too, nothing drastic but something.
What are you talking about? High are similar to the settings on the consoles, which is 3GB of VRAM, same thing as on the consoles.
That's still more than 99.9% of the GPU market.
Right now, it's not a good thing. It's a horrible thing. The price of premium PC gaming has literally just shot up big time.
Note also that the PS4 version seems to have trouble getting noticeably above 30fps.
That's still more than 99.9% of the GPU market. ANd obviously for PCs to maintain the upper hand graphically, it will require more. We're not just looking for parity with PCs, we're used to superiority.
He says clearly total amount of RAM. So not only VRAM. VRAM for Killzone has been 3GB, and that can't increase a whole lot.
You can run the game perfectly fine, stop overreacting.
Well we don't actually know yet what is comparable to next-gen consoles. Nobody has done any comparison shots so far. You're assuming. And many people do not have 3GB cards, either, much less 6GB cards.What are you talking about? High are similar to the settings on the consoles, which is 3GB of VRAM, same thing as on the consoles.
Pretty sure Andy from Nvidia here even said that the main memory benefits aren't being utilized by any games yet.Also, you don't need to support that technology as a game, it is done through the drivers, I think it already works.
Come on dude. An average looking game requiring a Titan for its ultra settings all of a sudden? See the writing on the wall.
Rehosted those:
HIGH:
ULTRA:
What's wrong with that ? You can still play with good enough visual quality on mid-range machines.Come on dude. An average looking game requiring a Titan for its ultra settings all of a sudden? See the writing on the wall.
Well we don't actually know yet what is comparable to next-gen consoles. Nobody has done any comparison shots so far. You're assuming. And many people do not have 3GB cards, either, much less 6GB cards.
Pretty sure Andy from Nvidia here even said that the main memory benefits aren't being utilized by any games yet.
770 GTX 4GB
i5 2500K @ 3.3 GHZ
8GB Ram
Ultra @ 2560x1440
A consistent 30 with a 770 GTX is reasonable to me. Dunno why 6gb VRAM is recommended.
This isn't about what is 'good enough'. This is about noticing a worrying trend. We're getting mediocre level textures but massive increases in the requirements to use them to where only a tiny minority of PC gamers will be able to even have these mediocre looking textures. Quite possibly, people without 3-4GB GPU's are looking at a step *back* in terms of texture quality unless the developer puts some effort into optimizing memory usage for PC.What's wrong with that ? You can still play with good enough visual quality on mid-range machines.
You are completely overreacting.
What's wrong with that ? You can still play with good enough visual quality on mid-range machines.
You are completely overreacting.