• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Evil Within System & Hard Disk Requirements (PC/Consoles), strongly suggests 4GB VRAM

Akronis

Member
So what's the difference between vram, total vram, shared vram? I'm so lost. 8GB of system ram and a 4GB 770 here.

VRAM is the memory that is actually on your GPU. Shared VRAM is just normal RAM that is being used as VRAM (not nearly as fast as VRAM on the GPU though). Total VRAM is a combination of GPU VRAM and system memory that is being used as VRAM.
 
VRAM is the memory that is actually on your GPU. Shared VRAM is just normal RAM that is being used as VRAM (not nearly as fast as VRAM on the GPU though). Total VRAM is a combination of GPU VRAM and system memory that is being used as VRAM.

This is all so confusing with the way Bethesda worded things then.

edit: can someone tell me how to find my total VRAM in Windows 8.1 then, please?
 
In some ways, he is right. Sometime PC gaming can cause an insane amount of buyer's remorse. Some of the folks here just spent 500 bucks on a GPU less than six months ago to find out it was less than optimal for the upcoming games.

I believe he's not right at all. There's always something better, more advanced or faster around the corner. Always. It's how technology works. How do you think 780Ti owners might feel now that the 970 more or less matches their card's performance and costs half as much? How do iPhone 5 owners feel about the much better iPhone 6? The owner of a $500 graphics card will have a much better experience than either consoles can provide regardless of memory requirements. This VRAM fear mongering has gotten out of hand, not to mention the issue of artificially inflated minimum specs in general. My suggestion is for everyone to calm down and wait for actual performance results before panicking. Even in the worst case scenario, let's say that they'll have to drop texture quality from Ultra to Very High. What's the big deal?

In any case, reading comments like "I have a 780Ti with 3GBs of VRAM, I'll go with the console version" makes me want to pull my hair out in frustration.
 

Kezen

Banned
CyHfSOv.jpg


From the Steam Community. A guy asked Bethesda and they replied the 4GB are concerning "Total Shared Memory" ... see Picture.

For example: My 770 has 2GB dedicated VRAM and 2GB shared VRAM. Together that's 4GB. It's a weird way of posting Requirements ... but well. I thought I should post it.

Hum.
 
This is all so confusing with the way Bethesda worded things then.

edit: can someone tell me how to find my total VRAM in Windows 8.1 then, please?

Hit the windows key and in the top right hand corner there is a magnifying glass click on that and in the box type in dxdiag and click on the application. Go to display and it will show you approx. total memory (in my case with a 2GB GTX 760 it is showing 3999 MB total memory)
 

nkarafo

Member
Hit the windows key and in the top right hand corner there is a magnifying glass click on that and in the box type in dxdiag and click on the application. Go to display and it will show you approx. total memory (in my case with a 2GB GTX 760 it is showing 3999 MB total memory)
Its the same for me (4071) and i only have a 512MB card. Does that mean even i can play this game with maxed textures? That would be funny.
 

Akronis

Member
Its the same for me (4071) and i only have a 512MB card. Does that mean even i can play this game with maxed textures? That would be funny.

You could (unless it hard-locks the setting if it detects lower than 4GB of on-card VRAM), but you'll probably encounter stuttering while it streams the textures due to the shared VRAM being much slower than GPU VRAM.
 

nkarafo

Member
You could (unless it hard-locks the setting if it detects lower than 4GB of on-card VRAM), but you'll probably encounter stuttering while it streams the textures due to the shared VRAM being much slower than GPU VRAM.
Nah, i probably won't see a difference as my VRAM is DDR3, just like my System RAM :p

Heck, my System RAM is probably faster actually... Its 1600mhz DDR3 while the VRAM is clocked at 900mhz according to CPU-Z
 

Skyzard

Banned
Keep in mind though that you'll be able to tell when the frame buffer on the video card is full and the spillover into the appropriated 2GB of system RAM happens. Whatever game you're playing will start stuttering for a few moments when the data from the game is loaded into the slower system RAM as opposed to the much faster dedicated VRAM that's built onto the video card itself.

Yeah fuck that, that sucks.

They must be kidding with the 4GB VRAM requirement. Textures look average at best.

The game looks fun but yeah, why the hell is it reaching 4GB.
 
People shouldn't worry too much.

I find texture quality isn't that big of a deal compared to other graphical features such as AO, AA, object density etc.

For one thing, I know my i5 (quad core, no HT) + 780 ti 3GB desktop will vastly outperform the i7 (quad core + HT) + 860m 4GB on my laptop in both IQ and FPS.
 

Gbraga

Member
I really think it's more likely that the e-mail dude is wrong.

Sure, it would make more sense that this game doesn't really need 4GB of VRAM, but no other game recommends shared VRAM, that makes absolutely no sense, not even other Bethesda games. If at least was some weird Bethesda thing, but not even that.
 

JaseC

gave away the keys to the kingdom.
The game looks fun but yeah, why the hell is it reaching 4GB.

MegaTextures can be up to 128k and 128k in size and this doesn't come cheap. The draw of the tech is that that an entire level becomes a blank canvas for artists and the memory footprint is low compared to texturing the level with a similar amount of unique detail using the traditional method, however the downside, as many have noted, is that for the textures to breath they need a lot of space -- even 50GB isn't quite enough for a game with many different locations, as evidenced by Wolf14's slightly uneven texture quality.
 

gelf

Member
I believe he's not right at all. There's always something better, more advanced or faster around the corner. Always. It's how technology works. How do you think 780Ti owners might feel now that the 970 more or less matches their card's performance and costs half as much? How do iPhone 5 owners feel about the much better iPhone 6? The owner of a $500 graphics card will have a much better experience than either consoles can provide regardless of memory requirements. This VRAM fear mongering has gotten out of hand, not to mention the issue of artificially inflated minimum specs in general. My suggestion is for everyone to calm down and wait for actual performance results before panicking. Even in the worst case scenario, let's say that they'll have to drop texture quality from Ultra to Very High. What's the big deal?

In any case, reading comments like "I have a 780Ti with 3GBs of VRAM, I'll go with the console version" makes me want to pull my hair out in frustration.

The big deal is in this case is that Bethesdas statements are making it sound like you'll struggle to run it even on low texture quality. I'm not sure I believe that either but it is concerning for those of us that don't keep up with technology so fast.

If I have uncertainly about my PC capabilities I'll usually go with the console version because its a more easily known quantity. I can look at reviews, read a Digital Foundry article and I know exactly how the game will perform and can make a call on if that's good enough for me.

Its too early to make that call for me though, I'll see what people who are crazy enough to risk paying full price on a game that their PC doesn't meet the advertised requirements think. I can understand why people feel more at ease getting the console version though, especially if feel you have to have it day one.
 

nkarafo

Member
MegaTextures can be up to 128k and 128k in size and this doesn't come cheap. The draw of the tech is that that an entire level becomes a blank canvas for artists and the memory footprint is low compared to texturing the level with a similar amount of unique detail using the traditional method, however the downside, as many have noted, is that for the textures to breath they need a lot of space -- even 50GB isn't quite enough for a game with many different locations, as evidenced by Wolf14's slightly uneven texture quality.
All that sound much worse to me, compared to the traditional method.
 

jett

D-Member
MegaTextures can be up to 128k and 128k in size and this doesn't come cheap. The draw of the tech is that that an entire level becomes a blank canvas for artists and the memory footprint is low compared to texturing the level with a similar amount of unique detail using the traditional method, however the downside, as many have noted, is that for the textures to breath they need a lot of space -- even 50GB isn't quite enough for a game with many different locations, as evidenced by Wolf14's slightly uneven texture quality.

If you thought Wolf had uneven texture quality... Rage is much worse. Entire walls filled with macroblocking. I'll never forget the hospital level, never seen something like that in a game. This technology is just a dead-end. It was supposed to make development easier and quicker, but Rage took forever to come out.
 
All that sound much worse to me, compared to the traditional method.

It has it's pros and cons. Personally I like the way Rage and Wolfenstein look. Overall you've got a lot more detail across any given scene, even though you've obviously got less 'DPI' per in game inch.
 

Dr Dogg

Member
All that sound much worse to me, compared to the traditional method.

There's a good thread on Reddit that goes into quite a bit of detail of the pros and cons of MegaTextures. One of the benefits is how unique a landscape can look over the more traditional tiled approach.

http://www.reddit.com/r/Games/comments/265ixh/virtual_texturing_id_tech_5_or_why_do_wolfenstein/

Some pics from the OP over there.
Skyrim
pQNNMR1.jpg

Rage
Rage%2B%2B-%2B%2B5.jpg

MegaTextures work brilliant when view from a certain distance but sticking your nose/camera right up to them is going to make them look awful. Sadly that's just the way the tech is. There's even more in depth talks that Carmack gave during the QuakeCon 2010 and 2013 keynotes about them.
 

Skyzard

Banned
^ Okay, that's acceptable if it's to make a difference like that, that looks a lot better. Hopefully we won't get any levels like the hospital jett mentions. I really did not like the way skyrim looked from far.

MegaTextures can be up to 128k and 128k in size and this doesn't come cheap. The draw of the tech is that that an entire level becomes a blank canvas for artists and the memory footprint is low compared to texturing the level with a similar amount of unique detail using the traditional method, however the downside, as many have noted, is that for the textures to breath they need a lot of space -- even 50GB isn't quite enough for a game with many different locations, as evidenced by Wolf14's slightly uneven texture quality.

Thanks for the info. Hopefully we should be able to get a sense of the benefit then right? Or is it mostly for time-saving during development - guess we'll see when we see the levels.
 

MayMay

Banned
Maan, what. I really hope my 3GB 780 is gonna be able to play it alright. I was so excited for the game, but that kinda put a dampener on it >,< I'm not gonna upgrade until a 1080 or something. Why can't Bethesda just stop using this Engine. Sure, there are some positives - but way more negatives.
 

JaseC

gave away the keys to the kingdom.
If you thought Wolf had uneven texture quality... Rage is much worse. Entire walls filled with macroblocking. I'll never forget the hospital level, never seen something like that in a game. This technology is just a dead-end. It was supposed to make development easier and quicker, but Rage took forever to come out.

Yeah, but Rage's textures were created with the storage space of two X360 DVDs in mind, while Wolf14's were not (the game is literally twice the size of Rage) and neither were TEW's (judging from the HDD space requirement), so it was a more appropriate comparison.

Edit: WRT Rage's size, I forgot to subtract the ~2.3GB DLC. Fixed.
 
I can't believe this still hasn't been properly addressed and now there is more conflicting information.

I'm convinced this is going to run like shit.
 

JaseC

gave away the keys to the kingdom.
Do you think those versions will use megatextures too?

It's going to look like shit if they do right?

It's going to look about as awful as Rage. The texture quality in general is so bad that Carmack patched in a sharpening filter, but the game arguably looks worse with it enabled.
 
Do you think those versions will use megatextures too?

It's going to look like shit if they do right?
Wolfenstein looked like shit on these systems so I don't expect much from Evil Within.

For comparison's purpose.

Wolfenstein X360 vs PS4
wolfenstein-wall-textures.png


But again, Wolfenstein looks like that because of 60 fps while Evil Within will most likely be 30 fps.
 

nkarafo

Member
MegaTextures work brilliant when view from a certain distance but sticking your nose/camera right up to them is going to make them look awful.
That sounds like the worst way to texture a first person shooter.

But a good way to texture a racing, flight sim, aerial combat, on rails shtmp where you usually don't examine stuff up close.
 
I wouldn't rest easy just yet. This is such a weird way of wording everything and I wouldn't be surprised if the support staff who replied here wasn't 100% sure what they were saying or what the person was asking.

Thanks. I wish they would officially straighten this out.

Keep in mind though that you'll be able to tell when the frame buffer on the video card is full and the spillover into the appropriated 2GB of system RAM happens. Whatever game you're playing will start stuttering for a few moments when the data from the game is loaded into the slower system RAM as opposed to the much faster dedicated VRAM that's built onto the video card itself.

Thanks for this advice. This isn't as simple as I thought :( It's so crazy that this game is asking for so much VRAM for great settings.
 

Cyriades

Member
Hit the windows key and in the top right hand corner there is a magnifying glass click on that and in the box type in dxdiag and click on the application. Go to display and it will show you approx. total memory (in my case with a 2GB GTX 760 it is showing 3999 MB total memory)

Mine says 11127 mb

410108490.jpg


what does that mean? Is that 11GB reserve for VRAM?!
 

dreamfall

Member
Yeah, what the hell is going on with that email?

Shared VRAM? I wish they be more clear. Though, I do understand that Wolfenstein's performance on my machine (i7 3770K, GTX 680 2GB, 8GB RAM) had some issues- id tech 5 seems pretty demanding.

Still debating which platform to buy this on.
 
Wolfenstein looked like shit on these systems so I don't expect much from Evil Within.

For comparison's purpose.

Wolfenstein X360 vs PS4
wolfenstein-wall-textures.png


But again, Wolfenstein looks like that because of 60 fps while Evil Within will most likely be 30 fps.

I don't think 30 fps is going to improve texture detail on any system.
 

Nemmy

Member
So... I guess I'll deal with low/medium settings then.

Hopefully I won't end up with worse IQ than pn PS360 because that would be bloody ridiculous.
 

Sanctuary

Member
Did anybody else purchase a 970 or 980, only to read the recently released VRAM requirements of upcoming games and wish you waited for an 8GB model?

*raises hand*

Personally I have zero problem with games becoming this power hungry, but they have to show for it, and the hardware developers need to stop being stingy with memory - specifically NVidia; AMD typically has a bit more memory on competing cards.

My plan was to wait until spring of 2014 to build a new PC when Nvidia released the 800 series. When it was announced that wasn't actually going to happen, I just build a new PC last November and picked up a GTX 780 thinking it would be more than enough for any of the multiplatform games. Not a single game that I've played this year so far aside from Skyrim loaded with a bunch of mods needed the power/Vram of the GTX 780. None of those games were starting to hit until now (it seems at least) and I'll actually end up needing a better card to run at higher resolutions and not drop below 60fps over the next six months or so.

That is if this is actual dedicated Vram they are talking about, and not shared Vram...

I believe he's not right at all. There's always something better, more advanced or faster around the corner. Always. It's how technology works. How do you think 780Ti owners might feel now that the 970 more or less matches their card's performance and costs half as much? How do iPhone 5 owners feel about the much better iPhone 6? The owner of a $500 graphics card will have a much better experience than either consoles can provide regardless of memory requirements. This VRAM fear mongering has gotten out of hand, not to mention the issue of artificially inflated minimum specs in general. My suggestion is for everyone to calm down and wait for actual performance results before panicking. Even in the worst case scenario, let's say that they'll have to drop texture quality from Ultra to Very High. What's the big deal?

In any case, reading comments like "I have a 780Ti with 3GBs of VRAM, I'll go with the console version" makes me want to pull my hair out in frustration.

It's not really like that at all. People simply assumed they had more than enough power to run games at better resolutions and framerates than the now current gen consoles. That's where the remorse comes in. Sure, some people want to "keep up with the Joneses" but that's not what's going on here if this vague information has any merit.
 
It's not really like that at all. People simply assumed they had more than enough power to run games at better resolutions and framerates than the now current gen consoles. That's where the remorse comes in. Sure, some people want to "keep up with the Joneses" but that's not what's going on here if this vague information has any merit.

They do. They really, absolutely do. It has been the case for every single next gen multi platform game released thus far and it will certainly be the case going forward. The already significant power gap is going to get even wider. That's why I don't understand what all the complaining is about.
 
I believe he's not right at all. There's always something better, more advanced or faster around the corner. Always. It's how technology works. How do you think 780Ti owners might feel now that the 970 more or less matches their card's performance and costs half as much?

In any case, reading comments like "I have a 780Ti with 3GBs of VRAM, I'll go with the console version" makes me want to pull my hair out in frustration.

I feel a bit better than that, 780Ti is closer to a 980 and better in some cases :p
 

Gvaz

Banned
i have a 2gb 750ti and it says i have 4gb of shared memory. this is coming from a 1gb 4890 lol

i doubt that's what they mean
 

Skyzard

Banned
To posit a frame of reference for current GPU hardware, the 8800GTX launched back in 2006 around the same time as the PS3 and the year after the Xbox 360 with 768MBs of VRAM (50% more than either console's combined total amount of RAM). The 8800GTX was a $600-650 monster of its day that trashed both consoles in performance. In comparison, the 780/290/970 level stuff of today has at least such an advantage over the latest consoles, perhaps even more if anything. However, they do not have more VRAM than both systems have total, not even the same amount, potentially not even the same amount actually usable only for GPU-centric assets . It is not unpredented that more VRAM than what we've been generally given would be utilized given the strength of current GPUs. The 780/Ti in particular is far more powerful than only 4x the old 8800GTX, but it only has 4x the VRAM. So where's the problem here?

The problem is GPU manufacturers (especially Nvidia) have not been scaling up VRAM amounts with increases in power properly, they've been very deliberately restricting VRAM amounts to bare minimums outside of their new (ridiculous) premium Titan line and as the (arguably) leading GPU manufacturer of the world, they should have known requirements would raise, yet have elected to do nothing about it or perhaps even actively restrict VRAM in the hopes people would have to upgrade arbitrarily over it. Assigning blame towards the developers vs. the hardware manufacturers is a tricky line, but at the end of the day, the end-result is the same. Nvidia should have and probably did foresee this, but they've taken no preventative measures and have been far too cheap on VRAM. People with less can make do, certainly, but this will restrict high-end hardware from being able to render high-resolution textures as they're fully capable of doing and, in SLI especially, might even face significant AA or resolution limitations. The 780s in particular having only 3GBs and the same for 770s with 2GBs is actually rather disgusting and regrettably, I sold my 780 for a sidegrade to a 970 because I was already slamming into my 780's VRAM wall in certain scenarios (like modded Skyrim, Space Engine, Watch Dogs, tested scenario of certain games downsampled from 4K, etc.). It's a disgusting realization, but the blame rests squarely on the hardware manufacturers for this situation, regardless of whether developers are using too much or not; it is Nvidia (and AMD's) jobs to create the balanced hardware necessary to render the latest and upcoming software and to make smart engineering decisions for their products, and Nvidia have dropped the ball miserably here because it's a lucrative business opportunity. No excuses.

Fully agree.
 

nkarafo

Member
i doubt that's what they mean
I agree because like i said, i get the same number and i have a measly 512MB card. If they mean shared VRAM then even i can play it on recommended settings, and that makes even less sense than the game requiring that much actual dedicated VRAM.
 

DarkFlow

Banned
Quite right, and the "GDDR5 gap" is only going to grow. Indeed, there are some very alarming trends emerging here:

Consider that in early 2013, Nvidia's latest and greatest flagship card debuted with 6GB of VRAM. By late 2014, however, Nvidia's latest flagship cards could only manage a paltry 4GB of VRAM.

Now, if this data rich and highly predictive pattern continues, and there's absolutely no remotely conceivable reason whatsoever to think that it won't, then in 2015, Nvidia's newest flagship card will sport no more than 2GB of VRAM, a small fraction of the memory available on the current generation consoles.
You take trolling to new heights, I admire your art.
 

nkarafo

Member
Indeed, there are some very alarming trends emerging here
The only alarming trend in my opinion is that these games that we've been told they require 4GB+ Vram look atrocious for what they ask. They require a lot more than older PC games that looked far better.

This whole "i want more but i offer less" is what disappoints me the most. I mean, do you really want to use that much memory? Then at least look better because of it damn it!
 

AsfaeksBR

Member
Last week we posted our recommended PC system specs for The Evil Within, and in turn, we received plenty of feedback. While we still recommend this benchmark to experience the game in all its gory glory, we also recognize the need to provide information on how you can play the game with a wider range of PCs.
Minimum requirements for The Evil Within can now be found below. You won’t be experiencing the game at 1080p and you’ll likely need to turn off some features, but you will still be able to have a great experience with the game.

Minimum Requirements
OS: windows 7/8.1
GPU: GTX 460 or an equivalent 1 GB VRAM card
CPU: i7 or an equivalent 4+ core processor
RAM: 4 GB
HDD: 50 GB
If you meet the recommended specs, you’re in for the ideal experience. The game looks amazing with full-screen anti-aliasing, full shadow quality, motion blur, tessellation, SSAO, and 1080p visuals.


PC users can pre-order The Evil Within today on Steam — where we’ve updated the game page to account for both the minimum and recommended system requirements, or your favorite digital or physical reseller.

Found at BethBlog
 
Top Bottom