• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Evil Within PC Info from Bethesda

Isn't the Witcher 3 currently using something like 1-2gb vram?
They said that? I've watched and read about everything I can on Witcher 3 and haven't heard this. I highly, highly doubt this though. I'd imagine W3 will require a 3GB card at a minimum.
You're talking about developers who know PC very well.

But then again. There is this. Though one could argue that Witcher and Witcher 2 both had pretty bad issues at launch.
 
We'll have to wait and see if more dedicated ports can solve this VRam issue, or if it's not actually that easy a problem to solve and most games are just going to up the requirements. Right now it's certainly not looking good. And the real graphical beasts haven't arrived yet.

What's clear is that there has been a sudden, large jump in Vram requirements, entirely down to the new nextgen consoles - it's just a question of how big that jump is. Either way, I'm pretty sure biting the bullet and going from a 2GB to a 4Gb card was the right thing to do.
 

Skyzard

Banned
Thanks... Here's the video.

With +toggle com_synctotime the framerate is unlocked, but now the games speed is dependent on them. So if they go over 60, the game becomes faster, if they drop below it gets slower - it's really challenging to aim that way :) I recommend trying for youself, it's quite hillarious. It may be hard to judge by the video, but it's really going bananas.

So the game is twice as fast at 60 vs 30?
 

Spaghetti

Member
OuVJHzj.gif
had the same reaction.

did all this come from a community manager?
 

Fractal

Banned
Instead of moving away from 30 FPS, the industry is doing its best to keep it going strong, even spreading it on the PC. How disturbing...

But hey, at least I like the part where they openly admit the game is a basic port not programmed to properly utilize the PC memory architecture. The honesty is refreshing!
 
They said that? I've watched and read about everything I can on Witcher 3 and haven't heard this. I highly, highly doubt this though. I'd imagine W3 will require a 3GB card at a minimum.


But then again. There is this. Though one could argue that Witcher and Witcher 2 both had pretty bad issues at launch.

I swear I read a post here where someone said that not to long ago (not this thread).
 

wildfire

Banned
I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?

more fps doesn't always = more better.

But it usually does because the game is more responsive.

Considering the fact Resident Evil 1 as game worked due to that lack of responsiveness is why I think everyone shouldn't be concerned with the consoles being the lead platform, causing this game to be frame capped.
 

bro1

Banned
They said that? I've watched and read about everything I can on Witcher 3 and haven't heard this. I highly, highly doubt this though. I'd imagine W3 will require a 3GB card at a minimum.


But then again. There is this. Though one could argue that Witcher and Witcher 2 both had pretty bad issues at launch.
We've already seen that almost all of these pc requirements are complete bullshit. This game will run fine on an i5 2500k and a gtx 660 on a 1080p monitor.
 
Thanks... Here's the video.

That was quite a challenge actually, id-tech5 just hates my system. My poor old i7-920 (OC'd to 3,6 GHz) is getting strangled by the streamig and my R9 290X never did perform well with id-tech5 (that's probably because AMDs neglegt of OpenGL however). And that recording-software also takes a heavy toll on my framerate (dropping it by 50% or so), so I had to drop the quality quite a bit to get some high fps.

With +toggle com_synctotime the framerate is unlocked, but now the games speed is dependent on them. So if they go over 60, the game becomes faster, if they drop below it gets slower - it's really challenging to aim that way :) I recommend trying for youself, it's quite hillarious. It may be hard to judge by the video, but it's really going bananas.
Wasn't there another command that supposedly fixed the sync issues? Could you try that?
 
Why does the PC recommended ask for 4 GB of VRAM?

The PS4 and Xbox One both have 8 GB of unified RAM which can be used as both system and video memory. Because our PC version is functionally identical to those platforms, we recommend 4 GB of system memory, and 4 GB of VRAM for the best experience.

Can I run it on a card with less than 4 gigs of VRAM?

Yes. Please refer to the minimum requirements above. You won’t be experiencing the game at 1080p and you’ll likely need to turn off some features, but you will still be able to have a great experience with the game.


WTF, that better be some misscommunication or else that is some class A bullshit.

more fps doesn't always = more better.

Up to a point (which is not less than 60) it is, in games at least.
 

Seanspeed

Banned
Instead of moving away from 30 FPS, the industry is doing its best to keep it going strong, even spreading it on the PC. How disturbing...

But hey, at least I like the part where they openly admit the game is a basic port not programmed to properly utilize the PC memory architecture. The honesty is refreshing!
The issue here is likely that we're dealing with a Japanese team who probably don't have a lot of experience developing for PC. We might consider ourselves fortunate we're getting a PC port at all.

Maybe if it does well, their next effort will be a bit more PC-friendly.
 
You're right too, it's hard for me to explain. I like seeing 60fps more than seeing 30fps. Makes me happy of course to see all that extra info.

Could be talking out my arse but maybe it's to do with your brain filling in less of the info itself when at 60 as opposed to 30...in a sense makes it less immersive - but then the increased agency balances that back.

I noticed this especially when I was trying to get the best settings for watch dogs - it was super noticeable, drop from 50->40->30 increased cinematic feel :S

I never prefer 30fps, always 60. But there's something there with the cinematic feel. Not that it's worth it.

I don't know what you are experiencing, but just don't call it cinematic. It is in no way connected to the way cinema framerate works.
 

UnrealEck

Member
What's clear is that there has been a sudden, large jump in Vram requirements, entirely down to the new nextgen consoles

I don't think that's true. Which games are you referring to? This game is using megatextures which are known to use up a lot of VRAM. Do we know what texture quality the PS4 and Xbone are using?
Modor is the other that stood out due to an optional texture pack, though to match console textures there's no necessity of a large jump in VRAM.
The only other game I can think of is Watch Dogs.
 

wildfire

Banned
Going faster than 30fps is only a "problem" when a development team is myopic enough to tie elements of the game logic to that framerate. So whatever fucking design decisions lead to your game being best experienced at a sub-par technical level are BAD decisions and I'm sure as shit not gonna pay anywhere near full price to see what if any good design elements exist in spite of that. There are plenty of other games I can spend that money on.


If you cared about game design you would care to ask questions about what the game is doing rather than make assumptions on technical performance.
 
If you cared about game design you would care to ask questions about what the frame capping allows rather than makes baseless assumptions.

Baseless assumptions? We have plenty of examples detailing exactly what he is talking about. In relation to this game? Why should we think otherwise when almost all signs point exactly to what he is saying.
 

Conezays

Member
I'm going to assume my 3GB 780 Ti can handle this game fine, right? Right?

You think they would know those specifications are ridiculous.
 

Fractal

Banned
The issue here is likely that we're dealing with a Japanese team who probably don't have a lot of experience developing for PC. We might consider ourselves fortunate we're getting a PC port at all.

Maybe if it does well, their next effort will be a bit more PC-friendly.
Could be the case, true. Reminds me of Dark Souls, even back then, the devs openly said their only goal is to make the game playable on the PC, with no PC-specific improvements.

I'll wait for a heavy discount on this one, but still, their honestly is appreciated.
 

Teremap

Banned
If the game uses even 3.1GB of VRAM, of course they can't recommend you have a 3GB or lower card. We know Killzone Shadowfall used 3.5GB of VRAM, so I don't think it's really unreasonable for the game to recommend 4GB (especially considering how the megatexture technology works).

At any rate, it'll be interesting to see how the game actually looks in-person (i.e. not in a compressed video). I feel more and more vindicated for splurging on 4GB GTX 670s instead of going for the cheaper option.
 

~Kinggi~

Banned
Nice to have answers. Maybe next time they can deliver them sooner rather than later.

Only thing im worried about is if they will hard-lock the 1080p setting behind 4gb cards or if they are simply saying you can set whatever you want but performance might be bad.
 

wildfire

Banned
Baseless assumptions? We have plenty of examples detailing exactly what he is talking about. In relation to this game? Why should we think otherwise when almost all signs point exactly to what he is saying.

The assumption he made was that tying game logic to framerate is poor game design.

I edited my post to not say it is baseless before you finished your response but his assumption is still just that, an assumption. It isn't inherently bad to hide frames.
 

Fractal

Banned
Agree to disagree. I am not being hyperbolic.
There's not a single game on the market where more FPS hurts the experience in any way (unless if you unlock the frame rate of a game that's hard-locked to 30 FPS through modifications, which breaks the animations, etc).

Sure, there are games where more FPS doesn't make for much of an improvement, but there are none where it damages the experience. More FPS is always an improvement, sometimes slight, and sometimes very noticeable.
 
What's the status on the game's framerate and aspect ratio?

Shinji Mikami and the team at Tango designed The Evil Within to be played at 30fps and to utilize an aspect ratio of 2.35:1 for all platforms. The team has worked the last four years perfecting the game experience with these settings in mind. For PC players, we’ll provide debug commands on how you can alter the framerate and aspect ratio, but these commands and changes are not recommended or supported and we suggest everyone play the game as it was designed and intended for the best experience.


Do they really think people are that dense? Although i am surprised it has a console command, very rare these days.
 

Rhaknar

The Steam equivalent of the drunk friend who keeps offering to pay your tab all night.
Could be the case, true. Reminds me of Dark Souls, even back then, the devs openly said their only goal is to make the game playable on the PC, with no PC-specific improvements.

I'll wait for a heavy discount on this one, but still, their honestly is appreciated.

Dark Souls was still my game of the generation, imagine that :/
 
The assumption he made was that tying game logic to framerate is poor game design.

I edited my post to not say it is baseless before you finished your response but his assumption is still just that, an assumption. It isn't inherently bad to hide frames.

How is it not? Bioshock did this and it made for some screwy physics, so did L.A. Noire, etc. They had their physics(logic) tied to framerate and it caused issues. Tell me how it's better to tie logic to framerate? I legitimately don't know and would love to.
 

UnrealEck

Member
People seem to think framerate matters only to camera panning. Everything that moves can be smoother, more fluid and life-like with higher framerates.
In a 2D point-n-click adventure, if you have animations going on like people in the background walking around, talking, trees blowing, cars driving by etc, it'll all look nicer, more fluid, smoother with a higher framerate.
 

Kriken

Member
4GB vRam for 1080p 30fps? Lol, fuck that, sounds like a bad port

Edit: For clarification, I'm disappointed, was really hoping the PC port would be okay. Seems ridiculous to require 4GB of vRAM for 1080p to be achieved and the lack of framerate options should not exist in 2014
 

Seanspeed

Banned
Game is not even in top 25 on steam best sellers only a few days from release.
I'm not surprised. They gave no minimum requirements for the longest time and initially made it sound like 99% of PC gamers didn't have the hardware to even *run* the game.

And while we have minimum requirements now, they are still high and requiring a 4GB card just to play at 1080p(according to them) is also going to turn off a ton of potential customers.

I'm certainly in no hurry to buy this until I know more about how it performs.
 

Skilletor

Member
Turn based strategy games benefit from higher frame rates? Point and click adventure games?

yes and yes? Aren't point and click games controlled with a mouse? Aren't mouse controls aided by higher framerates?

How many people have you seen complain about the framerate of FFT:WotL on PSP?

There's not a single game on the market where more FPS hurts the experience in any way (unless if you unlock the frame rate of a game that's hard-locked to 30 FPS through modifications, which breaks the animations, etc).

Sure, there are games where more FPS doesn't make for much of an improvement, but there are none where it damages the experience. More FPS is always an improvement, sometimes slight, and sometimes very noticeable.

I agree? lol. I think you're responding to the wrong person. :p
 

Fractal

Banned
Dark Souls was still my game of the generation, imagine that :/
Well, I never said it's a bad game. I enjoyed it a lot as well, and soon I'll finally start with DS2.

I was simply commenting on the port quality. Without Durante's fix, it was abysmal.
 

nbthedude

Member
Because I actually like gameplay. 30fps is sluggish and makes the game feel less responsive for me. This has nothing to do with a vision and everything to do to get the game to run on consoles.

And screw this cinematic talk. I can't wait for movies to finally take the plunge into smoother framerates too. I'm tired of archaic minded people thinking that films need to run at a low ass frame rate. Why not go back to 12fps like silent films had if you want that OG cinematic feel? I can't wait for a new generation of people to grow up on something other than 24fps films. This is getting annoying.

You kind of just proved his point rather than negating it. Even if it were true that 30fps means the game is more "sluggish" what if the designers want the gam to feel "more sluggish" or at least have designed the game to feel slower in reaction in that manner for tension.

In this case, though I think it's just a matter of the animations being tied to frame rates. There are a few people here like Durante that would know how much extra work that would require to undo, but most people have no idea and just go around shouting bullshit about "extra work" and "sloppy port" without any qualification to their statements or knowledge of what the fuck they are talking about.
 

blaidd

Banned
So the game is twice as fast at 60 vs 30?

Yeah, but in Wolfenstein 60 fps is normal, 30 fps is half the speed. Above 60 it get's faster

Here's a gif I found on Guru3D:
iojhj9UlBKctw.gif


Wasn't there another command that supposedly fixed the sync issues? Could you try that?

I just did, it doesn't work.

The console spews this out: Warning: r_syncAtEndFrame cannot be set in Retail! - so it's a developer-feature which got disabled.

Hopefully, it will work in The Evil Within. Although I'm going to play the game anyway if I have to, it made a good impression when I was playing it - there's not many classical survival-horror-games anymore.
 
Top Bottom