So they pretty much confirmed that the consoles were the leading platforms and this is a direct console port. That explains the high need for VRAM since PC version is "functionally identical." I don't know how I feel about this, I think developers should target PC hardware first, then console hardware.
Really my only reservations are with Bethesda's crap handling of all PC info. Only Mikami game I didn't enjoy was P.N.03.
A good game at higher fps is always better.I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?
more fps doesn't always = more better.
I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?
more fps doesn't always = more better.
I like the sound of advanced camera controls.
I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?
more fps doesn't always = more better.
It is. The real reason developers recommend 30fps is because they couldn't get game running past 30 on consoles, and now don't have any interest in doing extra work for PC version. PC platform is meant for customization. If you like 30fps sure but let me have my options, and I'll see myself if 30fps cinematic experience is better or not. Regarding this game, just look at that bull crap statement. 8gb on consoles, so 4+4 on PC. Fuck that. And we'll provide debug commands for unlocked frame rate but do that at your own risk. Fuck that as well.I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?
more fps doesn't always = more better.
I feel what you're saying but maybe some of us don't like anyone else(developer or otherwise)telling us how to best experience a game we purchased with our own money.
I'm assuming it was made for current gen, because have you seen a last gen console port that recommends 4 gb of VRAM?
Shadow of Mordor reccomends 6GB.
Because the 30fps = cinematic thing is bullshit.but if 30fps will create a better, cinematic mood, why not?
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
Shadow of Mordor reccomends 6GB.
Shadow of Mordor reccomends 6GB.
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
And its a current gen game. + the 6GB is for the optional Ultra HD texture pack. Last gen version is being done by a whole different company, ported down from current gen
You'll be fine. See: Wolfenstein.Minimum is an i7 processor?
WTF is this shit?
I guess it wont load on my shiny, new i5 then...
But developers dictate how you experience their games in all sorts of ways that we have no say in, why shouldn't 30fps be any different?
I like most of my games in 60fps (especially racing games) but if 30fps will create a better, cinematic mood, why not?
I don't get this mentality at all. If a dev thinks the game is best experienced at 30fps, why is that a bad thing?
more fps doesn't always = more better.
It's on last gen too. Same as Evil Within. That's what I am asking him. How are you concluding it's a current gen game? What criteria are you using?
... For an ultra texture pack not included in the base game. Your point?
The PC performance thread should be glorious.
Popcorn, or the salt?the popcorn stench will last for days.
I don't agree it makes it nicer to watch at all. Even seeing Gamersyde footage of Ryse at 60fps is glorious and so much better than at 30fps. And that's about as 'cinematic' a style of game as you can get.I think 30fps does feel more cinematic.
Which makes it nicer to watch sometimes but always more frustrating to play.
After like 15 minutes of playing you forget about the lack of cinemaness and just appreciate the increased responsiveness and visual info.
If you think less smooth = cinematic then why not play the game locked at 15fps or something? That's like twice as cinematic as the devs intend. That'd be Oscar worthy!
That's cool. Some people argue that 900p is sometimes better looking than 1080p, too.haha, guys, calm down. I'm not trying to convert you or anything. 30fps does wonders for me when a game is cinematic, sometimes the fluidity of 60fps breaks me out of the game and makes it feel too "gamey". I know most of you guys won't agree, but there it is.
Framerates in movies/TV work differently than in games.60fps didn't help the hobbit any. In fact, I'd say it made it worse
But I digress, I'm not in like-minded company. I'm all for the option for 30/60.
I am not saying you are wrong for believing in that but that statement never ceases to bug mesometimes the fluidity of 60fps breaks me out of the game and makes it feel too "gamey".
We are taking about game here. 60 fps is more to do with response and fluidity.60fps didn't help the hobbit any. In fact, I'd say it made it worse
But I digress, I'm not in like-minded company. I'm all for the option for 30/60.
60fps didn't help the hobbit any. In fact, I'd say it made it worse
But I digress, I'm not in like-minded company. I'm all for the option for 30/60.
You'll be fine. See: Wolfenstein.
Also, King_Moc, we seem to share tastes. Outside of the game's issues on Xbone, how was it?
Going faster than 30fps is only a "problem" when a development team is myopic enough to tie elements of the game logic to that framerate. So whatever fucking design decisions lead to your game being best experienced at a sub-par technical level are BAD decisions and I'm sure as shit not gonna pay anywhere near full price to see what if any good design elements exist in spite of that. There are plenty of other games I can spend that money on.
Why does the PC recommended ask for 4 GB of VRAM?
The PS4 and Xbox One both have 8 GB of unified RAM which can be used as both system and video memory. Because our PC version is functionally identical to those platforms, we recommend 4 GB of system memory, and 4 GB of VRAM for the best experience.
Can I run it on a card with less than 4 gigs of VRAM?
Yes. Please refer to the minimum requirements above. You wont be experiencing the game at 1080p and youll likely need to turn off some features, but you will still be able to have a great experience with the game.
I've got my fingers crossed, cuz the game definitely has a ton of potential.I could only play chapter 9, but it felt pretty clear that this is basically a survival horror Resident Evil 4. Some more supernatural looking elements to it, and the enemies felt more threatening though. Ammo seemed like it's going to be pretty scarce, I was having to run away a lot. This is where the zoomed in camera screws you over. The FOV is so narrow, it's difficult to turn and run towards something with any degree of accuracy.
I'm pretty convinced that if the PC version can get round these issues, it's going to be stellar though.
I don't agree it makes it nicer to watch at all. Even seeing Gamersyde footage of Ryse at 60fps is glorious and so much better than at 30fps. And that's about as 'cinematic' a style of game as you can get.
Right, so hardcoded to 30fps, but you can remove them through the debug? Hopefully that works without problems then. I really have a hard time believing those black bars were due to "artistic vision". That, combined with how closely zoomed in the camera was and the framerate, made this a nightmare to control on the XB1 version.
60fps didn't help the hobbit any. In fact, I'd say it made it worse
But I digress, I'm not in like-minded company. I'm all for the option for 30/60.
Isn't the Witcher 3 currently using something like 1-2gb vram?
Great post!
Would love to see those videos!
You're talking about developers who know PC very well.