VisceralBowl
Member
I had that as well.
I'm assuming CRU looks like this when you open it?
http://i.imgur.com/8glpMJC.jpg[/im][/QUOTE]
Yep.
I had that as well.
I'm assuming CRU looks like this when you open it?
http://i.imgur.com/8glpMJC.jpg[/im][/QUOTE]
Yep.
Hmm....Yep.
Hmm....
If you haven't already restarted then I would do that, if you have then I dunno lol.
30 fps on PC definitely feels jittery to me, compared with console games running at 30 on the exact same display. Max Payne 3's pre-rendered cutscenes, for instance, look pretty bad on PC compared to console because of this jittery movement.
My monitor is not liking this, its showing a black screen and saying unsupported refreshrate (Dell U2312HM). I think i will try the 1080p24 mode on my HDTV for a ''Cinematic experience''
I'm a framerate guy, but this could be useful if I decide to play the new Assassin's Creed game (ubi pls), so thanks.
Really? It shouldn't be a problem for prerendered stuff. Otherwise 24~30fps videos would look like crap on PC compared to PS3, for example, but they don't.
Maybe they're just encoded differently on PC or the fact that you were just playing at 60fps impacted your peception of the 30fps cutscene?
I thought I was the only one who was bothered by this. PC games at 30fps/60Hz feel choppy with constant stutters. But on consoles 30fps doesn't bother me at all.
If my hardware struggles with putting out 60 frames per second, I let it do as much as possible. Why chop it down to 30? If it happily does 50, that's nice.
If my hardware struggles with putting out 60 frames per second, I let it do as much as possible. Why chop it down to 30? If it happily does 50, that's nice.
If my hardware struggles with putting out 60 frames per second, I let it do as much as possible. Why chop it down to 30? If it happily does 50, that's nice.
I would assume it's because you're using a mouse. 30fps is totally fine with a controller, but with a high fidelity input device like a mouse it can appear choppy.
If my hardware struggles with putting out 60 frames per second, I let it do as much as possible. Why chop it down to 30? If it happily does 50, that's nice.
They can, just takes some custom settings like I stated above.
Never use in-game VSync, always force adaptive 1/2 through Inspector. That coupled with a 31/30/29 fps cap should do the trick.
Which cap you need depends on the individual game and also your display.
If my hardware struggles with putting out 60 frames per second, I let it do as much as possible. Why chop it down to 30? If it happily does 50, that's nice.
That is correct, the reason that some people think console games look smoother at lower framerates is not due to the refresh rate the TV or monitor is being made to run at by the system (A TV doing frame interpolation can make it look smoother, but it adds ton of input lag and should be avoided at all costs) and it is not due to some magical motion blur that the PC does not have.
It is simply because controller sticks move smoothly on an axis (and mice do not).
Exactly. It's not simply a matter of "using a mouse" when, in fact, I'm mostly using a controller anyways. If you slowly rotate the camera using a controller it becomes insanely obvious that something is interfering with the frame consistency.Wrong. Its a frame pacing issue as Dark10x said. A proper 30fps cap should deliver a steady stream of frames 33ms apart but most caps, especially those found ingame, generally don't do this. I've found MSI Afterburner to be the best solution, coupled with 1/2 refresh rate vsync.
I do not recommend the solution in the OP. As someone using a plasma, outputting at 30hz, would strobe enough to give you a seizure.
As someone using a plasma, outputting at 30hz, would strobe enough to give you a seizure.
I mostly game on a plasma (both PC and consoles) and would never dream of using any sort of LCD display as a primary gaming monitor. Aside from the motion resolution issues, but the main improvement is simply black levels. LCDs have to fake contrast and monitors tend to be worse than LCD TVs in that regard. IPS is just about the worst thing for black levels at this point.So how do consoles handle a Plasma ? Surely the smoother look would only work on LCD / LED.
Sorry if question is daft just trying to get my head around this interesting subject.
They can, just takes some custom settings like I stated above.
Never use in-game VSync, always force adaptive 1/2 through Inspector. That coupled with a 31/30/29 fps cap should do the trick.
Which cap you need depends on the individual game and also your display.
I'd like to try this 31fps business, but there's no 31fps option in my Nvidia Inspector:
You can type any number you want in there.
So how do consoles handle a Plasma ? Surely the smoother look would only work on LCD / LED.
Sorry if question is daft just trying to get my head around this interesting subject.
That's just not true. Frame interpolation is a thing only introduced in somewhat modern LCD panels (and occasionally in certain plasmas). It has absolutely nothing to do with this issue.I think since consoles are hooked up to TV's in the most part there is some frame interpolation done by the TV.
Some TV's are better at this than others, but could explain the increased fluidity.
If you've ever played a PC game that's been limited 30 fps, you've likely noticed that it isn't as fluid as console games at 30 fps. The FPS counter always reads 30, but something is amiss. It seems as though the screen continually stutters whenever you pan the camera around. The reason for this is the discrepancy between your monitor's refresh rate and the game's refresh rate. When your monitor's refresh rate is 60Hz and your game is running at 60 FPS, everything is fine
Yeah thats the reason pc gamers detest 30fps, a game at pc 30fps and console 30fps, under similar conditions(locked), will feel different. A console game at 30fps is smoother. It's hard to explain the smoothness of locked 30fps to someone who can't get it on pc.
This isn't some fantasy
No it does not. The effect is observable using a gamepad on the PC.Part of this feeling has nothing to do with hz fps syncing (which I ill admit is a real phenomenon for those who want to play at a flat 30fps), but rather that a mouse exaccerbates the effect of the input lag and how your brain interprets it.
Mouse is 1:1, a controller, definitely is not.
You guys realize that the displayed or reported frame rate only gives you half the story as to a given game's performance right?
What matters is the timing of those frames. You get 30 frames displayed in a span of one second, but that doesn't tell you how the frames are distributed IN that second.
To achieve a super smooth locked frame rate kook, your timing HAS to be such that frame are distributed evenly in that frame at the correct timing. So for 30fps, you want a 33ms frame time. Or time to render the frame.
Next step is to have your display refresh be a multiple of that target frame rate so as to avoid judder due to pull down methods.
If you guys are actually reading this post, I highly recommend checking this link out for more information:
http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking
And this jerky issue the OP is referring to has little to do with just being 30fps, as this same issue can persist even at 60 or 120fps.
Also, for those with displays that can actually display a 30hz refresh, DO NOT DO THIS.
You will be exposing yourself to potentially getting a really bad headache or migraine.
No it does not. The effect is observable using a gamepad on the PC.