• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

It shouldn't be just 30 or 60 fps.

Dude, come down... This is a discussion message board... If you start a topic, expect people to comment on it. Especially if you basically say "30fps is too slow... That's a fact" and shoot down everybody, who says it's fine for them.
You are right but it was the tone of his message that got me, not his opinion.
 
Stop trying to make devs conform to some weird arbitrary standard on console.

If you want to customize your experience, you already have a PC.

A lot of us, really don't care about framerate to that extent unless its certain genres.

Is it really all that arbitrary? When making games, you are basically limited what the TV can run in terms of framerates.

Check you're TV's input. Your TV is pretty much running at 60hz all the time, unless you're watching a movie and have that 24p movie setting turned on. You can't pick random stuff like 51/53/52.

Most game console OS's/UI's run at 60fps/50PAL, all the way back to even 15 years ago with the PS2/Xbox/GC.

60 is the highest one that everyone has, 30 is the next, and is twice as bad, objectively.

Not really arbitrary, not really democratic.

Also, why are you perturbed that we care? Is 60fps going to hurt the games you play, when you don't even care most of the time? What would you be missing out on, playing games designed to run at *Native Resolution*/60fps?

If the primary aim of the game is gameplay/combat/reflex in any shape or form, 60fps is the right option. If you want to make a movie/story/non-reflex game, 60fps for player input required tasks like menus would be a nice compromise, but most of the time we can't even get that.
 
I don't think EU panels actually refresh at 50hz, they can just accept a 50hz signal for the PAL regions
I don't know, it's just that the games i mentioned (GTA5, Project Cars) offer me a 50hz option. I don't manually set my TV to 50hz or anything, my default is 60, yet these options are still there and when i select them, the panel changes to 50hz.

Nope, I stopped paying attention after that. 30fps is perfectly playable.
Yes. Playable. But still slow since it blurs everything out and has noticeable jumps between frames, compared to smoother rates. The point of this thread is to find a rate which is playable, inexpensive and smooth/sharp enough so you won't have to deal with the 30fps blurriness or the sacrifices of 60fps.

I mean, come on, 30fps is the minimum acceptable frame rate.
 
I dislike lower and unstable framerates but I'm astonished by the "slow" talk. This isn't about speed, this is about fluidity and responsiveness.
 
I think next gen will be when we get 60fps at 1080p.

If they go to 4K then it'll need to be 30fps and I'd take 1080p 60fps over 4k 30fps.

Framerate should be top priority. Drop it to 720p if needed. Just give me 60fps.
C'mon I did not get a PS4 to play in blurry arse 720p i don't even want 900p
 
I disagree even on consoles at 40 to 50 fps is not jarring and feels a lot smoother than 30fps. Second son ran at around 40 fps and felt much better than the 30fps option.
Agreed, the framerate on ISS, firstlight, unlocked shadowfall felt much better than locked 30fps.

I don't get the comments suggesting that "these consoles struggle to hit 60fps". 60fps is a design decision, just as 30fps is, if a developer is interested in his title being as stable as Metro, Wolfenstein, MGS5, Forza, they will work towards that, at 1080p resolution intact.

Let's not forget that these consoles do have many 60fps titles, so there's no struggle "to hit that metric"....further to that, the weakest console this gen, "the WiiU", has the most 60fps titles by and large, PS2 was basically a 60fps machine. 60fps is not some new discovery in console gaming at all.
 
Let's not forget that these consoles do have many 60fps titles, so what exactly is "struggling to hit that metric", yet, the weakest console this gen the WiiU has the most 60fps titles by and large, PS2 was basically a 60fps machine. 60fps is not some new discovery in console gaming.
Well, 99% of NES, Master System, Mega Drive, SNES, etc games are all 60fps. A few 30fps games looked really bad back then and stood out (like Sonic Spinball). Only home computers had lower frame rate standards (which is ironic because now it's the opposite). Oh, and 3D polygon games (Starfox, etc).

...even Atari 2600 has all of it's games running at 60fps. The whole 30fps standard started with Saturn/PS1/N64 because we wanted 3D textured graphics but these machines were barely capable of doing that. Also there were many arcade ports from superior machines that the consoles couldn't handle. So they settled at 30fps.
 
Why lock frame rate ? Unlocked at all times.

Question, developers over the past ..... 10 ? Years, as far I can tell, start locking frame rates at things like 60/90/120 FPS on PC, sometimes with reasoning that certain things may "break" if the frame rate is too high, yet I can't remember many older games like Half Life having these problems. Is there something about modern development that's causing this ?
 
Why lock frame rate ? Unlocked at all times.

Question, developers over the past ..... 10 ? Years, as far I can tell, start locking frame rates at things like 60/90/120 FPS on PC, sometimes with reasoning that certain things may "break" if the frame rate is too high, yet I can't remember many older games like Half Life having these problems. Is there something about modern development that's causing this ?
Too many console ports/multiplatforms maybe?

Halflife was a PC exclusive initially. I don't think games that were developed with PCs in mind have this problem.
 
Why lock frame rate ? Unlocked at all times.

Question, developers over the past ..... 10 ? Years, as far I can tell, start locking frame rates at things like 60/90/120 FPS on PC, sometimes with reasoning that certain things may "break" if the frame rate is too high, yet I can't remember many older games like Half Life having these problems. Is there something about modern development that's causing this ?

Well, Half Life is a PC game, it happens when something gets ported from consoles to PC.

Besides that similar things did happen, there are a number of games that have their framerate unlocked but have everything play much faster on more powerful PCs.
 
this technology needs to be widespread.

People are never going to pay an extra $150+ per tv/monitor for this tech. The <1% of the market that are enthusiasts, sure. Everyone else? nope.

More over, AAA games make 80-90 % of their revenue on consoles. So, no dev working on an expensive title is going to target the PC market first, let alone the 82 people who will be playing with gsync monitors.
 
Well, 99% of NES, Master System, Mega Drive, SNES, etc games are all 60fps. A few 30fps games looked really bad back then and stood out (like Sonic Spinball). Only home computers had lower frame rate standards (which is ironic because now it's the opposite). Oh, and 3D polygon games (Starfox, etc).

...even Atari 2600 has all of it's games running at 60fps. The whole 30fps standard started with Saturn/PS1/N64 because we wanted 3D textured graphics but these machines were barely capable of doing that. Also there were many arcade ports from superior machines that the consoles couldn't handle. So they settled at 30fps.

I kinda think that once we got over that early hump, doing 60fps really wasn't a problem at a hardware level anymore.

The problem was the changing mindset of the games industry. Games started being pushed to be BIGGER N' BETTER or to simply emulate/copy movies as much as possible around the time the PS2/XB/GC era, and this came to a head last gen. Pushing hardware too far & lowering fps standards for the sake of graphics, losing sight of important gameplay necessities, like 60fps, along the way.

Take for instance Killzone 2. That game was bananas in terms of looks, but in its conquest to being a futuristic WWII with graphics that had to live up to a CG hype video, the devs dumped control responsiveness out the window in favor of a fancy, latent as hell rendering pipeline. Shooting at dudes has never been so difficult in any other game I have ever played.

Same thing for Heavenly Sword. That game had massive Hollywood-style battles and super impressive movie performance capture...and forgot it was an action game. Shit loads of slowdown, super sluggish and unresponsive controls.
 
Yeah until TV manufactures decide to use Free Sync or G-Sync tech in their panels then we're suck with 30 or 60 option.

In my opinion I would happily use 900p/60fps over 1080p/30fps as fps will always trump res in my eyes
 
50Hz? Aren't most monitors/TV outside of Europe unable to support that anyway? I know that in Europe it's a standard, but O thought that outside of here most TV don't have it because it was never used outside of here.

There are more PAL territories than NTSC territories.
Monitors often don't support 50hz - even in PAL territories.

For TVs, I don't know, but they should use the same panels. 50hz should be patchable with firmware.

If Xbox One and PS4 would allow a 50hz option that could be forced by the OS, games still would optimized for 60hz, but those that have some drops, would run more smooth if you have 50hz TV.

Should be a nor brainer to implement such an option. Remeber the whole 900p gate and the digital foundries discussions. If XBox one had this option, some games could run better than on PS4 although the console is weaker.
 
I kinda think that once we got over that early hump, doing 60fps really wasn't a problem at a hardware level anymore.

The problem was the changing mindset of the games industry. Games started being pushed to be BIGGER N' BETTER or to simply emulate/copy movies as much as possible around the time the PS2/XB/GC era, and this came to a head last gen. Pushing hardware too far & lowering fps standards for the sake of graphics, losing sign of important gameplay necessities, like 60fps, along the way.
I agree. I used to enjoy 60fps arcade racing games on my PS2/XBOX. Now i can only do that on PC.

Also, many of the best looking games of that era were 60fps. Rogue Leader 2, Metroid Prime, F-Zero GX, Rallisport Challenge 2...

go figure.


No it's not. You're whole story falls apart here.
Many people will disagree with you. That's why we are trying to find what would make most people happy, including those who want graphics first.


I think some researcher said 48FPS is the best solution for wanting Good graphics and smooth responsive controller input rates.
Looks like a good number to me.


Nothing wrong with that at all
It's wrong for non-cinematic games that are not supposed to look like movies.
 
Seems some people has never played The Witcher 3 to say 60 fps games has never jarrying. I don't know the hell it's happened here but sometimes not seems even 60 fps for the awful responsiveness of the controls in this game.

The Witcher 3 has that classic Euro Jank which the movement patch made somewhat better but it's still heavily there.
 
Higher is always better, but variable refresh rate is the only viable option.

If the system can sustain a 200fps rate sure, why not (and displays will get better and faster), but the best way to get the most out of hardware is having the gpu tell the display what to do and not the other way round.

I'm sure next-gen consoles will have something similar (and TV producers will somehow adapt).

Get a g-sync/free-sync display guys.
 
No it's not. You're whole story falls apart here.

.

Mass market is fine with 30fps; just like they are fine with DVD over bluray; Spotify over CD;; and LCD over plasma/oled. And there is nothing wrong with that. This stuff is just entertainment and distraction. We aren't talking about medical equipment here.

& once you have to start paying for things like a mortgage and retirement savings, it is a lot easier to accept DVD, LCD, Spotify, and 30 fps.
 
I agree. I used to enjoy 60fps arcade racing
It's wrong for non-cinematic games that are not supposed to look like movies.
Guy's post was implying that games that take inspiration from and try to emulate movies and such are doing it wrong

That's what I was commenting on, not the framerate aspect
 
I'm surprised how so many people in this thread are suggesting variable frame rates as the solution. In previous (frame rate wars) topics it seemed to me that it's some kind of a no-no, like screen tearing or input lag.

I guess we have to thank G-Sync for that?
 
Nothing wrong with that at all

There's nothing wrong with a 30fps cinematic game that puts being a movie first over being a game.

There is something wrong with a cinematic game that aims to put gameplay first (or a large enough component), but then caps the framerate at 30fps.

You can be cinematic (read: set pieces, controlled/non-standard camera angles, controlled pacing) and still run at 60fps. Cinematic =/= lower framerates, that's just something that people who watched the Hobbit at 48fps made up to justify how much they disliked various attibutes pertaining to the look of the movie.

The Witcher 3 has that classic Euro Jank which the movement patch made somewhat better but it's still heavily there.

That's probably less Euro-jank and more of an animation priority mindset towards movement+combat. Many people had the same problem with the FFXV demo.
 
I think this is turning into more of a display discussion than a console/developer one. Developers, or publishers in particular, want to make their games flashy to appeal to a wife audience. If displays could handle anything thrown at them, consoles could happily throw out variable framerates and people would be happy. Maybe with Freesync (is aSync then open one it's based on?) future consoles and TVs will be able to do just that.

I may kinda see what your saying.

We just had a 4K TV installed a few months ago and it has this strange feature to make 30fps video look like 60fps through i'm assuming frame interpolation.

When i use that feature on games, they actually look 60 in smoothness even when they were originally 30fps. I wonder if TV's will ever be able to make that work perfectly?
 
Well, 99% of NES, Master System, Mega Drive, SNES, etc games are all 60fps. A few 30fps games looked really bad back then and stood out (like Sonic Spinball). Only home computers had lower frame rate standards (which is ironic because now it's the opposite). Oh, and 3D polygon games (Starfox, etc).

...even Atari 2600 has all of it's games running at 60fps. The whole 30fps standard started with Saturn/PS1/N64 because we wanted 3D textured graphics but these machines were barely capable of doing that. Also there were many arcade ports from superior machines that the consoles couldn't handle. So they settled at 30fps.
Well, I can understand why N64 games weren't 60fps by and large, but even the PS1 had a bevy of 60fp games. It's so strange because some of the best looking games at the time were 60fps with high quality assets. I guess since recent games have sacrificed such fidelity in graphics to hit 60fps, persons feel the consoles are struggling, when other games have been able to maintain high fidelity graphics and resolution whilst also holding 60fps, that's a great balance imo.

Well, yes, I remember games like tobal on PS1 and how impressive that game was and MGS2 on PS2, were probably the best looking and most technically proficient games when they released......
60FPS is actually the middle ground, because 60FPS is too slow also.
I fear soon that this type of sentiment may gain ground. I'm sure you will have people saying eventually that 60fps is a slideshow. I find I'm good with framerates at 45-60fps generally, 50-60fps moreso. That's why I don't border to update my monitor to do 120+ refresh, but I have a feeling when the 120Hz and 240hz monitors start rolling in as standard, everything else will be called a slideshow.

As for now, my only issue are games that can do 60fps that don't and games that fall below 30fps or at least do so consistently. All these GPU taming effects are not necessary for sub30 frames or developers need to get their act together and do a proper job.
 
When i use that feature on games, they actually look 60 in smoothness even when they were originally 30fps. I wonder if TV's will ever be able to make that work perfectly?
I also have this on my old LCD TV (4 years old). It makes everything look like it moves smoother but there are visual artifacts and a lot of input lag. I don't know if this can be improved so it won't add any more lag.
 
Well... the information in this link pretty much proves that 30fps is indeed "slow".



Is that the modern equivalent of David Perry games? (Earthworm Jim, Aladdin, etc)
I wouldn't really know, never played those games. In watching some critique on The Witcher, and hanging around
increasingly insane
FFXV threads, animation priority is the culprit, but 30fps (and below, often) certainly doesn't help either game.
I may kinda see what your saying.

We just had a 4K TV installed a few months ago and it has this strange feature to make 30fps video look like 60fps through i'm assuming frame interpolation.

When i use that feature on games, they actually look 60 in smoothness even when they were originally 30fps. I wonder if TV's will ever be able to make that work perfectly?

Any additional processing done by the TV creates lag. Bad for games period.

Natively done by the game? Actually not detrimental to response time, apparently. Source 1. Source 2. I still would rather have native output, because interpolation causes too many artifacts.
I fear soon that this type of sentiment may gain ground. I'm sure you will have people saying eventually that 60fps is a slideshow. I find I'm good with framerates at 45-60fps generally, 50-60fps moreso. That's why I don't border to update my monitor to do 120+ refresh, but I have a feeling when the 120Hz and 240hz monitors start rolling in as standard, everything else will be called a slideshow.
Join uuuuuusssssss, only then can you truly understaaaaaaaand.

It's, like, the best thing ever.
 
Nope, I stopped paying attention after that. 30fps is perfectly playable.

And yet here you are.

This thread is for those of us that are sensitive to sub-60 framerates. If you're one of the lucky people that aren't, congratulations, this problem doesn't exist for you. Can you allow us to have this discussion undisrupted?

On topic, it seems to me g-sync is the ultimate solution to this problem. Is anyone in here optimistic about a future with g-sync capable consoles and TVs or is it a pipe dream?
 
I always found it interesting how different people have different acceptable limits when it comes to frame rate. It seems obvious when we talk about clarity, some of us have better eyesight, so why do I have such a hard time understanding why some people are fine with 30fps? Is it actually a thing? Are there people who find 30fps truly smooth and responsive enough (I guess they would be the lucky ones)?. Perhaps it's a confirmation bias towards their favourite game or platform or something?

I play Insurgency with my missus now and then and she seems happy at about 16fps on her laptop! She even played Evolve with me at 9fps, NINE! It was only at this point that she remarked that it was too "laggy". She is blown away when she plays on my PC at 60, but she always remarks that the graphics are so much better and doesn't mention frame rate unless I do first.

I have to agree completely with the OP. As a PC gamer I find 30fps to be the lowest acceptable frame rate. At 30fps I feel like I can just barely say that I'm technically playing a video game. It's funny then how I find that at just 15 frame more to be perfectly acceptable. 45fps is the real lowest acceptable limit for enjoying a video game with 60 being pretty nice. I assume I'd fall in love with 120.
 
And yet here you are.

This thread is for those of us that are sensitive to sub-60 framerates. If you're one of the lucky people that aren't, congratulations, this problem doesn't exist for you. Can you allow us to have this discussion undisrupted?

On topic, it seems to me g-sync is the ultimate solution to this problem. Is anyone in here optimistic about a future with g-sync capable consoles and TVs or is it a pipe dream?

G-Sync is NVidia's proprietary solution that only works with NVidia GPUs. Freesync is the alternative developed by AMD that has the advantages of being (to copy & paste from Wiki) "Royalty-free, free to use, and has no performance penalty" (G-Sync apparently has a 2% performance penalty). There are some differences between the two *Syncs, but they're largely similar. I also read that Intel are going to be using Freesync in their integrated GPUs.

I think the biggest thing to note is that they both currently only compatible with DisplayPort, so unless a future HDMI standard pops up, people will have to rely on a select few TV manufacturers putting out DisplayPort TVs (assuming they don't already).
 
60 fps is already too slow. It was impressive in the 90s, but it's time our standards advance a little. Games don't look real smooth until you get past 100 fps.
 
Agreed, the framerate on ISS, firstlight, unlocked shadowfall felt much better than locked 30fps.
Every time I read that opinion I literally cringe. My eyes just can't take it. Those games look truly abysmal in motion with an unlocked frame-rate. Locked 30fps is infinitely better in those instances.

I can accept Tomb Raider DE on PS4, though. It stays close enough to 60fps so as not to be a huge problem.
 
60 fps is already too slow. It was impressive in the 90s, but it's time our standards advance a little. Games don't look real smooth until you get past 100 fps.
Let's not get carried away here... even though i agree with you (i can't play Quake 3 with anything lower than 85fps and only with a CRT monitor) you see that most people won't accept anything higher than 30fps as the standard (because of the extra visual effects), let alone higher than 60...

Besides, most TVs/monitors are 60hz so let's just work within that limit for the time being.
 
Seems some people has never played The Witcher 3 to say 60 fps games has never jarrying. I don't know the hell it's happened here but sometimes not seems even 60 fps for the awful responsiveness of the controls in this game.

Animation priority.

A number of cinematic games would have asimlar feeling at 60.
 
This thread is for those of us that are sensitive to sub-60 framerates. If you're one of the lucky people that aren't, congratulations, this problem doesn't exist for you. Can you allow us to have this discussion undisrupted?

It's not much of a discussion if all you're after is an echo chamber of people who agree with you. That shouldn't really be what this is about.

On topic, I think a locked 30 is just fine. 60 is cool if it's there, but it's not a huge selling point.
You stop noticing after a while, and nobody really outside of enthusiasts is really bothered about it.

Pretty graphics and cool effects are much more interesting to most people.
 
I don't thik 60fps will become standard even in next gen, too many devs think it is cinematic, or want the graphics looking amazing for publicity and make the sacrifice.

A lot more developers are embracing it however and even make it a mark on the back of retail packaging now as a feature. So I think even more will make it their goal over time.
 
It's not much of a discussion if all you're after is an echo chamber of people who agree with you. That shouldn't really be what this is about.

On topic, I think a locked 30 is just fine. 60 is cool if it's there, but it's not a huge selling point.
You stop noticing after a while, and nobody really outside of enthusiasts is really bothered about it.

Pretty graphics and cool effects are much more interesting to most people.
The discussion in this particular topic is mostly about what can be done for the FPS standard to be improved a bit, without too many visual sacrifices so that everyone can be happy and have a better standard for the gaming industry, because 30fps is too low for many and 60 is to "expensive".

There are too many 30vs60fps threads already here. In them, every 2 or 3 posts looks exactly like yours, someone always says "locked 30fps is fine" and then "60 is cool if it's there" and that most people won't care and that graphics matter more to most people. We know this already, it's probably is the most common kind of post in NeoGaf. So we try to do a different discussion here.
 
30fps for videogames is too slow. Lets face it for once.

Your first line is incredibly wrong OP. 30 FPS is absolutely fine, it only becomes bad when there's dips below it. Dips below 60fps are not nearly as notable but below 30 are definitely easily picked up.

Games are absolutely fine if they're 30 FPS with decent smoothing.
 
Join uuuuuusssssss, only then can you truly understaaaaaaaand.

It's, like, the best thing ever.
I may have to get a more cutting edge rig before I try it, but I may just do so soon.
Every time I read that opinion I literally cringe. My eyes just can't take it. Those games look truly abysmal in motion with an unlocked frame-rate. Locked 30fps is infinitely better in those instances.

I can accept Tomb Raider DE on PS4, though. It stays close enough to 60fps so as not to be a huge problem.
Tombraider was fine too, but since ISS was a much faster paced game than TR, the unlocked framerate felt much better. I'm sure if ISS was 55-60fps for 99% of the time it would have felt even better still.

I do believe higher frames are better, but it's more important in some genres than others.
 
I wonder when 60fps will become standard with all the other graphics options maxed out.

Playstation 5?

Console manufacturers are motivated to lower the hardware cost to create a profit on the frontend, expect to see a similar situation to the Xbox One and PS4 next generation. Where the hardware is actually way behind, instead of pushing the envelope technically like the 360/PS3.

Developers are motivated to push visuals which help with the marketing process. The majority of commercials are run at 30 FPS, prior to 2015 YouTube videos were locked to 30 FPS, and screenshots are at 0 FPS.


Give more power to devs, the more power they will use elsewhere instead of framerate.

It's a never ending cycle.

I think devs should start prioritizing on 60fps regardless of hardware.
I'm glad to see some like 343 is doing with Halo, and EA is doing with Battlefield for example. This is how it should be, 60fps over everything.

Call me selfish, but I would prefer of developers targeted 30 FPS on consoles and pushed visual fidelity, so then the PC port can push the framerate and still have decent visuals. ;)

Why lock frame rate ? Unlocked at all times.

Question, developers over the past ..... 10 ? Years, as far I can tell, start locking frame rates at things like 60/90/120 FPS on PC, sometimes with reasoning that certain things may "break" if the frame rate is too high, yet I can't remember many older games like Half Life having these problems. Is there something about modern development that's causing this ?

I've asked for the technical reason for tying game logic to frame rate in previous threads. Apparently it is less performance intensive from a CPU perspective and easier to program on the front end.
 
Top Bottom