• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

It shouldn't be just 30 or 60 fps.

You're overthinking it. 60fps should be something devs strive for, and it should be something console hardware is a little bit better prepared to hit. This used to be the standard 2 gens ago. Why bother complicating things further with solutions that aren't as good?

All console hardware is and always will be more than prepared to hit 60fps...its simply a matter of developers choosing how to allocate their available resources

I wonder when 60fps will become standard with all the other graphics options maxed out.

Playstation 5?

Never, because developers will always be able to push better graphics at 30fps than they would at 60...
 
I know most informed gamers prefer 60 fps, but let's be honest. Does it really matter if you're playing something like The Wolf Among Us or Pillars of Eternity? As long as it's not 10 fps those games are perfectly playable sub 60 fps. However, FPS and fighting games should be mandatory 60 fps.
 
50hz being ignored is kinda interesting... but doesn't that have the problem where a missed vsync'd frame drops to 25hz now? That would be pretty darn jarring.
 
Sure it's been said but they wold just push the textures, polygons at 50 to give the same results.

What people are saying is developers shouldn't try to push too much with pop in and other things like variable resolutions to hit 60 fps

They should cut back on textures and polygons a lot more to gain fps it and it would work fine on the systems as it should at 60.
 
After playing some of Wolfenstein and MGS5, yes, framerate is king. Neither game is particularly good looking, but they feel great in motion.

I should probably just get a gaming PC.
 
In my case, 50hz. I don't know if this is because i live in EU so the TVs support that. But it does. And some games detect it and offer me 1080/50hz. GTA5 and Project Cars offer this option.

I guarantee that GTA 5 on current gen consoles does not run anywhere near 50fps.
 
We'll see high frame rates for VR, I guess.

I run Witcher 3 at ... 25 fps (half refresh).

I definitely feels worse than 30 fps, but better than 30 with regular dips.
 
I know about this already. That's why i'm talking about TVs and monitors being able to support more than just 60hz and it's divided rates.

Wouldn't be great if tv's and consoles altogether had variable framerate technology like gsync or freesync? Next gen consoles better have something like this.
 
If we are talking about what refresh should be ideally then the human perceptual limin should be taken into account. From what I remember, testing has shown that the human refresh rate perceptual limin is somewhere around 333 hz or greater. So technology has a ways to go yet.

Also, I think perceptual testing has shown there is a sort of liminal knee at around 44 hz where things are perceived as being far smoother than just a few Mhz slower.

48 hz should be the bare minimum for action games IMO.

Part of the reason higher refresh rates have been slow to roll out is that it is tough to market, you have to actually experience it to feel the difference it makes, then its like "oh wow, thats really nice."
 
I wonder when 60fps will become standard with all the other graphics options maxed out.

Playstation 5?

they are wrong. more graphics is more assets, more work, aka more $$$. essentially AAAA games. its easier to optimize the current fidelity. and i expect VR to be standard. making fps a priority.

also screw capital letters.
 
60 fps will never be the standard.

I think it will be the norm with ps5 or ps6, you look at games like like star wars, uncharted 4, the order, and drive club you think just how much better can graphics get, and these games already have a huge budget just for graphics aLone.
 
I think it will be the norm with ps5 or ps6, you look at games like like star wars, uncharted 4, the order, and drive club you think just how much better can graphics get, and these games already have a huge budget just for graphics aLone.

That's what everyone said with PS4 and Xbox One but even PS5 will have it's limits especially if they're gonna target for mostly 4K games.
 
G-Sync is NVidia's proprietary solution that only works with NVidia GPUs. Freesync is the alternative developed by AMD that has the advantages of being (to copy & paste from Wiki) "Royalty-free, free to use, and has no performance penalty" (G-Sync apparently has a 2% performance penalty). There are some differences between the two *Syncs, but they're largely similar. I also read that Intel are going to be using Freesync in their integrated GPUs.

I think the biggest thing to note is that they both currently only compatible with DisplayPort, so unless a future HDMI standard pops up, people will have to rely on a select few TV manufacturers putting out DisplayPort TVs (assuming they don't already).

I see, thanks a lot for the informative post. I'm not very informed about variable framerates yet, so thanks.

It's not much of a discussion if all you're after is an echo chamber of people who agree with you. That shouldn't really be what this is about.

On topic, I think a locked 30 is just fine. 60 is cool if it's there, but it's not a huge selling point.
You stop noticing after a while, and nobody really outside of enthusiasts is really bothered about it.

Pretty graphics and cool effects are much more interesting to most people.

So you haven't bothered reading the OP or even the title. For fuck's sake.

This thread whole discussion is about finding alternatives to 30 and 60 FPS. If you're fine with 30 FPS, this is literally worthless to you, you already have it your way. Coming into the thread saying "30 FPS is fine!" adds zero to the discussion except derailment: you are not interested in alternatives at all. It's, frankly, nothing short of shitposting.

Also SMFH at the "you stop noticing after a while". Thanks for letting me know what I or anyone more sensitive notices or not at a given time. This is the exact kind of condescending, close-minded shit that we get in every thread.

Notice the derailment yet?
 
Would certainly be a good thing, although I really don't get the 30/60 FPS bitching. I barely see a difference anyway.

Not saying there is no difference, it's just that I do not notice anything if not confronted with shit like 30 FPS rear view mirrors (F1 2015) or exactly the same game running side by side in 30/60 FPS.
 
Give more power to devs, the more power they will use elsewhere instead of framerate.

It's a never ending cycle.

I think devs should start prioritizing on 60fps regardless of hardware.
I'm glad to see some like 343 is doing with Halo, and EA is doing with Battlefield for example. This is how it should be, 60fps over everything.

The problem is, with console games literally the first thing people will start bitching about is "bad" graphics. While framerate realistically has a greater impact, much less people are invested in it as much as "LOOK, I CAN SEE A BLURRY TEXTURE, THIS IS SHIT, LAZY DEVS". So it's not all that difficult to see why many developers go for that graphical wow factor, even when they can't maintain a solid performance.

I mean, look at Halo 5 and Warzone. It's pretty crazy that mode still manages to hold a perfect 60fps at basically all times with so much going on relatively dated specs. But the loss of IQ and texture resolution required as lead to a pretty consistent stream of certain people complaining about the visuals, even calling it "unacceptable".
 
I mean, look at Halo 5 and Warzone. It's pretty crazy that mode still manages to hold a perfect 60fps at basically all times with so much going on relatively dated specs. But the loss of IQ and texture resolution required as lead to a pretty consistent stream of certain people complaining about the visuals, even calling it "unacceptable".
And yet, the graphics in Halo 5 are still "fine". The game looks "fine" and it's perfectly playable.

Replace "graphics" with frame rate and "looks" with runs and you have the main argument of people who are ok with the minimum standard of 30fps.

I mean, it's fine when the frame rate is just "fine", "ok", or "playable". And everyone hates it when someone says "60fps or bust" (which is understandable). But it seems that this doesn't apply to graphics. Graphics have to be "PERFECT or bust". Being "fine" is not good enough. Won't buy. Canceling preorder. Unsubing.
 
Every time I read that opinion I literally cringe. My eyes just can't take it. Those games look truly abysmal in motion with an unlocked frame-rate. Locked 30fps is infinitely better in those instances.

No, it really isn't. The improved smoothness and control response of, say, a 45 fps average is well worth it, unless there's something weird going on in console games with unlocked framerates that I don't know about.
 
I think AMD did have some prototype demo of HDMI freesync, but it would probably take another version of HDMI to add Adaptive-Sync support, and then it would need to be adapted into all new TV's, and even after that it would take years before devs could consider building their game around a dynamic refresh rate. Other than gaming purposes, TV's don't really need dynamic refresh for anything, so I don't see it being adopted that fast.
 
My computer can't handle 60FPS for games when on high settings so I usually opt to lock them at 30FPS (so it doesn't fluctuate), and when I play a 60FPS console like the Wii U I don't seem to notice that much.
 
You're overthinking it. 60fps should be something devs strive for, and it should be something console hardware is a little bit better prepared to hit. This used to be the standard 2 gens ago. Why bother complicating things further with solutions that aren't as good?

What? 60 fps is a priority issue, not a hardware issue.
 
I'll never understand not prioritizing frame rate. 60fps is just so glorious and smooth. Having higher fidelity is meaningless to me if it chokes along at some peasant fps.
 
I wonder if TVs will support Freesync when PS5 and the next Xbox will enter the market.

I hope so, I'd take freesync as standard over 4K any day of the week.

I just wish someone would come up with a version that worked with current standards, so it would work on any HDMI device.
 
Give more power to devs, the more power they will use elsewhere instead of framerate.

It's a never ending cycle.

I think devs should start prioritizing on 60fps regardless of hardware.
I'm glad to see some like 343 is doing with Halo, and EA is doing with Battlefield for example. This is how it should be, 60fps over everything.

That's one good thing about Nintendo, most of their exclusives run at 60fps and still look gorgeous.
 
Well, I'm personally one who finds 60FPS very important, but that's because I play fighters, character action games, arena FPS, and other genres where the better response time and smoother visuals are a necessity.

Anyway, where can I find out more about this G-Sync and Freesync stuff? It seems to be touted across this thread as a good solution, and I'll probably do some searching on my own, but if anybody has a good place to start, even a thread here, it would be quite helpful. Does it cause any sort of input latency on its own?
 
I wish people would stop bringing up the past consoles and saying dumb things like "well ps2 had lots of 60 fps games" the bar has been raised quite a bit graphically since then and 60 fps is much more taxing on the hardware now days. If every game looked like a ps2 or snes game with similar effects then every game would be 60 fps but let's all be honest. The average consumer isn't going to accept ps2 graphics in today's market.


Also to the halo 5 conversation above yes graphically the game is fine and I am currently playing through the campaign. The problem with halo 5 is that it's just ok and lots of concessions were made to get the game running at 60fps.
 
I wish people would stop bringing up the past consoles and saying dumb things like "well ps2 had lots of 60 fps games" the bar has been raised quite a bit graphically since then and 60 fps is much more taxing on the hardware now days.
The hardware has also raised the bar by itself. It's not like today's consoles are not 100 times more powerful than a PS2 to handle those graphical standards.

Like everyone says, it's not about hardware. It's about priorities. And 2 gens ago, frame rate was a bigger priority in some genres, like Racing.
 
No, it really isn't. The improved smoothness and control response of, say, a 45 fps average is well worth it, unless there's something weird going on in console games with unlocked framerates that I don't know about.
I will take unlocked over 30fps 100% of the time.
 
I prefer to think of 60 fps as the standard, and anything below that as not hitting the standard.

With the exception of early 3D games, it used to be that way.

Now we are even getting 30 fps racing games.
 
Yeah, it shouldn't be just 30 or 60.

Just like games shouldn't be just 720p or 1080p in the year of our lord 2015.


because they should be just 1080p and just 60fps
 
Yes. Playable. But still slow since it blurs everything out and has noticeable jumps between frames, compared to smoother rates. The point of this thread is to find a rate which is playable, inexpensive and smooth/sharp enough so you won't have to deal with the 30fps blurriness or the sacrifices of 60fps.

I mean, come on, 30fps is the minimum acceptable frame rate.

Nah, the tail end of last generation actually convinced a few people to lower their bar. 22fps is the minimum 'acceptable' framerate now and I'm pretty sure we'll be seeing some of that in the next few years on console
 
Top Bottom