• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

I'm "fine" with 30fps. Its "OK". But everything else has to as good as possible?

When you're like me, someone who notices the difference between playing at 30 and 60 frames per second, but never particularly detected much of an advantage between the two or felt that there is a significant improvement when the frame rate is doubled, then 30fps with a high resolution like 1080p is easily the more preferable option. Give me that boost to graphical fidelity and such - it helps build immersion.
 
Is it? Because its the complete opposite for me. And if you don't have a huge monitor/TV the difference in resolution is definitely small, while 30/60 fps difference is noticeable on any monitor of any size.
I'll argue 1080p's a significant difference, but on some visual styles it can be more subtle, and it sure as fuck doesn't jump out as much as going to 60 FPS. It's the motion equivalent of jumping from 480p unstretched to 1080p.

Well, I suppose 240p to 480p would work better being an exact multiple, that's low enough you WILL notice and appreciate it, certainly I imagine most of us noticed how much sharper DC/PS2/GC/Xbox games were over SS/PS1/N64.
 
On framerates, what can you do on consoles? You have to live with what you get but that was always the case and is only fanboy ammunition if you want to have a discussion about it. If you can spare the money you buy a potent pc and avoid it.
I would agree with you if the 6th generation of consoles didn't happen.
 
Is it? Because its the complete opposite for me. And if you don't have a huge monitor/TV the difference in resolution is definitely smaller, while 30/60 fps difference is always noticeable on any monitor of any size.
Perhaps because I'm playing right next to a somewhat large (27") monitor.

That said lots of effects > both 1080p and 60fps.
 
I can see the difference between 720p & 1080p but I seriously don't give a fuck, High-end graphics and IQ in general don't excite me much anymore, good looking graphics are good to have but it's not the reason that I play videogames.

Things like stylized art styles & high frame-rates are much more important things to me than resolution/high end graphics engines and if we're talking mainly about next gen titles I'm much more excited to see what devs will do (if they bother) with enemy AI, physics/destructibility and of course new and interesting concepts this gen. So yeah graphical fidelity is pretty low on my list of what I want to see from next gen.
 
I would agree with you if the 6th generation of consoles didn't happen.
Honestly, I feel like 60 FPS on that generation is something of a case of rose tinted glasses. A lot of games did hit 60, yes, but there were also loads and loads that were 30 FPS that at best jumped to 60 occasionally (quite a few JRPGs later on in the PS2's life, even ones with basic visuals like the Persona games) and even some games that struggled to be 30 FPS at all (Shadow of the Colossus most notably, along with the GTAs and other games.)

Still, I do find it frustrating how 60 FPS isn't really considered the standard for racing games. The nature of car rendering means you can easily have fantastic looking games at BOTH a high resolution and FPS as Gran Turismo demonstrated many, many times... so why is it that we so often fail to see that, and even have it willingly thrown aside for visuals even now? Hopefully Driveclub actually impresses, and NFS can possibly be written off as a casualty of being cross gen while having weird online code, but it just seems absurd to me.

EDIT: Also, I think it's probably worth looking at WHAT games failed to be a good, stable 60 or even 30 FPS, and where they came from. I don't recall as many 60 FPS games on Xbox as on PS2 or GC, in fact what I recall from the Xbox roughly matches what I've typically gotten with last gen.
 
Preferences. Some people want more eye candy, some people want a smoother frame rate.

Is it? Because its the complete opposite for me. And if you don't have a huge monitor/TV the difference in resolution is definitely smaller, while 30/60 fps difference is always noticeable on any monitor of any size.

Well, this is an enthusiast forum, so it's very likely that the large majority has a big gaming screen.

I'm sitting 2 meters away from my 50" TV, there's definitely a huge difference between native and upscaled resolution. Also edge flickering/high contrast edges are extremely visible at low resolution, even on small screens.
 
I was very surprised at how quiet it was about 30fps Driveclub, even though it was mostly expected. Stuff like that is just ridiculous. I blame the screenshots and the ever-growing hunger for more detail.

We should get direct feed of games in motion in small clips instead of screenshots and see if things change.
 
I've never been a big frame rate snob. Recently I've built myself a rather beastly rig. So far my main comparison has been FF XIV. I've played quite a bit on PS3, PS4, and PC. I gotta say that a playing @ 59 fps on PC with max graphics is sublime. I never thought it would make that big of a difference to me and I rather despise the "PC master race" mentality, but I am officially a convert. I could still play on PS4 but the experience is compromised and I couldn't even think about playing on PS3 again.
 
Well, this is an enthusiast forum, so it's very likely that the large majority has a big gaming screen.
Right, an enthusiast forum. And as an enthusiast, i like my games running as smooth as possible, not only because i see the graphics more clear at smooth frame rates but also because the games are more responsive and there is a difference on how the game feels and plays.
 
I was very surprised at how quiet it was about 30fps Driveclub, even though it was mostly expected. Stuff like that is just ridiculous. I blame the screenshots and the ever-growing hunger for more detail.

We should get direct feed of games in motion in small clips instead of screenshots and see if things change.
Yeah, I think if Youtube allowed 60 FPS videos we'd be seeing a huge, huge change. Hopefully they add that eventually, you start seeing publishers/developers use that to gain traction and it'll likely become more and more desired. Even when they use PC footage instead of console footage it's likely a difference that's harder to forget and will raise higher expectations on that front.
 
I don't give a toss about seeing 60fps in my console games. I don't care if your pulse pounding action game is 30fps on average. I don't care if the racing game isn't hitting 60 frames. My reflexes are not nearly good enough to make a statistically significant difference at either one even if I can "tell the difference" between them in terms of image smoothness.
 
I was very surprised at how quiet it was about 30fps Driveclub, even though it was mostly expected. Stuff like that is just ridiculous. I blame the screenshots and the ever-growing hunger for more detail.

We should get direct feed of games in motion in small clips instead of screenshots and see if things change.

Quiet ? Is spawned one of the most hyperbolic threads in recent memory.
 
I've always chosen image quality over framerate. I even played crysis on PC with sub 30 FPS in order to achieve some eye candy. So I'm perfectly fine with 30 FPS with console games. Surely 60FPS is more fluid experience, but I really haven't been used to it so I really can't require it.
 
Honestly, I feel like 60 FPS on that generation is something of a case of rose tinted glasses. A lot of games did hit 60, yes, but there were also loads and loads that were 30 FPS that at best jumped to 60 occasionally (quite a few JRPGs later on in the PS2's life, even ones with basic visuals like the Persona games) and even some games that struggled to be 30 FPS at all (Shadow of the Colossus most notably, along with the GTAs and other games.)
The thing is that during the 6th generation, frame rate standards increased. There were more 60fps games compared to 5th generation and more games running at steady 30fps without drops. It was a huge improvement compared to the previous gen. But the 7th gen was worst than the 6th in this regard. Which is unacceptable. I thought that, technically at least, games become better as technology improves. If the graphics were worse than the previous gen, everyone would find it unacceptable. But a much worse frame rate standard was OK?
 
I have found 720p to be way more noticeable than 30fps. I couldn't play RD:R because I just found it to be blurry in 720. Also compare something like the PS3 UI when you're in a 720 game to when you're just on the menu and it's 1080.
 
In a game like Dark Souls, where you need to be able to react to an enemy's attack animation within the first 1-2 frames, framerate becomes pretty important. Same for fighting games.
 
I can quite happily take 30fps/720p, 60fps/1080p, or a mix inbetween of each variable. Naturally the higher the better but if it requires a serious amount of work on my behalf to obtain that on a PC game then I'm going to not bother.

If push came to shove, though, and I really had to make a decision, I would go with 60fps over 1080p.
 
Fighting, racing, action, shooters, sports should be 60 fps. The immersion is completely different. I replayed Tomb Raider on my PS4 and i felt like playing a different game.
On games like RPGs, MMOs, strategy etc it does not matter if a game is 30 fps (although 60 fps is always better).
If a gamer cannot understand the difference they should google search 60 fps vs 30 fps and then see the difference. For me, it matters. I prefer 720p/60fps than 1080p/30fps.
 
Movies are not like games. Games need to be smooth and responsive because you interact with whatever happens on screen.

Games running at 30fps are responsive. They may be less smooth in their animation but they're responsive, and for plenty of people it's enough.

I think if you had a bunch of games on these new systems that visually looked no better than last gen games, but ran at 60fps, the average consumer would be turned off. People want better looking games.
 
In other words, why is that frame rate doesn't matter but every other visual aspect has to be "next-gen" like or as close to perfection as possible? Why aren't we as demanding about it? Its not that smooth, 60fps games, was never a thing on consoles. On PS2/GC/XBOX gen there were so many 60fps games that i thought on next gen would be a standard. But now, 2 generations after, we are still stuck at that low standard and our expectations are as low as during the PS1/N64 days, when it comes to frame rates. What the hell happened?

The short answer: we stopped caring.

We live in the social media age where communication between players, developers and the media is better than ever. As the #NoDRM campaign clearly demonstrated, if we want something badly enough we will make it happen. So, the challenge is to convince those who don't care enough about frame rates that it's something that will make not just your game playing experience better, but theirs as well. Until the masses are convinced and more pressure is placed on developers and publishers to shoot for higher frame rates I doubt anything will change anytime soon. I wouldn't wait around for PS5 or Xbox Two to make it happen either, I'm sure 4K res will be the primary goal by then.
 
People have different opinions on how games should look and play. Personally I prefer 1080p over 60fps, but depending on the game I do notice a difference in framerate. It's all about different priorities.
 
so, out of my memory and experience:

its been a FUCKLOAD of times that I've been really pissed with a poor framerate in a game.
its been hmm... ZERO(?) times that I've been really pissed about a game's resolution?

and even though I cannot be 100% sure right now that I was never pissed about resolution, I am 100% sure that resolution did not stop me enjoying a game I liked on paper.
piss poor framerates, dips and input lag on the other hand.... oh boy.....

therefore, I can say that in my mind resolution is more tied to "eye-candy" stuff (which of course I like), while framerate and response are tied to "gameplay" stuff.
so, in conclusion, when one puts eye candy before gameplay in my opinion, he is not being a good judge on things that matter to me as a player.
 
The thing is that during the 6th generation, frame rate standards increased. There were more 60fps games compared to 5th generation and more games running at steady 30fps without drops. It was a huge improvement compared to the previous gen. But the 7th gen was worst than the 6th in this regard. Which is unacceptable.
I doubt that. Do you have some solid numbers for it?

Many PAL TV's didn't even have 60Hz support during the 6th gen.
 
At this point, I just deal with it on consoles. But still, I would take 720p/60FPS over 1080p/30FPS any day. Smoothness and less input lag are my priority (and everything else, that benefits from 60FPS).
 
Most of the time console games that do target 60fps also get lambasted for their graphics (Rage, COD, Ground Zeroes) by the other half so it's lose-lose when it comes to Gaf.

Wipeout HD is the only game I can think of that pleased both graphics and framerate loyalists.
 
Every game should be 60fps@1080p. End of story.
 
30FPS games are very playable and enjoyable.
720p games are visually blurry and it retracts from my experience a lot.

I value graphical progression over maintaining 60FPS when I can handle 30 just fine.

30fps also makes games visually blurry.

If the image is static 30fps is fine..but if a texture is moving it remains crystal clear at 60fps..but blurs badly at 30fps.
 
M°°nblade;111240331 said:
I doubt that. Do you have some solid numbers for it?

Many PAL TV's didn't even have 60Hz support during the 6th gen.
Just compare the racing games genre between 6th and 7nth generation. That alone is enough.

PAL TVs that don't support PAL 60hz run games at 50fps/50hz. Which is still a great deal smoother than 30fps. But they also can't play games at 30fps either, you were playing those games at 25fps.
 
I'm mostly fine with 30 FPS games on consoles but only if it is a locked framerate. There are exceptions to that though; all sports, first-person shooters and racing games should be 60 FPS.

I was extremely impressed with how smoothly Assassin's Creed IV ran on the PS4 as it was a perfect 30 FPS with no apparent framerate drops or screen tearing. It might have its roots in the last generation but it also looked great at 1080p. I actually ended up enjoying it more than I did on my PC (i7-4770K, 16 GB, GTX 780, Windows 8.1 Pro) where it ran at a more erratic 25-60 FPS (and before anyone asks, capping the game to 30 FPS oddly resulted in a game that felt less smooth than it did on the PS4...don't ask me why that is though!). Metal Gear Solid V: Ground Zeroes on the PS4 is similarly impressive with its locked 1080p60 framerate.

What I don't like too much are games with unlocked framerates such as Bound by Flame which I'm playing on the PS4. The way the framerate wavers between 30 FPS during hectic combat to 60 FPS during emptier sections of the game affects both the feel of the controls and the smoothness of the camera to distracting effect. I had the same issue with both Killzone: Shadow Fall and inFamous: Second Son but thankfully patches added 30 FPS caps for those (but sadly not Tomb Raider). It is impressive to see the PS4 pushing out 30+ FPS at 1080p but developers really need to target either 30 or 60 FPS and optimise the games to stay at those framerates rather than having unlocked ones.

I will add that I'm disappointed that Driveclub is targetting 30 FPS though. In my view, all exclusive racing games should be running at 60 FPS on the current gen systems to maintain the smoothness of motion and responsive controls that simulations require. OK so Driveclub isn't exactly a simulation but it is the PS4's first exclusive racer and that framerate is going to draw unfavourable comments when compared to the Xbox One's Forza Motorsport 5 launch title. I will still buy the game but I wish the developers had prioritised the framerate over prettier graphics. From what I've seen of the game it doesn't look drastically superior visually to the Xbox One's Forza except for the addition of dynamic lighting but we already know the PS4 has the additional power to handle that with ease and still run Driveclub at 1080p60.
 
It depends on how fast or precise the gameplay is. For something like a traditional RPG 30 fps is fine but when you start adding precise timing and camera control into it like with dark souls then it starts to become less fine.
 
Because 99% of people who play games would find GAF's constant panty soiling about this sort of thing embarrassing.

People don't care about either frame rate or resolution as long as it runs, and looks good. If it's 'invisible' or 'not a problem', then the numbers are irrelevant.

People care if the game is fun. They could not give a flying if it runs at this or that frame rate as long as it's not problematic.
And it's better like that.
 
I mainly game on a pretty high spec PC but I jump back to consoles every now and then. I played GTA V and adapted quickly to the variable and low frame rate. I am not going to not buy a game because of frame rate.

But with a PC I will dial back settings to achieve 60fps. Given the choice I will always go for the higher frame rate.

As an example I had Castlevania Lords of Shadow on the 360 and just couldn't get on with it. It felt sluggish and hurt my eyes on a 42" screen. I bought the PC version and it was like night and day difference. The game felt so much better with the higher frame rate (plus the resolution helped too).

Because 99% of people who play games would find GAF's constant panty soiling about this sort of thing embarrassing.

People don't care about either frame rate or resolution as long as it runs, and looks good. If it's 'invisible' or 'not a problem', then the numbers are irrelevant.

People care if the game is fun. They could not give a flying if it runs at this or that frame rate as long as it's not problematic.
And it's better like that.

GAF is a lot more enthusiast-orientated though; we talk about things like this in a lot more detail. I admit some responses to frame rate can seem a bit over-the-top.

I am not a car anorak but if I went on a car forum they would talk about things I don't give a flying-what's-it-called about.
 
I legitimately can't see the difference between 720p and 1080p. Framerate is a much bigger deal to me.

Try switching it in a game right now. the difference is extremely noticable. As is the difference between 30fps solid and 60 fps solid.

There's almost no way you won't notice the difference if you really pay attention.

-

60fps is ideal of course, simply because it allows for greater responsiveness and precision. It's not always absolutley required, but it always makes for a smoother more reliable experience.
 
Just compare the racing games between 6th and 7nth generation. That alone is enough.
Racing games like need for speed have never been 60fps. The first Forza on the original Xbox was 30fps as well. PGR? Also 30fps. WRC? 30 fps.

I only remember the gran turismo series and a couple of futuristic racers like wipeout and F-zero to be 60fps, just like they were last gen.


PAL TVs that didn't support 60hz run games at 50fps/50hz. They also couldn't play games at 30fps either, you were playing at 25fps.
Which means all those people only saw a 'stable 25fps' framerate during the 6th generation.
 
I mainly game on a pretty high spec PC but I jump back to consoles every now and then. I played GTA V and adapted quickly to the variable and low frame rate. I am not going to not buy a game because of frame rate.
Thats what i did but only because there wasn't a better way to run GTA5 anyway. I had to deal with it.
 
It really depends on the game. Action,fighting and racing games are WAY better with 60 FPS because you'll get less lag and better response from the game and these are genres where a split second decision can be a life/death decision.

Other genres can be very playable in 30FPS but of course 60 is always the better option.
 
M°°nblade;111241390 said:
Racing games like need for speed have never been 60fps. The first Forza on the original Xbox was 30fps as well.

I only remember the gran turismo series and a couple of futuristic racers like wipeout and F-zero to be 60fps.
What are you talking about?

Most racing games run at 60fps during 6th gen, do some research. Colin Mcrae games were all 60fps. Now they run at 30. WRC games were all at 60fps. Now they run at 30. Most exclusive racing games run at 60 fps like Rallisport series on XBOX. I'm not going to make a huge list but if you want you can research yourself. Here's a hint: On XBOX 360, if you exclude retro XBLA games like Daytona USA, there are only 3 or 4 racing games that run at 60fps! In the whole Library!


Which means all those people only saw a 'stable 25fps' framerate during the 6th generation.
We also had many more 50 fps games back then, compared to 60fps games now.
 
What are you talking about?

Most racing games run at 60fps during 6th gen, do some research. Colin Mcrae games were all 60fps. Now they run at 30. WRC games were all at 60fps. Now they run at 30. Most exclusive racing games run at 60 fps like Rallisport series on XBOX. I'm not going to make a huge list but if you want you can research yourself. Here's a hint: On XBOX 360, if you exclude retro XBLA games like Daytona USA, there are only 3 or 4 racing games that run at 60fps! In the whole Library!
You're ignoring that they were interlaced too though. Make a list of progressive scanned 6th gen 60fps racing games and compare that.
 
You're ignoring that they were interlaced too though. Make a list of progressive scanned 6th gen 60fps racing games and compare that.
Are you implying that a 60fps interlaced game isn't considerably smoother/more responsive than a progressive 30fps game?
 
What are you talking about?

Most racing games run at 60fps during 6th gen, do some research.
I'm pretty, pretty, pretty sure NFS, Forza and PGR did not run at 60fps.
Is there any source material we can rely on?

We also had many more 50 fps games back then, compared to 60fps games now.
Also, most early PS2 titles aimed for 60fps, thanks to Sony, simply because early SDK only exposed interlaced scan-modes which required 60hz to get 640x480.
 
In other words, why is that frame rate doesn't matter but every other visual aspect has to be "next-gen" like or as close to perfection as possible?
Framerate *does* matter. Even for the general audience.
It's just that most people aren't competent and aware enough to have previous knowledge on the topic.
It also doesn't help that better framerate doesn't show in screeshots, nor in Youtube videos.
But if you let Mr Clueless play the same game at 30 and then 60 FPS he WILL notice the difference.
 
Top Bottom