• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Give me 1440 and 60fps or give me death

Armorian

Banned
60fps how cute.

Let me tell ya playing at 165fps is damn good. I'd rather cut out my eyes than play at 60fps.

Next gen should have a standard of 4k 60fps but we all know that won't happen.

You will be fucking blind after NG hits then, CPU demands will skyrocket on PC and CPU will be able to play some new games in 120FPS+ (just like you can't do that in AC games right now).

Just give me enjoyable games.

Imagine the same enjoyable game in two versions: 4K 30FPS and 1080p/1440p 60FPS, it plays better in higher framerate. And with Ryzen CPU's devs have no excuse not to include options like that unless they do some crazy physics etc. (won't be common I imagine).
 

DelireMan7

Member
I never noticed the framerate drop in Blighttown in Dark Souls (around 15-20 FPS according to google) so I guess 30 FPS is fine for me xD

But I guess if you're used to higher standard, it's difficult to go back. (Technical aspect of games were never a concern for me so I really can't judge)
 

carsar

Member
1440p is slightly better than 1080p and it still looks ugly/blurry with TAA. 4k is far more better and detailed
 
Last edited:

pawel86ck

Banned
For me it all depends on the game. Gamepad is not as accurate as gaming mouse and in x360 / PS3 era there were many games where I was happy with 30fps, however on Xbox One and PS4 things have changed and 30fps is no longer so responsive to me. For example GTA5. On xbox 360 and PS3 I can aim easily even without auto aim feature, but on xbox one and PS4 version I'm struggling with aiming (aiming feels heavy and unresponsive). The same with rise of the tomb raider. On X360 I could aim without problems and on Xbox One X I was struggling, so 60fps option was welcome.

Games are clearly more laggy now at 30fps compared with x360 era and I'm guessing it has something to do with multithreading. There are games on PC that feels heavy when all CPU threads are active, but when I set thread affinity to just one suddenly game become responsive.

On PS5 and XSX I would want more than 30fps because I dont want laggy controls. 60fps would be great but maybe even 45fps would be a good idea in VRR era.
 

JeloSWE

Member
For me it all depends on the game. Gamepad is not as accurate as gaming mouse and in x360 / PS3 era there were many games where I was happy with 30fps, however on Xbox One and PS4 things have changed and 30fps is no longer so responsive to me. For example GTA5. On xbox 360 and PS3 I can aim easily even without auto aim feature, but on xbox one and PS4 version I'm struggling with aiming (aiming feels heavy and unresponsive). The same with rise of the tomb raider. On X360 I could aim without problems and on Xbox One X I was struggling, so 60fps option was welcome.

Games are clearly more laggy now at 30fps compared with x360 era and I'm guessing it has something to do with multithreading. There are games on PC that feels heavy when all CPU threads are active, but when I set thread affinity to just one suddenly game become responsive.

On PS5 and XSX I would want more than 30fps because I dont want laggy controls. 60fps would be great but maybe even 45fps would be a good idea in VRR era.
The reason games can feel heavy even at 60 fps it that the game can take multiple frames to output the final image. Tekken 7 which is pretty good now days started out with 7 frames of input delay, and after community outcry was reduced down to 4 frames.
 

diffusionx

Gold Member
Impossible for all the console exclusives on Playstation and those are often the ones I like the most. Just give all games 60fps performance mode and I'll be happy. For everything else I always by the best Nvidia card.

Well, that's the problem - with a console, the developers decide that, and they won't always do what you want.
 

JeloSWE

Member
Well, that's the problem - with a console, the developers decide that, and they won't always do what you want.
I know, and that is why I and many others are raising their voices so maybe more developers take notice. At my game studio, not that big but I will not name it here, I was responsible for pushing for 60fps in our games, it makes a huge difference in how smooth the game feels.
 
Seeing How RDR 2 was native 4K/30 fps on One X i don’t see how next gen can’t be 4K and 60 FPS.

This is only true if all games looked just as good as RDR 2 from now on and no better.

But what if Rockstar wants to make RDR 3 with three times as many NPCs, more complex animations, more detailed grass and trees, better simulations of wildlife, longer draw distance, more dynamic lighting, more detailed models, and so on? It's all about tradeoffs. You can use CPU/GPU horsepower for much more than just higher resolution and more frames.

The XBox One S is probably capable of rendering Quake 3 in 4k120 - does that mean there's no reason the XBox One S couldn't render RDR2 at 4k120?
 

ACESHIGH

Banned
People please stop posting this false equivalence of 4k 30 FPS equals 1080p 120 FPS... You need to take the CPU into account as well. Ubisoft is the western king of unoptimized games. Their lazy asses will just code the games well enough so that they hit 30 FPS on the console CPUs.

A rock solid 60 FPS Open World Ubisoft game is a pipe dream on consoles. Even old AC games have trouble hitting 60 FPS on PCs with overpowered processors... They'll just hammer one core like they always do in their Anvil/Dunia engines and call it a day.
 

truth411

Member
As an owner of a 4k tv I honestly could not give a fuck about native 4k resolution, what I do care about though is them frames. Give me 1440p and 60 fps on consoles, its the smart move for gameplay, fluidity etc.

Its pathetic that near the end of the next gen consoles life cycle which will be around 2027, we will still be playing and talking about games in 30 frames per second.
I think its reported that the Demon Souls remake does this.
 

JeloSWE

Member
I prefer much more detailed visual at native 4k with slightly juddering at 30fps
The problem with that is that the 4K resolution only makes a difference when the camera is still, you need to pair higher resolutions with higher frame rates to benefit from it in motion and preferably BFI. If you can get a game running at 120fps with BFI then you can really appreciate the extra detail brought by the resolution. But at 30fps, it's a waste in most games where when the camera moves.
 
Give me 1080p 280/360hz!

Once you go 280hz, you never go back.

I'd much rather use my GPU horsepower to process higher framerates than a higher resolution.
 

carsar

Member
The problem with that is that the 4K resolution only makes a difference when the camera is still, you need to pair higher resolutions with higher frame rates to benefit from it in motion and preferably BFI. If you can get a game running at 120fps with BFI then you can really appreciate the extra detail brought by the resolution. But at 30fps, it's a waste in most games where when the camera moves.
Same with 60fps. You can only see the real 4k resolution at 60pixels per sec with 60hz and 30pixel per sec at 30hz/fps. Yes, BFI with 0.2-0.5ms MPRT is another story, because it can deliver 3840 pixels/sec = crystal clear 4k at almost all eyetracked speeds. But in case of 30/60fps I still need to stop camera if I really want to see all details and beauty of modern games. And native pixel to pixel 4k looks much more better in statics. 120HZ +BFI is the only case I can enjoy and appreciate games in motion. The problem BFI ruins peak brightness and HDR. I really hope to play with 4k 120hz+BFI+HDR in future without significant problems, but now I prefer 4k HDR 30 fpsinstead of 4k 120hz+bfi+low brightness and worse colors. 60fps is pathetic, just 2 times better in terms of motion clarity and it is still lowres. It is like compare 120p and 240p resolution. both are bad and I just ignore this aspect. We need 30-60 times better motion resolution to really enjoy motion like we need really high static resolution to avaid the most of graphical problems and artifacts(blur. lack of details, aliasing).
 
Last edited:

JeloSWE

Member
Same with 60fps. You can only see the real 4k resolution at 60pixels per sec with 60hz and 30pixel per sec at 30hz/fps. Yes, BFI with 0.2-0.5ms MPRT is another story, because it can deliver 3840 pixels/sec = crystal clear 4k at almost all eyetracked speeds. But in case of 30/60fps I still need to stop camera if I really want to see all details and beauty of modern games. And native pixel to pixel 4k looks much more better in statics. 120HZ +BFI is the only case I can enjoy and appreciate games in motion. The problem BFI ruins peak brightness and HDR. I really hope to play with 4k 120hz+BFI+HDR in future without significant problems, but now I prefer 4k HDR 30 fps. 60fps is just 2 times better in terms of motion clarity and it is still lowres. It is like compare 120p and 240p resolution. both are bad and I just ignore this aspect. We need 30-60 times better motion resolution to really enjoy motion.
For me, 120fps BFI is where things start to get great. I can play 1080/1440p 120hz BFI HDR with very little drop in nits on my Sony ZF9 because it has a very good BFI system. It just looks spectacular and at 120hz there is almost no flicker and leagues better than the tiresome 60hz BFI flicker that I just cant stand. Still 60fps both feels and look 2 times better than the completely unacceptable 30 fps. FFS I almost got a headache and my eyes tired much more quickly when I played through TLOU2 due to the 30fps sample and hold blur and having to constantly pan the camera around for threats and resources. Even without BFI 120hz is quite good in it self.
 

UltimaKilo

Gold Member
How unfortunate that DLSS type technologies isn’t more mature by now. I really hope any mid-generational console refreshes are able to hit 4K/120fps.
 

Schmick

Member
1440p is definately the sweet spot. I'm wishing PS5 supports 1440p purely so i can share my 1440p monitor with my PC whilst my XSS sits in my living room connected to my 1080p TV. But unfortunately its unlikely completely ruining my plans.
 

DeaDPo0L84

Member
Seeing How RDR 2 was native 4K/30 fps on One X i don’t see how next gen can’t be 4K and 60 FPS.

RDR2 was 4k/30fps on X1X, the important part is what quality assets were they using? On PC you have sliders to customize almost every aspect of the game. On console they have to lower the quality of the overall assets to maintain a solid 30fps experience. This means worse textures,lighting,grass details,anti aliasing,fov,water details,density of population, etc etc.

If they put the exact same game on XSX without upping the quality of assets then sure it'll probably hit 60fps. BUT, they know gamers get excited when they see something flashy upon boot up so they'll simply bump up all the assets, praise the upgrade in visuals, and you'll be stuck at 30fps.
 

Razvedka

Banned
I will go to my grave not understanding GAF's festish with 60fps console gaming and the persistent inability to understand why developers don't prioritize it. Thankfully there seems to always be a clean up crew of a handful of posters who 'lay it out' for people but invariably a new thread is created about it or its talked about in other ones.

If this is something you seriously care about, as a gamer, you're likely to be perpetually disappointed unless you're have a PC.

This isn't a 'PC Master Race' post, it's more to do with market forces and resulting priorities from a technical perspective. How a game looks, how it makes you feel looking at it, holds far more 'marketing sway' than the label '60fps'.

Personally, I wish these consoles had some alternative to DLSS 2.0 and the gamer could choose 'native' vs 'fake' 4k and opt for even prettier graphics or higher frames. But I accept that's not reality, and it would also entail more developer work (which means expense).
 
Last edited:

DeaDPo0L84

Member
This is only true if all games looked just as good as RDR 2 from now on and no better.

But what if Rockstar wants to make RDR 3 with three times as many NPCs, more complex animations, more detailed grass and trees, better simulations of wildlife, longer draw distance, more dynamic lighting, more detailed models, and so on? It's all about tradeoffs. You can use CPU/GPU horsepower for much more than just higher resolution and more frames.

The XBox One S is probably capable of rendering Quake 3 in 4k120 - does that mean there's no reason the XBox One S couldn't render RDR2 at 4k120?

This is where so many people get confused. Seeing a game is 4k/60 does NOT mean it has the best assets being used. I'll take 1440p/120fps with all near max settings than 4k/30/60 with low-possibly high on a couple settings.
 
Amen. The "industry wisdom" surrounding framerate never ceases to baffle me:

  • "Higher resolution matters more to gamers than framerate" is a proven falsity.

  • "The most popular games run at 30fps, just look at GTA!" -- Also false. Rockstar is the exception, not the rule.

  • "Amazing graphics sell more than high framerate" is an old wive's tale that has never been proven. The best selling video game of all time is a graphically simple game that runs at 60fps on PC, all consoles, and even on iPhone. Graphically-rich games and high-framerate games both sell well: there's no evidence of consumers choosing one over the other.


  • "Consoles have always had 30fps as a standard" is utter nonsense. Console games have had a 60fps standard since at least the NES (1983). The only period when 60fps games didn't dominate the console market was during the 5th gen, with the rise of 3D graphics (1994, PS1). The highest-selling console game of each year since the following gen (2001, PS2) has been a 60fps game 14/19 times.

 
Last edited:

Armorian

Banned
Just 60fps minimum, whatever resolution. Even 720p, i don't care.

30fps should not be acceptable anymore regardless the game genre. Even turn based RPGs.

NO, nothing below 1080p. This resolution looks great on 1080p native screens (obviously) and even good on 4K screens thanks to perfect 2x/2x scaling. Last gen was so shit thanks to idiotic resolutions they used when majority of tvs were 1080p.
 
If next generation is really going to be as expensive as rumored and we have to deal with $70 games, then I think we should get 4K/60fps. It will be ashamed if we have to deal with lower resolutions and bad frame rates yet again literally 7+ years later after having to deal with it for so long this generation. I understand if all games can’t hit 4K or 60fps, but we better get some gorgeous games then and noteworthy visual leaps.
 
Last edited:

Naru

Member
As an owner of a 4k tv I honestly could not give a fuck about native 4k resolution, what I do care about though is them frames. Give me 1440p and 60 fps on consoles, its the smart move for gameplay, fluidity etc.

Its pathetic that near the end of the next gen consoles life cycle which will be around 2027, we will still be playing and talking about games in 30 frames per second.
I agree. As someone with a HDR 4K TV I would always prefer 1440p60 to anything higher but only 30fps. Framerate >> Resolution and HDR. But it would probably be more work for devs to optimize a game to run at 60.
 
Last edited:

nkarafo

Member
NO, nothing below 1080p. This resolution looks great on 1080p native screens (obviously) and even good on 4K screens thanks to perfect 2x/2x scaling. Last gen was so shit thanks to idiotic resolutions they used when majority of tvs were 1080p.
Fair enough. 720p did appear rather washed out on my previous 1080p TV.
 

Neo_game

Member
1440p is slightly better than 1080p and it still looks ugly/blurry with TAA. 4k is far more better and detailed

This gen many games on Xbox were 720P same as 360. But obviously the upgrade from 360 was pretty big. 1440P I think is the sweet spot. Console specs are pretty good but at 4K the upgrade is going to be rather disappointing.
 
Last edited:

Fbh

Member
I don't care too much and still don't think 30fps is the end of the world.
I would like to see more games offer options though, I know lowering the resolution isn't some magical solution that will allow every 30fps console game to magically run at 60fps, but in the cases where it's possible I'd definitely like to see more games offer options like Nioh. A fully resolution/visuals focused mode at 30fps, a full performance focused mode at 60fps and a mode that tries to balance them both.
 

carsar

Member
This gen many games on Xbox were 720P same as 360. But obviously the upgrade from 360 was pretty big. 1440P I think is the sweet spot. Console specs are pretty good but at 4K the upgrade is going to be rather disappointing.
My first game at 4k render was the witcher 2 in 2012. Sad to see someone still plays at 720p or 1080p in 2013 and even 2020
 
Last edited:
With controller, not so sure, if that extra 60fps would be felt, with mouse it obviously is...
Even with a controller. It doesn't just feel better to control it's visually smoother too and once you get used to it and go back down 60 doesn't look or feel nearly as smooth to you as it once did.
 

Neo_game

Member
Red dead redemption 2 looks amazing at native 4k on the Xbox one x, wth am I reading in this thread ?

Do not forget RDR2 runs at 864P on Xbox and 1080P on PS4 this gen which were the base console. I know there will be Xbox S which is not good. But shouldn't PS5 and X be considered as base console for this gen ? You will find less of an upgrade for RDR3 if they target 4K for PS5 and the X.

My first game at 4k render was the witcher 2 in 2012. Sad to see someone still plays at 720p or 1080p in 2013 and even 2020

How many games do you game on native 4K ? Do you know the % of gamers who game on 4K ? I think it would not be more than 1%. including the PC.
 

Mister Wolf

Gold Member
1440p min 60fps min. If a game isn't reaching both of those minimums then I'm not trying to hear that it looks visually impressive.
 
Top Bottom