JeloSWE
Member
Oh simple farmer, 1000fps is where it's truly at.120fps is for peasants. 240fps should be the minimum acceptable.
Oh simple farmer, 1000fps is where it's truly at.120fps is for peasants. 240fps should be the minimum acceptable.
60fps how cute.
Let me tell ya playing at 165fps is damn good. I'd rather cut out my eyes than play at 60fps.
Next gen should have a standard of 4k 60fps but we all know that won't happen.
Just give me enjoyable games.
60 fps feels better when playing a game at 30, ill take "little" blurryness over a better functional games1440p is slightly better than 1080p and it still looks ugly/blurry with TAA. 4k is far more better and detailed
Reading this from 1.8m on my 75" TV
Impossible for all the console exclusives on Playstation and those are often the ones I like the most. Just give all games 60fps performance mode and I'll be happy. For everything else I always by the best Nvidia card.If you want 60fps for every game, get a PC. That's the only way.
The reason games can feel heavy even at 60 fps it that the game can take multiple frames to output the final image. Tekken 7 which is pretty good now days started out with 7 frames of input delay, and after community outcry was reduced down to 4 frames.For me it all depends on the game. Gamepad is not as accurate as gaming mouse and in x360 / PS3 era there were many games where I was happy with 30fps, however on Xbox One and PS4 things have changed and 30fps is no longer so responsive to me. For example GTA5. On xbox 360 and PS3 I can aim easily even without auto aim feature, but on xbox one and PS4 version I'm struggling with aiming (aiming feels heavy and unresponsive). The same with rise of the tomb raider. On X360 I could aim without problems and on Xbox One X I was struggling, so 60fps option was welcome.
Games are clearly more laggy now at 30fps compared with x360 era and I'm guessing it has something to do with multithreading. There are games on PC that feels heavy when all CPU threads are active, but when I set thread affinity to just one suddenly game become responsive.
On PS5 and XSX I would want more than 30fps because I dont want laggy controls. 60fps would be great but maybe even 45fps would be a good idea in VRR era.
Impossible for all the console exclusives on Playstation and those are often the ones I like the most. Just give all games 60fps performance mode and I'll be happy. For everything else I always by the best Nvidia card.
I know, and that is why I and many others are raising their voices so maybe more developers take notice. At my game studio, not that big but I will not name it here, I was responsible for pushing for 60fps in our games, it makes a huge difference in how smooth the game feels.Well, that's the problem - with a console, the developers decide that, and they won't always do what you want.
Seeing How RDR 2 was native 4K/30 fps on One X i don’t see how next gen can’t be 4K and 60 FPS.
I think its reported that the Demon Souls remake does this.As an owner of a 4k tv I honestly could not give a fuck about native 4k resolution, what I do care about though is them frames. Give me 1440p and 60 fps on consoles, its the smart move for gameplay, fluidity etc.
Its pathetic that near the end of the next gen consoles life cycle which will be around 2027, we will still be playing and talking about games in 30 frames per second.
I prefer much more detailed visual at native 4k with slightly juddering at 30fps60 fps feels better when playing a game at 30, ill take "little" blurryness over a better functional games
The problem with that is that the 4K resolution only makes a difference when the camera is still, you need to pair higher resolutions with higher frame rates to benefit from it in motion and preferably BFI. If you can get a game running at 120fps with BFI then you can really appreciate the extra detail brought by the resolution. But at 30fps, it's a waste in most games where when the camera moves.I prefer much more detailed visual at native 4k with slightly juddering at 30fps
Same with 60fps. You can only see the real 4k resolution at 60pixels per sec with 60hz and 30pixel per sec at 30hz/fps. Yes, BFI with 0.2-0.5ms MPRT is another story, because it can deliver 3840 pixels/sec = crystal clear 4k at almost all eyetracked speeds. But in case of 30/60fps I still need to stop camera if I really want to see all details and beauty of modern games. And native pixel to pixel 4k looks much more better in statics. 120HZ +BFI is the only case I can enjoy and appreciate games in motion. The problem BFI ruins peak brightness and HDR. I really hope to play with 4k 120hz+BFI+HDR in future without significant problems, but now I prefer 4k HDR 30 fpsinstead of 4k 120hz+bfi+low brightness and worse colors. 60fps is pathetic, just 2 times better in terms of motion clarity and it is still lowres. It is like compare 120p and 240p resolution. both are bad and I just ignore this aspect. We need 30-60 times better motion resolution to really enjoy motion like we need really high static resolution to avaid the most of graphical problems and artifacts(blur. lack of details, aliasing).The problem with that is that the 4K resolution only makes a difference when the camera is still, you need to pair higher resolutions with higher frame rates to benefit from it in motion and preferably BFI. If you can get a game running at 120fps with BFI then you can really appreciate the extra detail brought by the resolution. But at 30fps, it's a waste in most games where when the camera moves.
For me, 120fps BFI is where things start to get great. I can play 1080/1440p 120hz BFI HDR with very little drop in nits on my Sony ZF9 because it has a very good BFI system. It just looks spectacular and at 120hz there is almost no flicker and leagues better than the tiresome 60hz BFI flicker that I just cant stand. Still 60fps both feels and look 2 times better than the completely unacceptable 30 fps. FFS I almost got a headache and my eyes tired much more quickly when I played through TLOU2 due to the 30fps sample and hold blur and having to constantly pan the camera around for threats and resources. Even without BFI 120hz is quite good in it self.Same with 60fps. You can only see the real 4k resolution at 60pixels per sec with 60hz and 30pixel per sec at 30hz/fps. Yes, BFI with 0.2-0.5ms MPRT is another story, because it can deliver 3840 pixels/sec = crystal clear 4k at almost all eyetracked speeds. But in case of 30/60fps I still need to stop camera if I really want to see all details and beauty of modern games. And native pixel to pixel 4k looks much more better in statics. 120HZ +BFI is the only case I can enjoy and appreciate games in motion. The problem BFI ruins peak brightness and HDR. I really hope to play with 4k 120hz+BFI+HDR in future without significant problems, but now I prefer 4k HDR 30 fps. 60fps is just 2 times better in terms of motion clarity and it is still lowres. It is like compare 120p and 240p resolution. both are bad and I just ignore this aspect. We need 30-60 times better motion resolution to really enjoy motion.
Seeing How RDR 2 was native 4K/30 fps on One X i don’t see how next gen can’t be 4K and 60 FPS.
This is only true if all games looked just as good as RDR 2 from now on and no better.
But what if Rockstar wants to make RDR 3 with three times as many NPCs, more complex animations, more detailed grass and trees, better simulations of wildlife, longer draw distance, more dynamic lighting, more detailed models, and so on? It's all about tradeoffs. You can use CPU/GPU horsepower for much more than just higher resolution and more frames.
The XBox One S is probably capable of rendering Quake 3 in 4k120 - does that mean there's no reason the XBox One S couldn't render RDR2 at 4k120?
Just 60fps minimum, whatever resolution. Even 720p, i don't care.
30fps should not be acceptable anymore regardless the game genre. Even turn based RPGs.
I agree. As someone with a HDR 4K TV I would always prefer 1440p60 to anything higher but only 30fps. Framerate >> Resolution and HDR. But it would probably be more work for devs to optimize a game to run at 60.As an owner of a 4k tv I honestly could not give a fuck about native 4k resolution, what I do care about though is them frames. Give me 1440p and 60 fps on consoles, its the smart move for gameplay, fluidity etc.
Its pathetic that near the end of the next gen consoles life cycle which will be around 2027, we will still be playing and talking about games in 30 frames per second.
Fair enough. 720p did appear rather washed out on my previous 1080p TV.NO, nothing below 1080p. This resolution looks great on 1080p native screens (obviously) and even good on 4K screens thanks to perfect 2x/2x scaling. Last gen was so shit thanks to idiotic resolutions they used when majority of tvs were 1080p.
1440p is slightly better than 1080p and it still looks ugly/blurry with TAA. 4k is far more better and detailed
My first game at 4k render was the witcher 2 in 2012. Sad to see someone still plays at 720p or 1080p in 2013 and even 2020This gen many games on Xbox were 720P same as 360. But obviously the upgrade from 360 was pretty big. 1440P I think is the sweet spot. Console specs are pretty good but at 4K the upgrade is going to be rather disappointing.
Red dead redemption 2 looks amazing at native 4k on the Xbox one x, wth am I reading in this thread ?1440P is the sweet spot for next gen console IMHO. 4K is just too big a jump and the console power is wasted in crunching these pixels in cost of gfx settings, fps.
Even with a controller. It doesn't just feel better to control it's visually smoother too and once you get used to it and go back down 60 doesn't look or feel nearly as smooth to you as it once did.With controller, not so sure, if that extra 60fps would be felt, with mouse it obviously is...
Red dead redemption 2 looks amazing at native 4k on the Xbox one x, wth am I reading in this thread ?
My first game at 4k render was the witcher 2 in 2012. Sad to see someone still plays at 720p or 1080p in 2013 and even 2020