• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I am now a believer of higher framerate over resolution

Killer8

Member
The trade-off between resolution and framerate feels especially not worth it this generation. Many times it's 1440p60 versus 4K30, so it's not even like you are taking that much of a reduction in resolution. I saw a PS5 comparison of the FF7 remake between the quality mode and framerate mode. You basically got a small reduction in clarity in exchange for 30 extra frames. Factor in sitting meters from the TV and it's barely even noticeable.
 
Vperform reduces the resolution all the way down to 240p OP. Not worth it in my opinion.
What if they offered an ultra quality mode that ran at 10fps?

The thing is, going to 1440p or 1800p (in general) to get a 60fps mode is 100% worth it, even if other effects are reduced in quality at this point (the quality can be much higher than last gen as well, especially as we see in the new ratchet and clank game).
 
Last edited:
Austin Powers Doctor Evil GIF
😂😂😂😂😂😂😂
 

svbarnard

Banned
Sorry posted thread by accident before finishing my post
OP, have you seen this thread by chance?


Believe it or not the old CRT TVs we had had a very minimal amount of motion blur because of the way CRT TVs worked, when we upgraded to flat panel TVs we actually got worse motion blur. Modern day LCD/OLEDs have a horrible amount of motion blur at 30fps or even 60fps, in fact listen to me in order to get the same motion clarity as a CRT TV you would need to be playing on an LCD/OLED at 1,000fps and the LCD/OLED would have to have a refresh rate of 1,000Hz. So the LCD/OLED would have to be capable of displaying a thousand frames per second. Now I understand this is probably entirely new information for much of you as it was originally for me too but it's totally true!

Yes I wish the entire TV industry would realize that we don't need to go any higher than 4K but we definitely need more frame rates, in fact we need to go all the way up to 1,000 frame rates per second which means we need TVs with 1,000Hz refresh rates. It's ridiculous the TV industry is pushing to 8K resolution but yet they're keeping the refresh rate at 60Hz.

Here this will show you what I'm talking about, you'll see that even at 120fps the motion blur is still pretty darn bad on flat panel TVs, but it actually starts to look okay at 240fps. When it says sample and hold displays it's referring to flat panel TVs. Also see how smooth it looks at 1, 000fps? That's how smooth the old CRT TVs looked, essentially zero motion blur.
6u5iZGB.png


I want to add there is a way to see what a display with zero motion blur looks like in today's world where it's extremely hard to get your hands on a CRT TV, and that's VR!!! So VR headsets use OLED and motion blur is extremely noticeable in VR, like apparently the motion blur if it could not be eliminated would have prevented the widespread adoption of VR. So there is a technique called motion blur reduction where basically the backlight will strobe with each frame so if the game is playing at 90fps then that means the backlight will strobe 90 times a second. That essentially eliminates all motion blur. Because if you read into the history of VR the early prototypes had horrible motion blur which ruined the experience and so they had to find a way to eliminate motion blur, why did the early VR prototypes have bad motion blur? Because modern-day flat panels LCDs/OLEDs have really bad motion blur!
 
Last edited:

//DEVIL//

Member
I love how the OP is happy with 60 fps... when I am fighting for nothing less than 120 frames on my pc lol.

I honestly cant tell the difference between 4k and 2k. especially with all these new upscaling rendering technology ( mainly dlss, but others too ).

I wouldn't care playing full HD with proper upscaling as long as i get everything ultra graphics with high 120 frames minimum.

as for the PS5/ Xbox Series X. I will never buy a game that is 30 fps only. devs can really screw off for all I care, not adding the option of 60 frames at least is just pure laziness and they are the people that need my money, I do not need their games. Plenty of fish in the water ( there are other games I will enjoy )
 

THE DUCK

voted poster of the decade by bots
For me I can't do 1080p, it's just too.......unrefined/detailed now?
But definately I can do 1440p, it looks great, I'd take that and sacrifice rt on most games to hit 60fps. All these games recently converted to 60 have changed my opinion I think, wreckfest, gow, and whole bunch more just feel a lot better.
 

njean777

Member
I am perfectly fine running 1080/60fps over 4k/30fps. At least give the option for every game (ps5/XSX), it doesn't hurt anybody and everybody can be happy with what they choose.
 

Shai-Tan

Banned
I've been the type of person who jacks up graphics and plays at 35fps however Nvidia drivers in the last few years have pretty horrible stuttering in many games. That's my problem more so than absolute frame rate. DLSS at least lets me jack options up without sacrificing frame rate too much.
 

Neo_game

Member
Personally for me 30fps is not a problem, I can adjust to it. I do not play online or competitive gameplay. Having said that the more faster the game runs the better you can play. I also do not care about high resolution. IMO gfx detail is more important.
 

G-Bus

Banned
Yea if Sony keeps this 60fps thing up I might get used to it. As of now I definitely notice a difference but it doesn't bother me and is forgotten shortly after starting.

I'm sure as we get further into the generation some (most?) developers will decide that 30 will be required to meet ambitions.
 

Fredrik

Member
60+ fps and gsync is where it’s at, there is no turning back after that, doesn’t matter if it goes from 70 to 110 fps, everything looks smooth as butter, it’s why I play 99% of multiplats on PC now.

Stable 120 fps in the living room through fps boost on XSX was tranformative though, give me more of that and I’ll consider occasionally leaving the PC desk.

30fps is crap.
 

BadBurger

Is 'That Pure Potato'
I used to not really care much about fps performance either. But after playing games on my PC (75Hz monitor) and only 60 fps PS5 for several months, I decided to install Horizon Zero Dawn to get the feel of it again and revisit some battles - and boy was that 30 fps now extremely noticeable to me. I can't believe I ever thought it looked fine, it was weird and jerky. It felt like when you're awakened from a very deep, dream-filled slumber and for those first twenty or thirty seconds things seem to be moving slower than they should be.

My next PC upgrade will be a new monitor, one with HDR and 144Hz. I just haven't gotten around to researching them yet. It seems like every time I asked the internet everyone had conflicting opinions so I'm just going to have to set aside some time and figure it out myself.
 
Last edited:

rolandss

Member
Still having a 1080p tv and a ps5 the choice is kinda made for me but I’m totally fine with 1080p, looks great and with 60fps it looks better.
 

DeaDPo0L84

Member
I used to not really care much about fps performance either. But after playing games on my PC (75Hz monitor) and only 60 fps PS5 for several months, I decided to install Horizon Zero Dawn to get the feel of it again and revisit some battles - and boy was that 30 fps now extremely noticeable to me. I can't believe I ever thought it looked fine, it was weird and jerky. It felt like when you're awakened from a very deep, dream-filled slumber and for those first twenty or thirty seconds things seem to be moving slower than they should be.

My next PC upgrade will be a new monitor, one with HDR and 144Hz. I just haven't gotten around to researching them yet. It seems like every time I asked the internet everyone had conflicting opinions so I'm just going to have to set aside some time and figure it out myself.
Funny cause just last night I had to hook up my PS4 since my CX doesn't natively have HBO Max (dafuq?). I decided to throw on Destiny 2, a game I played on this exact same console for close to 500 hours and never once thought there was a problem. Instantly when I gained control of the character it felt like I was playing on some archaic piece of gaming tech from my childhood, it was honestly nauseating.

I tried to do a override mission but the slow ass fps, narrow fov, and general unpleasantness caused me to bounce 5 minutes in. Keep in mind I migrated to a 144hz 38" ultra wide paired with a rtx 3090 so my Destiny 2 experience got quite the face-lift. But damn experiencing it the old way again was a revelation.
 
The last 30 fps game I played was The Medium. During the walking segments it wasn't even that bad, you're walking through a static picture after all. But once you start inspecting things and moving the camera around... Feels like complete ass. 30 fps needs to die.
 

Justin9mm

Member
It really depends.. On console with my 4K TV... 1080p res @ 60fps is not acceptable for me.. 1440p @60 is more then fine. It needs to be 1440p minimum for 60fps.

Obviously on PC with a monitor, it's a different story.
 
Last edited:

hemo memo

Gold Member
Yeah 60fps should be the minimum now. Going back to 30fps is suffering. Lower the resolution as you like but 60fps is a must.
 

Boss Mog

Member
I always laugh when I hear people say they prefer the 30fps fidelity modes. That higher fidelity is only noticeable in static screenshots, when the game is in motion it becomes a choppy, blurry mess, especially when the camera pans horizontally. So you can't even appreciate the higher resolution unless you stop and have the camera fixed in place. During gameplay 30fps actually makes it seem like you have less fidelity. 60fps perfomance mode all the way.
 

Whitecrow

Banned
30 feels choppy after getting used to 60.
But then you also get used to 30 after a while.

An then, if you gro from 30 to 60 it feels soooo smooth.

But that doesnt mean that 30 is unplayable or that your brain will break if you play at 30.

It's annoying that at this day and age, people still complains about 30 JUST BECAUSE THE TRANSITION FROM 60.
When will people learn.
 
They all realise it at some point.
The realization scales with available hardware. Last gen everyone was fine with 30 fps because aside from PC, there was no alternative. Now that there's fairly cheap access to 60 fps gaming, everyone loves it. Same thing will happen with raytracing btw :messenger_winking:
 

RPSleon

Member
Beyond the added feel it gives the game, its actually a better visual upgrade, and more noticable, than having a higher resolution while in motion. 60fps makes the game look better. I also tried to go back to fidelity mode and could see a huge negative difference in smoothness.

Im all for 60fps as standard. I play cod in the 120fps mode exclusively too.
 

ZywyPL

Banned
tenor.gif


Once you go 60FPS you'll never go back.


It's annoying that at this day and age, people still complains about 30 JUST BECAUSE THE TRANSITION FROM 60.
When will people learn.

Learn what? People play the games and feel something's wrong, you can't bend the physics, only try to deny them, but it won't have any actual effect. And no, motion blur also doesn't fix anything. It's been talked to death already during PS360 gen how shitty most of the games control, and at the same time CoD was universally praised for it's controls, how it plays. RAGE while not being a huge commercial success was also praised for it's controls, because Carmack put all the effort so the game would run at 60Hz. Many people (like myself) left the console market and moved entirely to PC once they read that PS4/XB1 games yet again run at garbage 30FPS. What people get used to is pretty visuals, the WOW effect fades away really, really quickly, after minutes literally, and then you're left with something that's not enjoyable to play at all, exhausting even.

What's annoying is at this day and age people still defend 30FPS. But the reasons are rather obvious tho, people can't fight their console wars on the internet without pretty screenshots, they need that ammo. And the best part is that those who desperately defend 30FPS can't decide what they really want either - 4K is a waste, RT is a gimmick, but at the same time 1080p is last-gen and unacceptable, where no longer than a year ago many claimed that 1080p, just like 30FPS, is also enough and we don't need more, and it's better to put the resources into better visuals. But again - people play games, not watch them, let alone static screenshots on internet, in motion, a.k.a. when actually playing the games, the image and motion clarity are what matters the most, at least for most people.

But luckily, I'm pretty sure Sony and MS are tracking the data, how many people use the performance modes and how many the fidelity ones, and seeing people's overall feedback it's safe to say that 60FPS is (finally!) here to stay, especially when all it takes is just lowering the resolution a bit.
 

Marvel

could never
I'm the same now on PS5, always gonna choose 60fps. Feels abd looks way better.

Putting performance RT mode on ratchet immediately when it arrives.
 

nkarafo

Member
Good. That was the first step.

Next step is a high speed, VRR monitor and 120fps. 60fps is just acceptable. 120fps is king.


Enjoy your 1080p games in 2023 lmao
30 fps in 2023 is far more laughable, lmao


It's annoying that at this day and age, people still complains about 30 JUST BECAUSE THE TRANSITION FROM 60.
When will people learn.
I like how you use the "at this day and age" argument to defend 30fps. As if 30fps this day and age should be an acceptable standard when it used to be 60fps in arcade 3D games because 30fps was unacceptable.
 
Last edited:

Whitecrow

Banned
Once you go 60FPS you'll never go back.




Learn what? People play the games and feel something's wrong, you can't bend the physics, only try to deny them, but it won't have any actual effect. And no, motion blur also doesn't fix anything. It's been talked to death already during PS360 gen how shitty most of the games control, and at the same time CoD was universally praised for it's controls, how it plays. RAGE while not being a huge commercial success was also praised for it's controls, because Carmack put all the effort so the game would run at 60Hz. Many people (like myself) left the console market and moved entirely to PC once they read that PS4/XB1 games yet again run at garbage 30FPS. What people get used to is pretty visuals, the WOW effect fades away really, really quickly, after minutes literally, and then you're left with something that's not enjoyable to play at all, exhausting even.

What's annoying is at this day and age people still defend 30FPS. But the reasons are rather obvious tho, people can't fight their console wars on the internet without pretty screenshots, they need that ammo. And the best part is that those who desperately defend 30FPS can't decide what they really want either - 4K is a waste, RT is a gimmick, but at the same time 1080p is last-gen and unacceptable, where no longer than a year ago many claimed that 1080p, just like 30FPS, is also enough and we don't need more, and it's better to put the resources into better visuals. But again - people play games, not watch them, let alone static screenshots on internet, in motion, a.k.a. when actually playing the games, the image and motion clarity are what matters the most, at least for most people.

But luckily, I'm pretty sure Sony and MS are tracking the data, how many people use the performance modes and how many the fidelity ones, and seeing people's overall feedback it's safe to say that 60FPS is (finally!) here to stay, especially when all it takes is just lowering the resolution a bit.

Good. That was the first step.

Next step is a high speed, VRR monitor and 120fps. 60fps is just acceptable. 120fps is king.



30 fps in 2023 is far more laughable, lmao



I like how you use the "at this day and age" argument to defend 30fps. As if 30fps this day and age should be an acceptable standard when it used to be 60fps in arcade 3D games because 30fps was unacceptable.
This "Only what I think is correct" mentality is childish and stupid and I wont waste my energy fighting this.

Either you are smart enough to understand how preferences work and what others are trying to say to you, or you dont.
Profit.
 

nkarafo

Member
This "Only what I think is correct" mentality is childish and stupid and I wont waste my energy fighting this.

Either you are smart enough to understand how preferences work and what others are trying to say to you, or you dont.
Profit.
It's fine if your preference is 30fps. But the argument "in this day and age" doesn't make sense. You are defending 30fps (which is your preference and that's fine) but that is a lower standard than it used to be. As time passes and technology improves, we expect those standards to get higher or at least stay as they were. But not lower. The "in this day and age" argument implies you want higher standards.
 

ZywyPL

Banned
This "Only what I think is correct" mentality is childish and stupid and I wont waste my energy fighting this.

ohtheirony.jpg... Just go to some of the recent R&C threads and check out how many people already claim they'll choose either of the 60FPS modes before even having the game yet. Same deal with Miles Morales, barely anyone bothered with RT reflections or 4K. Demon's Souls - I didn't saw even a single person here claiming he prefers the 30FPS mode. Same deal with all the 3rd party games out there. The 60 vs 120FPS modes on consoles are debatable, but 30 vs 60 is a no-brainer, there's no discussion. I mean, why even bother with PS5/XSX if you want to play the exact same games at teh exact same 30FPS? Stick with PS4/XB1 and call it a day.
 

KAL2006

Banned
The trade-off between resolution and framerate feels especially not worth it this generation. Many times it's 1440p60 versus 4K30, so it's not even like you are taking that much of a reduction in resolution. I saw a PS5 comparison of the FF7 remake between the quality mode and framerate mode. You basically got a small reduction in clarity in exchange for 30 extra frames. Factor in sitting meters from the TV and it's barely even noticeable.

The problem is when we get to next gen only games and developers push for 1440/30FPS. If that happens if still want a 1080p/60FPS option. My Sony TV has a decent upscaler so even 1080p content looks good.
 

Rentahamster

Rodent Whores
I put Spiderman on Quality mode and immediately I felt the game looked choppy almost like I can see the frames. I then put the game on performance RT mode and it was like a night and day difference.
You could have been experiencing cross talk which makes your low framerate all the more apparent if your monitor is implementing some kind of blur reduction.
 
Top Bottom