• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I actually prefer Spider-Man (PS5) at 30fps.

No matter what information is provided to people like you, how much we try to convince you with data on why your post is the most retarded shit today on this board, you still believe you made a great post.
No matter how many times people like you try to convince people you still don’t realize that some people aren’t as susceptible to frame rate as other and don’t notice as much of a difference but you still believe you made a great gotcha post.
 

Arun1910

Member
I think part of this is conditioning.

I played Spider-Man at 30fps on PS4 and thought it was fine, but I got used to it.

When I played it at 60fps on PS5 it felt so damn weird (which is weirder considering I mostly play 100fps+ on PC), almost unnatural as I was conditioned to the 30fps mode (I played through Spider-Man and its DLC twice in the past.)

I stuck with 60fps for a few hours, switched it back to 30fps and then it was like I was watching a sideshow.

Conditioning is funny.
 
Last edited:

ResurrectedContrarian

Suffers with mild autism
Many people agree that movies look markedly worse at high frame rates, so I think that alone demonstrates that 60fps isn't universally better than 30fps, and that there is a certain feeling to 30fps which is more of a contribution to style than a technical limitation. For comparison, chromatic aberration is also "wrong" and "objectively" less accurate, yet it works for some games because it simulates the appearance of film. I see nothing which makes keeping 30fps somehow more irrational than using aberration; both are stylistic choices in certain situations.

You can always play at 60 and just blink really fast to simulate the cinematics when needed.

basically the equivalent of emulator scanlines, but for framerate instead of resolution
 
I think part of this is conditioning.

I played Spider-Man at 30fps on PS4 and thought it was fine, but I got used to it.

When I played it at 60fps on PS5 it felt so damn weird (which is weirder considering I mostly play 100fps+ on PC), almost unnatural as I was conditioned to the 30fps mode (I played through Spider-Man and its DLC twice in the past.)

I stuck with 60fps for a few hours, switched it back to 30fps and then it was like I was watching a sideshow.

Conditioning is funny.
But couldn't you then condition back to 30fps?

To expect every game to be 60fps is kinda unrealistic especially as the hardware ages and people wanna see better and better visuals. Something has to give and it's usually the frame rate since it's harder to sell a frame rate to someone.
 

nkarafo

Member
Many people agree that movies look markedly worse at high frame rates, so I think that alone demonstrates that 60fps isn't universally better than 30fps, and that there is a certain feeling to 30fps which is more of a contribution to style than a technical limitation.
Like i already mentioned, the frame rate standard of movies was not an artistic choice. It was a cost effective compromise. Everyone who prefers that only does so because they got used to it. If they really wanted the best regardless the cost, movies would be 60fps since forever and you would singing a different song.

Remember. This was a cost reduction compromise. Not an artistic choice.
 
Like i already mentioned, the frame rate standard of movies was not an artistic choice. It was a cost effective compromise. Everyone who prefers that only does so because they got used to it. If they really wanted the best regardless the cost, movies would be 60fps since forever and you would singing a different song.

Remember. This was a cost reduction compromise. Not an artistic choice.
When I watch movies, I want to be removed from reality, not feel like I'm looking out a window at some actors and seriously, 24fps actually hides a lot of mistakes. Stunt doubles are easy to spot and fight scenes look like 2 grown men play-fighting at high frame rates.
 

nkarafo

Member
When I watch movies, I want to be removed from reality, not feel like I'm looking out a window at some actors and seriously, 24fps actually hides a lot of mistakes. Stunt doubles are easy to spot and fight scenes look like 2 grown men play-fighting at high frame rates.
You can try and cherrypick anything you can to justiify your defence for this low fps standard. It doesn't change the fact that it was chosen because it was a balance between convincing enough motion and cost reduction. Plus some technical issues that had to do with 1920s tech. It was not chosen because it had something better to offer or because it added anything artistically. It was only to save money.

What you describe is just what your brain has been used to. You would never think that way if the standard was higher from the start. And in this case, if anyone showed you a 24fps movie you would probably say it looks like shit.
 

Arun1910

Member
But couldn't you then condition back to 30fps?

To expect every game to be 60fps is kinda unrealistic especially as the hardware ages and people wanna see better and better visuals. Something has to give and it's usually the frame rate since it's harder to sell a frame rate to someone.
Yeah of course. But at that point why would I want to.
 
Last edited:

ResurrectedContrarian

Suffers with mild autism
Like i already mentioned, the frame rate standard of movies was not an artistic choice. It was a cost effective compromise. Everyone who prefers that only does so because they got used to it. If they really wanted the best regardless the cost, movies would be 60fps since forever and you would singing a different song.

Remember. This was a cost reduction compromise. Not an artistic choice.

This is still much more complex to disentangle than you suggest. You could similarly point out that the historical reasons for the widescreen format were primarily to differentiate the theater from the home and give a reason for people to show up instead of watching TV, so they chose the simplest method of showing off, which was to widen the view in the theater (this was a later development in response to TV; movies in theaters had the squarer ratio for a long time until this marketing change)--and that rectangular format became standard for us all, eventually, since it was associated more with high-production movies rather than the squarer format which says "made-for-TV" to our eyes. In other words, the initial origins of something do not tell us whether it was a positive development or better than its predecessors / successors.

Even with a new generation of younger always-online consumers, who don't have a long history of watching traditional films, notice that all new television shows produce for Netflix etc still keep the standard framerate, outside of an occasional documentary-focused program. When directors & cinematographers experiment with framerates, there is almost universally a recognition that something in the 24-30fps range feels more artistic and "soft" than the harsher documentary feel of 60fps. More isn't always better. I think this will continue to be debated and experimented upon, but there is almost no real movement towards high framerates in serious filmmaking or television in 2021, despite the cost effective digital cameras.
 
Last edited:

Audiophile

Gold Member
I explained to you (but you didn't respond) how your "emotional response" is based on technicalities and compromises the movie industry did 100 years ago. It's not an artistic decision, it's just cheaper. Same with cartoons. Despite first generation cartoons making use of the same (already compromised) frame rate as movies, they decided to halve that as well to make them cheaper.

Your "emotional response" is nothing more than you being used to a cost cutting standard.

Even videogames started life with filling every refresh the TV could produce with a different frame. Any videogame that repeated the same frame every 2 refreshes was a compromise to show more complex graphics on weak hardware. Someone probably though, "eh, if movies get away with it, games can do it too".

Most people have this idea that graphics are sacrificed for 60fps and that smooth frame rate comes with compromises. But it's the other way around. Frame rate is sacrificed for more visual effects and more detailed graphics come with compromises. Making full use of the refresh rate your TV can provide should be the default.
Oh I'm aware of why many of these standards and limitations exist (and apologies for missing your prior post). However, citing the reasons why those limitations are there in the first place doesn't negate the subjective responses we have to them. Nor does it negate their use for artistic purposes since.

Almost all creative endeavors are constrained by some sort of limitation, usually as a matter of cost. But those limitations can become an important characteristic of the content itself and how it makes us feel.

I concede that higher framerates are technically superior, I just don't think they're fitting for all content. Higher framerates get you closer to reality, lower framerates the opposite. There is of course as you allude to: ~100 years of history in movies that informs us as to what feels "cinematic", "grand" or "fantastical" as a matter of what we're used to, but I'd argue the intrinsic characteristics of those lower standards play their part too.

I'm all for the highest numbers, the best standards, greater flexibility and pushing tech as far as possible. But as a means of providing more possibilities and a wider canvas to creators to present their work faithfully. I'd argue the default should be whatever aligns most with creative intent, whether as a result of what they deem an acceptable compromise given a set of limitations or if it is a direct result of their actual creative vision. And, in the context of videogames being an interactive visual medium as opposed to a passive one, additional options are great.

You can try and cherrypick anything you can to justiify your defence for this low fps standard. It doesn't change the fact that it was chosen because it was a balance between convincing enough motion and cost reduction. Plus some technical issues that had to do with 1920s tech. It was not chosen because it had something better to offer or because it added anything artistically. It was only to save money.

What you describe is just what your brain has been used to. You would never think that way if the standard was higher from the start. And in this case, if anyone showed you a 24fps movie you would probably say it looks like shit.
As above, it's not a matter of why these standards came about. It's a matter of how they effect us.
 
Last edited:
You jest but even with cartoons, higher fps is better

tumblr_p2w0qcv2pT1u59bv3o1_500.gifv


tumblr_ncmyq19oNm1sqr7kvo1_500.gif


w2TPM.gif



97E1935435A95E2CF513D3F66D0EB144B1B80991
True. But how many of those pretty GIFs run at 60 FPS?
I explained to you (but you didn't respond) how your "emotional response" is based on technicalities and compromises the movie industry did 100 years ago. It's not an artistic decision, it's just cheaper. Same with cartoons. Despite first generation cartoons making use of the same (already compromised) frame rate as movies, they decided to halve that as well to make them cheaper.

Your "emotional response" is nothing more than you being used to a cost cutting standard.

Even videogames started life with filling every refresh the TV could produce with a different frame. Any videogame that repeated the same frame every 2 refreshes was a compromise to show more complex graphics on weak hardware. Someone probably though, "eh, if movies get away with it, games can do it too".

Most people have this idea that graphics are sacrificed for 60fps and that smooth frame rate comes with compromises. But it's the other way around. Frame rate is sacrificed for more visual effects and more detailed graphics come with compromises. Making full use of the refresh rate your TV can provide should be the default.

Well, the fact that 24ips once was a technical limitation shouldn't invalidate the fact that it can be an artistic choice today.
 

dcx4610

Member
Never will I prefer the option of 30 over 60. 30 is a compromise in gaming. There is no benefit like film.

It sucks that in 2021 we are still having to choose.
 

Represent.

Represent(ative) of bad opinions
Its WAY better at 30FPS on PS5.

Visual fidelity is way more important. 60fps is hardly even noticeable. ray tracing makes a huge difference.
 

RafterXL

Member
Imagine being so brainwashed by a *limitation * that you start perceiving it as a feature.

Thirty fps is inferior, in all circumstances...period. The Stockholm Syndrome from console gamers who can't get over the abuse of 30 fps and embrace a better option is sad.

If there were a 60 fps Spider-Man option with all the bells and whistles and a 120hz version with turned down settings you'd be here extolling the virtues of 60fps and how it's superior to 120hz. The fact is, the *only* reason why 30 fps has *ever* been acceptable is because developers have used the power to make things prettier.
So is there anything inherently wrong with preferring the graphical bells and whistles over framerate?
No, but just admit why you're doing it. If you could get the same exact graphics at twice the framerate you'd choose it. Your love for 30 fps begins and ends with the fact that you have no choice.
 

tommib

Member
You can try and cherrypick anything you can to justiify your defence for this low fps standard. It doesn't change the fact that it was chosen because it was a balance between convincing enough motion and cost reduction. Plus some technical issues that had to do with 1920s tech. It was not chosen because it had something better to offer or because it added anything artistically. It was only to save money.

What you describe is just what your brain has been used to. You would never think that way if the standard was higher from the start. And in this case, if anyone showed you a 24fps movie you would probably say it looks like shit.
I think you need to stop talking about cinema.

When you’ve built a whole history of masters like Welles, Ford, Hitchcock, Kubrick, who had all the money and access to whatever technology they could have at the time - bloody Kubrick worked with NASA for his camera lenses - and not for one second these masters of their craft thought of changing the format to make a better picture your whole “conditioning” point falls flat.

You think you’re seeing beyond the Paul Thomas Anderson’s and the Scorsese’s of today? It’s about legacy, the weight of film history and elegance.

Games are a whole different species. OP was just saying he prefers 30 fps. There’s no conditioning. You’re coming out as an oracle and beacon of light and undermining everyone else - and being super patronizing.
 
Last edited:

nkarafo

Member
True. But how many of those pretty GIFs run at 60 FPS?
I don't think a 60fps cartoon exists. I'm not aware of a single one. The 24fps ones are already extremely demanding for today's low standards anyway. I think it's reasonable to suggest 24 different frames is the maximum the most patient artist will draw by hand. Still, it looks much better than the average 10-12 fps cartoons. I assume 60fps would look amazing. I'm talking about different, hand drawn frames, not some kind of interpolation.
 
Last edited:

DustQueen

Banned
Well...
Ok...


Web slinging n traversal is basically automated.
Combat is Arkham City + evade.
So it is also half automated
It all nicer in 60...not that much required from the player to not be playable at 30.
Plus writing n plot is about being as save as possible.

So no wonder some people try unconventional methods to make it...more grand.
 

EDMIX

Member
This comes down to preference. I agree with what the user is saying for several titles. I hated 60fps for Mass Effect, Dead Space, The Last Of Us and a few other titles. Its not to say I hate 60fps in general as many games I play with it feel and look great, but some titles look odd and it makes it look like a fucking soap opera or your watching the news vs a fucking cinematic game.

This is 100% a preference, to each their own thing.

Many can go on and on talking about "factually" how 60 fps or more fps is better, but this is a subjective thing when it comes to how someone FEELS about the setting. For fuck sakes, that would be like telling an artist 8k is objectively better, so they better not use that old 16mm camera..... you know, just fuck what they are going for, fuck the art direction, fuck the actual context artistically of the film, but yea bro 8k da best bro /s

In a technical sense I do not disagree......we are not fucking talking about that here folks, we are talking about a subjective idea on how people FEEL about the setting. This thread going back and forth literally proves its something people can have lots of different opinions on, thus not an ironclad thing.
 

Shut0wen

Member
I know it has become a bit of trope that 30fps is closer to 24fps; and therefore it is "more cinematic". But this is one of those games where it holds absolutely true for me.

While I am in absolute agreement that 60fps feels more responsive and has less undesirable judder; and I do prefer 60fps for many game types, (first person, survival, sim, arcade and many other third person games etc.). Spider-Man just feels better to me in 30fps in terms of a sense of momentum, in terms of the visual believability, how the assets appear and just my overall emotional response to the game.

The heightened sense of reality can bring about a sense of mundanity that makes the experience feel less fantastical. Just as in movies where most scenes at HFR break down and literally come across as sets and actors in dress up.. In certain games that are trying to be conveyed as intrinsically cinematic or epic experiences, the more rudimentary nature of assets becomes more apparent with increased temporal resolution.

Insomniac also have exceptional motion blur; and the 60fps modes do not appear to have appropriately adjusted shutter speed for it. With this also comes that sense of momentum: 60fps is giving me more information but it feels mundane, but at 30fps I feel like I have weight and I'm swinging faster. The motion is implied within each frame and my brain is filling in the gaps to create something greater than the sum of its parts. Whereas at 60fps I'm just getting my current position.

Again, I'm totally in favour for 60fps for most other games as well as being an option for games like this. But, for games like this..... that present the character/s and the world to you as one in a somewhat cinematic or epic manner as opposed to presenting the world to you as if you're the character, then I hope making a 30fps option available remains the norm (just as ND did with the recent TLOUII PS5 patch, you can choose either in the menu).

While of course movies and games are different things, there is obviously some cross-over, particularly in games such as Spidey. But one eg. I'd also like to give in respect to movies and in a different regard (spatial resolution and shutter speed vs temporal resolution) would be Mission Impossible: Fallout. There's a few "LIEMAX" scenes near the end shot on a Panavision 8K digital camera at 1.90:1 (as opposed to the 35mm Film 2.35:1 scenes elsewhere). Now from what I can gather these are meant to make the scope feel bigger, to make the action scenes more clear and epic. In reality, these shots overload my sub conscious with visual information (high frequency detail, smoother motion) and subsequently they felt more mundane, too real and removed me from the movie; the fixed shots of Cavill in the chopper just felt like someone stuck a high res camera in a chopper on a documentary.

Don't get me wrong, I'm favour of all kinds of approaches (and especially options for the end user) across different content of different types, but I think it's important to take into account the visual texture of the medium in addition to the content and how it impacts peoples' emotional response to it. And of course, when it comes to games and if it is 30fps, it has to be rock-solid with tight frame-pacing, no tearing and as little latency as possible.

What I don't like is when a game is originally released in 30fps but is updated for new platforms and is only available in 60fps. Days gone did this on PS5 and while I personally do prefer 60fps for it, there should always be an option to revert to the original fps. I can't recall if it was the The Nathan Drake Collection of The Last Of Us Remastered; but one of these released on PS4 with only 60fps available and I absolutely hated it, the texture of it not only robbed it of the feels but last gen geometry stuck out like a sore thumb.

So yeah, to summarise: I think Spidey on PS5 just feels better at 30fps, greater sense of momentum, greater emotional response, more aesthetically pleasing image and more artistically in keeping with the content. And I personally prefer to trade-off some responsiveness and visual comfort in favour of that.
I get laughed at for this but i actually dont mind playing a game on 30fps especially when its stable fps, people forget while playing games in early 2000 were all 30fps but atleast it was stable on some consoles and early pcs, dont get me wrong id prefer a game to be 60 but if its less i cant see how that is a problem
 
Let's apply some reasoning here. Let's take NES as a baseline. Everyone loved the NES. Great system. 256x224-ish on the resolution. 60 fps. Sure a game could have slowdown, that was universally considered bad though. Though I'm sure the OP thinks game slowdown as desirable just by default... because it makes him feel like Neo in the Matrix or some other bullshit. Like, you can't really speak logically with gramophone hipsters rambling about how warm vinyl sounds, ya know?

Anyway taking that baseline let's extrapolate. We're on to 4k console gaming baby! 3840x2160! So let's see that's uhh... 8,299,400 pixels. And the NES resolution works out to.... 57,344... so let's see... let's divide that out... So that's about 145x the number of pixels... so let's just multiply that out by our 60fps baseline frame rate and we get.... Well console gaming at 4k really should be targeting 8,684 fps. Though really I always thought comparing raw pixel counts when they spread in two dimensions was a little silly, so I'll do you a solid and further divide that by 2... so 4,300 and change for the FPS target...

Obviously I'm being just a touch farcical here... but if you consider 30 fps to be the 'good enough' frame rate (which I don't, I'd give you 60), and pair it to what I think is a 'good enough' resolution (I'd say 640x480 or 800x600 really were 'good enough' in the same vein, they're really only kinda nasty in the modern age due to fixed pixel display tech that we all use) your complementary frame rate for 4k is pushing 120hz.

So for 30 fps as cinema... like... the concept might not even be completely without merit. 24 fps films have a natural motion blur from the shutter speed and the reminds me of the blur I see when something moves quickly across my vision. When you actually track something with your vision it tends to be pretty clear but I can agree when I look at that 60 fps Will Smith action thing upthread that parts of it look off. But not everything looks good at the 24 fps. An entire industry was built around trying to make it look as good as possible, but to some extent it's a mirage. Yes the blurriness at times is nice... other times if they're trying to film some melee battle or something they will cut the shutter speed to get rid of the blur. You get really choppy frames that show more clearly what's going on. It's an aesthetic that maybe you're conditioned to enjoy, but if you give it more frames and give people a side by side comparison I doubt anyone would take choppy clarity over fluid clarity. And at higher frame rates you can add motion blur in post production if blur truly is a virtue of the old standard that people won't ever let go. The movie industry has had 100 years to figure out how to make things look good at 24fps.
 
It's perfectly fine that for some the 30fps "look" is preferable to 60fps or higher. It's closer to the framerate at which movies and TV shows run at after all. But it's personally unfathomable if someone says they think playability is equal or even better at 30fps vs. 60fps or higher. 60fps objectively and measurably results in lower input lag, aids reaction times, enables more precise aiming and movement and just feeds a lot more visual information that greatly enhances gameplay. That is not up for debate. It is kind of amusing how some people don't notice this. But hey, if that's what you prefer, that's what you prefer.
 

Stooky

Member
True. But how many of those pretty GIFs run at 60 FPS?


Well, the fact that 24ips once was a technical limitation shouldn't invalidate the fact that it can be an artistic choice today.
with 2d animation its a cost and time issue. Most 2d animation is animated on 2's 12 frames a second where each frame is doubled up to make 24. With realtime graphics its an art direction and hardware limits.
In cinema its at 24fps it multiple reasons. The more frames you add the more detail you brain registers, things start looking fake. You see that an actor is wearing prosthetics. The sets start looking fake. Cg fx cost more with added production and render time. In some cases the acting starts to fall flat. At 24fps your brain does more work to fill in whats not there. In my opinion the the higher the framerate you start to lose impact, puches and kicks dont hit as hard. In fighting games, melee at 60fps sometimes hold/exaggerate a few frames on impact so hits register and the player can fill the impact.
 

hussar16

Member
all u guys are clown wanting 60fps ''smooth'' when the real issue here is the monitors not the framerate,, tvs you are playing at have bad motion handling compared to something like crt of old when a game with 30fps looked buttery smooth and sharp. 30 fps actually would still be the norm in gamese if the tvs nowadays had the refresh rate motion of crt monitors , not many people think about this fact ,they think that higher framerate means better or clearer when all it does it adds soap like effect and makes it less cinematic
 

Neo_game

Member
console players 2012: the human eye can't see more then 30fps so who cares
console players 2021: 60 fps is a must

Well said. I have no problem in 30fps and I am sure most can adjust if they want to. For me the resolution bump is probably not worth it though, I prefer better gfx for the compromise in fps. Not sure if it was linus or someone who said 60fps make gfx look better in motion as you can see more in 60fps as the game runs some 16ms faster. I am not sure I agree with that as I can see low res shadows, flickering, lower textures quality, draw distance etc during gameplay as well. I guess it depends on the game. Some look really clean with little to no artifacts and is unnoticeable during gameplay.
 

ahmyleg

Banned
The lower the fps, the easier it is for graphical imperfections and unrealistic/wonky animation to hide, so there's merit in your view, IMO. I still wouldn't trade it for the fluidity of higher fps, though.
 

Whitecrow

Banned
Yes, there's also good and bad opinions lol
And who can judge what is a good or a bad a opinion?
Because there's also smart people who can rationalize and analyze things, take into accounts different points of view, and get a conclusion from the data they have, and dumb people that just say 60 fps always because yes out of their arse.
 

tommib

Member
So what I get from the back and forth here is that there’s people who don’t feel violated by playing a game under 60 fps and they’re cool with whatever frame rate is available and then the elitist visionaries who won’t play a game under 60 fps.

And the “play whatever frame rate” crowd is the one being accused of being conditioned and have low IQ. Got it.

And it’s not an attack on you, nkarafo, you make super valid points - I just don’t see the need to bring film history into the discussion since that has solidified as an art form (with real artists cementing it). We’re talking games anyway.
 
Marginal in its benefits or perceived effect. Similar to doubling the price of an iOS game from $1 to $2. It’s double the price but it’s still marginal in its impact on the pocket of the buyer.
Agree to disagree. Aside from 30FPS being juddery the additional input lag is like playing with your hands submerged in a bucket of molasses. Even 60FPS isn't great.
 
Agree to disagree. Aside from 30FPS being juddery the additional input lag is like playing with your hands submerged in a bucket of molasses. Even 60FPS isn't great.
Actually input lag can be drastically reduced even with 30fps games. I read an interview with the developers of DriveClub and they process the inputs before the graphics so you get almost 60Hz input response but with 30fps gameplay. For example, Mortal Kombat X at 60fps has an input lag of 107ms whereas DriveClub had 116ms at 30fps. That's a difference of 9ms which would be indistinguishable.

The fact is, console developers have tricks to get the input response as fast as possible even on 30fps games. Try playing Horizon Zero Dawn on PS4/PS5 and you'll notice the input response is very fast and doesn't feel like it lags behind at all. Same thing with Spider-Man remastered at 30fps... the controls feel very responsive to me.
 

Wonko_C

Member
When I watch movies, I want to be removed from reality, not feel like I'm looking out a window at some actors and seriously, 24fps actually hides a lot of mistakes. Stunt doubles are easy to spot and fight scenes look like 2 grown men play-fighting at high frame rates.
This sounds so weird to me. The only movie I was able to watch at HFR was the first Hobbit, and despite hearing all those "soap opera" critics beforehand, I was floored. I don't know if it was a combination of HFR and 3D but watching the movie that way had this unexplainable dreamlike quality to it, like I was watching a living, magical theater play.

I really hope they bring back HFR+3D for Avatar 2 & 3 in my city. Watching them without it will feel like it's missing something.

all u guys are clown wanting 60fps ''smooth'' when the real issue here is the monitors not the framerate,, tvs you are playing at have bad motion handling compared to something like crt of old when a game with 30fps looked buttery smooth and sharp. 30 fps actually would still be the norm in gamese if the tvs nowadays had the refresh rate motion of crt monitors , not many people think about this fact ,they think that higher framerate means better or clearer when all it does it adds soap like effect and makes it less cinematic
I still play my NES on a CRT and you can clearly notice when a game is 30 fps there. Games like Contra, Super Mario Bros. Castlevania all run at 60fps, while Double Dragon I/II/III, River City Ransom run at 30fps. (All of them from Technos, strangely enough). There's some crap that runs even lower than 30fps like Tiger Heli and 1942, which were converted from the arcade by the same dev.

And it's not a matter of "I can see it now because the Internet taught me about it". I always noticed some games looking "choppier" than others, even back in the mid-80's.

If anyone is curious, playing these same games in emulators yields the same results, even with current flat panel displays.
 
Last edited:
This sounds so weird to me. The only movie I was able to watch at HFR was the first Hobbit, and despite hearing all those "soap opera" critics beforehand, I was floored. I don't know if it was a combination of HFR and 3D but watching the movie that way had this unexplainable dreamlike quality to it, like I was watching a living, magical theater play.

I really hope they bring back HFR+3D for Avatar 2 & 3 in my city. Watching them without it will feel like it's missing something.
Yeah when I watch movies, I want to be removed from the movie, not in it. I don't wanna feel like I'm looking at reality or watching a live play. I want it to look like a movie. High frame rate movies look cheap to me and you can notice all the imperfections and the clothing looks like costumes and when characters hit each other you can tell they're faking it. It's just offputting.
 

Wonko_C

Member
Yeah when I watch movies, I want to be removed from the movie, not in it. I don't wanna feel like I'm looking at reality or watching a live play. I want it to look like a movie. High frame rate movies look cheap to me and you can notice all the imperfections and the clothing looks like costumes and when characters hit each other you can tell they're faking it. It's just offputting.
TBH when I watched The Hobbit I didn't notice any of that. And I've been told that exact same thing countless of times, but I just can't see it. Maybe someone could point me on what to look for?
 
TBH when I watched The Hobbit I didn't notice any of that. And I've been told that exact same thing countless of times, but I just can't see it. Maybe someone could point me on what to look for?
I don't really think it's something that can be pointed out. You just get a different feeling from watching it like you're looking at a live play or looking out your window. It just looks like grown ass adults in costumes playing make-believe. It's just the way I see it.
 
Top Bottom