• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

i am starting to like chromatic aberration/film grain...etc

ZehDon

Member
Lol what? Was not aware this was a thing. Film grain is just baked into the medium until digital filmmaking removed it entirely what is this worked for decades to remove it thing lol?

I don’t get the outrage. Like you guys want to see the pure pixels or something. It’s so silly.
Removing grain from film via capture and processing methods has been around for a very long time; it's been the selling point for many film processing houses, as well as different photographers, for as far back as I can remember. It was considered quite an art form when chemically treated film could be produced without grain. Higher capture rates and better capture processing results in less motion blur. It's been the selling point of a lot of hardware for decades.
Film grain is just another tool for visual art design. Same as any other.

Games or films, it doesn't matter, its all just a choice by the artist. Judge the implementation, but view it as the artist intended.

In film, you know, where physical film with actual grain exists, it exists as imperfections in the process of filming something. When a video game, a medium that cannot use film, implements it, its simply imitating something else to borrow established connections.

For example, in film, film grain can be intentionally used to produce "raw", "gritty", and "dirty" images. This is primarily because audiences have a connection between film grain and those feelings due to the history of film. In years gone by, typically cheaper films, such as the grind house classics and b-tier horror films, couldn't pay for the expensive processes necessary to reduce grain, or, weren't skilled enough in their treatment of their film to clean it up. As a result, overly grainy images were often the trademark of cheaper productions or low skilled artists. This is why we see a lot of horror movies with lots of film grain - they're typically pretty cheap productions. Now-a-days, film grain can be a stylistic choice as well as a simple result of the process of filming something. That's a different conversation because film grain exists naturally in film.

Today, when you make a big budget AAA video game, and decide to slather the frame with artificial grain and artificial motion blur, you're just copying something else to piggy back off of the connections I described above. Rather than trying to create something unique for video games, its a hell of a lot easier to just copy/paste from cinema and call it a day. The sum-total of the artistic decision is "film grain = raw/gritty/scary in movies, and I want that too". That's just lazy in my book. I feel for the artists who spend their days making high quality textures only for the resulting image to look like Jimmy from Film School didn't know how to use the damn camera.
 

The_Mike

Member
I'm speechless. These features are the most disturbing and annoying settings ever created and those behind it should have a firing squad.

TLoU2 wouldn't have looked as good as it did without film grain. I don't even think they gave you the option to turn it off.
 

mxbison

Member
The irony here is that cinema has been working hard for decades to remove film grain and motion blur from their images, while game developers have been working hard to add it in because they think it's "cinematic". It's sad.

Are they still doing that? Because that high framerate tech in The Hobbit looked awful and I've never seen it used again.

Motion blur is something natural, don't see the point in trying to remove it.
 

Keihart

Member
Yep, i too think they have their place.
I don't get the appeal of super clean visuals in everything, cleaner doesn't equal better precisely, it's just like this obsession with upresing old ass games putting all their ugly bits on the spotlight or people modding multiplayer games to have less lighting effects because like that they can "see" better.
 

Md Ray

Member
so if a game is brilliant , 4k , locked 60 fps with chromatic aberration and you will find the game unplayable?
Yes? As an e.g., in RE2 with CA disabled, the image quality looked soooooo much cleaner and miles better than with it enabled. It gets difficult to play as soon as you enable it. Kinda like switching to 30fps immediately after getting used to 60fps.
 
Last edited:

Bankai

Member
To me, it all depends on the game. When playing a slick fast paced shooter (like Doom), I want it disabled. It distracts from the overall image.

But more atmospheric games, like Oxenfree or Life is Strange, I leave it on (if there's a toggle).
 

ZywyPL

Gold Member
Film grain can sometimes help to set the mood, when the game is set in 50s, 60s etc. but other than that it just makes things look worse. CA on the other hand, there's no excuse for this garbage, it's like the game is trying to achieve this 3D effect from the 90s, except the red/cyan glasses are nowhere to be found... Motion blur, depends on the execution, I personally don't mind a well implemented, subtle motion blur, but in most games it's simply exaggerated and all over the place. It does help to add the sense of speed in racing games tho.
 

HoodWinked

Member
CA is one of the better visual effects to come into use to blend a scene. Though it's easy to get heavy handed with it. I think when it's used in a subtle way most people would say the scene looks better but may not be able to tell why.

Film grain feels a bit artificial, games are real time so the noise which is supposed to be a result of playback from an analog recording doesn't make sense so runs counter to immersion. They have this in mass effect and destiny which is dumb but horror games going for a particular aesthetic works better.

Vignetting I dislike it makes the scene into more of a display as it confines the image to the display. Our visual system already does this as our eyes send visual data to the brain to process where it fills gaps and the edges of our peripheral vision go into a void.
 
Perhaps in horror games like Bloodborne & Alien Isolation it works, as it makes the visuals more unsettling. But in general I find Chromatic Aberration ridiculous and turn it off where possible.

Film grain & motion blur can be ok, depending on the implementation. I usually turn them off though, like in Spyro remastered where the screen becomes vaseline when you spin the camera.
This.
 
one thing i must add is that CA reminds me of the colour in those sega genesis games , which is a console very close to my heart
 
The use of CA in Bloodborne alone should be justification. It's a great artistic tool.
In film, you know, where physical film with actual grain exists, it exists as imperfections in the process of filming something. When a video game, a medium that cannot use film, implements it, its simply imitating something else to borrow established connections.
Video games can use film. It would not be hard to scan in film to use for sprites or background art.

Besides, games are already imitating films in so many ways: camera, lighting, staging, dialogue, action, sound design/foley, etc. depth of field and bloom, these are effects a camera operator would know how to get. I don't get the reason we need to draw the line at this one specific thing.
 
Last edited:

SoraNoKuni

Member
I like it when it's used creatively, during some events in the game and cutscenes.

When there is no reason and they just apply it artistically it really depends on the game.
 

Larxia

Member
I hate chromatic aberration wish passion, one of my worst ennemies in life, along with lens distortion.
I always hated it in games because I found it dumb, like MeatSafeMurderer MeatSafeMurderer just said, the cinema industry has been constantly trying to research and reduce these effects as much as possible, it doesn't make sense to add these lens errors in a 3D scene, which is actually by nature perfect in quality.

And I hate it even more now that I actually suffer from it in my daily life, because of glasses, actually made a thread here about this a while ago:

So yeah, definitely a huge nope for me. Film grain bothers me "less" but I still hate it and will try to turn it off as soon as possible. But I also turn off depth of field (unless it's a very cinematic game, like recently man of medan, where it made more sense, but I hate depth of field in actual gameplay), motion blur, vignette and stuff like this. Just give me a clear image quality.

I find it baffling that people / the industry are so into higher and higher resolution screens, 4K / 8K, HDR tech and so on, just to after all add a bunch of horrible layers to destroy the final image.
 

Soodanim

Member
I can see how with the right mix of display, game, and person it would be a desirable combination. Personally, I prefer games to be gameplay first and visuals second, so I’m not a fan of CA. I love it in an old film photo, though.

Same goes for film grain and the rest of it. I like my game image clean, but I’m usually sat at a monitor hoping for high frame rates. If I was at a nice big 4K OLED with HDR, I might feel differently about it.

To look at it a different way, it’s like the high frame rate discussion that happened a few months back. The guy from the UFO Test said that all motion blur should be done by the human eye/brain, it doesn’t need extra thrown at it. That’s how I feel about most extra effects in games.
 
Last edited:

ZehDon

Member
The use of CA in Bloodborne alone should be justification. It's a great artistic tool.

Video games can use film. It would not be hard to scan in film to use for sprites or background art.

Besides, games are already imitating films in so many ways: camera, lighting, staging, dialogue, action, sound design/foley, etc. depth of field and bloom, these are effects a camera operator would know how to get. I don't get the reason we need to draw the line at this one specific thing.
Lol, yes, because that's clearly what I meant. Video games cannot be produced on film. Period. Record gameplay and then attempt to play that footage the same way you played the video game with a controller. You can't. Video games require a computer to produce frames, in order to use the player's input. Film is in capable of this. A video games' frames do not inherently have grain - barring path traced images, of course - unless an artistic dunce puts it there to pretend they're a movie.

As for "games are already imitating films", you've mis-applied the term 'imitation'. A video game camera is not an imitation of a cinematic camera; they're wholly disparate. You're just seeing the same word used twice and drawing incorrect conclusions. Video games are not imitating camera, lighting, staging, dialogue, sound design, or foley. These are simply inherent aspects of a video game; a video game has these things, it is not pretending to have these things.
 

kyliethicc

Member
It's undeniable that these effects can enhance a game's visuals and art direction. Of course, if misused they can detract from the game's presentation, and they're certainly not things that fit every game, but for the right game they both can add a lot.

I specifically appreciate how Naughty Dog used these two effects on The Last of Us Part II. Given the game's tone, setting, and overall art direction, both film grain and chromatic aberration fit quite well alongside the other post-processing effects Naughty Dog use (motion blur, depth of field, lens distortion, lens flares, etc.)

I also think these effects have been used effectively in many other games. Just to name a few: Uncharted 4, Uncharted The Lost Legacy, God of War, Death Stranding, Call of Duty Modern Warfare, Ratchet & Clank Rift Apart, Spider-man, Spider-man Miles Morales, Ghost of Tsushima, and more.

I get that for some Bloodborne's use of chromatic aberration is too much at times, but even there I don't have that much of an issue with it. It does fit the game's sorta creepy surrealist aesthetic.
 
Last edited:

ZehDon

Member
... a "defence" that uses one game as an example, comprised of a contaminated A|B test? I'd suggest looking at some other games as well to make your point, because one game doesn't make for much of a "defence".

Speaking on the examples you've created, the use of the Noir filter contaminates any attempt at an "A|B" test of preference for Film Grain and Chromatic Aberration. In an A|B test, when testing [X], [X] must be the only difference between A and B. I'd recommend re-doing them without any filter, so the only difference between the images is actually the Film Grain and Chromatic Aberration, as the thread title suggests. I'd also recommend creating a two frame GIF with labels for this type of test. Flickr prevents easy swapping back and forth for a proper comparison when using the free versions, whereas a GIF will make it much easier to show whatever difference you're attempting to show.
 

ckaneo

Member
Film grain can look alright depending on what the game is going for, im not quite sure I understand the purpose of chromatic aberration
 

GymWolf

Gold Member
Film grain can add some atmospher but his clear use is to hide low detailed parts of the game.

CA looks good...if you are blind i mean.

But i'm a particular case, i always disable even blur and dof, i want clarity in my vg, not blurriness and stuff out of focus, fuck the cinematography.
 

Kuranghi

Gold Member
Due to this thread bump I gave my whole family Chromatic Aberrations as a gift and now they all look at my funny. Thanks OP.

Yeah I hate it in almost every form its ever used outside of showing an actual CA effect on an input/display that would really have it. Ie 1 out of every 1000 times its used in games.

Film grain is also 99% of the time badly used in games imo, they use it in RE2 Remake to hide low bit depth banding in shadowed areas so it breaks the IQ if you turn that setting off on PC. I still turned it off since it reduces detail in even the middle distance so much and setting the brightness sliders "correctly" means you hardly ever see those overbrightened shadowed areas in gameplay anyway, you'll still see the banding a bit in cutscenes though annoyingly.
 

Kuranghi

Gold Member
But i'm a particular case, i always disable even blur and dof, i want clarity in my vg, not blurriness and stuff out of focus, fuck the cinematography.

I used to like the DoF effect but I think in large part due to massively increased rendering resolutions murdering the framerate when combined with a properly HQ DoF effect devs have mostly changed over to a shitty inaccurate post processing version of it like the old days (Like in Phantom Pain the DoF on every setting below Extra High/Max looks shit in comparison to Max).

RE2/3 Remake has a really nice HQ effect but its also misaligned a lot of the time and blurs out faces partially (Marvin in cutscene of RE2 Remakes first encounter with him in the main hallway and the first encounter with old hag in Village) which is just trash and ruins the cutscene imo. Not to mention its superbly not optimised for native 4K so when you have a cutscene showing a closeup of a/several characters faces and hair with SSS + DoF on at native 4K it can halve the framerate when 99.99% of the rest of the game was locked to refresh rate. So totally not worth to turn it on from that perspective.
 

Kuranghi

Gold Member

What are you meaning to show here? Grain?

The right image has a much higher rendering resolution and doesn't have the TAA/reconstruction on so its sharper and has no ghosting. Also they have captured images direct from the engine output according to their specifications as opposed to the left being DFs capture of the games final output to screen.

As far as I know the ghosting (on the edge of her face) has nothing to do with CA, its a temporal artifact.
 
Last edited:

Trimesh

Banned
I would be perfectly happy if both of them went away completely for ever. I would add lens flare to the list too, since that's another artefact of technical limitations that has been incomprehensibly added into technologies that don't suffer from it.
 
Top Bottom