• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Attention All PC Game Devs! When you add motion blur and chromatic abberration to your game, remember this

walking-dead-get-the-fuck-outta-here.gif


fuck-you.gif


cookie-going-to-hell.gif
Fuck you, CA rocks like a motherfucking boss.
 
I actually don't mind chromatic aberration in bloodborne.
Actually I like it in scary games.
I love it - it gives this CRT/VHS feeling in games with realistic graphics.

That said, I understand completely people not liking it. It can strain the eyes if overused, and distort the detail of the textures.

In the end, if it's an OPTION, it doesn't hurt anyone.
 
Why PC devs in particular though? At least you can remove them from most PC games, unlike with consoles. Even if there isn't a graphics setting, there's bound to be a .ini file or a mod that can remove them.
 
I was actually going to say that TXAA looked really good on some games and was a first great improvement in Anti Aliasing, then I saw the rest of your message mentionning TXAA lol. Maybe you tried it on games where it wasn't very well implemented? But I remember it being really good in Assassin's Creed 3 and Black Flag, it didn't feel blurry at all and gave a very clean looking image. On some other games like ArcheAge however it was absolutely awful. But the tech when used properly looked really nice.

The best anti aliasing I have ever seen though is SGSSAA, it was a very specific thing that you could tweak through nvidia inspector, giving a perfect image quality looking CGI-like, but it was very demanding and only worked on DX9 games, maybe some rare DX11, it needed MSAA to work.
Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.

The biggest problems with most Taa, like even unreal 4's Taa, is that even at 1080p it just destroys sharpness and looks even less sharp than native 720p with no AA. That is just stupid and a waste of pixels. At 1440p it's much sharper, but again, less sharp than just plain old 1080p.

The second problem is ghosting, which again, destroys clarity.

So yeah, I like sharpness, and TAA is its sworn enemy.
 
Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.

The biggest problems with most Taa, like even unreal 4's Taa, is that even at 1080p it just destroys sharpness and looks even less sharp than native 720p with no AA. That is just stupid and a waste of pixels. At 1440p it's much sharper, but again, less sharp than just plain old 1080p.

The second problem is ghosting, which again, destroys clarity.

So yeah, I like sharpness, and TAA is its sworn enemy.
I don't agree. I hate jaggies and especially shimmering. Even Tata is not enough most of times. Dlss quality is the only solution that is close to ground truth for now.
i take slight blur for that postcard clean look
 
having options is good. on console it's not always possible to disable them.

i tend to turn motion blur off in most games because i don't have any need for it when playing at 160fps. if i'm playing a game at a low framerate like 60fps i might turn it on. CA doesn't really bother me actually but hey...if you don't like it you can turn it off or mod it out. again, not always possible on console.
 
I'm not going to say CA and MB should never exist, but there should always be options to toggle them if they're there. I shouldn't have to go into config files to turn off CA.
 
Whatever it was that Nintendo used in Twilight Princess can fuck off. Played that on an emulator recently with the bloom turned off.. muuuuch better.
 
I don't agree. I hate jaggies and especially shimmering. Even Tata is not enough most of times. Dlss quality is the only solution that is close to ground truth for now.
i take slight blur for that postcard clean look
Surely you run everything at native 4k on your pc. I would never use taa on pc if i have a choice. Try witcher 2 at 4k with ubersampling and tell me if you still need want taa lol.

Also we should ask why TC lumped motion blur in the same category as CA which is basically hemmorrhoids for the eyes and is absolutely never good. :messenger_dizzy:
 
I've actually come around to liking motion blur a chromatic aberration in some games. I feel as games have become super realistic looking it kinda fits to have that filmic look.
 
Surely you run everything at native 4k on your pc. I would never use taa on pc if i have a choice. Try witcher 2 at 4k with ubersampling and tell me if you still need want taa lol.

Also we should ask why TC lumped motion blur in the same category as CA which is basically hemmorrhoids for the eyes and is absolutely never good. :messenger_dizzy:
Native 4k is jaggy as fuck. Uber sampling is 8k downsampled to 4k
 
Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.

The biggest problems with most Taa, like even unreal 4's Taa, is that even at 1080p it just destroys sharpness and looks even less sharp than native 720p with no AA. That is just stupid and a waste of pixels. At 1440p it's much sharper, but again, less sharp than just plain old 1080p.

The second problem is ghosting, which again, destroys clarity.

So yeah, I like sharpness, and TAA is its sworn enemy.

I have come to some bizzare way of getting a sharp picture with no jaggies on my big 75inch tv.

Basically the 4k image on my Phillips ambilight is pretty soft. 1440p looks even blurrier.

I noticed on my cable tv box that 1080p looks alot sharper, the tv must be doing some processing and sharpening.

So now I output my PC and ps5 to 1080p to the tv and use Drs factors to downscale 4k or 1440p (ps5 downcales automatically)

The result is a really sharp picture with little to no jaggies. It's bizzare like I said but the image is so much more pleasing to me than outputting to 4k.
 
I'm adding depth of field. I don't care how much peeps like Bokeh it is still shit like normal DoF. Especially abominations like Octopath Traveler, though at least there are mods to disable that crap on PC.

I do agree with just add toggles to turn it off, I get that some peeps like it, so no need to take away from them and their shitty tastes. Just let me have a clear and clean image.
 
Motion blur has to be the dumbest ideia anyone ever had in videogame's graphics.
The person who invented this crap should be taken to the gulags.
An option to turn off motion blur off, is absolutely essential in ALL games.
In fact, the default should be to have this crap turned off.
It's surprising how many games still have motion blur on, as a default. It's like devs don't play their own games.

Chromatic aberration is also a bad ideia. As is film grain, and lens flares.
Depth of field is also a problem, although it can have a good effect in cutscenes and photomode.
Bloom, is also a bad ideia.
 
Last edited:
No, If done right can be good. Games are like movies in this aspect, every director/dop have his vision and a clear image is not always the best thing. For example, chromatic aberration in Outlast make perfect sense and add up a lot to atmosphere, same for movie grain in The Evil Within. Anyway, usually you can turn everything off from settings, so is not a problem.
 
The three things I always turn off are chromatic aberration, film grain and lens flares, which developers totally overdo. Motion blur I actually prefer in a lot of games.
 
You need motion blur at 30fps on oled.

On 60fps, no. But *some* per object motion blur effects can still look cool.
i don't know but can I be honest with you? i disliked motion blur in hzd and spiderman when i played these games on ps4. i liked the non motion blur image more. the blurring of the entire scene when you move the camera was too unsettling for me. i didn't get startled with motion judder. maybe my monitor handled the judder well. or something like that

i wonder if im alone in this or not. then again you said oled. why oled specifically?
 
CA is so stupid.
CA is a fucking lens DEFECT.

We spend thousands of dollars with objectives to avoid them in photography, why devs would you think we need them ?!
 
Last edited:
LOl,I turn both off, i've never liked blurring my game, i don't see the point of paying hundreds of pounds for Gpu's to blur the screen lol.
 
Screen space motion blur sucks ass, but per object motion blur gives the game far more refined look. Sadly pretty expensive, and you need many samples to make it look good/realistic.
 
Nah, when done right these cinematic effects enhance the visual experience imo.

In competetive FPS games though I will admit that I usually turn all of the effects off because they can be distracting and unnecessary. In any story driven single player game though, or any game were visuals have a large impact on the experience, these effects are important. The absence of such effects really makes the visuals look fake, unnatural, and/or uncanny.
 
I have come to some bizzare way of getting a sharp picture with no jaggies on my big 75inch tv.

Basically the 4k image on my Phillips ambilight is pretty soft. 1440p looks even blurrier.

I noticed on my cable tv box that 1080p looks alot sharper, the tv must be doing some processing and sharpening.

So now I output my PC and ps5 to 1080p to the tv and use Drs factors to downscale 4k or 1440p (ps5 downcales automatically)

The result is a really sharp picture with little to no jaggies. It's bizzare like I said but the image is so much more pleasing to me than outputting to 4k.
Dam somthing is defo wrong with ur tv lol, play with ur settings, try game mode etc, no way 1080p image should better then a 4k image, on some tvs there is less image processing on 1080p then 4k. So it would be what's happening here or ur tv is faulty.
 
Do people actually like motion blur?

Motion Blur is a beautiful effect and adds tremendously when the devs care to implement it correctly (like PGR4 and Driveclub).
Also, 60fps + Motion blur is paradise, much more than 60fps only, and that's an hill I'm willing to die on.
 
Cgi is much blurrier than video games typically.
Hm? What you consider sharpness it probably a false sharpness, because of aliasing creating fake details. I personally don't see this as a positive sharpness, and CGI / pre rendered footage is usually the definition of a perfect image quality, it might seem "blurrier" if you're used to something over sharpened, but it's a more natural look for sure.
If I could have games with CGI image quality I would be pretty happy.
 
Rather depth of field get this hate but anything that increases options on locked/closed systems is a good thing.
 
I have come to some bizzare way of getting a sharp picture with no jaggies on my big 75inch tv.

Basically the 4k image on my Phillips ambilight is pretty soft. 1440p looks even blurrier.

I noticed on my cable tv box that 1080p looks alot sharper, the tv must be doing some processing and sharpening.

So now I output my PC and ps5 to 1080p to the tv and use Drs factors to downscale 4k or 1440p (ps5 downcales automatically)

The result is a really sharp picture with little to no jaggies. It's bizzare like I said but the image is so much more pleasing to me than outputting to 4k.
It's possible. Esp. if the device is outputting the same native resolution via downscaling i.e. your ps5 downscaling 4k to 1080p then your tv scales that back up again. I noted on my sony tvs, this produced a superior image although the reason I did this was to retain 444 chroma on hdr games when the tv and ps4 pro were both hdmi 2.0 limited.

Your tv is simply doing more processing when it has to upscale vs. the PC or PlayStation.
 
i don't know but can I be honest with you? i disliked motion blur in hzd and spiderman when i played these games on ps4. i liked the non motion blur image more. the blurring of the entire scene when you move the camera was too unsettling for me. i didn't get startled with motion judder. maybe my monitor handled the judder well. or something like that

i wonder if im alone in this or not. then again you said oled. why oled specifically?
I turn off camera based motion blur on my x900e lcd even at 30fps.

The difference on oled is the response time is so fast that the tv is basically showing each frame on the display for a longer period, resulting in a lot of judder at just 30fps, so you need motion blur to help smooth that out. I turn off camera based motion blur on oled for 60fps.
 
I turn off camera based motion blur on my x900e lcd even at 30fps.

The difference on oled is the response time is so fast that the tv is basically showing each frame on the display for a longer period, resulting in a lot of judder at just 30fps, so you need motion blur to help smooth that out. I turn off camera based motion blur on oled for 60fps.
yes, i think what you say is true, led screens create natural induced blur that i dont even need additional motion blur

i already see lots of blur in movement. oled is fantastic though
 
I don't mind those effects, actually I think that motion blur is important for the coherence of the animation. Per-object motion blur is amazing. Camera Motion Blur? Not so much.
 
it takes me 6 beers to achieve motion blur IRL
6 beers actually turns the IRL motion blur setting from 50 to 70+. When you are dead sober and turn your head without blinking, you have motion blur. It's virtually imperceptible due to you being so used to it.
 
3D coders always ask, "how do I make this scene look like it was shot with a vintage camera?" Never, "why would I want to make this scene look like it was shot with a vintage camera?"
 
6 beers actually turns the IRL motion blur setting from 50 to 70+. When you are dead sober and turn your head without blinking, you have motion blur. It's virtually imperceptible due to you being so used to it.
Not true in most cases. When you are moving IRL, your eyes are almost always fixed on a point and you stare at that. What you are staring at is not blurry. People are not cameras. Their eyes can move independently from the direction their heads are facing.
 
It's possible. Esp. if the device is outputting the same native resolution via downscaling i.e. your ps5 downscaling 4k to 1080p then your tv scales that back up again. I noted on my sony tvs, this produced a superior image although the reason I did this was to retain 444 chroma on hdr games when the tv and ps4 pro were both hdmi 2.0 limited.

Your tv is simply doing more processing when it has to upscale vs. the PC or PlayStation.

Yes exactly. I don't dislike the 4k image, it's just the downscale then upscale with the ppi of 4k gives the image zero jaggies and a nice sharpenong filter is added by the tv I think. Plus I was getting picture drop outs on a long 15m HDMI cable so win win.

I know it's not the purists way but if it looks good to me that's all that counts.
 
Top Bottom