Nickolaidas
Banned
Fuck you, CA rocks like a motherfucking boss.
Fuck you, CA rocks like a motherfucking boss.
I love it - it gives this CRT/VHS feeling in games with realistic graphics.I actually don't mind chromatic aberration in bloodborne.
Actually I like it in scary games.
Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.I was actually going to say that TXAA looked really good on some games and was a first great improvement in Anti Aliasing, then I saw the rest of your message mentionning TXAA lol. Maybe you tried it on games where it wasn't very well implemented? But I remember it being really good in Assassin's Creed 3 and Black Flag, it didn't feel blurry at all and gave a very clean looking image. On some other games like ArcheAge however it was absolutely awful. But the tech when used properly looked really nice.
The best anti aliasing I have ever seen though is SGSSAA, it was a very specific thing that you could tweak through nvidia inspector, giving a perfect image quality looking CGI-like, but it was very demanding and only worked on DX9 games, maybe some rare DX11, it needed MSAA to work.
I don't agree. I hate jaggies and especially shimmering. Even Tata is not enough most of times. Dlss quality is the only solution that is close to ground truth for now.Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.
The biggest problems with most Taa, like even unreal 4's Taa, is that even at 1080p it just destroys sharpness and looks even less sharp than native 720p with no AA. That is just stupid and a waste of pixels. At 1440p it's much sharper, but again, less sharp than just plain old 1080p.
The second problem is ghosting, which again, destroys clarity.
So yeah, I like sharpness, and TAA is its sworn enemy.
Film grain is only in the areas where the EMMI is still alive. Kill the EMMI and the grain is gone.latest example is metroid dread, the film grain is really bad and it can't be turned off, like Wtf at least give us a option turn it off.
Surely you run everything at native 4k on your pc. I would never use taa on pc if i have a choice. Try witcher 2 at 4k with ubersampling and tell me if you still need want taa lol.I don't agree. I hate jaggies and especially shimmering. Even Tata is not enough most of times. Dlss quality is the only solution that is close to ground truth for now.
i take slight blur for that postcard clean look
This is just sacrilege.Whatever it was that Nintendo used in Twilight Princess can fuck off. Played that on an emulator recently with the bloom turned off.. muuuuch better.
This is just sacrilege.
Native 4k is jaggy as fuck. Uber sampling is 8k downsampled to 4kSurely you run everything at native 4k on your pc. I would never use taa on pc if i have a choice. Try witcher 2 at 4k with ubersampling and tell me if you still need want taa lol.
Also we should ask why TC lumped motion blur in the same category as CA which is basically hemmorrhoids for the eyes and is absolutely never good.![]()
Games should generally prioritize sharpness over lack of jaggies, but the industry has largely spoken. Well, "AAA" western devs anyway. Thankfully some Taa has gotten better, and we have things like dlss as well and that's getting better as well. And we are starting to hit higher resolution like 1440p which can help fix the blur TAA creates. And yes I thought Txaa looked crap on everything. Everything, including game design just seems to want to ape Hollywood filmic style.
The biggest problems with most Taa, like even unreal 4's Taa, is that even at 1080p it just destroys sharpness and looks even less sharp than native 720p with no AA. That is just stupid and a waste of pixels. At 1440p it's much sharper, but again, less sharp than just plain old 1080p.
The second problem is ghosting, which again, destroys clarity.
So yeah, I like sharpness, and TAA is its sworn enemy.
it takes me 6 beers to achieve motion blur IRL
They don't. They are blurry by definition on modern TVs. Everything is blurry on modern TVs.30fps games need blur
Best post.it takes me 6 beers to achieve motion blur IRL
I only need 5 strong IPA'sit takes me 6 beers to achieve motion blur IRL
i don't know but can I be honest with you? i disliked motion blur in hzd and spiderman when i played these games on ps4. i liked the non motion blur image more. the blurring of the entire scene when you move the camera was too unsettling for me. i didn't get startled with motion judder. maybe my monitor handled the judder well. or something like thatYou need motion blur at 30fps on oled.
On 60fps, no. But *some* per object motion blur effects can still look cool.
Dam somthing is defo wrong with ur tv lol, play with ur settings, try game mode etc, no way 1080p image should better then a 4k image, on some tvs there is less image processing on 1080p then 4k. So it would be what's happening here or ur tv is faulty.I have come to some bizzare way of getting a sharp picture with no jaggies on my big 75inch tv.
Basically the 4k image on my Phillips ambilight is pretty soft. 1440p looks even blurrier.
I noticed on my cable tv box that 1080p looks alot sharper, the tv must be doing some processing and sharpening.
So now I output my PC and ps5 to 1080p to the tv and use Drs factors to downscale 4k or 1440p (ps5 downcales automatically)
The result is a really sharp picture with little to no jaggies. It's bizzare like I said but the image is so much more pleasing to me than outputting to 4k.
Do people actually like motion blur?
Hm? What you consider sharpness it probably a false sharpness, because of aliasing creating fake details. I personally don't see this as a positive sharpness, and CGI / pre rendered footage is usually the definition of a perfect image quality, it might seem "blurrier" if you're used to something over sharpened, but it's a more natural look for sure.Cgi is much blurrier than video games typically.
Here's a video of it on and off. I think the off wins.
It's possible. Esp. if the device is outputting the same native resolution via downscaling i.e. your ps5 downscaling 4k to 1080p then your tv scales that back up again. I noted on my sony tvs, this produced a superior image although the reason I did this was to retain 444 chroma on hdr games when the tv and ps4 pro were both hdmi 2.0 limited.I have come to some bizzare way of getting a sharp picture with no jaggies on my big 75inch tv.
Basically the 4k image on my Phillips ambilight is pretty soft. 1440p looks even blurrier.
I noticed on my cable tv box that 1080p looks alot sharper, the tv must be doing some processing and sharpening.
So now I output my PC and ps5 to 1080p to the tv and use Drs factors to downscale 4k or 1440p (ps5 downcales automatically)
The result is a really sharp picture with little to no jaggies. It's bizzare like I said but the image is so much more pleasing to me than outputting to 4k.
I turn off camera based motion blur on my x900e lcd even at 30fps.i don't know but can I be honest with you? i disliked motion blur in hzd and spiderman when i played these games on ps4. i liked the non motion blur image more. the blurring of the entire scene when you move the camera was too unsettling for me. i didn't get startled with motion judder. maybe my monitor handled the judder well. or something like that
i wonder if im alone in this or not. then again you said oled. why oled specifically?
yes, i think what you say is true, led screens create natural induced blur that i dont even need additional motion blurI turn off camera based motion blur on my x900e lcd even at 30fps.
The difference on oled is the response time is so fast that the tv is basically showing each frame on the display for a longer period, resulting in a lot of judder at just 30fps, so you need motion blur to help smooth that out. I turn off camera based motion blur on oled for 60fps.
Well they need blur on PC monitors thats for sureThey don't. They are blurry by definition on modern TVs. Everything is blurry on modern TVs.
Current PC monitors use the same technology as TVs. You get movement blur on monitors just as much as on TVs. They are all LCD/LED/OLED and all prone to the issue.Well they need blur on PC monitors thats for sure
6 beers actually turns the IRL motion blur setting from 50 to 70+. When you are dead sober and turn your head without blinking, you have motion blur. It's virtually imperceptible due to you being so used to it.it takes me 6 beers to achieve motion blur IRL
Not true in most cases. When you are moving IRL, your eyes are almost always fixed on a point and you stare at that. What you are staring at is not blurry. People are not cameras. Their eyes can move independently from the direction their heads are facing.6 beers actually turns the IRL motion blur setting from 50 to 70+. When you are dead sober and turn your head without blinking, you have motion blur. It's virtually imperceptible due to you being so used to it.
It's possible. Esp. if the device is outputting the same native resolution via downscaling i.e. your ps5 downscaling 4k to 1080p then your tv scales that back up again. I noted on my sony tvs, this produced a superior image although the reason I did this was to retain 444 chroma on hdr games when the tv and ps4 pro were both hdmi 2.0 limited.
Your tv is simply doing more processing when it has to upscale vs. the PC or PlayStation.
Don't need motion blur on my plasma tv.You need motion blur at 30fps on oled.
On 60fps, no. But *some* per object motion blur effects can still look cool.