• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

At 4K does AA become unneeded?

bomblord1

Banned
About 3 million pixels short, actually.


Then you don't know what to look for.


Aliasing is aliasing man, however its caused. You cant just say, "Oh those jaggies don't count."

It's a static image created to be viewed on a 480p screen. If I were to extract the actual texture file it would be tiny like super tiny it's also not a vector image that can be scaled.

It can't be blown up without aliasing. Put that image on a 32k monitor it would still have aliasing.
 

ref

Member
It depends how tolerant you are to aliasing in my opinion.

At 1440p, I'm alright with only SMAA. It's not 100% cleared, but the only time I notice it if I'm standing still. By that account I imagine I'd be fine with 4k with no AA.

At 1080p I could not play games without 4x MSAA and some form of post process AA. That was about the equivalent to 1440p with SMAA.
 

tuxfool

Banned
It depends how tolerant you are to aliasing in my opinion.

At 1440p, I'm alright with only SMAA. It's not 100% cleared, but the only time I notice it if I'm standing still. By that account I imagine I'd be fine with 4k with no AA.

You're obviously assuming the right conditions though. Like distance from the screen and ppi.
 
I agree that's why it crossed my mind that 4K on an "average" sized monitor at an "average" sitting distance may not need AA at all.

Maybe I didn't express myself clearly. I wanted to say that your Dolphin shot (rendered at 2560x2112 with 2x MSAA) has less edge aliasing than it would've if it was rendered at 4K without any AA. And I think the aliasing is pretty noticeable in your shots, at least in the second one; and that's not even the worst case.
 

Jinko

Member
I think 4k with no AA is perfectly fine, there are many who are obsessed with image fidelity who would disagree though.
 

bomblord1

Banned
Maybe I didn't express myself clearly. I wanted to say that your Dolphin shot (rendered at 2560x2112 with 2x MSAA) has less edge aliasing than it would've if it was rendered at 4K without any AA. And I think the aliasing is pretty noticeable in your shots, at least in the second one; and that's not even the worst case.

How is that possible? Wouldn't smaller pixels create less aliasing than "edge blurring" (I know that's an oversimplification of what Anti-aliasing does but it gets the point across)?
 

FrunkQ

Neo Member
The higher density your pixel-pitch the less you need AA... but you will probably always need a layer of subtle AA to deal with weird interference you get with overlaid regular patterns and transparencies - which seems to happen a lot in games with chain-link fences etc.
 

tuxfool

Banned
How is that possible? Wouldn't smaller pixels create less aliasing than "edge blurring" (I know that's an oversimplification)?

Blurring is reducing high frequency information, hence reducing the necessity for anti aliasing. You could blur things into a single color smudge, and in the process not need any AA.
 

bomblord1

Banned
Blurring is reducing high frequency information, hence reducing the necessity for anti aliasing.

Shouldn't smaller pixels (higher monitor resolution at the same screen size) create less visible aliasing than downsampling from the same resolution? Because in the end we are still dealing with a 1080p output meaning the pixels can very well be visible when viewed up close where a 4K monitor resolution should mitigate that to a greater degree.
 

tuxfool

Banned
Shouldn't smaller pixels create less aliasing than downsampling from the same resolution? Because in the end we are still dealing with a 1080p output meaning the pixels can very well be visible when viewed up close where a 4K monitor resolution should mitigate that to a greater degree.

Smaller pixels (and more of them in a given area) create less aliasing because presentable bandwidth is greater. But perceptually it could be no different, depends on your eyesight, level of blurring and amount of information to need to resolve.
 

Damian.

Banned
Really depends on the game/environment. I downsample 4k resolutions on my 55" 1080p TV I sit 7 feet away from and the vast majority of the time, I don't need any AA at all to make jaggies virtually invisible. Alien Isolation is the only game I've run into so far that still has noticeable jaggies at a 4k downsample.
 
4k is not nearly a high enough resolution to make aa redundant. also using still images, especially of old games with extremely simple graphics is not a smart way to judge the situation. as detail gets finer and poly counts continue to increase aliasing gets worse
 

bomblord1

Banned
4k is not nearly a high enough resolution to make aa redundant. also using still images, especially of old games with extremely simple graphics is not a smart way to judge the situation. as detail gets finer and poly counts continue to increase aliasing gets worse

Really? According to this handy PPD caluclator I just found AA is more than likely not needed on a 22" screen @ 4k from a viewing distance of 2 foot.

Edit plugged in the wrong numbers AA may be only needed at 2 foot in medium and high contrast areas unneeded at 4.

http://phrogz.net/tmp/ScreenDens2In.html
 

bomblord1

Banned
i could probably see specular and subpixel aliasing with that. Most people probably could.

Shimmer and snap crackle pop of whitish pixels is pretty obvious

I plugged in the wrong numbers 2 foot is needed in medium and high contrast areas but unneeded elsewhere.
 

Portugeezer

Member
No, it depends on size of TV and how far you sit from it. 4K doesn't add anti aliasing, it just makes each bit of aliasing less and less of the whole picture.
 
Really? According to this handy PPD caluclator I just found AA is more than likely not needed on a 22" screen @ 4k from a viewing distance of 2 foot.

Edit plugged in the wrong numbers AA may be only needed at 2 foot in medium and high contrast areas unneeded at 4.

http://phrogz.net/tmp/ScreenDens2In.html

my own eyes and what i see tell me thats wrong. bf4 at 200% scaling on my 24 inch 1920x1200 monitor from 5 feet has massive amounts of temporal aliasing and shimmer. adding 4xaa w/ 4xtrssaa helps a lot, altho even that isnt as temporally stable as my native res w/ txaa
 
I play on a 27" 4k asus monitor and i turn off AA. they very little aliasing i get doesnt bother me at all. I rather get the extra fps from not using AA. I only turn it on when I can easily hold 60 fps and i might aswell max everything out.
 
This is Dark Souls 2 @ 5k downsampled with ultra SMAA. You can still see aliasing on the tree on the right. Even 8k doesn't completely eliminate it

ds2aa.jpg
 

bomblord1

Banned
This is Dark Souls 2 @ 5k downsampled with ultra SMAA. You can still see aliasing on the tree on the right. Even 8k doesn't completely eliminate it

That's downsampled to 1080p though. that trees branches literally is 1 pixel wide it's going to look aliased unless you increase the number of pixels the tree will stay aliased.
 

M3d10n

Member
I can see jaggies in 1080p phone screens, so no. I'll always take AA over a higher resolution since LCD screens have an inherent "sharpen" filter to them that makes jaggies more noticeable. Also, the human eye is damn good at picking up contrasting detail.
 
Well first that trees branches are only 1 pixel wide there unless you increase pixel density that tree will stay aliased.

Second I'm referring to 4k monitors in the OP not 4k and up down sampled.

You should make that clear then! You only mention resolution in the OP. I'm curious though, is there even a difference in aliasing on a 4k image downsampled to a 1080p display vs a 4k image on a 4k native display? Maybe that's a dumb question. I have no experience with 4k displays.
 

Durante

Member
You should make that clear then! You only mention resolution in the OP. I'm curious though, is there even a difference in aliasing on a 4k image downsampled to a 1080p display vs a 4k image on a 4k native display?
Absolutely (depending on your visual acuity and how much of your FoV the screen in question fills).
 

Eusis

Member
I wouldn't say unneeded but you wont need very heavy performance killing forms thats for sure.
That's my thought: you'll still need it, but you don't need to use AS much because it does get harder to notice the jaggies and so it takes less to make them effectively vanish.
 

scitek

Member
I just got a 4k TV, and no, jaggies aren't eliminated by the higher res. This is entirely dependent on the game, seating, distance, etc., though iI can live with no AA to be honest most of the time.
 

bomblord1

Banned
I just got a 4k TV, and no, jaggies aren't eliminated by the higher res. This is entirely dependent on the game, seating, distance, etc., though iI can live with no AA to be honest most of the time.

I guess it should be noted the assumptions is that game is rendering and outputting at 4k.
 

dr_rus

Member
This is Dark Souls 2 @ 5k downsampled with ultra SMAA. You can still see aliasing on the tree on the right. Even 8k doesn't completely eliminate it

I wish people would stop showing 1080p images and saying that these are 4K/8K simply because they were downsampled from these resolutions. 1080p is 1080p. Even 4K will give you a much higher clarity and detail which you won't be able to get with any kind of downsampling to 1080p.

As for the OP's question - it really depends on the (perceived) size of the screen and how good your eyes are. I would say that 4K won't be enough for me in 24" computer display.
 
Top Bottom