• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anisotropic Filtering in 2016

Muzicfreq

Banned
After reading an article on PS4Pro enhancements for Horizon Zero Dawn I ran into a bullet point that just made me question...

Just how hard is this crap to implement on consoles for Devs?
16x Anisotropic filtering is super cheap and a great step up for IQ. I have been using this on even low end cards with at most 1-2 fps difference.

There will be shadow maps and antisotropic filtering quality enhancements. This will increase the quality of texture sampling, resulting in more detailed environment textures.

It's 2016.... This shouldn't be a thing. If anything even 8x is fine.

I'm sorry. This is getting old and consoles more than ever now can pump this out easily.
 

Flandy

Member
Have we ever had an explanation as to why PS4 games don't tend to have it?
On my PC I just force 16xAF on a driver level since it basically costs nothing
 

Easy_D

never left the stone age
AF has been a must have since the early 2000's for me on PC, makes no sense that it's not a standard, yeah. Should have been a standard since last gen at the very least.
 

jelly

Member
I guess console parts just aren't like for like with PC, maybe it's the CPU, maybe it's the GPU, maybe they need to use every bit of juice for other things. Consoles do punch above their weight so it wouldn't surprise me if they had nothing left for AF.

What amazes me though was Rise of the Tomb Raider on Xbox 360 had better AF than the Xbox One and PS4 version!

If devs could turn on 16xAF with ease like PCs do, I think they would have done it so there must be a performance penalty that isn't worth it.
 

Jonnax

Member
I remember someone explaining it was due to having a single memory bus on the console's. That has to handle both CPU and GPU.

I remember a graph showing the ps4's memory bandwidth reduces the more you allocate to the CPU the slower it all gets.

I'm by no means an expert in graphics though but I'll try finding the chart.

I think it was this:
GkqNjpH.jpg
 

Durante

Member
My AF story is that I bought a PS3 around its launch thinking that now, finally, all games will have decent anisotropic texture filtering, after I had enjoyed that for at least half a decade on PC already.

Now here we are one decade later and it's still not a given.
 

Muzicfreq

Banned
I guess console parts just aren't like for like with PC, maybe it's the CPU, maybe it's the GPU, maybe they need to use every bit of juice for other things. Consoles do punch above their weight so it wouldn't surprise me if they had nothing left for AF.

What amazes me though was Rise of the Tomb Raider on Xbox 360 had better AF than the Xbox One and PS4 version!

The problem with this argument is that they're the closest to PCs now than they ever have been in the past. Not only this but some companies even claim their games can run higher than 30fps but lock it down... Well they could use some of that power to toss at better AF
 

Easy_D

never left the stone age
I guess console parts just aren't like for like with PC, maybe it's the CPU, maybe it's the GPU, maybe they need to use every bit of juice for other things. Consoles do punch above their weight so it wouldn't surprise me if they had nothing left for AF.

What amazes me though was Rise of the Tomb Raider on Xbox 360 had better AF than the Xbox One and PS4 version!

If devs could turn on 16xAF with ease like PCs do, I think they would have done it so there must be a performance penalty that isn't worth it.

What use are those hi-res textures when they turn into a blurry mess a couple meters away from you? Like, tone shit down, get 16x AF in there, bam, all your textures now look incredible, it's a huge improvement.
 

Cmerrill

You don't need to be empathetic towards me.
It should be standard 8x-16x AF on every game as well as 60fps and 1080p. Screw the extra bells and whistles.

Games age much better when these type of practices are followed. Unfortunately, we get sub 1080p,sub 30fps and sometimes no AF.
 

enemy2k

Member
Yeah. You'd think 16x AF would be stock on every console by now? What's the reason it isn't...wish a legit dev would chime in.
 

jelly

Member
My AF story is that I bought a PS3 around its launch thinking that now, finally, all games will have decent anisotropic texture filtering, after I had enjoyed that for at least half a decade on PC already.

Now here we are one decade later and it's still not a given.

I always think new gen will have no pixelated shadows. Disappointed every time. That is an expensive effect though.
 

burgerdog

Member
What use are those hi-res textures when they turn into a blurry mess a couple meters away from you? Like, tone shit down, get 16x AF in there, bam, all your textures now look incredible, it's a huge improvement.

This right here. Games that use trilinear or 2xAF look so bad a few feet away from the player.
 

-griffy-

Banned
I thought this was gonna not be a thing anymore when I got a PS3. I thought surely now, finally, we can get decent AF on consoles too since this is such an important part of image quality. I mean, I had first really experienced what texture filtering was and how it affects textures with the first Half Life game! The stair stepping of the metal walkways with mipmapping, vs. the smoothness of trilinear filtering, vs. the sharpness of proper anisotropic filtering. In 2016, why bother with such high resolution textures if you are just going to let them turn to blurry shit 5 feet away from the camera? You are practically disrespecting your artists' work at that point.

We've had threads about this before though, and I believe it was suggested to be a bandwidth issue with console architecture vs. PC's, where the setting is effectively free at this point.
 

Muzicfreq

Banned
This right here. Games that use trilinear or 2xAF look so bad a few feet away from the player.

Though I will admit one thing about this. in areas where there's grass and other plants you don't notice it unless if you look specifically for it. But once you get out of said areas and it's flat ground... boy does it show.

If this is a bandwidth issue maybe find something to take out that is visually minor in detail.
There's things in games now that devs add that tank performance and the effect is so small most wont notice.

Crysis 3 is a good example. going from high to very high grants nearly nothing except a huge loss in performance on a 750ti.
 

Lathentar

Looking for Pants
The problem with this argument is that they're the closest to PCs now than they ever have been in the past. Not only this but some companies even claim their games can run higher than 30fps but lock it down... Well they could use some of that power to toss at better AF

Yet still have a unified pool of memory. Which is the problem. Unlike your PC which has memory for the CPU and memory for the GPU.
 

Muzicfreq

Banned
Yet still have a unified pool of memory. Which is the problem. Unlike your PC which has memory for the CPU and memory for the GPU.

Okay but if that is the issue how is it no so with the PS4 Pro? They're adding 1GB (iirc) for the OS in the background but for games still running unified.

PS4 Pro players who possess a standard 1080p HDTV will still be able to get “far better image quality.” This will be done by supersampling, a high-quality anti-aliasing technique that lets the game internally render at a higher resolution (close to 4K) before shrinking it down to the final 1080p resolution. This will result in more detail, including smoother edges, and a more stable image.

SSAA is far more bandwidth demanding yet somehow isn't an issue along side anisotropic filtering.
 

Locuza

Member
[...]
16x Anisotropic filtering is super cheap and a great step up for IQ. I have been using this on even low end cards with at most 1-2 fps difference.
[...]
Why I can't hold all this super cheap 16x AF:
supercheapd4xqo.jpg

https://www.computerbase.de/2011-12/test-amd-radeon-hd-7970/7/

There are different examples also with far less performance difference but if I remember correct AF is very costly on Radeons on the recent Tomb Raider title.
In addition 16x is simply wasteful, you never need to force 16x globally and in many cases won't even see a difference in comparison to 4x.

Just look at Rise of the Tombraider or Fallout 4:
http://images.nvidia.com/geforce-com/international/comparisons/rise-of-the-tomb-raider/alt/rise-of-the-tomb-raider-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=ca7pakwszwaqfsffsvwqqvipai7lelelvnsn

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=dkzckx0mh0kwnmnnm20ww2rckrzobobo2ymy
 

Widge

Member
Wasn't there a trend this gen where the XB1 would have AF but the PS4 wouldn't? Is there something up with Sony architecturally?
 

Muzicfreq

Banned
Why I can't hold all this super cheap 16x AF:
supercheapd4xqo.jpg

https://www.computerbase.de/2011-12/test-amd-radeon-hd-7970/7/

There are different examples also with far less performance difference but if I remember correct AF is very costly on Radeons on the recent Tomb Raider title.
In addition 16x is simply wasteful, you never need to force 16x globally and in many cases won't even see a difference in comparison to 4x.

Just look at Rise of the Tombraider or Fallout 4:
http://images.nvidia.com/geforce-com/international/comparisons/rise-of-the-tomb-raider/alt/rise-of-the-tomb-raider-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=ca7pakwszwaqfsffsvwqqvipai7lelelvnsn

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=dkzckx0mh0kwnmnnm20ww2rckrzobobo2ymy

Can I ask why the FO4 one have 2 different foliage placements? o_O?
 

ymgve

Member
Why I can't hold all this super cheap 16x AF:
supercheapd4xqo.jpg

https://www.computerbase.de/2011-12/test-amd-radeon-hd-7970/7/

There are different examples also with far less performance difference but if I remember correct AF is very costly on Radeons on the recent Tomb Raider title.
In addition 16x is simply wasteful, you never need to force 16x globally and in many cases won't even see a difference in comparison to 4x.

Just look at Rise of the Tombraider or Fallout 4:
http://images.nvidia.com/geforce-com/international/comparisons/rise-of-the-tomb-raider/alt/rise-of-the-tomb-raider-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=ca7pakwszwaqfsffsvwqqvipai7lelelvnsn

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=dkzckx0mh0kwnmnnm20ww2rckrzobobo2ymy

Do you have any articles about how AF impacts video cards that are not from a five year old article?
 

MDave

Member
Imagine if the Switch has 16xAF by default, the most unlikely place to see it as standard finally for consoles.
 

Sakujou

Banned
ms tried to establish 4xaa on consoles. everyone failed.

i love microsoft for driving the industry to the next level, its just sad, when devs decide not to do this.

same goes for af.

i dont care about graphics anymore these days. if the game is 60fps with no dippings, iam happy.

games like ikaruga or radiant silvergun need dippings, but thats on purpose.
 
Why I can't hold all this super cheap 16x AF:
supercheapd4xqo.jpg

https://www.computerbase.de/2011-12/test-amd-radeon-hd-7970/7/

There are different examples also with far less performance difference but if I remember correct AF is very costly on Radeons on the recent Tomb Raider title.
In addition 16x is simply wasteful, you never need to force 16x globally and in many cases won't even see a difference in comparison to 4x.

Just look at Rise of the Tombraider or Fallout 4:
http://images.nvidia.com/geforce-com/international/comparisons/rise-of-the-tomb-raider/alt/rise-of-the-tomb-raider-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=ca7pakwszwaqfsffsvwqqvipai7lelelvnsn

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-anisotropic-filtering-interactive-comparison-001-16x-vs-4x.html?ClickID=dkzckx0mh0kwnmnnm20ww2rckrzobobo2ymy


You realize those percentages could only reflect half a frame or something right? Never see a difference in performance with AF on outside of the early 2000s.
 

Ombala

Member
I dont tend to think about it alot when playing on console, maybe because I sit alot further away from the TV then my PC screen?
 

burgerdog

Member
AF 16x
bf1_2016_11_02_18_18_01usk.png


AF Off
bf1_2016_11_02_18_18_3uu6n.png


Not very meaningful impact on performance, less than 1 FPS.

Both of those shots look exactly the same. AF off would have the textures looking super blurry a few feet away from the player. You might have 16xAF turned on at driver level(like everyone should.)
 

Durante

Member
Even if high-quality AF does cost 15% of performance in some particular scenario, as far as I am concerned it's way more worth it than many other effects games waste performance on these days. Of course, holding such views is why I primarily play on PC.
 

Locuza

Member
Can I ask why the FO4 one have 2 different foliage placements? o_O?
Some games are placing parts of their foliage/assets randomly, don't ask me why.

Do you have any articles about how AF impacts video cards that are not from a five year old article?
Unfortunately ComputerBase stopped making AA/AF scaling tests in the recent years.
And I don't have a source with a good and recent history on that matter.

But there are some simple takeaways, the TMU design is more or less the same between GCN Gen 1 and 2.
If you are a console dev you will have to pay quite an amount of performance if you want to use 16x globally.
Instead of literally wasting ressources, where most players are sitting in front of a TV playing the games in motion, you are better off tweaking the AF settings per texture and only using higher AF where it really matters.
 

-griffy-

Banned
Wasn't there a trend this gen where the XB1 would have AF but the PS4 wouldn't? Is there something up with Sony architecturally?

No, because most games have the exact same AF on both platforms (and it's usually pretty subpar on both, even in cases where the XO is better), several of the games that had worse AF on PS4 were patched to fix it and bring it in line with XO, and several games have better AF on PS4.
 

antyk

Member
AF 16x
bf1_2016_11_02_18_18_01usk.png


AF Off
bf1_2016_11_02_18_18_3uu6n.png


Not very meaningful impact on performance, less than 1 FPS.

Are you sure 2nd image is AF Off, because they're hardly different?

Anyway, I think it all comes down to three things:
1) shared bandwidth to the RAM from GPU and CPU (a graph posted above)
2) very careful optimising of resources on consoles - for PC you're mostly above 60fps anyway, so losing 1-2 frames by forcing 16x AF isn't a big deal; on consoles when you target 30fps losing 1 or 2fps is already noticeable (and the game becomes "unstable mess coded by lazy devs" in DF threads ;)) therefore the most you see usually is 4x AF, which is passable
3) on PS4 the process of enabling it seems (seemed?) to be particularly difficult, as some games whipped without AF altogether when it was present in the X1 version.

So yeah, not an ideal situation :)
 
I wonder if the regular ps4 has some hardware issue with some forms of AF.

Like Sony overlooked something when making the damn thing.
 

-griffy-

Banned
I wonder if the regular ps4 has some hardware issue with AF.

Like Sony overlooked something when making the damn thing.

.
No, because most games have the exact same AF on both platforms (and it's usually pretty subpar on both, even in cases where the XO is better), several of the games that had worse AF on PS4 were patched to fix it and bring it in line with XO, and several games have better AF on PS4.
 
I thought this was gonna not be a thing anymore when I got a PS3. I thought surely now, finally, we can get decent AF on consoles too since this is such an important part of image quality.

I thought pop-ups would be gone once we hit the PS2/GC/Xbox area. They were. Then they came back with a vengeance.
 

Tovarisc

Member
Both of those shots look exactly the same. AF off would have the textures looking super blurry a few feet away from the player. You might have 16xAF turned on at driver level(like everyone should.)
I don't see a single difference in AF in your post.
AF is being forced somewhere, because texture quality looks near identical in those shots. The AF Off shot should look much worse.
Is that the same picture?

Looking on my phone though but it doesn't look different.
are you sure that's AF off? cuz it just looks like AF 8x

i opened each image in a new tab and they looked almost identical

Yeah, I messed up. Had forgotten that I forced it through driver so redid shots after removing forcing through driver.

Setting being changed between Low and Ultra;

Low;
bf1_2016_11_02_18_41_18uy6.png


Ultra;
bf1_2016_11_02_18_41_owudl.png
 

Tovarisc

Member
Er, I think you've messed up again. Both those shots have exactly the same level of anisotropic filtering.

A) Didn't upload nor link same picture twice.

B) Changed only AF value in-game

C) In drivers had all AF forcing disabled and set to application enforcing AF setting

Not sure what else I need to do.
 
Top Bottom