• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anisotropic Filtering and next-gen: Why is it still absent?

In practice this is the performance hit we are talking about:

Max Payne 3:
chart-11-anoisotropicxgfqz.png


Borderlands 2:
borderlands-2-tweak-gxudee.png


Deus Ex: Human Revolution:
chart-texture-filtergjf6e.png


even Crysis 3:
15_5_anisotropic_grap3wevl.png


and Battlefield 3:
chart-af3ics8.jpg


It is absolutely negligible.

On which graphics card though?

It really depends on the circumstances. Just did a quick check in the WRC3 Demo on my GTX460. Results:

No AF:


16xAF:


That's 13% faster without AF. Doesn't mean it isn't worth it, but as I said: Depending on the situation it can be a noticeable performance hit.
Doesn't mean it isn't worth it f course.

Isn't the trilinear one more realistic?

Ehh.. no. Take a look out of the window. ;)
 
On which graphics card though?

It really depends on the circumstances. Just did a quick check in the WRC3 Demo on my GTX460. Results:

No AF:


16xAF:


That's 13% faster without AF. Doesn't mean it isn't worth it, but as I said: Depending on the situation it can be a noticeable performance hit.
Doesn't mean it isn't worth it f course.



Ehh.. no. Take a look out of the window. ;)

Test between something like 2x and 16x, it's ridiculous to test with no AF! Why would you do that?!
 

JNT

Member
I played Crysis 1 on a GT220, 720p and everything on low, but AF didn't make me lose more than a frame.

Playing on low would free up a lot of memory bandwidth for AF to utilize. If you would have cranked up the detail while toggling AF on and off you would have seen quite a big difference (percentually).
 

Raist

Banned
I don't know, if that thing is essentially free (according to the benchmarks) then there's not really a reason why they shouldn't be using it. Unless it's completely different in the way console games are made, or with console HW. Or maybe they just don't care because of the emphasis on motion blur / DOF.

Someone page Tempy or something.
 
Test between something like 2x and 16x, it's ridiculous to test with no AF! Why would you do that?!

To show the maxiumem difference texture filtering can do in that game in that situation.

Actually though, the game doesn't have an AF slider and activated some unkown AF level on its own.

Redone benchmarks entirely with driver-forced AF:

2xAF: 60 fps
4xAF: 57-58 fps
8xAF: 55 fps
16xAF: 52 fps
 

JNT

Member
I've been using AF for as long as I can remember. There is no performance it. It's practically free.

If that were really true for the case of console games, wouldn't you expect a group of, arguably, intelligent programmers to just enable it and try it out? Or, perhaps they already have, and found that once you already maximize your memory bandwidth, adding a memory bandwidth intense algorithm to the mix doesn't really play well with the framerate anymore.
 
To show the maxiumem difference texture filtering can do in that game in that situation.

Actually though, the game doesn't have an AF slider and activated some unkown AF level on its own.

Redone benchmarks entirely with driver-forced AF:

2xAF: 60 fps
4xAF: 57-58 fps
8xAF: 55 fps
16xAF: 52 fps

Driver forced AF could potentially use more resources than an in-game option.
 
Console development is not PC development. Console development allows for much more optimizations, but you can't do it out of the box. On these systems devs will prefer a filtering method that deals with the textures analytically and individually. Some of them will receive 4:1 AF (or even higher), but applying the max possible level of AF to anything on screen is a complete waste of resources.

If you want to brute-force 16:1 AF and 8xSGSSAA in every game then you're not part of the target audience of a games console anyway.
Horrible false equivalence here. 16X AF is not at all comparable in terms of performance to something like 8XSGSSAA. AF does not exist as some PC gamer $2000 rig luxury.

Many current gen console games even go up to 16X AF (Crysis 2 and 3 for example on the PS3). 8-16 AF is relatively cheap and strange to not see in 1080p games where it matters.

Edit: Maybe 8X.. but the points still stands that trilinnear or 4x AF is not at all acceptable consiering last gen games had better AF and that more resolution makes less AF more noticable.
 
Please don't compare console image quality to downsampled PC shots. You need ultra-high-end hardware to achieve this image quality with enjoyable framerates. This comparison is first-class bullshit because PC games don't have this image quality for 99.9% of players.



Dude, this gen hasn't even started yet.

Console development is not PC development. Console development allows for much more optimizations, but you can't do it out of the box. On these systems devs will prefer a filtering method that deals with the textures analytically and individually. Some of them will receive 4:1 AF (or even higher), but applying the max possible level of AF to anything on screen is a complete waste of resources.

If you want to brute-force 16:1 AF and 8xSGSSAA in every game then you're not part of the target audience of a games console anyway.

PC cannot apply variable filtering if the devs want?
 

TronLight

Everybody is Mikkelsexual
Horrible false equivalence here. 16X AF is not at all comparable in terms of performance to something like 8XSGSSAA. AF does not exist as some PC gamer $2000 rig luxury.

Many current gen console games even go up to 16X AF (Crysis 2 and 3 for example on the PS3). 8-16 AF is relatively cheap and strange to not see in 1080p games where it matters.
What? I'm having an hard time believing this.
 
If that were really true for the case of console games, wouldn't you expect a group of, arguably, intelligent programmers to just enable it and try it out? Or, perhaps they already have, and found that once you already maximize your memory bandwidth, adding a memory bandwidth intense algorithm to the mix doesn't really play well with the framerate anymore.

Yup. I mean, there might be some black sheep that really don't put much thought into that, but most developers won't limit AF without reason. Which doesn't mean we have to like it or can't criticize it. It's always a trade-off, a matter of preferences.

Driver forced AF could potentially use more resources than an in-game option.

The application can select for each texture the maxium ratio of anisotropy. There's potential for optimization if the dev thinks that on some textures the lack of AF won't be noticed by players.
 

JNT

Member
The code for Crysis 2 and 3 on the PS3 allow it to adjust the AF if there is enough performance in the frame to allow it. There would be instances where the PS3 version had much better AF in a frame than the manual 4XAF of the 360 version.



Oooo. It may have been 8X and not 16X. Whoops. Need to check.

Says up to 16x here. Can't speak for truthfulness.
 

Stronty

Member
You must have some serious eye problems if the floor in front of you becomes blurry after 2 meters.

It is what will happen to your vision in your 40s. I think he meant that things 100s of feet away IRL get blurry. IMO, dropping the limitaions of the human eye and cameras is better for games, you can aways change an image for artistic purposes after the best possible image is rendered.
 

Arulan

Member
This is what has become passable because it's "console standards"? Completely disregard image quality in the hope that the excessive motion blur, depth of field, sub-1080p resolutions, and other effects hide the fidelity?

I understand consoles have to make compromises, they're usually forced into using post-processing AA solutions for example. However, it's clear with the abundance of effects we've seen on some of these next-gen titles including particles, blurring, etc that they could tone something down for AF which has an extremely minimal cost to performance in comparison.

Perhaps this complaint is unreasonable to their target audience however, which is a little depressing.
 

-SD-

Banned
Instead of improving image quality, far too many devs choose to worsen it by post-processing the crap out of their games with all sorts of superflous and annoying effects, thus covering the lack of AF and such.
 

Jedi2016

Member
Instead of improving image quality, far too many devs choose to worsen it by post-processing the crap out of their games with all sorts of superflous and annoying effects, thus covering the lack of AF and such.
Whatever do you mean?

ofaejbb.png
 

Izayoi

Banned
Instead of improving image quality, far too many devs choose to worsen it by post-processing the crap out of their games with all sorts of superflous and annoying effects, thus covering the lack of AF and such.
Really annoying trend. I hope that, with the PS4 at least, 1080p remains a high priority - even without AA and AF it does a lot for the IQ.
 
I don't understand it.

I like to fiddle and test to see what affects my frames the most and AF going from 2x to 16x hits my PC about half an fps if the game can be uncapped. Most of the time it never leave a lock at 60 though.
 

Jedi2016

Member
I don't understand it.

I like to fiddle and test to see what affects my frames the most and AF going from 2x to 16x hits my PC about half an fps if the game can be uncapped. Most of the time it never leave a lock at 60 though.
Same here. Even for the titles that run lower than 60fps, AF has never had a noticeable effect on performance. For the games that run in the 40s and 50s, I get more of a change just from looking in a different direction than I ever will from AF. It's frankly astounding that they can't seem to make it work all the time on these new consoles.
 

StevieP

Banned
I don't understand it, really. I don't notice any performance decrease with AF set to max.
Many modern PC games don't even have af settings. They just have it all the way up by default. But you're less likely to run into hardware limits on a modern gaming PC than a console
 

TheExodu5

Banned
I'm not really surprised. At low resolutions, AF can lead to a lot of texture aliasing. A bit puzzling that 1080p games are forgoing it, though.
 

Mosati

Banned
That's still not the point of the picture, or this thread.

It demonstrates the value though. Especially in racing games, where the huge payoff makes even moderate drop in frames worth it. I don't think I've noticed lack of AF more than when playing racing games. The viewing angle of the tarmac makes everything turn to soup after a few car lengths in the absence of AF.
 
I'm surprised they didn't add AFx16 to KZ and lock it to 30 in SP since they seem to have a lot of headroom above 30 from what we have seem of that unlocked framerate.
 

ghst

thanks for the laugh
Instead of improving image quality, far too many devs choose to worsen it by post-processing the crap out of their games with all sorts of superflous and annoying effects, thus covering the lack of AF and such.

but it makes for such great #gifs.
 
Top Bottom