• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anisotropic Filtering and next-gen: Why is it still absent?

Izayoi

Banned
Does Ryse have AF?
I looked at a lot of images and it seems like it does not, but I couldn't find a great example to illustrate the point.

One thing I'm pretty sure of is that Ryse bullshots DO have AF, but the actual game does not (or at least uses a lower level of it).
 
Yeah, it's especially worrying that even bullshots are showing a distinct lack of AF. I thought it was a relatively inexpensive process, but maybe I'm wrong?
Doesn't that mean they're NOT bullshots (as in supersampled)? Because sufficient sampling would make AF obsolete.
 

KKRT00

Member
I looked at a lot of images and it seems like it does not, but I couldn't find a great example to illustrate the point.

One thing I'm pretty sure of is that Ryse bullshots DO have AF, but the actual game does not (or at least uses a lower level of it).

It does have AF, its clearly visible on ton of shots.

Example
image_ryse_son_of_rome-23646-2061_0003.jpg


And framegrab from gameplay video
i0uFZzJrH1guB.png
 

Mman235

Member
One thing this thread has shown is that developers have gone fucking crazy on the depth of field blur stuff. No AF almost makes sense when you apparently can't see shit past a few meters. Apparently we've gone back to 32-bit era fog in a different way.

Edit: Although, to be fair, it does apparently seem to be more of a bullshot thing.
 
One thing this thread has shown is that developers have gone fucking crazy on the depth of field blur stuff. No AF almost makes sense when you apparently can't see shit past a few meters. Apparently we've gone back to 32-bit era fog in a different way.

Edit: Although, to be fair, it does apparently seem to be more of a bullshot thing.

Devs last gen used lens flares to blind us and now they are using DOF and lens flares.
 
image_ryse_son_of_rome-23646-2061_0003.jpg


You can't tell in that pic. The DOF is covering it up.

i0uFZzJrH1guB.png


Nope. Doesn't seem like it has it. Look at the bricks off to the right-middle of the screen.. Not enough detail there anymore. Need more evidence.
 

Izayoi

Banned
Looks like it has some.
I was looking through older screens, and it seems like in old builds at least there was none (or very little). At least from that second shot, it appears that it has definitely improved. Still pretty difficult to tell because of all the post-processing, but it's nowhere near as bad as Forza or KI - that much is certain.
 

KKRT00

Member
Not at all. The first shot is a throw away.. you can't tell shit from the DOF.

Second shot clearly shows that the detail in the textures on the ground falls off rapidly with respect to the angle of the camera and the ground as it recedes into the distance.

Its clearly have at 8x AF on both shots. And You can see quite far away on first shot too.
 

Blackthorn

"hello?" "this is vagina"
Is POM compatible with anisotropic filtering these days? I remember back in Crysis/Warhead you had to give up one for the other.
 

KKRT00

Member
Teach me how to "clearly" objectively determine that multiple you came up with.

"at least"
And You can see that on the second shot when You can see bricks pattern far away in the background. On AF 4x it would brake there.
 

HTupolev

Member
Supersampling doesn't do anything with AF/texture filtering
That's completely wrong.

The reason stuff like mipmapping gets used is that sampling from the full-res texture would result in shimmering from undersampling at long distances. Supersampling is the simplest and most direct fix to undersampling.

Where basic mipmapping breaks down is at oblique angles. If you sample from a low LOD, you get shimmering from undersampling on the short axis. But if you sample from a high LOD, you get blurring because you're not sampling from a low enough LOD for the long axis. The solution? Use a low LOD, but take more samples from it so that you don't wind up with shimmering from undersampling (basically what AF does).
Again, supersampling accomplishes the same thing. When you supersample when using basic mipmapping, the GPU uses lower LODs, because with a higher resolution you can get away with that without undersampling (and shimmering). Hence even at oblique angles, you get sharper textures.

Granted, supersampling doesn't do AF particularly well; 16xAF is a bajillion times more efficient than trying to get the same results through ridiculously high-order supersampling, which is why we turn on AF and do this stuff strictly in the TMUs.

//================

Actually, supersampling has EVERYTHING to do with texture filtering, and also with techniques like MSAA. Texture filtering and MSAA are just efficient ways to target supersampling-esque results for various aspects of an image. Texture filtering attempts to get supersampling-like results for textures, and MSAA attempts to get supersampling-like results for geometric edges.
 

Izayoi

Banned
"at least"
And You can see that on the second shot when You can see bricks pattern far away in the background. On AF 4x it would brake there.
Sounds like you're talking out your ass, to me.

You have any proof that's where it would definitively cut off?
 

Vizzeh

Banned
Am I correct to assume PS4 going forward will likely be able to handle improved versions of AF, especially as the hardware is getting more familiar and engines are adapated/made more efficient? More so because it can use a larger frame Buffer on the 8GB GDDR5: Should be able to produce 1080p with alot more room to expand AA/AF?

Were as on the other side of the garden fence, likely the X1 will struggle going forward since it only has 32MB to use a frame buffer from, if it draws the frame buffer from the DDR3 then it will struggle with performance, They need to work on trying to get a 1080p picture or just accept 720p with some added AF? since a typical 1080p frame buffer with 2xAF costs 54mb?
 

HTupolev

Member
Am I correct to assume PS4 going forward will likely be able to handle improved versions of AF, especially as the hardware is getting more familiar and engines are adapated/made more efficient? More so because it can use a larger frame Buffer on the 8GB GDDR5: Should be able to produce 1080p with alot more room to expand AA/AF?

Were as on the other side of the garden fence, likely the X1 will struggle going forward since it only has 32MB to use a frame buffer from, if it draws the frame buffer from the DDR3 then it will struggle with performance, They need to work on trying to get a 1080p picture or just accept 720p with some added AF? since a typical 1080p frame buffer with 2xAF costs 54mb?
It sounds like you're confusing AF with MSAA.

AF does not increase your framebuffer size.
 

Crisium

Member
you have no idea what you're talking about

people have been using it since 2002 on pc with little to no performance hit
the moment games started having the option it was a complete no brainer to use it.

I was using it on a radeon 9800 pro and a geforce 4 ti 4200 back then... 11 years ago

there is no defending this

So true. I remember reading reviews of the revolutionary Radeon 9700 giving us AF for free, and it has been that way ever since. I used it on my GeForce 4 Ti 4200, albeit with a bit of a performance hit. My next card was a Radeon 9800 Pro, and it was free. It has been free, ever since.

For more than a decade.

This is inexcusable. Utterly.
 
Am I correct to assume PS4 going forward will likely be able to handle improved versions of AF, especially as the hardware is getting more familiar and engines are adapated/made more efficient? More so because it can use a larger frame Buffer on the 8GB GDDR5: Should be able to produce 1080p with alot more room to expand AA/AF?

Having more GPU memory or available memory doesn't mean you have room to add more computations to the rendering engine. What good is all that memory when your GPU is the bottleneck? AF takes compute cycles and several texture reads. It's not memory intensive more so than memory bandwidth intensive (fetching several texels).
 
What's wrong with these consoles? It's been years since I use some heavy AF on my PC games, it seems like it's not taxing at all I don't think I ever seen any significal performance improvement by removing it. I remember being blown away back then, the first time I activated it was with San Andreas, it made the game look 10 times better lol. Damn, it's been at least 6 years since I activate it on my PC games.
 

JNT

Member
What's wrong with these consoles? It's been years since I use some heavy AF on my PC games, it seems like it's not taxing at all I don't think I ever seen any significal performance improvement by removing it. I remember being blown away back then, the first time I activated it was with San Andreas, it made the game look 10 times better lol. Damn, it's been at least 6 years since I activate it on my PC games.

The problem is not so much with the consoles themselves rather than with how the developers spend the available memory bandwidth. When they are already pushing the limits, adding another memory bandwidth hungry algorithm is going to destroy performance, even if the impact would only be minimal if there were bandwidth to spare.
 

LCGeek

formerly sane
Please don't compare console image quality to downsampled PC shots. You need ultra-high-end hardware to achieve this image quality with enjoyable framerates. This comparison is first-class bullshit because PC games don't have this image quality for 99.9% of players.

Not at 30fps maybe for 60fps and higher. I was downsampling with my GTX460/560 and now my 7950 not hard unless you're what you're downsampling from is insanely high.
 

Jedi2016

Member
The problem is not so much with the consoles themselves rather than with how the developers spend the available memory bandwidth. When they are already pushing the limits, adding another memory bandwidth hungry algorithm is going to destroy performance, even if the impact would only be minimal if there were bandwidth to spare.
They need to prioritize better, then. AF is something that people can see, and they're most likely sacrificing it for an effect that people can't see, or one that wouldn't have a noticeable change if it were reduced just slightly to allow for better AF.

It's a balance thing, really. Reducing something to allow for something else rather than just cutting it out entirely like they seem to be doing with AF.
 

Vizzeh

Banned
Having more GPU memory or available memory doesn't mean you have room to add more computations to the rendering engine. What good is all that memory when your GPU is the bottleneck? AF takes compute cycles and several texture reads. It's not memory intensive more so than memory bandwidth intensive (fetching several texels).

Thats a trade off though between the likes of FPS then? Thats why I highlighted the difference in both platforms with 1 having the "ability" to expand even if it is at the cost of GPU or negated by GPU/Game engine efficiency. I was merely pointing out 1 platform has alot of room for improvement were as the other seems to have other problems without those AF advancements.
 

Mr Swine

Banned
If the Xbox e or PS4 games do t have AF, then it's kinda stupid to have really good detailed textures if you can only see them a few feet ahead
 

Odrion

Banned
Is there an explanation why AF isn't demanding on 'modern' pc hardware (although it wasn't demanding five years ago) and yet it's missing on next-gen games?
 

-SD-

Banned
Anisotropic texture filtering has been commonplace in PC games for sooooo long. I can't even remember the last time I played a game that had no AF but bi/trilinear filtering option only. Maybe 10 years ago?

Because of that, this thread makes me feel

what_year_is_itmcc2k.jpg
 

Odrion

Banned
Hey Guerilla, instead of leaving your framerate uncapped and have it judder at 35-40fps, cap the framerate and turn on the AF.
 

RoadHazard

Gold Member
Yeah, I've been wondering the very same thing. On my 2011 laptop with a then-mid-level (maybe sightly above that) mobile GPU I can turn AF up to 8x or 16x with no noticeable performance impact, so it's super weird to me if these consoles somehow have trouble with it.
 

creyas

Member
Anisotropic texture filtering has been commonplace in PC games for sooooo long. I can't even remember the last time I played a game that had no AF but bi/trilinear filtering option only. Maybe 10 years ago?

Tried out Crysis 2 a while back and didn't notice it in the options or on by default.

There appears to be a console command for it though, so at least Crytek seems to be aware it is a thing!
 

KKRT00

Member
Sounds like you're talking out your ass, to me.

You have any proof that's where it would definitively cut off?

Launch any game and make comparison, its not hard.

---
You can't tell from looking at a screenshot what level of AF a rendered image has. LOL! What's your secret sauce??

You cant even notice AF on shots with AF, so ...

------
Tried out Crysis 2 a while back and didn't notice it in the options or on by default.

There appears to be a console command for it though, so at least Crytek seems to be aware it is a thing!

They default it on 8x in C2, but yeah there are no options for it in menu. Command is r_TexMaxAnisotropy. In C3 they've added option though.
 

JNT

Member
They need to prioritize better, then. AF is something that people can see, and they're most likely sacrificing it for an effect that people can't see, or one that wouldn't have a noticeable change if it were reduced just slightly to allow for better AF.

It's a balance thing, really. Reducing something to allow for something else rather than just cutting it out entirely like they seem to be doing with AF.
No argument there. However, given the amount of post process blur game developers are adding these days it wouldn't shock me if they actually consider AF to be an effect that you can't see.

Is there an explanation why AF isn't demanding on 'modern' pc hardware (although it wasn't demanding five years ago) and yet it's missing on next-gen games?
When developing a console game you know what amount of resources you have available. Developers are quick to utilize these resources because they are constant, and therefore easy to design a game around.

When developing a PC game, on the other hand, the amount of resources you have available are unknown. Therefore, you have to establish some sort of minimum requirement that covers a reasonable share of the market, and then design a game around that minimum. If you happen to have a rig that is more capable than the target you can start enabling optional features that were easy for the developers to add to the side of the game without them being essential to the game itself (things like AA, AF, higher quality shadows, various forms of post processing, etc.).

The only way to get AF in a console game is to convince the developers to design a game with AF in mind from the start. That way, they won't gobble up all of the resources only to realize there are none left for AF.
 

nib95

Banned
Can't tell for sure because of YouTube compression, but the latest video of Killzone Shadow Fall seems to show some degree of Anisotropic Filtering. Enough that I couldn't really tell either way.

KZSF17.jpg~original


KZSF19.jpg~original


KZSF18.jpg~original


KZSF16.jpg~original


KZSF15.jpg~original


KZSF13.jpg~original


KZSF12.jpg~original


KZSF11.jpg~original


KZSF10.jpg~original


KZSF9.jpg~original


KZSF8.jpg~original


KZSF6.jpg~original


KZSF5.jpg~original


KZSF4.jpg~original


KZSF1.jpg~original


KZSF3.jpg~original


KZSF2.jpg~original
 
It has been free, ever since.

AF isn't free, never has been free and never will be free. In particular it's pretty bandwidth-intensive because you've got to force more stuff down the pipeline. The consoles, more than anything, are bandwidth-constrained, and realistically both operating at a step (resolution wise) above where they rightly should be operating [would be operating on a PC], so it should come as no surprise they're not using it to the full extent.

Besides, at 10ft+ away on a big TV, it's not massively noticeable, so from a development point of view, particularly in this stage of the life cycle, it's a way to free up bandwidth with no real noticeable impact except in the online gaming echo chamber.
 
Top Bottom