• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anisotropic Filtering and next-gen: Why is it still absent?

I guess this is what was happening here:
kzsf_fe_2013-10-08_helghast-infantry_04.jpg

kzsf_fe_2013-10-08_helghast-infantry_06.jpg


I remember having a discussion about these pictures in a killzone thread and many pointed out that the only difference between the spotlight model and the in-game model are:
Better AA
Much better AF

The difference is noticeable.
 

Salex_

Member
I guess this is what was happening here:
http://killzone.dl.playstation.net/...s/kzsf_fe_2013-10-08_helghast-infantry_04.jpg
http://killzone.dl.playstation.net/...s/kzsf_fe_2013-10-08_helghast-infantry_06.jpg

I remember having a discussion about these pictures in a killzone thread and many pointed out that the only difference between the spotlight model and the in-game model are:
Better AA
Much better AF

The difference is noticeable.
Those pics have nothing to do with AF and the models look the same as it does in-game.
 

sTaTIx

Member
Destiny looks a bit more rough than I remember. Ugh =(.

Looks freaking fantastic to me. One of the best looking next-gen games I've seen so far, especially taking into consideration that it's open-world.

The graininess of the screenshot you're seeing may just be the poor compression of the image file.
 
It's so obvious in the 8k reference too. I've always hated that about fences in games. I need a perfect fence :(

Yeah. I didn't mean to say it was fixed in the 8K shot. Just demonstrating that a massssssssive resolution is needed.

Seems worse in BF4 than other games.
 

Reallink

Member
You guys can rest easy knowing the same guys deciding on 0x AF are the same ones "balancing" all the other effects with regards to resolution
720p
. Who cares if you can't tell high precision HBAO+ apart from SSAO, pile it on. Gotta have some real time ray traced reflections to reflect these 4k x 4k soft shadows. Drop resolution to 1024x600.
 

pottuvoi

Banned
Don't feel bad. It takes an insane resolution to fix that.
Or just use proper transparency instead of alpha test, this needs either proper ordering of samples or order independent transparency.
In which case you can count on prefiltering on mipmaps to fix the problem, just like on normal textures.
 

Salex_

Member
This is the thread:
http://67.227.255.239/forum/showthread.php?t=693763&page=2

You are saying those two models look the same?

It's extremely hard to tell the difference due to the lighting and different angles but they look pretty much the same. The armor has a bunch of little details and the same aliasing ( or armor damage) is in both pictures. Gotta wait until release to see how all the direct feed pics of the models look like compared to those character highlights.
 

Chev

Member
I don't know where the people who say there's no performance hit got it from. It directly increases the number of texel fetches, so on shaders that require lots of fetches per pixel already you'll get very immediate benefits from disabling it.
 

HyperionX

Member
Don't feel bad. It takes an insane resolution to fix that.

4K fence:


8K fence:

This is where ray tracing can really come in handy. Because you are going to be naturally sampling every pixels hundreds of times to generate correct global illumination, moire patterns get massively reduced in the meanwhile. Compared to 8K it might even be practical.
 

R1CHO

Member
Lack of Aniso is disgusting, it'ssad on current gen consoles, it's inexcusable on next gen.

Is it just a thing that only annoys a pair of assholes like us on the internet?

Because i refuse to believe that a developer working on making a beautiful world for a video game is happy when the final product has a shitty image quality that makes a fair amount of his work look like shit.

oWXtYrV.jpg
 
I don't know where the people who say there's no performance hit got it from. It directly increases the number of texel fetches, so on shaders that require lots of fetches per pixel already you'll get very immediate benefits from disabling it.

In practice this is the performance hit we are talking about:

Max Payne 3:
chart-11-anoisotropicxgfqz.png


Borderlands 2:
borderlands-2-tweak-gxudee.png


Deus Ex: Human Revolution:
chart-texture-filtergjf6e.png


even Crysis 3:
15_5_anisotropic_grap3wevl.png


and Battlefield 3:
chart-af3ics8.jpg


It is absolutely negligible.
 

DiscoJer

Member
I don't know where the people who say there's no performance hit got it from. It directly increases the number of texel fetches, so on shaders that require lots of fetches per pixel already you'll get very immediate benefits from disabling it.

Well, if you have a PC, it's one of those things that just doesn't drop the FPS if you turn it on. Maybe on consoles it's different, but it's one of those things that is taken for granted in PC games. And has been for years.

I mean, City of Heroes had it, and that was 10 years old when they pulled the plug on it last year.

In practice this is the performance hit we are talking about:


It is absolutely negligible.

While I agree, none of those charts had it completely turned off. The lowest entry is 1x (and on the Deux Chart, the lowest is bilinear, and in that case, there is a couple FPS difference. 70 vs 68)
 

peace

Neo Member
Some people need to remember why they play video games. It isn't for the graphics, surely? I still go back and enjoy SNES games and old PC games.

Surely the graphics from 1st generation titles are 'good enough?' They are going to get much better soon enough.

Slagging off a screenshot is ridiculous to me when you won't notice it when the game is moving.
 

Sendou

Member
Some people need to remember why they play video games. It isn't for the graphics, surely? I still go back and enjoy SNES games and old PC games.

Surely it's about graphics. If it's not then why do we even need PS4 and XBONE? As far as I can see OP isn't asking something unreasonable.
 

CTLance

Member
Heh, nextgen.

I remember when even trilinear filtering was awesomesauce. Those were the times.

With modern hardware basically doing anisotropic filtering for free I honestly wonder why it isn't used. Then again, after a FXAA pass half of the detail is missing anyway, so maybe devs just don't give a flying fuck about IQ.
 

SparkTR

Member
Some people need to remember why they play video games. It isn't for the graphics, surely? I still go back and enjoy SNES games and old PC games.

Surely the graphics from 1st generation titles are 'good enough?' They are going to get much better soon enough.

Slagging off a screenshot is ridiculous to me when you won't notice it when the game is moving.

Well, playing those old PC games with modern enhancement like x16AF sure makes them a whole lot more palatable though. I can learn to love low res textures and effects, but I can't do the same with poor IQ. I think in the end it's an enhancement that's basically 'free', there's no reason not to have it.
 
Noticed this too, kind of assumed it would be in when the games are out;

I can't even fathom why it wouldn't be, the last time I played pc games without 16x AF was in 2002 before medal of honor AA released.
the performance impact has been <1-2 fps in games since 2004 or so , even in moh:AA it had very little performance impact

such a no brainer setting, I was mad/surprised when it was missing in ps360 games.


Priorities for developers are fucked beyond belief if ps4/xbox one games don't all have 16x af when they're out...

I remember valve boasting about trilinear filtering back in 99 or something with half life, many ps360 games didn't even have that.
 
While I agree, none of those charts had it completely turned off. The lowest entry is 1x (and on the Deux Chart, the lowest is bilinear, and in that case, there is a couple FPS difference. 70 vs 68)

Completely(!) turning it off(!!) though would be super stupid(!!!) in 2006(!!!!), let alone 2013!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 

-PXG-

Member
I frequently used 16x AF on my Radeon 9700 in 2002 with little performance hit. There is no excuse for a 2013 console to no use at least 8x.

Seriously. The performance hit.... there is none. It's virtually unnoticeable. There is like a one frame difference between no AF and x16. Not one game of mine on PC has had performance issues because of AF.

Not having it is just stupid.
 

R1CHO

Member
Slagging off a screenshot is ridiculous to me when you won't notice it when the game is moving.

Yes, it would be ridiculous, but it's not, because the true is, you will notice with the game running.

Let's see an extreme example:

ib1P7NDkeUsnm1.png

ibhsHSKvmRv4h0.png


It's not photoshop sadly, Saboteur looks that bad on the pc if you don't activate aniso on your graphics card pannel, because there is no such an option on the game.

It dosn't look that bad in every game, but, it always affect image quality in a pretty noticeable way.
 

TronLight

Everybody is Mikkelsexual
I find the lack of AF to be a terrible thing. Especially when developers waste so much effort and resources on things like Chromatic Aberration. I will take aliasing over muddy textures any day of the week.

Isn't the trilinear one more realistic?

You must have some serious eye problems if the floor in front of you becomes blurry after 2 meters.
 

dark10x

Digital Foundry pixel pusher
I'm stunned that AF isn't used across the board at launch. How the fuck are developers ignoring such a feature? It doesn't make sense.

That's a kind of a bullshit comparison.

The Forza shot is taken from a noticeably compressed video and, more importantly, motion blur is present. There is no motion blur in the PCars shot. Motion blur makes the XB1 side seem completely lacking in detail in a still shot.
 

Chev

Member
Well, if you have a PC, it's one of those things that just doesn't drop the FPS if you turn it on. Maybe on consoles it's different, but it's one of those things that is taken for granted in PC games. And has been for years.
Only taken for granted if you have a high-end card in the first place. It's nothing inherent to the PC in general, it's just that PC gamers who case about those things are the kind of people who have GPUs that are far above average and wouldn't be taxed by the lookup. Don't take the case of a technological upper crust as the general one.
 

SparkTR

Member
Only taken for granted if you have a high-end card in the first place. It's nothing inherent to the PC in general, it's just that PC gamers who case about those things are the kind of people who have GPUs that are far above average and wouldn't be taxed by the lookup. Don't take the case of a technological upper crust as the general one.

I had a low end card until not long ago and there was no performance impact there either. It's just something that you should automatically do on PC, in all my years of gaming I've never known AF to cause any performance impact on PC, low end or high end.

I'd love for a developer to chime in here, since even when last generation was in full swing I was wondering why there was no AF in console games, it really made some of them look like shit in certain scenes.
 

JNT

Member
AF only has a minimal impact on performance. However, the impact on memory bandwidth is... not so negligible.
 
I'm stunned that AF isn't used across the board at launch. How the fuck are developers ignoring such a feature? It doesn't make sense.


That's a kind of a bullshit comparison.

The Forza shot is taken from a noticeably compressed video and, more importantly, motion blur is present. There is no motion blur in the PCars shot. Motion blur makes the XB1 side seem completely lacking in detail in a still shot.

Exactly. Pathetic comparison.
 

TronLight

Everybody is Mikkelsexual
Only taken for granted if you have a high-end card in the first place. It's nothing inherent to the PC in general, it's just that PC gamers who case about those things are the kind of people who have GPUs that are far above average and wouldn't be taxed by the lookup. Don't take the case of a technological upper crust as the general one.

I played Crysis 1 on a GT220, 720p and everything on low, but AF didn't make me lose more than a frame.
 

dark10x

Digital Foundry pixel pusher
I played Crysis 1 on a GT220, 720p and everything on low, but AF didn't make me lose more than a frame.
You picked the wrong example, I think.

AF had a particularly large impact on performance around its release time frame and, more importantly, was not originally compatible with parallax occlusion maps (POM). If you used AF you would lose POM and suffer a drop in framerate (this was on an 8800GT in 2007, of course, at 720p). AF was expensive in Crysis.

In most other games, however, it definitely was not.
 
Only taken for granted if you have a high-end card in the first place. It's nothing inherent to the PC in general, it's just that PC gamers who case about those things are the kind of people who have GPUs that are far above average and wouldn't be taxed by the lookup. Don't take the case of a technological upper crust as the general one.

Untrue.
 
Only taken for granted if you have a high-end card in the first place. It's nothing inherent to the PC in general, it's just that PC gamers who case about those things are the kind of people who have GPUs that are far above average and wouldn't be taxed by the lookup. Don't take the case of a technological upper crust as the general one.

you have no idea what you're talking about

people have been using it since 2002 on pc with little to no performance hit
the moment games started having the option it was a complete no brainer to use it.

I was using it on a radeon 9800 pro and a geforce 4 ti 4200 back then... 11 years ago

there is no defending this
 
Top Bottom