• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Texture filtering on consoles, what's going on?

http://www.eurogamer.net/articles/digitalfoundry-2015-vs-texture-filtering-on-console


There's an elephant in the room. Over the course of this console generation, a certain damper has been placed on the visual impact of many PS4 and Xbox One games. While we live in an era of full 1080p resolution titles, with incredible levels of detail layered into each release, the fact is that one basic graphics setting is being often neglected - texture filtering.
But why is this? Can it really be the hardware at fault, given superior filtering is so readily just patched in, as with Devil May Cry? Or could the unified memory setup on PS4 and Xbox One - a unique design which means both CPU and GPU are connected to the same memory resources - be more of a limit here than we might expect?


Either way, the precise reason why PS4 and Xbox One are still often divided like this is an issue that still eludes full explanation. However, certain points are clearer now; we know unified memory has immense advantages for consoles, but also that bandwidth plays a bigger part in texture filtering quality than anticipated. And while PS4 and Xbox One have a lot in common with PC architecture, their designs are still fundamentally different in ways that haven't yet been fully explained. Responses from developers here have been insightful, but the complete answer remains out of grasp - at least for now.

The article is inconclusive like you can see in the quote above but it's worth reading.
 

Durante

Member
I've been asking myself this question at least since the PS3 and 360 were released.

It has such a massive difference in IQ, even if the relative impact on performance is larger than it is on PC it should always be worth it.
 
Even if it may be harder on consoles due to shared bandwidth, I still think devs are making the wrong decision by not including at least 8x no matter what.

Other things should go before AF.
 
I've been asking myself this question at least since the PS3 and 360 were released.

It has such a massive difference in IQ, even if the relative impact on performance is larger than it is on PC it should always be worth it.

The issue is supposed to be the fact that CPU and GPU are fighting for memory from the same pool, unlike PCs where the GPU has dedicated memory.
They did a test to emulate that situation with a PC but the frame-rate drops were still as negligible as they usually are when you mess with that setting on any other PC.
Do you think that test PC they used is a good comparison?

Another interesting thing is the difference between both consoles.
One developer blamed it on the fact that the PS4 versions are running at 1080p but like the article says:

t's a theory that holds strong in some games, but not in others - such as Tony Hawk's Pro Skater 5, Strider, PayDay 2: Crimewave Edition or even Dishonored: Definitive Edition. In these four cases, both PS4 and Xbox One are matched with a full native 1920x1080 resolution, and yet Xbox One remains ahead in filtering clarity in each game.

Why is the filtering worse in these cases on PS4? It's really strange.
 
Do you think that test PC they used is a good comparison?

It is the most similar thing available in the PC space, but perhaps they need to saturate bandwidth even more to show the differences better, likewise, there could be a performance cliff on consoles that is not visible in the testing.
 

c0de

Member
Interesting that in the af thread with ps4, people argued that I was wrong with the slide of showing the decrease of bandwidth when also CPU accesses the RAM. Obviously there is some truth behind it and the whole “It's free!“ is just not true. Surprise.
 

Marlenus

Member
If the A10-7800 with 38GB/s of memory bandwidth can go from no filtering to 16x and suffer barely any drop there is no excuse why the consoles suck in this regard. All three have the same base GCN 1.1 architecture and the PS4 and Xbox one have more bandwidth.
 

Skux

Member
I'm not a computer graphics expert but the idea that AF is too 'expensive' just baffles me. The performance hit is miniscule, even on lower end systems, even on APUs as demonstrated. There's just no reason that's been given to us why it isn't at least 8x on almost every game. But at the same time the experts working on these games continually choose low quality trilinear filtering. It's almost like they're hiding some big secret from us.
 

EGM1966

Member
Interesting. I'd like to see AF given more focus in titles. TBH I feel too many titles on console miss having a form base in the form of 1080p, solid 30fps minimum and good AA and AF. Build what you can on that devs but at least get the foundations right.

TBH seeing how good say MGSV looks at 60fps or even Uncharted remaster I realise I'd prefer less scrappy 1080p titles with weak AF and unstable for or worse 900p titles with similar problems and more titles that get basics right first.
 

c0de

Member
I'm not a computer graphics expert but the idea that AF is too 'expensive' just baffles me. The performance hit is miniscule, even on lower end systems, even on APUs as demonstrated. There's just no reason that's been given to us why it isn't at least 8x on almost every game. But at the same time the experts working on these games continually choose low quality trilinear filtering. It's almost like they're hiding some big secret from us.

It seems it's not about performance directly but the fight for a shared resource where two computing units want data and as the graph in the article does this doesn't happen in a way you would normally think it would. The more the CPU wants, the bigger is the penalty on the GPU side.
 

Marlenus

Member
It is the most similar thing available in the PC space, but perhaps they need to saturate bandwidth even more to show the differences better, likewise, there could be a performance cliff on consoles that is not visible in the testing.

There is no logical reason for it though. The AMD APU has crap bandwidth compared to the consoles, and it is down on shaders, rops, tmus etc yet it suffers barely any performance degradation. There is no hardware reason for it at all so the only other possibility is software and that is on the devs.

It seems it's not about performance directly but the fight for a shared resource where two computing units want data and as the graph in the article does this doesn't happen in a way you would normally think it would. The more the CPU wants, the bigger is the penalty on the GPU side.

Sorry but if a system running a Kaveri APU does not show such a scenario it really does not exist.

This seems more and more like dev oversight and they are not readily going to admit to it.
 
I have a 7850 and I always turn it up to 16X in any PC game that has the option, which is most, with little to no performance impact, it makes such a noticeable difference and can go a long way in making a game look nicer. It's really strange that it seems to be such a problem on consoles. Some of those images comparing the 3 are pretty jarring.
 
There is no logical reason for it though. The AMD APU has crap bandwidth compared to the consoles, and it is down on shaders, rops, tmus etc yet it suffers barely any performance degradation. There is no hardware reason for it at all so the only other possibility is software and that is on the devs.
It definitely does not seem logical based on what we can deduce from what info is readily available. But devs for some reason think it is significant enough to cut. So either there is something we are not seeing IMO in the "equivalent" PC that DF tested, or dev's technical priorities are just plain skewed universally.

Like I said, their could be a performance fall off that is not visible. Some GPUs for example have decidedly unlinear MSAA performance with increasing samples.
 

Easy_D

never left the stone age
Yeah it's kinda weird how devs forget about AF. They have these wonderful texture son display, but you only have an ingame meter ahead of you until it looks like peasoup.

Like, why? At least apply AF on all ground textures, minimum.
 

Javin98

Banned
So there still isn't a definitive answer for this issue. The fact that a very underpowered PC with much less bandwidth can run 16× AF with almost no performance hits when consoles can't is even more baffling. What is really the issue with AF on consoles? Also, I think most of the games stated by DF in the article aren't good examples. Most are just quick cash grabs with the devs probably not giving a shit about AF on the PS4, which seems to require some additional work or else it defaults to trilinear.
 
"Generally PS4 titles go for 1080p and Xbox One for 900p," he says. "Anisotropic filtering is very expensive, so it's an easy thing for the PS4 devs to drop (due to the higher resolution) to gain back performance, as very few of them can hold 1080p with AF. Xbox One, as it is running at lower res, can use the extra GPU time to do expensive AF and improve the image."

Can someone explain a couple of things from this quote?

What does "expensive" mean in this context?

And is there really only a few games that have both 1080p and AF on the PS4?
 

nib95

Banned
Can someone explain a couple of things from this quote?

What does "expensive" mean in this context?

And is there really only a few games that have both 1080p and AF on the PS4?

Regarding how much it costs in terms of system resources or rendering budget. More spent on one expensive graphical feature, the less there is to spend on others. Thing is, traditionally, at least with PC gaming, anisotropic filtering isn't actually very taxing at all.
 

Marlenus

Member
It definitely does not seem logical based on what we can deduce from what info is readily available. But devs for some reason think it is significant enough to cut. So either there is something we are not seeing IMO in the "equivalent" PC that DF tested, or dev's technical priorities are just plain skewed universally.

Like I said, their could be a performance fall off that is not visible. Some GPUs for example have decidedly unlinear MSAA performance with increasing samples.

But when it does get patched in there is no performance impact so really all i can think of is dev oversight or API differences. That still does not explain why the AF is low in a lot of games. Would driveclub suddenly drop frames if they turned AF up to 16x? I doubt it as they added an entire weather system and that would have been substantially more taxing than increasing AF settings.
 

Danlord

Member
I was playing through Everybody's Gone to the Rapture the other day and that suffers from bad AF, even though there's no AI or weapons and such to take notice of.

The strange thing is, because it's CryEngine it includes a heavy amount of motion blur (unfortunately panned the camera as soon as I took the screenshot, PNG might have a delay in taking the screenshot versus JPEG);
921Mu3P.jpg
and even parallax occlusion mapping;

but suffers from bad AF on some objects;
q5tg6AV.jpg

which looking directly on looks like;
hh98AqJ.jpg

Another small example of low/no AF
y8lHoKM.jpg

I don't get it. The game features a lot of dense AI and geometry so it can be sort of hidden most of the time and only noticeable in the areas with buildings and bricked paths, but it's such a jarring thing when you come across such a beautifully rendered town and have the texture just blur so badly when you approach it from a certain angle. I explored a lot throughout my playthrough so it was noticeable to me.
 

VGA222

Banned
I think that DF should have toyed with their test PCs memory clock speed a bit to see if that would change the performance impact of AF. If the games that they tested weren't bandwidth bound then this test wouldn't really be indicative of the relationship between memory bandwidth and AF on consoles.
 
There is no logical reason for it though. The AMD APU has crap bandwidth compared to the consoles, and it is down on shaders, rops, tmus etc yet it suffers barely any performance degradation. There is no hardware reason for it at all so the only other possibility is software and that is on the devs.



Sorry but if a system running a Kaveri APU does not show such a scenario it really does not exist.

This seems more and more like dev oversight and they are not readily going to admit to it.

I've yet to be convinced the reason is any more complicated than that they haven't enabled it. Occam's razor. I don't really see any need to overthink it.

Unless a dev comes out and says "we disabled it because of X" then there's not much else to say. In reality it's mostly been, "oops, we've enabled it now".
 

c0de

Member
Can someone explain a couple of things from this quote?

What does "expensive" mean in this context?

And is there really only a few games that have both 1080p and AF on the PS4?

Given the context of the article, expensive in terms of bandwidth.
How many games? Well, in the other thread people said it would only affect only a few games but also only a few games get a face off so that argument is crap.
 

Javin98

Banned
Can someone explain a couple of things from this quote?

What does "expensive" mean in this context?

And is there really only a few games that have both 1080p and AF on the PS4?
"Expensive" refers to how much hardware grunt it consumes to have it enabled. A more demanding graphical feature can be called more expensive. When a feature that is used it too "expensive", it can cause frame rate drops and/or stuttering, which is why on consoles, devs have to wisely use the right features, typically those that are less demanding. A good example would be post processing AA being very common on consoles because it is "cheap". Likewise for SSAO whereas more powerful PC's can utilize the more expensive HBAO+.

As for your second question, most games have AF on PS4 at 1080p, but it is usually insufficient, ranging around 4× AF. Some games do have a good level of AF, though, such as Bloodborne, Dark Souls 2, Shadow of Mordor and Uncharted:NDC. Some games even had AF patched in with no performance hits at all, like Dying Light and DMC.
 

c0de

Member
I think that DF should have toyed with their test PCs memory clock speed a bit to see if that would change the performance impact of AF. If the games that they tested weren't bandwidth bound then this test wouldn't really be indicative of the performance impact that AF has on consoles.

The thing is that the consoles seem to be way more different than it's obvious so the test wouldn't tell you much.
 

c0de

Member
I've yet to be convinced the reason is any more complicated than that they haven't enabled it. Occam's razor. I don't really see any need to overthink it.

Unless a dev comes out and says "we disabled it because of X" then there's not much else to say. In reality it's mostly been, "oops, we've enabled it now".

Did you read the article? There are quotes from devs in it. I don't believe someone just forgets to enable it.
 

VGA222

Banned
The thing is that the consoles seem to be way more different than it's obvious so the test wouldn't tell you much.

The quote in the article from the Chief Technology Officer of Bluepoint seems to indicate that the issue stems from the effect that shared memory architecture has on memory bandwidth for both the consoles and DFs test PC.
 

Javin98

Banned
Did you read the article? There are quotes from devs in it. I don't believe someone just forgets to enable it.
There is still the theory that the PS4 defaults to trilinear filtering if some additional work isn't done to enable AF, which I think is the reason for the lack of AF in those poorly optimized ports, most being quick cash grab remasters.
 

c0de

Member
The quote in the article from the Chief Technology Officer of Bluepoint seems to indicate that the issue stems from the shared memory architecture that the consoles and DFs test PC uses.

And they failed to reproduce it with the system so it seems the comparison doesn't work as expected.
 

c0de

Member
There is still the theory that the PS4 defaults to trilinear filtering if some additional work isn't done to enable AF, which I think is the reason for the lack of AF in those poorly optimized ports, most being quick cash grab remasters.

But also the cto of Bluepoint mentions disadvantages and I don't think they are incompetent or known for cash grab ports and remasters.
 

Javin98

Banned
But also the cto of Bluepoint mentions disadvantages and I don't think they are incompetent or known for cash grab ports and remasters.
I never said Bluepoint was incompetent. In fact, Uncharted:NDC has a good level of AF on most surfaces, with most being 16× AF but some being 4× AF. Clearly, Bluepoint is not having issues with AF.
 

Angel_DvA

Member
it's a developers issues, some devs have no problem with AF, some have, how many games have been fixed on PS4 with patches ? too damn much...
 

Marlenus

Member
And they failed to reproduce it with the system so it seems the comparison doesn't work as expected.

Or the comparison is perfectly fine and the reason is down to the devs.

But also the cto of Bluepoint mentions disadvantages and I don't think they are incompetent or known for cash grab ports and remasters.

Yet those games have good levels of AF so while the shared memory setup can have disadvantages it seems obvious that low AF is not one of them.
 

c0de

Member
I never said Bluepoint was incompetent. In fact, Uncharted:NDC has a good level of AF on most surfaces, with most being 16× AF but some being 4× AF. Clearly, Bluepoint is not having issues with AF.

I don't say they have a problem with it but obviously 16x was too much for everywhere and didn't they also fall back to trilinear? I don't remember.
But the fact that they talk about it shows it's definitely not free.
Also achieving these results doesn't mean they didn't have problems getting there. Only that if there were problems, they found a way to do it still.
 

Javin98

Banned
I don't say they have a problem with it but obviously 16x was too much for everywhere and didn't they also fall back to trilinear? I don't remember.
But the fact that they talk about it shows it's definitely not free.
Also achieving these results doesn't mean they didn't have problems getting there. Only that if there were problems, they found a way to do it still.
As far as I know, Uncharted:NDC never used trilinear on any surfaces. Even surfaces with blurry textures in the distance still showed a noticeable but still insufficient level of AF, likely 4× AF. Also, I never said it was free and your last statement just proves my point that devs doing cash grab remasters didn't put in the extra effort to enable AF on PS4.
 
I've yet to be convinced the reason is any more complicated than that they haven't enabled it. Occam's razor. I don't really see any need to overthink it.

Unless a dev comes out and says "we disabled it because of X" then there's not much else to say. In reality it's mostly been, "oops, we've enabled it now".

Yeah, it seems like some forgotten setting or seen as not important, sometimes a difference of opinion. I haven't followed every situation though and is that quote from a developer about bandwidth? Problem is very low bandwidth on low end PC doesn't effect much when enabled.

On PC it's like a driver setting. We have PC games that default to x4 today. I always force it to x16 for like a decade, its like an end user thing really. I wonder if sometimes on console it's to save literally 2-3 frames and they think AF is the first to go and easy to disable, and most times it's just either forgotten or not seen as important and again an end user thing on PC. I like native resolution, high AF. We see this gen many devs adding CA and some patching in AF by end user request as if it never crossed their minds and maybe some API differences aren't flagged as again it's seen as not important. I could see the extra clarity that AF brings not being high on the list and some think the blur is a good aesthetic or a non issue.
 

Gurish

Member
I expected DF to come back with a clear answer if they decided to delve into this, how mysterious it can be? developers really can't provide a clear answer why it's so problematic?
 
Article is spot on. Low quality texture filtering sticks out like a sore thumb on some really good looking games. The IQ is an absolute mess despite some high quality lighting, shadows, reflections etc, and it's a real shame as games can look even better if AF wasn't swept under the bus.

I mean I know it's a last gen game, but the principle still applies IMO: DmC Definitive Edition showed that games can improve the AF with little sacrifice in performance (and yes I know it does affect bandwith, but still, I think a little bit more effort in that respect to try and get the maximum quality as possible (or even, make room for it by sacrificing another minor feature) would go a long way.
 

Kezen

Banned
Anisotropic filtering is very expensive, so it's an easy thing for the PS4 devs to drop (due to the higher resolution) to gain back performance, as very few of them can hold 1080p with AF. Xbox One, as it is running at lower res, can use the extra GPU time to do expensive AF and improve the image."
How is 16xAF expensive exactly ? I mean, even in modern games I can effortlessly enable high quality 16xAF via the driver at little to no performance cost.
It's been like this for so long, I don't remember not turning on 16xAF ever since I began gaming on PC in 2004.

It's quite embarrassing for countless console releases not to ship with 16xAF, and sometimes it's even lower than x8.
 

Marlenus

Member
I don't say they have a problem with it but obviously 16x was too much for everywhere and didn't they also fall back to trilinear? I don't remember.
But the fact that they talk about it shows it's definitely not free.
Also achieving these results doesn't mean they didn't have problems getting there. Only that if there were problems, they found a way to do it still.

They talked about memory contention being a disadvantage with a unified memory setup, the Kaveri test PC shows that such a disadvantage does not affect AF performance.
 

Shin-Ra

Junior Member
I don't feel any the wiser after reading this.

Even with the CPU competing for memory bandwidth, there's still far more bandwidth available than lower spec PC GPU/VRAM pairings that'd cope perfectly fine with something like Dishonored at full 16xAF across the board.
 

Renekton

Member
I'll go with Dictator's theory that the A10-7800 test may have had different workload distribution and bottlenecks than Jaguar.
 

dr_rus

Member
Oh god they are talking about memory bandwidth and how "expensive" AF is again, are they?

Sometimes I think that most of devs who talk this way haven't even tried using AF on new consoles and are just basing their replies on what they've learned during the 360/PS3 era. Because no, AF is not memory bandwidth intensive and no, you don't have less bandwidth on PS4 even with full CPU b/w saturation than on XBO.

I can see how "900p+AF vs 1080p-AF" thing may be possible but it's not like this is what's happening. There are plenty of examples of games missing AF on PS4 while running in the same res as on XBO. There are also several examples when AF was patched in after release with no performance drop whatsoever.

Right now the only possible explanation we have left is an overlook on developer's part.
 
I can see how "900p+AF vs 1080p-AF" thing may be possible but it's not like this is what's happening. There are plenty of examples of games missing AF on PS4 while running in the same res as on XBO. There are also several examples when AF was patched in after release with no performance drop whatsoever.

Right now the only possible explanation we have left is an overlook on developer's part.

Yup pretty much. It just shows some neglect in that aspect.

"Generally PS4 titles go for 1080p and Xbox One for 900p," he says. "Anisotropic filtering is very expensive, so it's an easy thing for the PS4 devs to drop (due to the higher resolution) to gain back performance, as very few of them can hold 1080p with AF. Xbox One, as it is running at lower res, can use the extra GPU time to do expensive AF and improve the image."

It's a theory that holds strong in some games, but not in others - such as Tony Hawk's Pro Skater 5, Strider, PayDay 2: Crimewave Edition or even Dishonored: Definitive Edition. In these four cases, both PS4 and Xbox One are matched with a full native 1920x1080 resolution, and yet Xbox One remains ahead in filtering clarity in each game."

This quote from the article says it right there. It's just sheer incompetency.
 
Well, I guess you can handle cache memory in a more efficient way when you have a local memory to back up. But on consoles you have to share memory, so now you can't allow AF to eat up all the cache as in PC since you need it for more crucial processes.

Just guessing.
 
I just hope that developers try and prioritize texture filtering more often, even if it may be at the cost of another minor graphical feature. The low implementation of it is becoming increasingly noticeable (and jarring) as games are becoming better looking.
 

Ran rp

Member
I've been asking myself this question at least since the PS3 and 360 were released.

It has such a massive difference in IQ, even if the relative impact on performance is larger than it is on PC it should always be worth it.

It's always the first thing I set to high or even max when gaming on my weak laptop.
 
Top Bottom