• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

why is anisotropic filtering in many next gen titles missing?

ethomaz

Banned
It is missing (or at least low level) on Watch Dogs PC too unless you force it at the drive level on nVidia GPUs.

So I guess it is a dev issue... you need to talk with them ;)
 

LCGeek

formerly sane
Some enthusiasts care but the general audience does not, and if I recall correctly 60fps has been the norm going back to at least the SNES.

And just because console companies use PC hardware or adopted PC norms does not mean that devs or customers really care about technical specsheet wars like 30fps, anisotropic filtering, etc. And how am I "dead wrong on online" as I listed the truth: console customers and are okay with monthly subs and sub-1080p multiplayer, and I'm not sure how your response replies to that.

60fps was a norm for 2d games.

For 3d games its not and plenty of saturn, psx, and n64 games are proof of that. 30fps became more of a standard in the 2nd generation of 3d consoles even then it was largely still shaky until last generation.

The devs do care if they didn't none of them wouldn't be welcoming the features they have been using each generation they get new tools and toys. Sony last generation ditched their own exotic hardware for nvidia based two times now. MS has used pc hardware for good reason as devs like the ease of it compared to what existed before. Epic alone last generation was the sole reason sony and ms beefed up the ram to 512MB vs 256MB. Devs matter in the discussion unless you want to ignore various inteviews on the subject where devs and manufacturers give credit to the idea. You certainly cannot ignore the influence epic, cryktek, naughty dog, and a few other big names who have all prodded the big players in industry to go a certain direction.

As for consumers most don't care about graphics in general be it eye candy or not from my experiences or casual players vs enthusiast I meet at conventions and the like. Most consumers just want a good gaming experience graphics get far more interest from the hardcore and enthusiast groups.

And to my points before are you forgetting the dramatic changes in tv alone that consumers have seen since the 80's. They want their gaming hardware to keep up if they didn't we wouldn't have moved beyond 480p and sub resolution below it. Monthly subscriptions are something the pc has been doing for ages and for good reason it brings in the money. The entire online infrastucture sans massive deidciated servers has been borrowed from ideas that the PC did including MS own MS gaming zone which was popular for ipx hacks and dial up types.

You can argue all you like that they do not care that much to say they don't care at all is a stretch of the imagination, history, and sanity.
 
The explanation throughout the whole thread has been "consoles don't have as much bandwidth as today's PC GPUs", and it's been accepted as valid. People have responded with, "Okay, then what about yesterdays GPU?"

Can you actually explain why bandwidth is a such bigger issue on consoles? So much that they can't equal 10 year old PC AF performance?

People keeps saying it's bandwidth, with no depth added to the technical aspects of why.
It makes sense to being bandwidth limited because I believe you need more texture samples to get higher levels of filtering.

I'm not sure it would be an large amount, but if you are already using all available bandwidth anything that requires more will bring performance down.

Outside of 900 Watch_Dogs, yes.
Can you give some examples of games and their resolutions/framerate?
I'll ask again: What generation are you living in? Where are all these sub-1080p PS4 games? Yeah, there are two (and a half, if you include KZ's multiplayer), but you're making it sound like 1080p is still some rare and mythical thing on consoles. It's not. It's the de facto standard resolution on PS4, with just a few exceptions. XBO is a slightly different matter, but still.



Last gen I was gaming on a laptop with a GT 555M (2GB DDR3), in addition to my PS3. I could generally run console ports at higher-than-console settings, but I couldn't max most games and get good framerates. AF, however, I could always crank up to 8x or 16x without a discernable impact on performance.

Well, on the higher end that's a 50GB/s gpu, which is more than what's available on either Ps3 or 360. And unlike those consoles, all that is available solely to the gpu, it's not shared in any way with the cpu.

So it makes sense that this card is able to run console ports at higher quality settings and specially resolutions than Ps360 would, and still be able to crank up AF settings.
 
I honestly don't know. Graphics cards for a few generations now have had virtually free 16xAF. Here's a GTX 460 (released in mid-2010) running Bad Company 2:

qZzC0BF.png

Yup. I have 16xAF forced in my control panel as a default for all applications. Seems like a no brainer
 

LCGeek

formerly sane
Yup. I have 16xAF forced in my control panel as a default for all applications. Seems like a no brainer

Yes if your hardware is much more powerful than the 360 and has the grunt power for it. People have already explained why you cannot compare what the pcs are doing for various reasons to a console. Even then JNT has laid it out clearly from a development standpoint why this happens.

Also try using perspective a GTX460 shits on the X1900 which is the closest thing to what the 360 has.
 

Abounder

Banned
60fps was a norm for 2d games.

For 3d games its not and plenty of saturn, psx, and n64 games are proof of that. 30fps became more of a standard in the 2nd generation of 3d consoles even then it was largely still shaky until last generation.

The devs do care if they didn't none of them wouldn't be welcoming the features they have been using each generation they get new tools and toys. Sony last generation ditched their own exotic hardware for nvidia based two times now. MS has used pc hardware for good reason as devs like the ease of it compared to what existed before. Epic alone last generation was the sole reason sony and ms beefed up the ram to 512MB vs 256MB. Devs matter in the discussion unless you want to ignore various inteviews on the subject where devs and manufacturers give credit to the idea. You certainly cannot ignore the influence epic, cryktek, naughty dog, and a few other big names who have all prodded the big players in industry to go a certain direction.

As for consumers most don't care about graphics in general be it eye candy or not from my experiences or casual players vs enthusiast I meet at conventions and the like. Most consumers just want a good gaming experience graphics get far more interest from the hardcore and enthusiast groups.

And to my points before are you forgetting the dramatic changes in tv alone that consumers have seen since the 80's. They want their gaming hardware to keep up if they didn't we wouldn't have moved beyond 480p and sub resolution below it. Monthly subscriptions are something the pc has been doing for ages and for good reason it brings in the money. The entire online infrastucture sans massive deidciated servers has been borrowed from ideas that the PC did including MS own MS gaming zone which was popular for ipx hacks and dial up types.

You can argue all you like that they do not care that much to say they don't care at all is a stretch of the imagination, history, and sanity.

60fps is the video game standard, and we saw it with the PC, SNES, and today with Nintendo and other companies keep that standard alive on contemporary consoles too.

I don't even know how to kindly reply to your post which seems like a bunch of historical tangents with vague points, but in the end I think we both should agree: technical specsheets like the level of anisotropic filtering or fps means very little, and hence why we don't see 60fps or such features listed on the box. However that does not mean that tech and industries remain complacent like your examples showed.

As for your last sentence: don't put words into other people's messages. I never said they didn't care "at all", just they didn't care "that much", which is kind of besides the point anyway since they are selling millions despite what a specsheet says. They don't really care about 60fps or 16x AF, it's about the $$$ and selling an experience
 

LCGeek

formerly sane
60fps is the video game standard, and we saw it with the PC, SNES, and today with Nintendo and other companies keep that standard alive on contemporary consoles too.

I don't even know how to kindly reply to your post which seems like a bunch of historical tangents with vague points, but in the end I think we both should agree: technical specsheets like the level of anisotropic filtering or fps means very little, and hence why we don't see 60fps or such features listed on the box. However that does not mean that tech and industries remain complacent like your examples showed.

As for your last sentence: don't put words into other people's messages. I never said they didn't care "at all", just they didn't care "that much", which is kind of besides the point anyway since they are selling millions despite what a specsheet says. They don't really care about 60fps or 16x AF, it's about the $$$ and selling an experience

On pc the standard is much higher, 60fps is old news in a lot of enthusiast circles especially anyone with high refreshrate monitor including myself. I never said it wasn't a standard anymore just one that doesn't apply to the situation. Who cares about the snes we are talking about 3d consoles where that standard is not the majority.

Why should I agree with your faulty premise? Vague histories, these are things you should know especially in a topic like this before spewing a useless point that adds nothing to the discussion and is factually not true. Not only that you claimed devs didn't care which i pointed out how in recent history they have and pretty much have cared about specs since the PS2 age, which I don't think you're that bold and ignorant in declaring to everyone it doesn't matter to devs. Also I do not speak in tangents I speak in convergence on a point you brung up that is dead wrong with facts I point out be it recent or old history.

Not at all and not much, pretty much the same thing considering your tone already. Want me to be precise than stop speaking in basic abstracts.

Be kind or not you're wrong.
 

Abounder

Banned
On pc the standard is much higher, 60fps is old news in a lot of enthusiast circles especially anyone with high refreshrate monitor including myself. I never said it wasn't a standard anymore just one that doesn't apply to the situation. Who cares about the snes we are talking about 3d consoles where that standard is not the majority.

Why should I agree with your faulty premise? Vague histories, these are things you should know especially in a topic like this before spewing a useless point that adds nothing to the discussion and is factually not true. Not only that you claimed devs didn't care which i pointed out how in recent history they have and pretty much have cared about specs since the PS2 age, which I don't think you're that bold and ignorant in declaring to everyone it doesn't matter to devs. Also I do not speak in tangents I speak in convergence on a point you brung up that is dead wrong with facts I point out be it recent or old history.

Not at all and not much, pretty much the same thing considering your tone already. Want me to be precise than stop speaking in basic abstracts.

Be kind or not you're wrong.

Agree to disagree then: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars.

60fps is still the benchmark on the PC and video games in general, but yea it's old-hat for enthusiasts.

You can argue all you like that they do not care that much to say they don't care at all is a stretch of the imagination, history, and sanity.

Be kind or not you're wrong.

You're also dead wrong

As for tone: it's hard to reply kindly to walls of text that question one's "sanity" (avatar quote?) or how they're "dead wrong" but I digress.

Can you sum up how I'm wrong because it seems like we agree? Here is my premise from the above: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars. Otherwise specs like 60fps, 16xAF, etc. would be focused on.
 
I think part of the reason you don't see as much AF in console games is that it is discriminatory against people that don't notice the AF.
 

LCGeek

formerly sane
Agree to disagree then: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars.

Keep moving that goal post.

Snes vs genesis: eye candy wins
Saturn,PSX, N64, vs all the other crap we got: people chose the better
PS2 vs DC : noticing a pattern
GC vs Xbox
Wii VS PS3/360
WiiU vs PS4/X1

Need I say more

Stop moving your own goal posts


Consumers aren't half as disinterested as you say they are

60fps is still the benchmark on the PC and video games in general, but yea it's old-hat for enthusiasts.

It's a standard, but not the highest. As for videogames in general the history of 3d consoles would say otherwise. You trying to act like it is for console and ignore the debates here is actually pretty ballsy

As for tone: it's hard to reply kindly to walls of text that question one's "sanity" (avatar quote?) or how they're "dead wrong" but I digress.

I love it when gaffers avatar quote and talk out of there ass. I earned that tag and believe me my posts were epic to earn including a 2012 well before most were even aware of it and conspiracy troll. Go ahead show more ignorance

Can you sum up how I'm wrong because it seems like we agree? Here is my premise from the above: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars. Otherwise specs like 60fps, 16xAF, etc. would be focused on.

Yup. Console customers don't care much for PC or other video game standards: 30fps, sub-1080p (especially in online multiplayer), monthly subs, and even black bars are becoming a thing for "next-gen" boxes.

As for anisotropic filtering and other issues: it's either the quality of the dev talent, hardware limitations, or rushed project deadlines. But it sounds like an artistic choice even though it's been lacking


Can't edit now so you're sort of stuck. You argue for 60fps despite mentioning 30fps, neither are long time standards for console games at best 2 generations. 60fps or 30fps are nice options when devs do it right otherwise fps has been pretty spotty and you're at the mercy of the game. You mention sub 1080p for systems yet ignore that basically until last generation a majority of console games don't even support the option 1080p or 720p. Before that consoles barely enjoyed 480p just to be clear. Black bars aren't new or have you missed various titles even in the psx age doing it? Consumers care them buying hdtvs to play these titles show plenty they care.

I just showed you're now limiting the scope despite the fact I responded to you and another user making a very basic premise you're now walking away from. I don't care anymore especially if that's how you like to argue.
 

mrklaw

MrArseFace
I accept the possibility that it requires a small amount of power or bandwidth and developers choose to assign that elsewhere. However, that would suggest that almost all developers came to the same conclusion, which seems odd. You'd think at least some would value the increased quality from adding AF.

Arguably AF is a bigger improvement to image quality than other, more intensive features that could be turned down slightly to compensate.
 

Guess Who

Banned
Keep moving that goal post.

Snes vs genesis: eye candy wins
Saturn,PSX, N64, vs all the other crap we got: people chose the better
PS2 vs DC : noticing a pattern
GC vs Xbox
Wii VS PS3/360
WiiU vs PS4/X1

I like how cherry picked this is. "PS2 vs DC" and "GC vs Xbox" is very disingenuous when those are all the same generation, and the PS2 pounded the GC and Xbox despite being weaker than either. The Wii also outsold the PS3 and 360 by a substantial margin.
 

LCGeek

formerly sane
I like how cherry picked this is. "PS2 vs DC" and "GC vs Xbox" is very disingenuous when those are all the same generation, and the PS2 pounded the GC and Xbox despite being weaker than either. The Wii also outsold the PS3 and 360 by a substantial margin.

His point is that consumers don't care about graphics.

PS2 beats DC and DC came out first.

PS2 stills beats GC and trades with it in performance in variety of titles. Yet out of the 4 systems that came out that generation Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games.

Also that generation vs the rest I showed is cherry picking itself. Might not want to try to call a spade a spade and then miss the bigger point which you didn't refute at all.
 

Dash Kappei

Not actually that important
Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.

Maybe it will begin to be a priority if/when consoles start outputting larger resolutions like 4k, but until then I can see why devs wouldn't be tremendously interested in going out of their way to incorporate AF when it doesn't really look very good.

wtf-is-this-shit.jpg


This has gotta be one of the most inaccurate posts I've seen all year.
It's pretty much clear as a day that you don't even know what AF looks like and what it does, there's literally no other explanation since what you're saying would compare to something akin to 'high texture quality looks gross at lower resolution, even at 1080p high quality texturing don't look [as good as blurrier smudgy textures], hi-res textures don't look good at all, especially compared to mid-res textures.... Maybe with 4K it'll be different but for now high quality/Hi-res textures aren't needed and just don't really look very good".
I mean, WTF!
 

Chev

Member
Do you know something he doesn't?
While not a videogame developer I am a software developer who has had to wrestle with GPUs. As I've mentioned earlier in the topic, not only does AF indeed incur a performance hit that tends to be underestimated, there are in fact situations, namely virtual texturing (but anything that'll result in noncontiguous cache access will exhibit similar problems), which make hardware-provided AF completely and utterly useless. Because those approaches don't use hardware mipmapping, instead dividing the textures into a hierarchical tree of equally sized tiles, AF will either not be supported or wreck your texture cache, incurring huge performance losses.

Hardware AF is really specifically tailored to a classic mipmap chain, so more unusual texture schemes won't give the optimal results those bar graphs would suggest. And a lot of new approaches have been unusual in recent years. More recent GPUs account for that (this is what tiled resource support is about in DX11.2 for example) but of course you'll need to tailor your code for the ones that don't.
 
Also, you PC gamers need to calibrate your displays better, on my TV you can not notice the lack of filtering.
Most PC gamers are simply used to have better graphics since '08 than todays PS4/XBone can offer. Including me.
I hate the fact that it's all about graphics and not a game itself, but I'm sailing in the same boat as all those haters.. sadly.
Just can't get myself to play something on PC without everything maxed out and AA+AF, and I hate this fact.

Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.

Maybe it will begin to be a priority if/when consoles start outputting larger resolutions like 4k, but until then I can see why devs wouldn't be tremendously interested in going out of their way to incorporate AF when it doesn't really look very good.

Son, you just forgot to put on your glasses.
 

Guess Who

Banned
His point is that consumers don't care about graphics.

PS2 beats DC and DC came out first.

PS2 stills beats GC and trades with it in performance in variety of titles. Yet out of the 4 systems that came out that generation Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games.

Also that generation vs the rest I showed is cherry picking itself. Might not want to try to call a spade a spade and then miss the bigger point which you didn't refute at all.

Your argument is flawed for every generation.

SNES vs Genesis was very close and the SNES won on the strength of its games, not graphical prowess. People bought SNESes because they wanted to play Mario, Zelda, and Donkey Kong. The Genesis was no slouch, either - what it lacked in graphical horsepower it made up for in CPU strength (hence "blast processing").

The PSX didn't have the most powerful hardware, the N64 did. The PSX won because it was easier to develop for than the ridiculous architecture of the Saturn and wasn't constrained by Nintendo's dumb decision to use cartridges, so most developers jumped ship from Nintendo and Sega, which meant that most games people wanted were on PSX.

I already went over the PS2 generation, but your argument is totally nonsensical here. "Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games" - what does this even mean? In any case, the point still stands that, out of the PS2, GameCube, and Xbox, the PS2 was unequivocally the weakest and was the best-selling system by miles and miles, again on the strength of its now-legendary game library. "PS2 stills beats GC and trades with it in performance in variety of titles" is just straight-up bullshit. Where's that PS2/GC RE4 comparison gallery?

And once again, the Wii outsold both the PS3 and 360, with 20 million more sales than either, despite being a whole generation behind in tech.
 

jonezer4

Member
With the excessive amounts of depth of field and motion blur effects at the moment, it probably doesn't even matter

This is what I was thinking. It might be pretty superfluous to crisp up that far away texture if DOF's just gonna wind up blurring it anyway.
 

LCGeek

formerly sane
Your argument is flawed for every generation.

SNES vs Genesis was very close and the SNES won on the strength of its games, not graphical prowess. People bought SNESes because they wanted to play Mario, Zelda, and Donkey Kong. The Genesis was no slouch, either - what it lacked in graphical horsepower it made up for in CPU strength (hence "blast processing").

The PSX didn't have the most powerful hardware, the N64 did. The PSX won because it was easier to develop for than the ridiculous architecture of the Saturn and wasn't constrained by Nintendo's dumb decision to use cartridges, so most developers jumped ship from Nintendo and Sega, which meant that most games people wanted were on PSX.

I already went over the PS2 generation, but your argument is totally nonsensical here. "Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games" - what does this even mean? In any case, the point still stands that, out of the PS2, GameCube, and Xbox, the PS2 was unequivocally the weakest and was the best-selling system by miles and miles, again on the strength of its now-legendary game library. "PS2 stills beats GC and trades with it in performance in variety of titles" is just straight-up bullshit. Where's that PS2/GC RE4 comparison gallery?

And once again, the Wii outsold both the PS3 and 360, with 20 million more sales than either, despite being a whole generation behind in tech.

They only had some of those games cause of the architecture. He literally said devs don't care about that another clear clase in my favor not his or yours. No one with half a clue about snes abilities will say it's technology didn't have impact in the games that were designed from it.

Blast Processing didn't win genesis the generation nor did I say it was a slouch in sales.

PSX and N64 trade off in hardware abilities also if we are talking about power of that age it belongs to Neogeo for 2d and for 3d consoles were nothing compared to arcades which back then were still banking money until the dc era end. Also everytime you bring up a dev point you put another dagger in his argument devs don't care about specs so thanks. You are also forgetting about 3do, jaguar and tons of other crap that was more powerful or even less powerful that consumers abandoned or didn't care. Again just showing consumers aren't as disinterested as one person made the claim they are.

DC was the weakest and you admitted it was part of the generation. Even with PS2 supposedly being the weakest it still has plenty of exclusive and multiplatform titles where it near equal or better in some case compared to Xbox or GC. Either way consumers didn't choose the weakest system, despite it having plenty of good games. Devs even with PS2 hard nature to develop for still chose it over DC why cause it ended doing games like Silent Hill 3 and Gran Turismo to say the least. His argument is they don't care about specs history shows otherwise. You must not have been here for the ps2 sparks particle debate or how in various multiplatform titles people would choose PS2 for sound and content which sales show. Where is my search history when I need it?

Combined sales buddy when it comes to Wii and yes that does matter cause the industry doesn't give two fucks about nintendo last gen or this gen when it comes to third parties for the most part. Don't see you mentioning WiiU I wonder why....
 

Abounder

Banned
Keep moving that goal post.

Snes vs genesis: eye candy wins
Saturn,PSX, N64, vs all the other crap we got: people chose the better
PS2 vs DC : noticing a pattern
GC vs Xbox
Wii VS PS3/360
WiiU vs PS4/X1

Need I say more

Stop moving your own goal posts



Consumers aren't half as disinterested as you say they are

Actually in this case you could say more: Show me how the goal posts changed? Both posts say that most people don't care about specsheets specifically when it comes to AF and FPS


It's a standard, but not the highest. As for videogames in general the history of 3d consoles would say otherwise. You trying to act like it is for console and ignore the debates here is actually pretty ballsy

What on earth? It's "ballsy" to suggest that 60fps has been the benchmark for video games? I must have Duke Nukem's balls of steel.


I love it when gaffers avatar quote and talk out of there ass. I earned that tag and believe me my posts were epic to earn including a 2012 well before most were even aware of it and conspiracy troll. Go ahead show more ignorance

Yikes.

Yup. Console customers don't care much for PC or other video game standards: 30fps, sub-1080p (especially in online multiplayer), monthly subs, and even black bars are becoming a thing for "next-gen" boxes.

As for anisotropic filtering and other issues: it's either the quality of the dev talent, hardware limitations, or rushed project deadlines. But it sounds like an artistic choice even though it's been lacking


Can't edit now so you're sort of stuck. You argue for 60fps despite mentioning 30fps, neither are long time standards for console games at best 2 generations. 60fps or 30fps are nice options when devs do it right otherwise fps has been pretty spotty and you're at the mercy of the game. You mention sub 1080p for systems yet ignore that basically until last generation a majority of console games don't even support the option 1080p or 720p. Before that consoles barely enjoyed 480p just to be clear. Black bars aren't new or have you missed various titles even in the psx age doing it? Consumers care them buying hdtvs to play these titles show plenty they care.

I just showed you're now limiting the scope despite the fact I responded to you and another user making a very basic premise you're now walking away from. I don't care anymore especially if that's how you like to argue.

1080p/60fps (and no black bars) has been the standard for quite a few years in the video game industry which goes well beyond outdated PS3/360 console hardware. 60fps has been around since the SNES and etc. Of course that doesn't mean everyone reaches or agrees with benchmarks found in Digital Foundry and other comparison websites.

Most consumers don't really care about specsheet wars because they're buying consoles for the plug-and-play experience. Devs don't always follow industry standards because they sacrifice fps and af for their vision in addition to other limitations. And because they can make millions either way.

His point is that consumers don't care about graphics.

No there's either miscommunication or you're simply putting words in my posts. I'm saying most people don't care about specsheet tech wars like the amount of AF or FPS. It's more of an enthusiast thing

He literally said devs don't care about that another clear clase in my favor not his or yours.

Feel free to give a quote, I believe it was something like devs "don't really care" which may seem like a small distinction but it matters when it comes to fps and af which is within the reach of console tech. Devs would rather prioritize the experience and make money instead of focus on spechsheet benchmark wars like how much AF or FPS it has. That's for enthusiasts.
 

Branduil

Member
Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.

I can't even comprehend opinions like this. How does resolution make good filtering look gross?
 

dr_rus

Member
Performance. Contrary to what PC gamers are used to say high levels of AF aren't "free" although they are relatively "cheap" on modern GPUs. If you're limited by texel fetches in your renderer then the hit can be pretty significant - especially when you're already running at 30 fps.

Same as with resolutions it's a trade off - no AF but better effects, shaders, more objects on screen.
 

LCGeek

formerly sane
Actually in this case you could say more: Show me how the goal posts changed? Both posts say that most people don't care about specsheets specifically when it comes to AF and FPS




What on earth? It's "ballsy" to suggest that 60fps has been the benchmark for video games? I must have Duke Nukem's balls of steel.




Yikes.



1080p/60fps (and no black bars) has been the standard for quite a few years in the video game industry which goes well beyond outdated PS3/360 console hardware. Of course that doesn't mean everyone reaches or agrees with benchmarks found in Digital Foundry and other comparison websites.

Most consumers don't really care about specsheet wars because they're buying consoles for the plug-and-play experience. Devs don't always follow industry standards because they sacrifice fps and af for their vision in addition to other limitations. And because they can make millions either way.



No there's either miscommunication or you're simply putting words in my posts. I'm saying most people don't care about specsheet tech wars like the amount of AF or FPS. It's more of an enthusiast thing



Feel free to give a quote, I believe it was something like devs "don't really care" which may seem like a small distinction but it matters when it comes to fps and af which is within the reach of console tech. Devs would rather prioritize the experience and make money instead of focus on spechsheet benchmark wars like how much AF or FPS it has. That's for enthusiasts.

Ok....

First all you're statement says "Console customers don't care much"

Logically speaking that's pretty absolute to me and most others here but apparently not to you who had to change to it not much or not much at all. Even then quantify not much for everyone? Anyway you slice it's a dismissive attitude that isn't really grounded in anything relevant to the conversation it's not exact or even all that generalized it blank check to move things when you want. For someone talking about me putting words in to other mouths you literally are speaking for millions whether they are a minority or not and you see it fit to question me, check yourself.

Even then I established with certain facets besides AF or FPS like with resolution, technology in consoles or other things Guess Who has mentioned in response to me they do you still argued otherwise they didn't care all that much. HDTV alone shows they care cause if they didn't they would've kept their 480i/p crts which are now mostly non existent.

That is how you moved the goal post can't be anymore frank and concise about it. You went from a variety of things they don't care about to AF or FPS specsheets even then the dev talk alone going on shows your wrong.

Also right now AF means jack to most enthusiast its insulting that your suggesting it cause it shows you really aren't going to most sites that do benchmarks for such standards. How a game performs at any resolution at or beyond 1080p at 60fps or higher with aa various forms, hbao/ssao and physics tends to be what they look for so they know how their card performs.
 

aeolist

Banned
It is most likely that they are just focusing on higher impact issues. Honestly, i've played very few games where, in motion, AF really makes any appreciable difference (i know some pc people are going to rage at this).

It is one of those things that is much more impact on screenshots that it is on in-motion game play.
actually it's a hell of a lot more noticeable in motion since there's a hard line where you can see the texture quality drop that stays a fixed distance from the game camera

basically textures blur and sharpen as you move around, it's really distracting
 

Abounder

Banned
Ok....

First all you're statement says "Console customers don't care"
You left out a key word or there's miscommunication, it says: Console customers don't care much for PC or other video game standards.

http://www.neogaf.com/forum/showpost.php?p=114061909&postcount=182

Logically speaking that's pretty absolute to me and most others here but apparently not to you who had to change to it not much or not at all. You didn't say this at first stop acting like you did when I linked to your exact posting showing in complete detail how it says exactly what it says.
I said that initially, and also clarified throughout. Although even if I didn't: logically speaking you should understand that it's a generalization.

Even then I established with certain facets besides AF or FPS like with resolution, technology in consoles or other things Guess Who has mentioned in response to me they do you still argued otherwise they didn't care all that much. HDTV alone shows they care cause if they didn't they would've kept their 480i/p crts which are now mostly non existent.

That is how you moved the goal post can't be anymore frank and concise about it. You went from a variety of things they don't care about to AF or FPS specsheets.

Also right now AF means jack to most enthusiast it insulting that your suggesting it cause it shows you really aren't going to most sites that do benchmarks for such standards. How a game performs at any resolution at or beyond 1080p at 60fps or higher with aa, various forms, hbao/ssao and physics tends to be what they look for so they know how their card performs.

Yea we seem to basically agree. In other words: Specsheet wars specfically FPS and AF is secondary to the overall experience. Devs would rather focus on other things like say resolution or amount of pretty mayhem on the screen instead of scoring high on a digital foundry benchmark
 
Lack of AF is the new common core of console gaming. Seriously it makes absolutely no sense at all why it isn't enabled for every single game on both x1 and ps4.
 

LCGeek

formerly sane
You left out a key word, it says: Console customers don't care much for PC or other video game standards.

Still doesn't dismiss the rest of what you said. I can admit I was wrong on that.

I said that initially, and also clarified throughout. Although even if I didn't: logically speaking you should understand that it's a generalization.

It's horribly bad considering your personally speaking for literally millions of people. A couple hundred or thousand is fine after that you're just appealing to popularity and quite frankly in all honesty can't or shouldn't. If publishers and devs who spends millions on sales data and surveys can't get consumers right what makes you so much better to say such a thing.


Yea we seem to basically agree. In other words: Specsheet wars specfically FPS and AF is secondary to the overall experience. Devs would rather focus on other things like say resolution or amount of pretty mayhem on the screen instead of scoring high on a digital foundry benchmark

Resolution is equally as much specsheet wars as much as FPS and AF. Also that I clearly showed how hd standards are only common to the last two generation before that they were virtually non existent unless you had a pc display and machine capable of that. Also polygons and the max output was basically part of specwars until the last generation when devs really had more freedom than they had before. Devs still care which is point you were harping they didn't when the reality is not that.

I'm an FPS through and through lightboost 120hz for me until there is something better which basically until 4k lightboost shows up if it does won't happen. As for AF, AA effects and the like I tend to gut them till I get a smooth experience. Grew up playing counter strike, quake and other very simple looking games eye candy is just that give me response, clarity, and good mechanics.
 

Pimpbaa

Member
The general public may not know what the hell AF is, but they sure as hell would know what blurry textures are. I expected 16x AF last gen. In PC games, it is just something so simple to turn on (or even force on) and has such a drastic effect on image quality and it's something I have been using in every game with no ill effects (except for instances where AF was killing POM effects) since I had the Radeon 9800. It's almost insulting to the texture artists who worked on the games to see their efforts all blurred to hell.
 

tehPete

Banned
I honestly don't know. Graphics cards for a few generations now have had virtually free 16xAF. Here's a GTX 460 (released in mid-2010) running Bad Company 2:

qZzC0BF.png


Yup. I have 16xAF forced in my control panel as a default for all applications. Seems like a no brainer

It's the one setting I've been able to consistently max with no noticeable performance hit for at least my last three graphics cards. I can see that it might conflict somewhat with some post-processing effects being used these days, but it should still be a standard feature and only disabled if absolutely necessary.

Edit - I upgrade my GPU every 2 - 3 years.
 

Abounder

Banned
Still doesn't dismiss the rest of what you said. I can admit I was wrong on that.

It's horribly bad considering your personally speaking for literally millions of people. A couple hundred or thousand is fine after that you're just appealing to popularity and quite frankly in all honesty can't or shouldn't. If publishers and devs who spends millions on sales data and surveys can't get consumers right what makes you so much better to say such a thing.

Resolution is equally as much specsheet wars as much as FPS and AF. Also that I clearly showed how hd standards are only common to the last two generation before that they were virtually non existent unless you had a pc display and machine capable of that. Also polygons and the max output was basically part of specwars until the last generation when devs really had more freedom than they had before. Devs still care which is point you were harping they didn't when the reality is not that.

I'm an FPS through and through lightboost 120hz for me until there is something better which basically until 4k lightboost shows up if it does won't happen. As for AF, AA effects and the like I tend to gut them till I get a smooth experience. Grew up playing counter strike, quake and other very simple looking games eye candy is just that give me response, clarity, and good mechanics.

Yea it's all about priorities and I think devs focus on pretty mayhem above all else even if that means fewer FPS, AF, resolution, and other effects. Even graphical showcases like Ryse outputs at less than 1080p for pretty mayhem and the devs vision in addition to limitations.

I think the video game standard is 60fps, and for the past few years the benchmark in the industry is arguably HD+60FPS but unfortunately it's more uncommon than not
 
Wow this thread seems like a constant see-saw of people posting the same question, a few people touching on the answer and then some more asking it again like it's not been answered...

I'm not into games dev much these days but the answer is likely some combination of this list:

- Because it's not really "free"
- AF is heavily bandwidth intensive
- PCs get away with this because CPU + GPU have two completely separate buses
- Consoles don't because they have a single data bus they have to share between the CPU+GPU, especially in modern shared memory architectures
- I know for a fact the console GPU silicon budget is significantly less than what you'd find in most desktop HW
- If you're cutting out cruft to try to optimise your GPU silicon budget (so that you can get both your CPU+GPU chips onto the same die, hugely saving costs), the first things to trim are things like the density of ROPs and texture units (you aren't going to be doing 1600 x 1400 on PS4 anytime soon), which would have a direct impact on cost of something like AF
- Trying to understand the "cost" of AF from PC benchmark charts, using "1-2 FPS" as your unit of measurement doesn't work. frames per second is not a unit of measurement because 1 frame at 300fps is 3.2ms and 1 frame at 60fps is 16ms - i.e. the real cost is entirely architecture-dependent and no such measurement on any PC would be able to provide an accurate ballpark figure for comparison (textbook apples to oranges)
- There maybe other reasons not listed here that either I don't know about/forgotten or are specific to the new HW architectures and gagged under NDA (highly doubtful)
 

LCGeek

formerly sane
I think the video game standard is 60fps, and for the past few years the benchmark in the industry is arguably HD+60FPS but unfortunately it's more uncommon than not

I'm just saying there are many standard doesn't help to lock people in to limited boxes when there are many be it console or pc.

People still care to what degree is quite varied but to not much or very little ignores the reality of the industry largely since Nes. Consumers could easily choose lesser products or better products in most of the generations we are talking about but largely they choose balance and so do devs.

Rising_Hei thank you for pointing that out cause the different weather/time makes the colors seem a lot of different than what they probably are. For me though the easiest way to spot the af after you take out the effects on the rocks is the building in the back. Though like you mention your factor could be a play in that due to fog.

archangelmorph bookmarking that excellent post for when this topic comes up again.
 
Wow this thread seems like a constant see-saw of people posting the same question, a few people touching on the answer and then some more asking it again like it's not been answered...

I'm not into games dev much these days but the answer is likely some combination of this list:

- Because it's not really "free"
- AF is heavily bandwidth intensive
- PCs get away with this because CPU + GPU have two completely separate buses
- Consoles don't because they have a single data bus they have to share between the CPU+GPU, especially in modern shared memory architectures
- I know for a fact the console GPU silicon budget is significantly less than what you'd find in most desktop HW
- If you're cutting out cruft to try to optimise your GPU silicon budget (so that you can get both your CPU+GPU chips onto the same die, hugely saving costs), the first things to trim are things like the density of ROPs and texture units (you aren't going to be doing 1600 x 1400 on PS4 anytime soon), which would have a direct impact on cost of something like AF
- Trying to understand the "cost" of AF from PC benchmark charts, using "1-2 FPS" as your unit of measurement doesn't work. frames per second is not a unit of measurement because 1 frame at 300fps is 3.2ms and 1 frame at 60fps is 16ms - i.e. the real cost is entirely architecture-dependent and no such measurement on any PC would be able to provide an accurate ballpark figure for comparison (textbook apples to oranges)
- There maybe other reasons not listed here that either I don't know about/forgotten or are specific to the new HW architectures and gagged under NDA (highly doubtful)

Very interesting post, thank you.
 

Branduil

Member
Wow this thread seems like a constant see-saw of people posting the same question, a few people touching on the answer and then some more asking it again like it's not been answered...

I'm not into games dev much these days but the answer is likely some combination of this list:

- Because it's not really "free"
- AF is heavily bandwidth intensive
- PCs get away with this because CPU + GPU have two completely separate buses
- Consoles don't because they have a single data bus they have to share between the CPU+GPU, especially in modern shared memory architectures
- I know for a fact the console GPU silicon budget is significantly less than what you'd find in most desktop HW
- If you're cutting out cruft to try to optimise your GPU silicon budget (so that you can get both your CPU+GPU chips onto the same die, hugely saving costs), the first things to trim are things like the density of ROPs and texture units (you aren't going to be doing 1600 x 1400 on PS4 anytime soon), which would have a direct impact on cost of something like AF
- Trying to understand the "cost" of AF from PC benchmark charts, using "1-2 FPS" as your unit of measurement doesn't work. frames per second is not a unit of measurement because 1 frame at 300fps is 3.2ms and 1 frame at 60fps is 16ms - i.e. the real cost is entirely architecture-dependent and no such measurement on any PC would be able to provide an accurate ballpark figure for comparison (textbook apples to oranges)
- There maybe other reasons not listed here that either I don't know about/forgotten or are specific to the new HW architectures and gagged under NDA (highly doubtful)

So the problem is, as usual, console designers are prioritizing flashy graphics to the detriment of image quality.
 
Top Bottom