Some enthusiasts care but the general audience does not, and if I recall correctly 60fps has been the norm going back to at least the SNES.
And just because console companies use PC hardware or adopted PC norms does not mean that devs or customers really care about technical specsheet wars like 30fps, anisotropic filtering, etc. And how am I "dead wrong on online" as I listed the truth: console customers and are okay with monthly subs and sub-1080p multiplayer, and I'm not sure how your response replies to that.
How on earth is this:
Not apparent???
It makes sense to being bandwidth limited because I believe you need more texture samples to get higher levels of filtering.The explanation throughout the whole thread has been "consoles don't have as much bandwidth as today's PC GPUs", and it's been accepted as valid. People have responded with, "Okay, then what about yesterdays GPU?"
Can you actually explain why bandwidth is a such bigger issue on consoles? So much that they can't equal 10 year old PC AF performance?
People keeps saying it's bandwidth, with no depth added to the technical aspects of why.
Can you give some examples of games and their resolutions/framerate?Outside of 900 Watch_Dogs, yes.
I'll ask again: What generation are you living in? Where are all these sub-1080p PS4 games? Yeah, there are two (and a half, if you include KZ's multiplayer), but you're making it sound like 1080p is still some rare and mythical thing on consoles. It's not. It's the de facto standard resolution on PS4, with just a few exceptions. XBO is a slightly different matter, but still.
Last gen I was gaming on a laptop with a GT 555M (2GB DDR3), in addition to my PS3. I could generally run console ports at higher-than-console settings, but I couldn't max most games and get good framerates. AF, however, I could always crank up to 8x or 16x without a discernable impact on performance.
Unless you are yourself a console dev it is prudent to assume explanations that make no sense to you are Dunning-Kruger syndrome at work.
I honestly don't know. Graphics cards for a few generations now have had virtually free 16xAF. Here's a GTX 460 (released in mid-2010) running Bad Company 2:
I love the word mip map
I will never forgive Valve for taking away my 'mat_picmip -10'
TF2 looked 5 years out with that setting on.
Yup. I have 16xAF forced in my control panel as a default for all applications. Seems like a no brainer
60fps was a norm for 2d games.
For 3d games its not and plenty of saturn, psx, and n64 games are proof of that. 30fps became more of a standard in the 2nd generation of 3d consoles even then it was largely still shaky until last generation.
The devs do care if they didn't none of them wouldn't be welcoming the features they have been using each generation they get new tools and toys. Sony last generation ditched their own exotic hardware for nvidia based two times now. MS has used pc hardware for good reason as devs like the ease of it compared to what existed before. Epic alone last generation was the sole reason sony and ms beefed up the ram to 512MB vs 256MB. Devs matter in the discussion unless you want to ignore various inteviews on the subject where devs and manufacturers give credit to the idea. You certainly cannot ignore the influence epic, cryktek, naughty dog, and a few other big names who have all prodded the big players in industry to go a certain direction.
As for consumers most don't care about graphics in general be it eye candy or not from my experiences or casual players vs enthusiast I meet at conventions and the like. Most consumers just want a good gaming experience graphics get far more interest from the hardcore and enthusiast groups.
And to my points before are you forgetting the dramatic changes in tv alone that consumers have seen since the 80's. They want their gaming hardware to keep up if they didn't we wouldn't have moved beyond 480p and sub resolution below it. Monthly subscriptions are something the pc has been doing for ages and for good reason it brings in the money. The entire online infrastucture sans massive deidciated servers has been borrowed from ideas that the PC did including MS own MS gaming zone which was popular for ipx hacks and dial up types.
You can argue all you like that they do not care that much to say they don't care at all is a stretch of the imagination, history, and sanity.
60fps is the video game standard, and we saw it with the PC, SNES, and today with Nintendo and other companies keep that standard alive on contemporary consoles too.
I don't even know how to kindly reply to your post which seems like a bunch of historical tangents with vague points, but in the end I think we both should agree: technical specsheets like the level of anisotropic filtering or fps means very little, and hence why we don't see 60fps or such features listed on the box. However that does not mean that tech and industries remain complacent like your examples showed.
As for your last sentence: don't put words into other people's messages. I never said they didn't care "at all", just they didn't care "that much", which is kind of besides the point anyway since they are selling millions despite what a specsheet says. They don't really care about 60fps or 16x AF, it's about the $$$ and selling an experience
On pc the standard is much higher, 60fps is old news in a lot of enthusiast circles especially anyone with high refreshrate monitor including myself. I never said it wasn't a standard anymore just one that doesn't apply to the situation. Who cares about the snes we are talking about 3d consoles where that standard is not the majority.
Why should I agree with your faulty premise? Vague histories, these are things you should know especially in a topic like this before spewing a useless point that adds nothing to the discussion and is factually not true. Not only that you claimed devs didn't care which i pointed out how in recent history they have and pretty much have cared about specs since the PS2 age, which I don't think you're that bold and ignorant in declaring to everyone it doesn't matter to devs. Also I do not speak in tangents I speak in convergence on a point you brung up that is dead wrong with facts I point out be it recent or old history.
Not at all and not much, pretty much the same thing considering your tone already. Want me to be precise than stop speaking in basic abstracts.
Be kind or not you're wrong.
You can argue all you like that they do not care that much to say they don't care at all is a stretch of the imagination, history, and sanity.
Be kind or not you're wrong.
You're also dead wrong
Agree to disagree then: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars.
60fps is still the benchmark on the PC and video games in general, but yea it's old-hat for enthusiasts.
As for tone: it's hard to reply kindly to walls of text that question one's "sanity" (avatar quote?) or how they're "dead wrong" but I digress.
Can you sum up how I'm wrong because it seems like we agree? Here is my premise from the above: I think most console customers, developers, and hardware manufacturers don't really care about the level of anisotropic filtering or even fps. They care more about $$$ and the experience instead of digital foundry specsheet wars. Otherwise specs like 60fps, 16xAF, etc. would be focused on.
Keep moving that goal post.
Snes vs genesis: eye candy wins
Saturn,PSX, N64, vs all the other crap we got: people chose the better
PS2 vs DC : noticing a pattern
GC vs Xbox
Wii VS PS3/360
WiiU vs PS4/X1
I like how cherry picked this is. "PS2 vs DC" and "GC vs Xbox" is very disingenuous when those are all the same generation, and the PS2 pounded the GC and Xbox despite being weaker than either. The Wii also outsold the PS3 and 360 by a substantial margin.
Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.
Maybe it will begin to be a priority if/when consoles start outputting larger resolutions like 4k, but until then I can see why devs wouldn't be tremendously interested in going out of their way to incorporate AF when it doesn't really look very good.
While not a videogame developer I am a software developer who has had to wrestle with GPUs. As I've mentioned earlier in the topic, not only does AF indeed incur a performance hit that tends to be underestimated, there are in fact situations, namely virtual texturing (but anything that'll result in noncontiguous cache access will exhibit similar problems), which make hardware-provided AF completely and utterly useless. Because those approaches don't use hardware mipmapping, instead dividing the textures into a hierarchical tree of equally sized tiles, AF will either not be supported or wreck your texture cache, incurring huge performance losses.Do you know something he doesn't?
Most PC gamers are simply used to have better graphics since '08 than todays PS4/XBone can offer. Including me.Also, you PC gamers need to calibrate your displays better, on my TV you can not notice the lack of filtering.
Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.
Maybe it will begin to be a priority if/when consoles start outputting larger resolutions like 4k, but until then I can see why devs wouldn't be tremendously interested in going out of their way to incorporate AF when it doesn't really look very good.
His point is that consumers don't care about graphics.
PS2 beats DC and DC came out first.
PS2 stills beats GC and trades with it in performance in variety of titles. Yet out of the 4 systems that came out that generation Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games.
Also that generation vs the rest I showed is cherry picking itself. Might not want to try to call a spade a spade and then miss the bigger point which you didn't refute at all.
With the excessive amounts of depth of field and motion blur effects at the moment, it probably doesn't even matter
Your argument is flawed for every generation.
SNES vs Genesis was very close and the SNES won on the strength of its games, not graphical prowess. People bought SNESes because they wanted to play Mario, Zelda, and Donkey Kong. The Genesis was no slouch, either - what it lacked in graphical horsepower it made up for in CPU strength (hence "blast processing").
The PSX didn't have the most powerful hardware, the N64 did. The PSX won because it was easier to develop for than the ridiculous architecture of the Saturn and wasn't constrained by Nintendo's dumb decision to use cartridges, so most developers jumped ship from Nintendo and Sega, which meant that most games people wanted were on PSX.
I already went over the PS2 generation, but your argument is totally nonsensical here. "Ps2 and Xbox which had their respective competitors beat the lesser systems that are somehow suppose to win cause consumers only care about games" - what does this even mean? In any case, the point still stands that, out of the PS2, GameCube, and Xbox, the PS2 was unequivocally the weakest and was the best-selling system by miles and miles, again on the strength of its now-legendary game library. "PS2 stills beats GC and trades with it in performance in variety of titles" is just straight-up bullshit. Where's that PS2/GC RE4 comparison gallery?
And once again, the Wii outsold both the PS3 and 360, with 20 million more sales than either, despite being a whole generation behind in tech.
Keep moving that goal post.
Snes vs genesis: eye candy wins
Saturn,PSX, N64, vs all the other crap we got: people chose the better
PS2 vs DC : noticing a pattern
GC vs Xbox
Wii VS PS3/360
WiiU vs PS4/X1
Need I say more
Stop moving your own goal posts
Consumers aren't half as disinterested as you say they are
It's a standard, but not the highest. As for videogames in general the history of 3d consoles would say otherwise. You trying to act like it is for console and ignore the debates here is actually pretty ballsy
I love it when gaffers avatar quote and talk out of there ass. I earned that tag and believe me my posts were epic to earn including a 2012 well before most were even aware of it and conspiracy troll. Go ahead show more ignorance
Yup. Console customers don't care much for PC or other video game standards: 30fps, sub-1080p (especially in online multiplayer), monthly subs, and even black bars are becoming a thing for "next-gen" boxes.
As for anisotropic filtering and other issues: it's either the quality of the dev talent, hardware limitations, or rushed project deadlines. But it sounds like an artistic choice even though it's been lacking
Can't edit now so you're sort of stuck. You argue for 60fps despite mentioning 30fps, neither are long time standards for console games at best 2 generations. 60fps or 30fps are nice options when devs do it right otherwise fps has been pretty spotty and you're at the mercy of the game. You mention sub 1080p for systems yet ignore that basically until last generation a majority of console games don't even support the option 1080p or 720p. Before that consoles barely enjoyed 480p just to be clear. Black bars aren't new or have you missed various titles even in the psx age doing it? Consumers care them buying hdtvs to play these titles show plenty they care.
I just showed you're now limiting the scope despite the fact I responded to you and another user making a very basic premise you're now walking away from. I don't care anymore especially if that's how you like to argue.
His point is that consumers don't care about graphics.
He literally said devs don't care about that another clear clase in my favor not his or yours.
Anisotropic filtering looks gross at lower resolutions. I wouldn't be surprised if devs opted to not worry about it because most console games are still running at 720p, and even at resolutions like 1080p AF still doesn't really look good at all, especially in comparison to trilinear.
I never said it wasn't apparent. But, i think that AA would help that image much more than AF.
Actually in this case you could say more: Show me how the goal posts changed? Both posts say that most people don't care about specsheets specifically when it comes to AF and FPS
What on earth? It's "ballsy" to suggest that 60fps has been the benchmark for video games? I must have Duke Nukem's balls of steel.
Yikes.
1080p/60fps (and no black bars) has been the standard for quite a few years in the video game industry which goes well beyond outdated PS3/360 console hardware. Of course that doesn't mean everyone reaches or agrees with benchmarks found in Digital Foundry and other comparison websites.
Most consumers don't really care about specsheet wars because they're buying consoles for the plug-and-play experience. Devs don't always follow industry standards because they sacrifice fps and af for their vision in addition to other limitations. And because they can make millions either way.
No there's either miscommunication or you're simply putting words in my posts. I'm saying most people don't care about specsheet tech wars like the amount of AF or FPS. It's more of an enthusiast thing
Feel free to give a quote, I believe it was something like devs "don't really care" which may seem like a small distinction but it matters when it comes to fps and af which is within the reach of console tech. Devs would rather prioritize the experience and make money instead of focus on spechsheet benchmark wars like how much AF or FPS it has. That's for enthusiasts.
Aniso is unrealistic. Trilinear is like an Aniso+DOF.
With trilinear and some light DOF, you'll get more realism.
actually it's a hell of a lot more noticeable in motion since there's a hard line where you can see the texture quality drop that stays a fixed distance from the game cameraIt is most likely that they are just focusing on higher impact issues. Honestly, i've played very few games where, in motion, AF really makes any appreciable difference (i know some pc people are going to rage at this).
It is one of those things that is much more impact on screenshots that it is on in-motion game play.
You left out a key word or there's miscommunication, it says: Console customers don't care much for PC or other video game standards.Ok....
First all you're statement says "Console customers don't care"
I said that initially, and also clarified throughout. Although even if I didn't: logically speaking you should understand that it's a generalization.Logically speaking that's pretty absolute to me and most others here but apparently not to you who had to change to it not much or not at all. You didn't say this at first stop acting like you did when I linked to your exact posting showing in complete detail how it says exactly what it says.
Even then I established with certain facets besides AF or FPS like with resolution, technology in consoles or other things Guess Who has mentioned in response to me they do you still argued otherwise they didn't care all that much. HDTV alone shows they care cause if they didn't they would've kept their 480i/p crts which are now mostly non existent.
That is how you moved the goal post can't be anymore frank and concise about it. You went from a variety of things they don't care about to AF or FPS specsheets.
Also right now AF means jack to most enthusiast it insulting that your suggesting it cause it shows you really aren't going to most sites that do benchmarks for such standards. How a game performs at any resolution at or beyond 1080p at 60fps or higher with aa, various forms, hbao/ssao and physics tends to be what they look for so they know how their card performs.
These were posted in the DF Face off thread.
PC x16AF
PS4
You left out a key word, it says: Console customers don't care much for PC or other video game standards.
I said that initially, and also clarified throughout. Although even if I didn't: logically speaking you should understand that it's a generalization.
Yea we seem to basically agree. In other words: Specsheet wars specfically FPS and AF is secondary to the overall experience. Devs would rather focus on other things like say resolution or amount of pretty mayhem on the screen instead of scoring high on a digital foundry benchmark
More going on there than just texture filtering, but yeah...These were posted in the DF Face off thread.
PC x16AF
I honestly don't know. Graphics cards for a few generations now have had virtually free 16xAF. Here's a GTX 460 (released in mid-2010) running Bad Company 2:
Yup. I have 16xAF forced in my control panel as a default for all applications. Seems like a no brainer
These were posted in the DF Face off thread.
PC x16AF
PS4
Still doesn't dismiss the rest of what you said. I can admit I was wrong on that.
It's horribly bad considering your personally speaking for literally millions of people. A couple hundred or thousand is fine after that you're just appealing to popularity and quite frankly in all honesty can't or shouldn't. If publishers and devs who spends millions on sales data and surveys can't get consumers right what makes you so much better to say such a thing.
Resolution is equally as much specsheet wars as much as FPS and AF. Also that I clearly showed how hd standards are only common to the last two generation before that they were virtually non existent unless you had a pc display and machine capable of that. Also polygons and the max output was basically part of specwars until the last generation when devs really had more freedom than they had before. Devs still care which is point you were harping they didn't when the reality is not that.
I'm an FPS through and through lightboost 120hz for me until there is something better which basically until 4k lightboost shows up if it does won't happen. As for AF, AA effects and the like I tend to gut them till I get a smooth experience. Grew up playing counter strike, quake and other very simple looking games eye candy is just that give me response, clarity, and good mechanics.
These were posted in the DF Face off thread.
PC x16AF
PS4
These were posted in the DF Face off thread.
PC x16AF
PS4
I think the video game standard is 60fps, and for the past few years the benchmark in the industry is arguably HD+60FPS but unfortunately it's more uncommon than not
Wow this thread seems like a constant see-saw of people posting the same question, a few people touching on the answer and then some more asking it again like it's not been answered...
I'm not into games dev much these days but the answer is likely some combination of this list:
- Because it's not really "free"
- AF is heavily bandwidth intensive
- PCs get away with this because CPU + GPU have two completely separate buses
- Consoles don't because they have a single data bus they have to share between the CPU+GPU, especially in modern shared memory architectures
- I know for a fact the console GPU silicon budget is significantly less than what you'd find in most desktop HW
- If you're cutting out cruft to try to optimise your GPU silicon budget (so that you can get both your CPU+GPU chips onto the same die, hugely saving costs), the first things to trim are things like the density of ROPs and texture units (you aren't going to be doing 1600 x 1400 on PS4 anytime soon), which would have a direct impact on cost of something like AF
- Trying to understand the "cost" of AF from PC benchmark charts, using "1-2 FPS" as your unit of measurement doesn't work. frames per second is not a unit of measurement because 1 frame at 300fps is 3.2ms and 1 frame at 60fps is 16ms - i.e. the real cost is entirely architecture-dependent and no such measurement on any PC would be able to provide an accurate ballpark figure for comparison (textbook apples to oranges)
- There maybe other reasons not listed here that either I don't know about/forgotten or are specific to the new HW architectures and gagged under NDA (highly doubtful)
Different weather...
I'm not saying there's no difference, there is... but the weather/time of the day difference does not help.
Yeps. Still not seeing anything that's needed.
Wow this thread seems like a constant see-saw of people posting the same question, a few people touching on the answer and then some more asking it again like it's not been answered...
I'm not into games dev much these days but the answer is likely some combination of this list:
- Because it's not really "free"
- AF is heavily bandwidth intensive
- PCs get away with this because CPU + GPU have two completely separate buses
- Consoles don't because they have a single data bus they have to share between the CPU+GPU, especially in modern shared memory architectures
- I know for a fact the console GPU silicon budget is significantly less than what you'd find in most desktop HW
- If you're cutting out cruft to try to optimise your GPU silicon budget (so that you can get both your CPU+GPU chips onto the same die, hugely saving costs), the first things to trim are things like the density of ROPs and texture units (you aren't going to be doing 1600 x 1400 on PS4 anytime soon), which would have a direct impact on cost of something like AF
- Trying to understand the "cost" of AF from PC benchmark charts, using "1-2 FPS" as your unit of measurement doesn't work. frames per second is not a unit of measurement because 1 frame at 300fps is 3.2ms and 1 frame at 60fps is 16ms - i.e. the real cost is entirely architecture-dependent and no such measurement on any PC would be able to provide an accurate ballpark figure for comparison (textbook apples to oranges)
- There maybe other reasons not listed here that either I don't know about/forgotten or are specific to the new HW architectures and gagged under NDA (highly doubtful)