• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

krizzx

Junior Member
Who knows, but those are still bullshots. Perfect anti aliasing isn't going to happen.

I have not seen any major discernable aliasing in any of the tralier released for this. That is big accusation to make with nothing really to support it.

The aliasing doesn't need to be perfect to take a screenshot of the game with none discernible.

http://www.youtube.com/watch?v=v7nweiUGh-g

I can see " occasional" aliasing in this video on small spots for breath moments, but it goes away the moment he stops moving or get closer to the object. The games has very little aliasing.

The Wii U GPU is capable of 4xMSAA I according to some older docs. This is not impossible at all.
 

LordGouda

Member
How do you know 1080p isn't going to happen? There a many 1080p games on the Wii U arleady and more are getting anounced.

Mario Kart 8 is suppose to be 1080p. Smash Bros. U is suppose to be 1080p. Its seems that with the updates to system performance, efficiency and stability they are able to do far more than what they were a launch.

I have no reason to believe that this game cannot be done in 1080p on the Wii U GPU.

Then on top of all of that, you discount the fact that there is no discernible aliasing in the trailer so that you can dismissed the screenshots based on lack of aliasing. I think you may be hoping these are bullshots more than evidence suggests.

No one knows if it will be 1080p. I'm expecting 720p with 60fps. It would be a dream to see it in 1080p, while keeping the 60fps.
 
Nintendo hardware usually give terrible captures for whatever reason so cleaning up press shots isn't the end of the world. It would be nice though if they would at least keep screenshots at native resolutions.

I would like to know how the game looks in that two player mode that utilizes both screens. It was impossible to tell in the video if major sacrifices are being made.
I can't access the link.

http://www.gamersyde.com/news_gc_sonic_lost_world_gamepay-14478_en.html
 
How do you know 1080p isn't going to happen? There a many 1080p games on the Wii U arleady and more are getting anounced.

Mario Kart 8 is suppose to be 1080p. Smash Bros. U is suppose to be 1080p. Its seems that with the updates to system performance, efficiency and stability they are able to do far more than what they were a launch.

I have no reason to believe that this game cannot be done in 1080p on the Wii U GPU.

Then on top of all of that, you discount the fact that there is no discernible aliasing in the trailer so that you can dismissed the screenshots based on lack of aliasing. I think you may be hoping these are bullshots more than evidence suggests.
The key word you seem to have overlooked before responding is the fairly important conjunctive "and." He's not saying it can't do 1080p, he's saying it can't do 1080p and perfect anti-aliasing. I don't know I'd declare it in such absolutes, but I tend to agree. Even if the hardware is capable of 4XMSAA, that doesn't mean it can do that rendered at 1080p60. There are big performance costs with anti-aliasing, particularly at that level. I highly doubt 1GB RAM is enough to push 1080p60 with 4XMSAA and high resolution textures, setting aside that I don't think the GPU is capable enough for all that at the same time as well.
 

fred

Member
Who knows...they may be bullshots, they might not be. What's most important is the video which looks gorgeous and running at 60fps. Even at 720p native we can tell that compared to other Sonic games from last gen it's technically more impressive (more polys, superior draw distance and what looks to be very decent AA) and running at twice the framerate.
 
The key word you seem to have overlooked before responding is the fairly important conjunctive "and." He's not saying it can't do 1080p, he's saying it can't do 1080p and perfect anti-aliasing. I don't know I'd declare it in such absolutes, but I tend to agree. Even if the hardware is capable of 4XMSAA, that doesn't mean it can do that rendered at 1080p60. There are big performance costs with anti-aliasing, particularly at that level. I highly doubt 1GB RAM is enough to push 1080p60 with 4XMSAA and high resolution textures, setting aside that I don't think the GPU is capable enough for all that at the same time as well.

In those old docs it mention 1080p no MSAA, and I think single pass. It makes no mention of framerate though.
 
In those old docs it mention 1080p no MSAA, and I think single pass. It makes no mention of framerate though.
That sounds reasonable: 1080p30 with low MSAA our FXAA doesn't seem wholly outside the realm of possibility. But it all depends on the rendering load as well, as that will determine what's left in the budget, so to speak, for post processing. It's not as easy flipping switches.
 

LCGeek

formerly sane
The key word you seem to have overlooked before responding is the fairly important conjunctive "and." He's not saying it can't do 1080p, he's saying it can't do 1080p and perfect anti-aliasing. I don't know I'd declare it in such absolutes, but I tend to agree. Even if the hardware is capable of 4XMSAA, that doesn't mean it can do that rendered at 1080p60. There are big performance costs with anti-aliasing, particularly at that level. I highly doubt 1GB RAM is enough to push 1080p60 with 4XMSAA and high resolution textures, setting aside that I don't think the GPU is capable enough for all that at the same time as well.

Very true

1080p 4xmsaa be it 60fps or 120fps can choke out even the top sli or multi card systems.

1GB is not enough now if you use certain forms of transparency with MSAA 4x or higher you outright run out of room.

You get more out of resolution increases imo, but my problem be it aa or resolution we don't have the juice even at 30fps as average or minumum to keep having everything we want with more eye candy.
 

TheD

The Detective
The screen shots look like they were captured from the video in the link. Is the entire video a bullshot too? What evidence do you have to suggest that this they are simply bullshots?



Please stop. You have jumped far out of bounds and are comparing grapefruits to cucumbers now.

We the same model from two games in the same series of the same make. The only differences here are the system strength. Its an Apple's to Apple's comparison. Trying to go make round about augments like that is just grasping for straws in defense at this point.

Wii U renditions is clearly a huge leap over the previous one.

They only person that is "out of bounds" is you!

You lack technical knowledge yet keep on posting about things you do not understand.

That model is clearly not the same! LOOK AT THE ARMS FFS!
And even if it was the same, nothing about it's shading is impressive!
 
This off screen 60fps Sonic Lost World video looks very very good in motion.

http://www.neogaf.com/forum/showthread.php?t=658013

To me that speaks for itself, amazing how it looks and runs. AFAIK it is 720p60. Most Wii U games are 720p60, that is about what I expected and I am OK with that. If further down the line they surprise with more 1080p games that would be really nice, but given the little known info we have I doubt it.

Bayo 2 is the most impressive achieving the 720p60 and inb4 any comment about the other games being platformers or simple it actually invalidates those types of comments. Mario Kart 8 also seems to be doing a lot of cool things and also 720p60, even with 2 player split screen. I know 160 is what we have, but I find it funny that 1-1.5 year games are achieving this already, and the 160 theory could match what we are seeing taking the shader efficiency (vs Xenos) explanation into account. But what will later games bring? Zelda U, any other big budget game if their is one? The next games by Shinen? Shinen guys said they got resuts with Nano Assault Neo but have way more to squeeze out of the Wii U. I know we cannot know for sure, but if we go by this statements, and we get games that squeeze 30-40% more out of the Wii U, how will the 160 theory explain that? It will certainly fall flat. I know this is just IFs.

That is just me throwing some logic out there, make of it what you like.
 

fred

Member
Just out of interest, how much more Floppage (lol) does it take to run a game at 720p native at 60fps compared to 720p at 30fps..? Would it be twice as much given twice the framerate..? Ignoring the extra poly count, AA, draw distance etc. Would it take twice the processing power to run one of last gen's Sonic games at 60fps..?

And then taking into account the extra polys, AA, improved textures, draw distance what does this tell us about the power of Latte..?
 

JordanN

Banned
And then taking into account the extra polys, AA, improved textures, draw distance what does this tell us about the power of Latte..?
Nothing? All of those are buzzwords.

There were cases of PS3 making improvements to 360 games but the two are still very close. Not saying Wii U is a PS3 btw (although there are interesting parallels for how all 3 consoles top each other).

PS3 = better CPU/weaker GPU 360 = Better GPU+edram/Weaker CPU
Wii U = Better GPU/RAM, CPU = ???
 

fred

Member
Nothing? All of those are buzzwords.

There were cases of PS3 making improvements to 360 games but the two are still very close. Not saying Wii U is a PS3 btw.

They're not buzzwords, all of those take extra processing power to achieve, particularly the higher poly models. Weve seen higher poly models than average compared to PS3 and 360ttitles in a fair few Wii U titles so far - ZombiU, Ninja Gaiden, Sonic Lost World and Bayonetta 2 spring to mind.
 

JordanN

Banned
They're not buzzwords, all of those take extra processing power to achieve, particularly the higher poly models. Weve seen higher poly models than average compared to PS3 and 360ttitles in a fair few Wii U titles so far - ZombiU, Ninja Gaiden, Sonic Lost World and Bayonetta 2 spring to mind.
It's a buzzword because it's a vague descriptor that can be used in almost all console comparisons.

Unless you're specific, it doesn't actually speak much about Wii U's power.

Again, PS3 had games that outperformed 360 but PS4 has games that outperforms both. PS3/PS4 are still not the same/close though.
 
It's a buzzword because it's a vague descriptor that can be used in almost all console comparisons.

Unless you're specific, it doesn't actually speak much about Wii U's power.

Again, PS3 had games that outperformed 360 but PS4 has games that outperforms both by a wider margin because it's been documented to do many things better and with greater effect.

Extra polys, AA, improved textures, and draw distance are objectively measurable and well defined concepts. Buzzwords are not.

Example of Buzzwords: theoretical FLOPs (PS3's 1.7TFs), Cloud Power (Xbox One), Emotion Engine (PS2), Project Reality (N64)... you notice that none of these are actually measurable results of the end product.

As for the bolded, that's why they aren't buzzwords, they are objectively quantifiable results which can be measured and compared.

Otherwise, standards of measure in all mediums would just be buzzwords... "You want me to pay a dollar for my coffee? Currency is just a buzzword, so are milliliters... and numbers in general..."

I'm not saying the Wii U has enough increases in these to make much of a difference in anyone here's opinion, just that the foundation of your argument made my head hurt... Buzzwords is a buzzword, btw...
 

JordanN

Banned
Extra polys, AA, improved textures, and draw distance are objectively measurable and well defined concepts. Buzzwords are not.

Example of Buzzwords: theoretical FLOPs (PS3's 1.7TFs), Cloud Power (Xbox One), Emotion Engine (PS2), Project Reality (N64)... you notice that none of these are actually measurable results of the end product.

As for the bolded, that's why they aren't buzzwords, they are objectively quantifiable results which can be measured and compared.
What are the extra polys? Is Wii U pushing 1 more triangle or 5? That's a buzzword. It leaves too much to the imagination.

Like, the only answer you'll get is Wii U does something better but then why bring up the question "what does this tell you about latte?" Can't you and everyone else figure that out yourself?
 
What are the extra polys? Is Wii U pushing 1 more triangle? That's a buzzword. All flash over substance.

Like, the only answer you'll get is Wii U does something better but then why bring up the question "what does this tell you about latte?" Can't you figure that out yourself?

If you have the same game engine running on two different consoles and one has farther draw distance, that means it has to push extra polygons to draw those extra features in the distance. If that same game then also has higher resolution textures, It's also having to hold those in memory and apply the textures to the extra objects it's also rendering due to the extra draw distance. You don't have to know the exact count to know there is more.

To over simplify this, you can lift two weights and tell which one is heavier without having to know the exact weights of each or the exact difference between the two...

You're argument would seem to think that the PS4's ability to show more polys, have higher res textures, better AA, and greater draw distance mean nothing. It's no more powerful than the Wii U, or the Wii for that matter, considering that the very things you are downplaying as buzzwords are the primary means of measuring advances in graphics outside of shader effects.
 

Donnie

Member
Just out of interest, how much more Floppage (lol) does it take to run a game at 720p native at 60fps compared to 720p at 30fps..? Would it be twice as much given twice the framerate..? Ignoring the extra poly count, AA, draw distance etc. Would it take twice the processing power to run one of last gen's Sonic games at 60fps..?

And then taking into account the extra polys, AA, improved textures, draw distance what does this tell us about the power of Latte..?

Twice the framerate means twice the work all else being equal. So yeah if you take a game running at 30fps and run it at 60fps without any modifications its going to take twice the processing power.

Nothing? All of those are buzzwords.

There were cases of PS3 making improvements to 360 games but the two are still very close. Not saying Wii U is a PS3 btw (although there are interesting parallels).

PS3 = better CPU/weaker GPU 360 = Better GPU+edram/Weaker CPU
Wii U = Better GPU/RAM, CPU = ???

They're a bit nondescript, but I think its clear enough what he's referring to. No idea wether the game he's talking about actually features those improvements or not though. Also not sure what paralells you're seeing between WiiU and PS3.
 

JordanN

Banned
If you have the same game engine running on two different consoles and one has farther draw distance, that means it has to push extra polygons to draw those extra features in the distance. If that same game then also has higher resolution textures, It's also having to hold those in memory and apply the textures to the extra objects it's also rendering due to the extra draw distance. You don't have to know the exact count to know there is more.

To over simplify this, you can lift two weights and tell which one is heavier without having to know the exact weights of each or the exact difference between the two...
What happens when Sonic Team announces they're making a PS3/360 game that applies all of those advantages? Does Wii U all of a sudden become weak because the question was "what does this say about latte"?

Just saying extra polys doesn't get you very far because the term covers a wide range of things.
They're a bit nondescript, but I think its clear enough what he's referring to. No idea wether the game he's talking about actually features those improvements or not though. Also not sure what paralells you're seeing between WiiU and PS3.
Sorry, I meant all 3 consoles, not just PS3. It was about how each console uses their own advantages.
 
What happens when Sonic Team announces they're making a PS3/360 game that applies all of those advantages? Does Wii U all of a sudden become weak because the question was "what does this say about latte"?

Just saying extra polys doesn't get you very far because the term covers a wide range of things.

I was actually thinking of Need For Speed... When you are talking about quantifiable differences, you have to use real comparisons, not what-ifs.

If Sonic isn't any different, it just means that the devs didn't capitalize on the hardware advances.

I wasn't even specifically talking about Wii U at first to be honest, just that you're dismissal of the only real gauges by which to compare graphics other than resolution and shader effects was very odd.

Goodnight JordanN, I wasn't trying to argue. I was just trying to figure out where you were coming from with that buzzwords claim...
 

fred

Member
What happens when Sonic Team announces they're making a PS3/360 game that applies all of those advantages? Does Wii U all of a sudden become weak because the question was "what does this say about latte"?

Just saying extra polys doesn't get you very far because the term covers a wide range of things.

Sorry, I meant all 3 consoles, not just PS3. It was about how each console uses their own advantages.

I can't see that (assumed) 720p footage at 60fps being able to run on the PS3 or 360 at the same framerate without serious gimping going on myself. Just compare it to Generations footage and you'll see what I mean. The difference is night and day.
 

joesiv

Member
Just cause it's massive don't mean it's efficient. When it comes to oc'ing these days you need better form of cooling than air only be it cpu or gpu. You can still do a lot with air but there is clear wall of performance that it offers.
Really? Sure to go to 5+ghz you need phase change, nitrogen/etc.. and sub 0 cooling. But on the stock cooler on newer intel processors you can get quite a bit of OCing. Then if you put a bigger HSF on it, you can get even higher on air. And that's what we're talking about, air cooling.
 
I have not seen any major discernable aliasing in any of the tralier released for this. That is big accusation to make with nothing really to support it.

The aliasing doesn't need to be perfect to take a screenshot of the game with none discernible.

http://www.youtube.com/watch?v=v7nweiUGh-g

I can see " occasional" aliasing in this video on small spots for breath moments, but it goes away the moment he stops moving or get closer to the object. The games has very little aliasing.

The Wii U GPU is capable of 4xMSAA I according to some older docs. This is not impossible at all.
Uhhhhh....

The burden of proof is on you. That level of AA is unprecedented even on PC's.

You then comment how there is aliasing during motion, but then none when the action slows, but you refer to the 4xMSAA.

MSAA is always applied. Even on moving things.

4xMSAA doesn't look remotely to that level of AA.
 
Matsushita: We waited because we believed in you! (laughs) The development environment for Wii U wasn't even done then, so I think that was part of what was making things difficult.

Iwata: The development environment for Wii U wasn't exactly ideal early on, so I'm sure that caused quite a hardship on you.

Guess this is already known, but it was in the Iwata Asks for W101. They mention how the devkit/tools from, at least, E3 2012 to the end of that year, weren't fully completed.

http://iwataasks.nintendo.com/interviews/#/wiiu/thewonderful101/1/2
 
How do you know 1080p isn't going to happen? There a many 1080p games on the Wii U arleady and more are getting anounced.
To me that speaks for itself, amazing how it looks and runs. AFAIK it is 720p60. Most Wii U games are 720p60, that is about what I expected and I am OK with that. If further down the line they surprise with more 1080p games that would be really nice, but given the little known info we have I doubt it.

Reading the last couple of pages, you guys are trying way too hard to use the 'Wii U games have a higher IQ than current gen console' line as a proof that the console is significantly more powerful than said consoles because they don't.

Regarding the resolution:
Inform yourself better. Go check the B3D rendering resolution pixel counter list instead of just cherrypicking the 1080p games to force a point. 95% of all multiplatform games are running at the exact same resolution as the PS360 versions. This even means subHD (880x720) for games like CoD. The exclusive Wii U games you tout as 'an increasing amount of 1080p games!' are just 1 game (SSBM) and a bunch of Gamecube, 3DS and even PS2 ports. It doesn't prove anything as the PS3 also recieved a few less demanding exclusive releases (even at launch!), a hand full of PS2 HD classics and PSN games that were running in native 1080p.
http://beyond3d.com/showthread.php?p=1113344

Regarding the framerate:
Again, inform yourself better. DF framerate analysis show that the framerate of multiplatform games hovers between the Xbox 360 (usually highest) and the PS3 (usually lowest) version. I see no changes in upcoming releases like splintercell. Nintendo and some other companies targetting 60fps for exclusive Wii U games is great, but again ... this isn't something unique that differentiates Wii U games from current gen. Especially not when these are the same developers that were targetting 60fps as well on PS360 hardware. It's as dumb as claiming that the Xbone is more powerful than the PS4 simply because Microsoft has announced more exclusive 60fps games.

I'm sure Wii U games will look better over time. I expect the average Wii U game to have better lighting and textures than the average PS360 game. These expectations come from the footage I see and my understanding that the Wii U GPU is more modern (full dx10 feature set) and the console has twice the amount of RAM available for games. However, claiming there is, or expecting there will be an IQ difference is based on literally nothing. The games don't show it and from what understand it greatly depends on bandwidth in which the Wii U doesn't have a real advantage over PS360 hardware. Since PS360 image quality didn't increase over the years when developers got more experienced with the hardware, it's save to say that we won't see a positive evolution regarding Wii U IQ as well. People should expect 720p/30fps to remain the sweetspot IQ for the majority of Wii U games since the Wii U hardware simply doesn't have the specs to double the framerate or the rendering resolution. The Wii U is designed to be a 720p/30fps machine. If it wasn't, you already would have noticed.

Also, Krizzx, you claim you post in this thread to understand the Wii U GPU. I don't think that's true. The only reason you are here is because you want to tell a story. A story that the Wii U GPU is much more powerful than it really is. It's dogmatic and thus has nothing to do with learning anything new.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I don't think you'll ever see 4xMSAA in 720p+ Wii U games, it uses far too much eDRAM and we don't know if it supports tiling.
A 1280 * 720 * 8 (color + depth) * 4 = 28.125MB back buffer does not need tiling to fit in the eDRAM. Question is, would that be a good use for the eDRAM.
 

NBtoaster

Member
A 1280 * 720 * 8 (color + depth) * 4 = 28.125MB back buffer does not need tiling to fit in the eDRAM. Question is, would that be a good use for the eDRAM.

That's what I mean, at 720p it consumes most of it and and at higher resolutions it wont fit at all. Given that the eDRAM isnt solely intended for the framebuffer and 2xMSAA or FXAA is a small quality downgrade for most people, it doesnt seem efficient.
 
That's what I mean, at 720p it consumes most of it and and at higher resolutions it wont fit at all. Given that the eDRAM isnt solely intended for the framebuffer and 2xMSAA or FXAA is a small quality downgrade for most people, it doesnt seem efficient.

Why would you not put your framebuffer in MEM 2? Unless the required bandwidth exceeds what's available from MEM2 or eats up to much.
 

krizzx

Junior Member
M°°nblade;77880033 said:
Reading the last couple of pages, you guys are trying way too hard to use the 'Wii U games have a higher IQ than current gen console' line as a proof that the console is significantly more powerful than said consoles because they don't.

Regarding the resolution:
Inform yourself better. Go check the B3D rendering resolution pixel counter list instead of just cherrypicking the 1080p games to force a point. 95% of all multiplatform games are running at the exact same resolution as the PS360 versions. This even means subHD (880x720) for games like CoD. The exclusive Wii U games you tout as 'an increasing amount of 1080p games!' are just 1 game (SSBM) and a bunch of Gamecube, 3DS and even PS2 ports. It doesn't prove anything as the PS3 also recieved a few less demanding exclusive releases (even at launch!), a hand full of PS2 HD classics and PSN games that were running in native 1080p.
http://beyond3d.com/showthread.php?p=1113344

Regarding the framerate:
Again, inform yourself better. DF framerate analysis show that the framerate of multiplatform games hovers between the Xbox 360 (usually highest) and the PS3 (usually lowest) version. I see no changes in upcoming releases like splintercell. Nintendo and some other companies targetting 60fps for exclusive Wii U games is great, but again ... this isn't something unique that differentiates Wii U games from current gen. Especially not when these are the same developers that were targetting 60fps as well on PS360 hardware. It's as dumb as claiming that the Xbone is more powerful than the PS4 simply because Microsoft has announced more exclusive 60fps games.

I'm sure Wii U games will look better over time. I expect the average Wii U game to have better lighting and textures than the average PS360 game. These expectations come from the footage I see and my understanding that the Wii U GPU is more modern (full dx10 feature set) and the console has twice the amount of RAM available for games. However, claiming there is, or expecting there will be an IQ difference is based on literally nothing. The games don't show it and from what understand it greatly depends on bandwidth in which the Wii U doesn't have a real advantage over PS360 hardware. Since PS360 image quality didn't increase over the years when developers got more experienced with the hardware, it's save to say that we won't see a positive evolution regarding Wii U IQ as well. People should expect 720p/30fps to remain the sweetspot IQ for the majority of Wii U games since the Wii U hardware simply doesn't have the specs to double the framerate or the rendering resolution. The Wii U is designed to be a 720p/30fps machine. If it wasn't, you already would have noticed.

Also, Krizzx, you claim you post in this thread to understand the Wii U GPU. I don't think that's true. The only reason you are here is because you want to tell a story. A story that the Wii U GPU is much more powerful than it really is. It's dogmatic and thus has nothing to do with learning anything new.

That is one extremely fact distorting and condescending post

Don't inflect your ideology on me. I've said it a dozen time. I'm not here for console war/fanboy garbage like that. Stop trying to twist my words. You are accusing me of doing the opposite of what you and a few others are doing(as in trying your hardest to downplay the GPU). The general consensuses from all of the analyst is that the hardware is stronger. To say otherwise at this point requires some special tented glass. I use "facts" and details to help find out how much. I'm only proposing possibilities that the details point to. Unlike you I haven't come in here and slammed people with what I think is absolute fact(as I think none of this is) and raged when someone said otherwise.

Don't insult me with that crap.

Part of what I do is graphics design and I have an good eye for detail. I'm using that to help further along the analysis by contrasting the scale and fidelity demonstrated in games, which I consider at least a minor helpful contribution being it that a large chunk of people who come in here only seem to care about what the games on the console look like while attempting to limit the hardware's capability as much as they can to the limit of said games such as you just did in that post. It is ridiculous. I'm tired of responding to people who come in this thread with bitter, poorly correlated responses and work to dismiss all proposed gains every time anyone in the thread proses that something on the Wii U is even remotely better than something on the PS3/360 with absolutely no directly correlated facts to back what they say.

They get outraged at even the smallest gain's suggested about the hardware and make rebuke's that make no sense. And for 720p to "remain" the sweetspot? It never was.

All Wii U games that aren't launch games or ports either run at 720 60fps or 1080p. You are proposing that the Wii U's "sweet spot" is lower than what the average full Wii U game has already been demonstrating. Many have "already noticed"> I do not see how you have not.
 

Luigiv

Member
That's what I mean, at 720p it consumes most of it and and at higher resolutions it wont fit at all. Given that the eDRAM isnt solely intended for the framebuffer and 2xMSAA or FXAA is a small quality downgrade for most people, it doesnt seem efficient.

FXAA is absolutely awful and can die in a fire. I hope Wii U games never use it. If devs can't spare the frame buffer for real AA, I'd rather just have Jaggies.
 

krizzx

Junior Member
FXAA is absolutely awful and can die in a fire. I hope Wii U games never use it. If devs can't spare the frame buffer for real AA, I'd rather just have Jaggies.

Why is that? FXAA can do more than just AA. You can use it to add effects at a much lower cost as well(or at least it could in Skyrim on the PC). Even with FXAA, the texture are still generally clean and high quality. The degradation would be superficial.
 

krizzx

Junior Member
Because it looks worse than no AA, plain and simple. The point of AA is to make 3D graphics look better not worse.

Perhaps, thoughI cannot fully agree with that.

MSAAvsFXAA.png

FXAA.JPG

BACFXAAComparison.png
 
Because it looks worse than no AA, plain and simple. The point of AA is to make 3D graphics look better not worse.

That's a bit of hyperbole. I understand that people don't like how it blurs the final image a bit, but it's a good compromise. You get rid of jaggies at very little cost.
 
The image you picked works well with FXAA because the background is out of focus. FXAA is cheap but blurry. There are countless examples of this demonstrated in practice. For example:

52a9f023d856c5d92d865c5f6e444358.jpg


It smoothed the edges better because it smoothed everything. The blurring is very noticeable. (The game in question is, I believe, BF3. Where is your screenshot from?)
 
A

A More Normal Bird

Unconfirmed Member
Why is that? FXAA can do more than just AA. You can use it to add effects at a much lower cost as well(or at least it could in Skyrim on the PC). Even with FXAA, the texture are still generally clean and high quality. The degradation would be superficial.

No it can't, it literally is "Fast Approximate Anti-Aliasing". What you're talking about is that the dll injector on PC can be used to run other post-process shaders, which are independent of FXAA.
 
I've said it a dozen time. I'm not here for console war/fanboy garbage like that.
Aren't most of your threads about WiiU? Not that that's a perfect indicator, but you also had one of the most misleading and disingenuous thread titles I've seen. I want to give you the benefit of the doubt, but it seems hard to discuss the negatives of something you always post about positively; how can healthy discussion be had?

No it can't, it literally is "Fast Approximate Anti-Aliasing". What you're talking about is that the dll injector on PC can be used to run other post-process shaders, which are independent of FXAA.
And which devs very likely won't use since, by my guess, it's inefficient for production quality code to use injection to achieve post processing. There are probably more efficient ways of doing it; people do it on PC because it's not officially supported.
 

Luigiv

Member
Perhaps, thoughI cannot fully agree with that.

MSAAvsFXAA.png

FXAA.JPG

BACFXAAComparison.png

This is a very, very cherry picked example. It crops and zooms in on the one weak spot of low level MSAA (very high contrast spots) and highlights an object that has no texture detail so that the downside of FXAA isn't apparent. The zooming also makes the Jaggies look worse than they would otherwise.
 

krizzx

Junior Member
Aren't most of your threads about WiiU? Not that that's a perfect indicator, but you also had one of the most misleading and disingenuous thread titles I've seen. I want to give you the benefit of the doubt, but it seems hard to discuss the negatives of something you always post about positively; how can healthy discussion be had?

That is irrelevant to this thread or its discussion.

My other posts and threads are made simply to contrast the overbearing onslaught of negative news and the general negative opinion directed toward Nintendo and their products that I see on the internet. It has reached ridiculous levels with journalists taking news that should have been positive and spinning it to look as negative as possible or cherry picking facts for negative purposes. If there were a more balanced opinion of the hardware or others posting more positive about it then I would not bother.

As I've said at least a 100 times now. I'm a PC gamer. I rarely ever touch consoles. I still find it unjust the way Nintendo gets so much acute hatred for things that Sony get praised for. I would do the same if it were SEGA or some other company in cross hairs of journalist trying to capitalize on click-bait headlines that distort fact. I like for things to be accurate, and I generally like to root for the underdog as well.

Though I must ask again. What is with all of the personal attacks? Its like my attempts to clear up the misconceptions about the Wii U hardware are causing people physical harm with some of the posts I've been seeing.

This is a very, very cherry picked example. It crops and zooms in on the one weak spot of low level MSAA (very high contrast spots) and highlights an object that has no texture detail so that the downside of FXAA isn't apparent. The zooming also makes the Jaggies look worse than they would otherwise.

Well, of course they are cherry picked. They were selected to make a point. The point is that FXAA is not without its benefits even over other AA. You spoke of it like it was the hell spawn of the devil and should be snubbed out of existence..
 
Status
Not open for further replies.
Top Bottom