can this tread go back to technical discussion with people who knows what they're talking about instead of competing fanboys talking about their opinions and feelings?
As I said Ive really enjoyed keeping up with this thread and learning more about hardware and architecture along the way. I agree itd be great if we could table side conversations about third party support business dynamics. And while I dont think Im the best person to do that I figured Id give it a shot.
I was curious about BGAssassins theory about dual graphics engines and z0m3Ies contributions to that as well. Has that theory been abandoned? I might be wrong on this but I was thinking that the presence of a dual graphics engine would suggest greater than 160 ALUs? And so conversely, if we assume that the Wii U is 160 ALUs based on the visuals we are seeing, that would then make the dual graphics engine unlikely.
2 things I'd like to point out, more to the thread than yourself.
1. Tessellation is simply drawing more polygons by splitting what is there, I am not sure what last gen games pushed when it came to polygons per second, but having a limit of 550 million polygons could make tessellation unusable if the game is already pushing those numbers with more simple models (around 9million polygons per frame if the game is 60FPS) of course that is probably a lot of detail anyways for a modern game, but certainly tessellation adds a ton more polygons on top of it, so a dual graphics engine might be needed.
2. Even though dual graphics engines came about in the 6000 series, the chance of Wii U using one would still point to Wii U having custom parts, the GCN tessellator is something that might of been easy to add to Wii U's GPU7 though I think the real problem there is that tessellators from GCN are designed at 28nm and obviously that might pose a problem for GPU7. My point however though is if something is obviously customized, it is likely to pull together the best parts for a goal so we shouldn't assume any of components GCN's, designed ~4 years ago, didn't make it into Wii U's GPU. (meaning one way or the other)
Basically if games were pushing ~300 million polygons per second this last gen, then Wii U's tessellator would be limited in its use since it couldn't even double the polygon count, and though adaptive tessellation is being used in Wii U, there would obviously be a benefit from moving to a dual graphics engine no matter which generation of tessellator it is using.
Please feel to pick this apart, throw mud, etc
160 ALUs In Favor:
160 ALUs of more efficient, modern design could approximately match Xenos and explain similar visuals and performance
At a count of 160 there is a size discrepancy where Wii U ALUs appear significantly larger than wed expect them to. This could make sense with what we now know of a 45 nm manufacturing process as opposed to the 40 nm process we assumed.
Furthermore, the size discrepancy is not large enough to suggest that a variant <= 320 would fit.
160 ALUs Against (>160):
Wii U architecture is vastly different than PS/360 and so it is reasonable to expect a learning curve.
IF the curve is as steep as it was for PS/360
THEN launch games for Wii U that can match or exceed late gen titles would imply more powerful hardware. (especially if the dev is rushed, small team, small budget, incomplete tools etc)
IF a dual graphics engine is present, THEN it would suggest >160 ? (I ask, i don't know if that is how it works)
IF there was a game whos visuals could only be explained with a >160 ALU part.
I guess I have this is both categories, but the size of the ALU is bigger than we would expect for 160 and if it cant be explained by 45 nm we dont know why.
Also it seems as though some of the heavy hitters on this thread have lost interest. Ive seen a couple recent posts from Fourth Storm, BGAssassin, and Mihael Mello Keehl but havent seen z0m3Ie, blu, Thraktor, or krizzx. Was wondering why the loss of interest? Theres been so much in this thread and I was hoping to hear "closing statements" or some kind of finale! haha Certainly not expected just sayin