Nice to see you back bgassassin.
If I may ask, do you regret mentioning a 600+ GFLOPS GPU?
Nice to see you back bgassassin.
If I may ask, do you regret mentioning a 600+ GFLOPS GPU?
Potentially dumb question, but I was wondering and saw it posed on B3D as well: Is it possible to "mix" nodes for a component?
Nope. I stand by my opinions although that was based on a more traditional design. The problem is silicon dedicated to other tasks makes it harder to classify from a FLOPs angle, though I believe the performance would be similar. So no regrets at all as I was trying to make what wasn't tangible as tangible as possible.
You wereI think I was more right than bgassassin in terms of predicting Wii U's power.
Nope. I stand by my opinions although that was based on a more traditional design. The problem is silicon dedicated to other tasks makes it harder to classify from a FLOPs angle, though I believe the performance would be similar. So no regrets at all as I was trying to make what wasn't tangible as tangible as possible.
the zelda demo in particular had a large emphasis on lighting. the 'day/night' switch seemed designed entirely to highlight the lighting change and that was a big part of the demo. i can't recall the bird demo right now. from what i've noticed lately, lighting is one of the easiest to utilize effects that truly make a game feeling modern. i can see them pushing good lighting in just about every mainline nintendo game.
Man, it's great to see the reason I joined gaf (and pretty much abandoned the GT forums) back. Also, this has been said before. We can't fully judge FLOPs do to parts of it being (for lack of a better term) being fixed. In fact, didn't we guess this in the old WUST? That the lighting would be fixed? I remember commenting on that during E3 2011.
I would not count on it.Wii U being deceptively powerful confirmed?
I would not count on it.
I would not count on it.
If you get a chance to watch it have a look at the extended 'Bird' tech demo again (it's filmed off screen but the quality is decent in HD), the lighting is perhaps it's most impressive feature imo -
http://www.youtube.com/watch?v=i2Nsa06KRLo
Even more impressive is that the console is rendering the game twice, once on the TV and once on the Gamepad.
I have never really been worried that Nintendo could create fantastic looking games on WiiU hardware because of those two demos (which were created in a short time on the very first version of the devkits).
What they are going to create on final devkits with a decent amount of time and budget is going to be incredible imo.
You werenot
Potentially dumb question, but I was wondering and saw it posed on B3D as well: Is it possible to "mix" nodes for a component?
This is where things get tricky from the foundry perspective. A foundry might be willing to give a Tezzaron the information. But consider this: Nintendos going to build their next-generation box, said Patti. They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together. Do you think IBMs going to be real excited about sending their process manifesttheir backup and materialsto TSMC? Or maybe TSMC will send it IBM. Neither of those is ever going to happen. Historically they do share this with OSATs, or at least some material information. And theyve shared it typically with us because Im only a back-end fab. I can keep a secret. Im not going to tell their competition. Theres at least a level of comfort in dealing with a third party that isnt a competitor.
Post history proves it.
Congratulations. Would you like a medal? Or would you prefer the rest of us to bow down before you?
Medal plz.
Take the bow....lolMedal plz.
That's all well and good but aren't you putting a little too much stock in that case in currently unquantifiable metrics?
No because I've never said that view was guaranteed nor said it in a manner that it should be taken that way. Plus considering it was almost a year ago when I made that post about a reduced amount of shaders and dedicated silicon, I'm not all of a sudden trying to make some kind of excuse. I said it then and I'll say now that I don't know what the silicon could be for. My personal interpretation is that it's for performance-based tasks. People are free to believe that or not as I can't control that, but I make no claims that is what the logic is for.
I just have to ask, what are you basing your opinion on? Educated guesses, or developer/insider feedback you have been privy to but can't share? To be clear; what you are stating is that performance wise, what WiiU is able to put out, graphically that is, is 600 Gflops "worth" of processing power & effects, right?
ASTC is an official extension to OpenGL. But there is no hardware supporting it yet.According to the dubiously reliable Wikipedia, ASTC is an official extension for OpenGL now. That's where the thought crossed my mind. A tile mode would be interesting. Are we talking PowerVR type tile rendering, or something entirely different?
Heh, I was going to contact you to see what you thought about the latest info, but I was waiting to see if we were able to more info about the customizations first. I didn't expect you to pop back up on you own.No because I've never said that view was guaranteed nor said it in a manner that it should be taken that way. Plus considering it was almost a year ago when I made that post about a reduced amount of shaders and dedicated silicon, I'm not all of a sudden trying to make some kind of excuse. I said it then and I'll say now that I don't know what the silicon could be for. My personal interpretation is that it's for performance-based tasks. People are free to believe that or not as I can't control that, but I make no claims that is what the logic is for.
Medal plz.
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.
I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
If anyone deserves a medal, it's Wsippel:
bg too since wsippel said "we" in your quote; meaning both him and bg reached the same conclusion.
bg too since wsippel said "we" in your quote; meaning both him and bg reached the same conclusion.
No, because BG said his conclusion was not the exact same in response to Wsippel's quote, and BG overshot, lol. Or at least Wsippel gets a bigger medal
If anyone deserves a medal, it's Wsippel:
Eh, I'm the one that said it was basically a 360 for months and months before release. And that the CPU was weak.
I win.
Eh, I'm the one that said it was basically a 360 for months and months before release. And that the CPU was weak.
I win.
So where is Iherre during all of this?
Now it makes sense what BG was telling me over 6 months ago, that the GPU performance would be similar to an AMD e6760 but not based or made just like it except for being very close in power consumption. In that sense the Wii U GPU does share some things in common with that GPU but still completely custom and not like any GPU for a PC or the coming systems from MS and Sony.
However, Nintendo hopefully made the GPU "easy enough" to develop for so porting down-scaled modern engines would not be too much of a hassle so even it's custom "not ordinary" parts can still be used.
Bravo BG
“Today we discovered a new hardware feature of the Wii U that shaves off 100 megabytes of texture memory usage in Toki Tori 2! [This] means we spend less time loading and have more memory available when the game is running.”
That's actually pretty ridiculous if true.The problem is they most likely had to discover it on their own.
If Nintendo isn't even fully documenting the hardware or what the API supports the chances of [good] next gen engine downports are slim.
That's actually pretty ridiculous if true.
In fact, the whole situation wherein GAF has had to crowdsource images/ChipWorks has been exceedingly generous in providing dieshots presumably means the documentation doesn't detail any GPU specifications?
VGLeaks MO has essentially been copying and pasting documentation - which is how we have so many details of the PS4 and XBOX 3. Yet, nothing for the Wii U.
I wouldn't be surprised if third parties can't be assed if that's the case.
How long are you willing to keep this up?
We all know who you are. No need to remind us. Which is exactly why you don't win.
Would this thread (and by that i mean the xray) be of any help to any devs? That would be hilarious.
Hilarious and sad at the same time.
Could anybody provide a short summary of what's been found? I'm not techy in the least, so I have trouble following even the OP.