WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I don't see how the freezes during the conf have any relevance at all to my point. The odds of them being present in the full version are extremely low. We're 6 months out from launch.
Because and early alpha version of AC IV freezing it's irrelevant. It doesn't belong in the disscusion at all and proves nothing. We should wait for release and see how competent (or not) the port is.

But it has been explained to death that the fact that Wii U is not significantly above cuarrent gen is what making the ports not exhibit much improvements. More so than any "familiarity", "inmature toolset" or achitectural problems/excuses.

The fact that you've admitted you've got to restrain yourself from being offensive in a reply to that post tells me you're really emotionally invested into this. Sorry for making you feel that way (seriously).
Out of the two of you, he is the most "emotionally" invested but still aproachable in a disscussion. He's no where near close another gentleman in the thread which take's it to the extreme. Now that's pation and fanatism.
Just like people are free to share their opinions here is mine.

From E3 the best graphically were: Destiny, The Division, Infamous SS , BF4 and MGSV.

The Division was running on a mighty PC so lower your expectations a bit. BF4 was also running on PC. Everything for XB1 was running on PC, so MGSV too. The ones I am not sure about are Infamous SS and Destiny. Saw a clip of infamous 2 vs SS and actually Infamous 2 holds up pretty well. That is what we are saying, not that there is no difference, but last gen games look really good, and Wii U is an improvement to that, they will hold up better indeed.
No acording to DF. Forza 5, one of the most impressive of the show was runing on a kit.
 
After going back and looking at the Mario Kart 8, 3D World, 101, and Bayonetta 2 60 fps videos again the games really does looks good in motion you can't just look at still pics. Its not like the Wii - PS360 situation anymore were it was obvious still pics or other wise.
 
Well this went downhill fast.

On Topic.
Because of the relative lack of any major jump in Infinity Ward's do you think the WiiU version of Ghost will be comparable to the ps4/720 version?

From what I've seen it looks like it should hold up nicely I have yet to see a single effect that the wiiU should not be able to pull off with relative ease.
 
Well this went downhill fast.

On Topic.
Because of the relative lack of any major jump in Infinity Ward's do you think the WiiU version of Ghost will be comparable to the ps4/720 version?

From what I've seen it looks like it should hold up nicely I have yet to see a single effect that the wiiU should not be able to pull off with relative ease.
Yes I think it can based on what we saw.

Now imagine the dog cam on the Wii U gamepad! sold!
 
Well this went downhill fast.

On Topic.
Because of the relative lack of any major jump in Infinity Ward's do you think the WiiU version of Ghost will be comparable to the ps4/720 version?

From what I've seen it looks like it should hold up nicely I have yet to see a single effect that the wiiU should not be able to pull off with relative ease.
Eh, highly unlikely. There's no reason for them to do much more than dump the 360 code onto a wii u disc with swapped out button icons and call it a day. I forget who (might have been Shocking Alberto) but someone relatively notable hinted that the real reason it hasn't been talked about is because Activision is still negotiating some kind of perk package just as they usually do with MS and Sony.
 
Also very interesting comment from Nomura, Wii U not having the capacity to support equivalent features from DX11 API and how impacts game development, plus obviously the huge gap between the PS4/X1.

This is false, what is thruth on the other hand is that the API used by Sony for development is not DX 11, which is a different matter. The hardware is fuly DX 11 compliant, which seems not to be the case with the U.

Holy shit. My reply to you AGAIN from just a few pages ago.

I could have sworn I've specifically replied to you about this very same thing before. DX11 added 3 main features to the DX standard over DX10.1. The big main 2 were tesselation, and compute shaders. The 3rd addition was better support for multi-core processors. The only things we need to be looking at here are Tesselation and Compute Shaders. DirectCompute can be run on AMD 10.1 hardware. DX11's flavor of tesselation couldn't directly be run on the tesselators found in AMD's chips do to not supporting certain function calls of DX11. That said though other API's could absolutely make use of the Tesselators. Now it is a given that the Tesselators that followed were much more efficient, than the ones found in AMD's 7xx line of chips.

We know from the leaked specs that were confirmed by multiple insiders that the Wii U GPU supports Compute Shaders, and we know from Shin'en confirming that it supports a tesselator.

So the two main features added by DX11 over DX10.1 are there, though yes the Wii U hardware is not as powerful as the XO/PS4, but were talking feature set not power. Also yes DX11 spec parts are going to be more efficient, and also support shader model 5.0 but the main feature set is there.
 
Didn't Tesselation shaders come into existence with DX10.1?

I'm pretty sure the .1 was almost if not entirely for performance and tessellation, though it was never really used in games thanks to Nvidia.

http://www.youtube.com/watch?v=0K80ZzgR37o
http://www.youtube.com/watch?v=a7-_Uj0o2aI
Tesselation hardware has been around for awhile on the AMD side. The 360 has a very early piece of tesselation hardware in it. It wasn't officialy a part of the DX API till 11. That said though those early tesselators couldn't now be used under the DX11 API because they did things slightly different than the official DX11 way. They could how ever be used under other graphic APIs.
 
Holy shit. My reply to you AGAIN from just a few pages ago.
lol, i've noticed this in a discussion with him before, some arguments he doesn't want to deal with. Must not be emotionally invested enough, lol.

But it has been explained to death that the fact that Wii U is not significantly above cuarrent gen is what making the ports not exhibit much improvements. More so than any "familiarity", "inmature toolset" or achitectural problems/excuses.
Then please explain why early PS360 games looked like turd compared to games available now. The gap between PS360 launch games and games 6 to 7 years later on those platforms, is arguably just as wide as the gap between late PS360 games and launch PS4/XBone games. I'm not claiming WiiU is immensely more powerful, but you seem to be underestimating the issue of dev experience on a mature toolset with actual documentation.
 
This was in the third party reel at E3.

http://www.youtube.com/watch?v=xHMJXy6lSHo&t=2m8s

Don't know if that's confirmed Wii U footage, or PC though.
I think that sizzle reel contained PC footage. The clips apparently are the same as some footage shown a few months ago, but I haven't verified it myself.

That said, I'm not worried. I've preordered the Wii U version sight unseen. AC3 held up okay at launch with shitty dev tools (I played it on both Wii U and PC) so I'm not too concerned about Watch_Dogs.
 
I think that sizzle reel contained PC footage. The clips apparently are the same as some footage shown a few months ago, but I haven't verified it myself.

That said, I'm not worried. I've preordered the Wii U version sight unseen. AC3 held up okay at launch with shitty dev tools (I played it on both Wii U and PC) so I'm not too concerned about Watch_Dogs.
I want to say it doesn't because it's Nintendo but at the same time it's a third party reel.
 
Tesselation hardware has been around for awhile on the AMD side. The 360 has a very early piece of tesselation hardware in it. It wasn't officialy a part of the DX API till 11. That said though those early tesselators couldn't now be used under the DX11 API because they did things slightly different than the official DX11 way. They could how ever be used under other graphic APIs.
Now, what about OpenGL? The Wii U uses some version of that, and the feature set is not completely the same as its Direct-X equivalent.
 
When I was reading through the "All this talk about our earnings is silly" thread I found this passage really interesting.

The other is a little bit coincidental in that the hardware jump from DS to 3DS was quite big in terms of the difference between those two [platforms] and it just so happens that that same scale of jump happened from Wii to Wii U, consecutively with those two pieces of hardware. And any time you have a big jump in the hardware technology it certainly takes the teams time to learn that and adjust their development environment in order to adapt to those big changes. So I think gradually as we’re adding more staff and we’re increasing our capabilities… and in the future as the hardware generation change doesn’t result in significant change in the hardware environment or capabilities of the hardware, then what ends up happening is you have a smoother transition, as you saw from the Gamecube to Wii.
Can we expect the successors of 3DS and WiiU to use both the same basic architecture like one of these two (probably WiiU)?
 
Forza 5, The Division, Shadow Fall, Dark Sorcer Demo, Ryse, The Order, Witcher 3, Quantum Break, Battle Field 4, Need for Speed Rivals. Non on them impressed you in the slightest?
These games did look amazing, I did not think "wow ugly" at all for any of those titles.

But, they impressed me the same way The Last of Us or Mario Kart 8 has impressed me; not in the polygonal pushing sense but in a general "these games look fantastic visually" sense.

What I was personally expecting visually from the PS4/XBone overall was not matched however, and that was mostly disappointing for me, outside of essentially MGSV.

That said, it does not mean I was repulsed or anything of that nature. It just means my expectations were not met, and I did not end up ultimately seeing a huge difference. It was just prettier, but essentially prettier versions of everything I've seen before.

Objectively, the Wii U games themselves may not be anything "special" in that sense; but they are somewhat more impressive for many, myself included, since its the first time we've seen HD Nintendo games so there is more of a "wow" factor after 10+ years of Gamecube-Wii SD graphics. Going from Xenoblade to X was that "generational leap" that didn't exist with the Gamecube to Wii transition, and that leaves much more of an impression on me than Forza 4 -> 5 (or something like GT6 -> Forza 5), or FIFA/MADDEN current gen -> "next gen" FIFA/Madden, or Wticher 2 -> 3 ... that leap was not big for me, in that sense.
 
I think Nintendo should make sure the Wii 3 has the same architecture as the PS5/Xbox9thGen, won't mean all third party support but it can help.
I guess this depends when the WiiU successor will be released (likely 2016), I think the step-up won't be bigger from PS4/XBone than WiiU from PS360 (they don't have to).

So probably 8 cores, 2TF GPU and 8 GB of RAM, I am more interested in the architecture used though.

Would a die shrink of Espresso be a good option or are there better offerings on the market?
I think Miyamoto's statement suggests that they would go the extra way to keep compabilitiy.

As I understood AMD's "Cat" CPUs will switch from x86/x64 to ARM V8 in the next years, so a Cortex-A57 could be another possibility (with a GCN-based GX3 GPU on the die).
 
I guess this depends when the WiiU successor will be released (likely 2016), I think the step-up won't be bigger from PS4/XBone than WiiU from PS360 (they don't have to).

So probably 8 cores, 2TF GPU and 8 GB of RAM, I am more interested in the architecture used though.

Would a die shrink of Espresso be a good option or are there better offerings on the market?
I think Miyamoto's statement suggests that they would go the extra way to keep compabilitiy.

As I understood AMD's "Cat" CPUs will switch from x86/x64 to ARM V8 in the next years, so a Cortex-A57 could be another possibility (with a GCN-based GX3 GPU on the die).
If the Wii 3 released in 2016 than that would be a 4 year life of the Wii U...That's not gonna happen, the Wii 3 will release in 2017 or 2018 (prediction is 2018).

I'm thinking 2.3TF, with 10GB of ram, with 9-10 cores.

I'm thinking third parties might pressure Nintendo into going the same route as with PS5 and Xbox 9th Gen (I really don't know what to call the Xbox One successor).
 
If the Wii 3 released in 2016 than that would be a 4 year life of the Wii U...That's not gonna happen, the Wii 3 will release in 2017 or 2018 (prediction is 2018).

I'm thinking 2.3TF, with 10GB of ram, with 9-10 cores.

I'm thinking third parties might pressure Nintendo into going the same route as with PS5 and Xbox 9th Gen (I really don't know what to call the Xbox One successor).
You may be right, shouldn't have deleted "the earliest" :)

And wouldn't it be Gen5 (by EA's standard *g*)?

If they continue to use PPC in 2016, then they are drunk as fuck.
Why?
 
If the Wii 3 released in 2016 than that would be a 4 year life of the Wii U...That's not gonna happen, the Wii 3 will release in 2017 or 2018 (prediction is 2018).

I'm thinking 2.3TF, with 10GB of ram, with 9-10 cores.

I'm thinking third parties might pressure Nintendo into going the same route as with PS5 and Xbox 9th Gen (I really don't know what to call the Xbox One successor).
How about "Xbox 6" since Xbox will be 6-feet under by 2016?
 
Does it hold up against the likes of The Division? Not at all. Titanfall? Yes, absolutely. I agree that it still doesn't look distinctly "next gen" but I wouldn't fault anyone for thinking it looks beyond PS360. It's all in the eye of the beholder, that's the main problem here.
Based on the gifs people made? Yes, then it does seems to hold up against Titanfall.

But that's just the magic of low res gifs.


Low res gifs make people fill in the details that may or may not be there in full resolution.


If you look at the trailers of both games in their native resolution, full screen, Titanfall destroys X.
http://www.youtube.com/watch?v=goe6IB1DLZU
http://www.youtube.com/watch?v=atTvHk9CxOM

Even with the eye of the beholder etc ... I can't see how or why people even try to make a contest of this. There is no contest.
 
If the Wii 3 released in 2016 than that would be a 4 year life of the Wii U...That's not gonna happen, the Wii 3 will release in 2017 or 2018 (prediction is 2018).

I'm thinking 2.3TF, with 10GB of ram, with 9-10 cores.

I'm thinking third parties might pressure Nintendo into going the same route as with PS5 and Xbox 9th Gen (I really don't know what to call the Xbox One successor).
6 core CPU, 1 TF GPU, 8 GB GDDR3 RAM, 64 GB FLASH.
 
M°°nblade;64716911 said:
Even with the eye of the beholder etc ... I can't see how or why people even try to make a contest of this. There is no contest.
Titanfall is cross gen so that may be its weakness assuming the Xbox 360 is holding it back. Still an impressive effort if it manages to [greatly] surpass X after you factor that in.

However, on Wii U's part this looks less impressive because of said handicap.
 
That means dropping BC and I dont think that is a choice Nintendo would make lightly.
If they had gone with an x86 based CPU for the WiiU (think Llano/Trinity with double the SP count) they could've software emulated all their past systems and have Xbone performance levels for the upcoming games.

Now that they have a tricore PPC and an integrated hot mess of a GPU this premise becomes less likely in the future.
 
If they had gone with an x86 based CPU for the WiiU (think Llano/Trinity with double the SP count) they could've software emulated all their past systems and have Xbone performance levels for the upcoming games.

Now that they have a tricore PPC and an integrated hot mess of a GPU this premise becomes less likely in the future.
Llano/Trinity wouldn't have fitted the power envelope. Maybe 2xBobcat would have been a possibility, but even the 2 cores / 80 shader standard variant has a TDP of 18W, this is probably only slightly less than WiiU. So there weren't really that many attractive alternatives.
 
Llano/Trinity wouldn't have fitted the power envelope. Maybe 2xBobcat would have been a possibility, but even the 2 cores / 80 shader standard variant has a TDP of 18W, this is probably only slightly less than WiiU. So there weren't really that many attractive alternatives.
When I made that suggestion it automatically implied orders of magnitude more demanding power/thermal characteristics (~120W TDP for the APU alone). It would require Nintendo to alter their approach and try to compete with the other two. Given the relatively conservative specs of other next gen consoles they had a unique opportunity to enter the race at an even level while providing a unique capability of having a game library that spans decades in a single package.

Of course hindsight is 20/20 vision; Nintendo themselves probably weren't aware of the architectures and performance levels of the competition. They aimed to provide a "second console" assuming their own franchises are alluring enough to attract customers.

Let's see how it pans out for them this time.
 
Titanfall is cross gen so that may be its weakness assuming the Xbox 360 is holding it back. Still an impressive effort if it manages to [greatly] surpass X after you factor that in.

However, on Wii U's part this looks less impressive because of said handicap.
I don't see much holding back in the footage we've seen.
The only thing that bothers me is the limited mech detail. The use of textures gives it a cartoon look.

http://i405.photobucket.com/albums/pp134/Cowboy_bebop/2013-06-19_124050_zpsd17f3e8c.jpg

I expect the X360 version of Titanfall to look significantly worse than the footage we've seen on E3. Worse than X.
 
M°°nblade;64716911 said:
Based on the gifs people made? Yes, then it does seems to hold up against Titanfall.

But that's just the magic of low res gifs.


Low res gifs make people fill in the details that may or may not be there in full resolution.


If you look at the trailers of both games in their native resolution, full screen, Titanfall destroys X.
http://www.youtube.com/watch?v=goe6IB1DLZU
http://www.youtube.com/watch?v=atTvHk9CxOM

Even with the eye of the beholder etc ... I can't see how or why people even try to make a contest of this. There is no contest.
It... maybe my eyes are broken, or maybe it's the stylized nature of Titanfall, but I don't see *that* much difference, tbh. I've never made the comparison before, but watching both trailers, the most obvious thing to me is perhaps that X has jerkier animations. And probably worse AA.
 
If they had gone with an x86 based CPU for the WiiU (think Llano/Trinity with double the SP count) they could've software emulated all their past systems and have Xbone performance levels for the upcoming games.

Now that they have a tricore PPC and an integrated hot mess of a GPU this premise becomes less likely in the future.
Do you really think that? Software emulation with graphical glitches is not what a professional enterprise like Nintendo can offer.
And by the way, the Bobcat design (Jaguar wasn't available 1 year ago) is MUCH LESS efficient than the one on the WiiU.
This means more power draw for the same performance or even less, and Nintendo having to make a compiler optimized from scratch and re-do all their tools.

It wasn't an option, really.
 
Do you really think that? Software emulation with graphical glitches is not what a professional enterprise like Nintendo can offer.
They would develop the emulators internally with intimate knowledge of their past hardware. Performance and fidelity could be much better than what you are seeing on enthusiast developed emulators for other platforms. Worst case scenario they'd have to optimize on a game to game basis and focus on the best sellers first.

And by the way, the Bobcat design (Jaguar wasn't available 1 year ago) is MUCH LESS efficient than the one on the WiiU.
Llano APUs feature the Husky core based on an optimized K10.5 microarchitecture with much higher per core performance than any next gen console (three-issue wide, big fast L2, high clocks).

A8-3820 crams a 4 core CPU clocked at 2,5 Ghz and 400 SP GPU at 600mhz into a 65W thermal envelope. A ~120W design I suggested before would facilitate a higher clocked GPU featuring 768 SPs and a wider 256 bit memory interface (or 128 bit GDDR5). If emulation required it, the CPU could turbo two cores to ~3 GHz or they could implement translation hardware on the graphics command processor to aid in Flipper/Broadway emulation.

This means more power draw for the same performance or even less, and Nintendo having to make a compiler optimized from scratch and re-do all their tools.

It wasn't an option, really.
The first part is irrelevant in pursuit of performance, of course the upfront cost would be higher across the board. They already had to re-work their tools for multi core support and that Frankenstein of a GPU.

Whether it was an option in the eyes of decision makers at Nintendo or not, frankly nobody here can claim that. It definitely was an option in a wider sense and an oportune one which would have allowed them to close the performance gap and vitalize 3rd party suport.

In my previous post I forgot to mention that they'd be bringing Xbone level performance to the market a year earlier than the competition. It would've been priced higher, I'm guessing between $399 to $449 to break even, if they insisted on including the tablet controller. By the time the PS4 launched the production cost would drop to profitability levels with adoption of recent technology like 20 nm DRAM and freeing up of 32 nm production slots at GloFo since AMD would be moving to 28 nm.

Meh... just a nice little fantasy that, had it realized, would drive me back to being their fan.
 
A8-3820 crams a 4 core CPU clocked at 2,5 Ghz and 400 SP GPU at 600mhz into a 65W thermal envelope. A ~120W design I suggested before would facilitate a higher clocked GPU featuring 768 SPs and a wider 256 bit memory interface (or 128 bit GDDR5). If emulation required it, the CPU could turbo two cores to ~3 GHz or they could implement translation hardware on the graphics command processor to aid in Flipper/Broadway emulation.
It is indeed nice to dream. But I do agree, and seeing how they haven't really shown a compelling use for the gamepad yet, I feel like it's a $100 wasted in costs that otherwise could have been used to bump up the specs. Not up to PS4 level because I still think it's nice to aim for a smaller and more power efficient console, but Llano/Trinity based chipset would have helped a lot.

A8-4555 with 2GB RAM for games and 1GB RAM for whatever the OS need. Up the flash storage to 64GB, drop the basic model and add USB 3.0 support. And dual-band WiFi. Doesn't have to be 802.11ac.

That said, I will still buy a Wii U when Smash and Mario Kart arrives. And whenever Metroid decides to show up. Can't miss those.
 
2016? It will still be the same generation, unless you're expecting them to drop Wii U.
I was working off of the discussion above me. Someone said they expect a "Wii 3" of sorts in 2016 (their personal estimate).

I think using a PPC is blindingly stupid. Their costs are going to be much higher than Sony's and MS's for comparable power. Not only that but they'll STILL be the only one with a completely different architecture environment.

If they had gone with an x86 based CPU for the WiiU (think Llano/Trinity with double the SP count) they could've software emulated all their past systems and have Xbone performance levels for the upcoming games.

Now that they have a tricore PPC and an integrated hot mess of a GPU this premise becomes less likely in the future.
Yep!

Llano/Trinity wouldn't have fitted the power envelope. Maybe 2xBobcat would have been a possibility, but even the 2 cores / 80 shader standard variant has a TDP of 18W, this is probably only slightly less than WiiU. So there weren't really that many attractive alternatives.
It's either bite the bullet now or deal with the cluster fuck of development later.

I guess they picked their poison.
 
It is indeed nice to dream. But I do agree, and seeing how they haven't really shown a compelling use for the gamepad yet, I feel like it's a $100 wasted in costs that otherwise could have been used to bump up the specs. Not up to PS4 level because I still think it's nice to aim for a smaller and more power efficient console, but Llano/Trinity based chipset would have helped a lot.

A8-4555 with 2GB RAM for games and 1GB RAM for whatever the OS need. Up the flash storage to 64GB, drop the basic model and add USB 3.0 support. And dual-band WiFi. Doesn't have to be 802.11ac.

That said, I will still buy a Wii U when Smash and Mario Kart arrives. And whenever Metroid decides to show up. Can't miss those.
Off-TV play is honestly a compelling enough reason to me. The specs I listed would be more than enough to run faithful next gen ports at 720p, the exact resolution of the gamepad screen.

Imagine playing an intense multiplayer game of The Crew, Plants vs. Zombies or the next Call of Duty while lying in bed, chilling on the terrace or relieving yourself on the can. XD

If the above option was presented to me I'd consistently pick the WiiU multiplats above others, maybe even go WiiU exclusive.
 
It is indeed nice to dream. But I do agree, and seeing how they haven't really shown a compelling use for the gamepad yet, I feel like it's a $100 wasted in costs that otherwise could have been used to bump up the specs.
This isn't really on topic but I'd like to say I think the gamepad would be great for a rainbow six type game. Where you could draw how you wanted your team to move or be strategic about advancing through hallways to corner a target. Though i wouldn't prefer it they could of course also release a pokemon game with the gamepad as a pokedex and that would work too. In general though I think nintendo saw all the touch enabled mobile games and realized they too need a touchscreen "just in case". I mean no one really knows what the right direction for consoles is going to be.

In general though I think its odd how people justify cost. For example I paid about $300 for wii u deluxe and have a great system plus essentially a tablet. I often use the gamepad to watch hulu, netflix, amazon, youtube, surf the web, check email, load up VC games etc. Yet people tell me i was ripped off! The same people that spend $800 for their second iPad with the only justification being "look how much more comfortable I am checking my email on the toilet"
 
People usually make these comparisons in context. 50% faster than the competition is not the same as 50% faster than last gen consoles. Random number but you get the point.

Whoever is saying the PS4 is a generation ahead of the Xbox One because it has a few advantages is wrong. Just like you, I wouldn't consider a 50% difference to be "on par", but it's definitely in the same ballpark and the kind of difference I would expect from systems released within the same generation.
Yeah 50% faster than the competition is not the same as 50% faster than last gen consoles if you're discussing how impressed you are by that extra 50%. I get what you're saying, just don't think that was the context of the discussion. Or maybe it was for some and they just didn't communicate it very well.
 
When I was reading through the "All this talk about our earnings is silly" thread I found this passage really interesting.



Can we expect the successors of 3DS and WiiU to use both the same basic architecture like one of these two (probably WiiU)?
What he's referring to here is that for Wii to WiiU they've gone from a compartmentalised fixed function GPU to a fully programmable unified GPU. Two very different animals. The next console will use something newer again no doubt but its still going to be a unified shader architecture (just like every modern GPU going forward).

The only thing I can possibly see them reusing again for the next console is the CPU. They could perhaps improve it again put it on a 14nm process and clock it to 2.5Ghz and include 8 cores. But they could just as easily decide to finally move to a entirely new CPU architecture.
 
Status
Not open for further replies.