Alright, the second thing I wanted to put up here before I head back into exile is about the GPU. It's just a bit of deduction/speculation on my part based on what we know.
Wii U's GPU
I've been thinking about the GPU a bit recently, and in particular what we can infer from the decision to use an R700 series (Radeon HD4xxx) chipset in the development kits. Firstly, we know that the Wii U's GPU is, to some extent, a custom chip. It may well be based around an existing chip, but at the very least it has 32Mb of eDRAM onboard, and quite possibly some other extra stuff we don't know about. We also know that it began development in 2009. We can expect that in 2009 and early 2010 Nintendo and AMD settled down on the basic specifications for the chip, ie the number of SPUs, TMUs and ROPs, the use of VLIW5, VLIW4 or GCN architectures, and the intended manufacturing node. Now, sometime in late 2010 or early 2011, when Nintendo were putting together the first dev kits to be sent out to third parties, the GPU quite obviously wasn't ready, so they had to go with one of AMD's off the shelf cards as a stand-in, and they chose one from the R700 line (I've heard the HD4830, but I don't know if we've got confirmation of this).
Why did they do this?
We can pretty safely say that whatever GPU ends up in the Wii U, it will be manufactured at a 40nm or smaller process. Why then go with an older 55nm card when there were plenty of 40nm HD5xxx and HD6xxx cards available which could provide pretty much identical performance with a lower power draw? What characteristic does the Wii U's GPU share with the HD4xxx series that it doesn't with any card in the HD5xxx or HD6xxx lines? There's only one aspect that I can think of:
The HD4xxx series were the only 640 SPU cards available at the time the dev-kits were being put together.
This is actually a fairly sensible reason for putting a R700 series card in the dev kit; Nintendo had settled on a core configuration with 640 SPUs (and perhaps 32 TMUs and 16 ROPs), so a HD4830 would naturally have been the best fit for a development kit. I don't think it would be a stretch to say that this is good evidence for the final GPU being a 640 SPU part.
Now comes the real speculation. Early this year, we started to get reports that developers were getting new development kits with a performance boost over previous kits. That's the sort of thing you'd expect to hear if Nintendo replaced the R700 stand-in card with an early production version of the actual Wii U GPU. This lines up exactly with AMD's new 28nm HD7xxx series coming off the production line, and in particular the HD7770 (Cape Verde), their first 640 SPU part since the HD4xxx series. The HD7770, clocked down to about 600MHz-700Mhz, would fit pretty much perfectly into Nintendo's requirements as far as performance, size and heat are concerned.
Nintendo approached AMD in 2009 looking for a reasonably powerful, but low-wattage GPU to put in their mid-2012 console. It's not unreasonable to speculate that AMD said "we've got a 640 SPU part on a 28nm process planned for late 2011, how about we customise something around that?". It explains why they went with a HD4xxx card in the dev kits, it explains why the dev kit power boost came when it did, and it fits very neatly to what we've heard about performance and power consumption.
And to the inevitable "Nintendo would never do 28nm" responses, keep in mind that Nintendo have always used the smallest available node in manufacturing their hardware, right back to the 350nm chips in the N64. Also this would have been decided back in 2009/2010, when it would have been reasonable to expect the 28nm node to be ready for a 2012 reasonably-priced console. In fact, the push back of the release date from the summer could well be due in part to a desire to wait until the yields on 28nm chips increase.
We also have to consider whether NEC (now Renesas), who manufactured the Gamecube and Wii GPUs, and we can expect are first in line to manufacture the Wii U GPU, are capable of manufacturing at 28nm. As it happens, NEC announced a deal back in 2009 (when Nintendo would have been making the decision) with none other than IBM, to manufacture 28nm chips at East Fishkill, New York, in the very same facility which the Wii U's CPU is being manufactured. How's that for a coincidence?