*puts on fanboy hat*
There is still a chance that via a future firmware upgrade, Nintendo could up the CPU/GPU clocks somewhat, similar to what Sony did with the PSP (didn't Nintendo also overclock the 3DS post launch as well?). Or maybe there is some silicon that hasn't been excersized in some games as of yet. Or maybe the disk drive isn't spinning at full speed yet (or ever will due to noise/durability?). Who knows... just believe!
No, no, no. Supposedly they've released more RAM from the OS and access to the second core, previously reserved to OS and networking tasks; as much as 25% of it, that is.
It's unlikely that Nintendo would overclock the console (and they could overclock the 3DS quite a bit, and they certainly won't for battery reasons), but the further a game or software uses the hardware the bigger the power draw will be, so 33W measured with New Super Mario Bros Wii U can be probably topped out.
Case in point: past Nvidia Quadro cards who were virtually equal to Geforce cards often had less clockrate than them, because they use more transistor's out of their array and in full load for long periods of time no less (and a regular geforce is humiliated in rendering tasks by them, by those extra transistor's being put to use). The MHz threshold had to be lower then. In fact, looking at current Nvidia Quadro that's still true, GK107GLM a quadro part is topping out at 706 MHz; a regular GT107 is a GeForce GT 640 who's sold with 797 (DDR3 RAM), 900 and 950 MHz (GDDR5 RAM) configurations, the lowest is due to RAM bandwidth constraints (and punishing for going cheaper, I'm sure) and the others are attainable clocks for gaming, the quadro version though...
It's just low, but low for a reliability purpose.
Also, there's the old talk about Carmack advising against stock overclocked cards for Doom 3 because since the game was cutting edge and using shit no one else did (
more transistors, as he said) said card should heat more and if it was too overclocked it could fry or start out a artifact fest.
So, if this is a really custom part, and say, it has stuff like Pikmin 3's HLSL Blur hardwired along other effects as well as lots of stuff to exploit in a low level basis, then power consumption is bound to be affected by wether those paths are being exploited or not, simultaneously too. The VLIW5 example too, AMD opted out of it for VLIW4 not because it wasn't efficient, but because since it being 5-Way it wasn't really being taken advantage of then they were wasting transistors better used in more stream processors with only 4-way capability.
So, just by going from that if there was a VLIW5 GPU on a console, and on a PC, and it was the very same part, with optimization and the closed nature of the console there could be a very palpable power draw and heat difference.
Don't get me wrong I'm sure it can be overclocked (perhaps quite a bit), and full draw of the GPU might not amount to much difference (pretty sure it won't go past 40W sans-peripheric energy feeding), but it's not an universal thing to conclude after measuring first generation a low-end game (from a technological standpoint); not just GPU, I doubt they had to use more than one CPU core.