ANY new tech is extremely costly to produce. If you're just buying components off the shelf and slotting them in it's much cheaper
Isn't that what Nintendo are effectively doing?
The gamepad's streaming tech is Broadcom's Mirrorcast. Based on Ask Iwata, software/firmware level optimised for improved latency response and resistance to interference
The Processor and its architecture is IBMs
AMD customise a GPU from the R700 series
Samsung provide NAND Flash and eMMC memory
Samsung, Micron, and various others provide the DDR3
Wireless networking is provided by Broadcom 802.11n chip
NFC chip in controller provided by Broadcom
Seems to me a majority of the components in the Wii U are off the shelf. Only the GPU and CPU are not, and they're rather simplistic and dated architectures by modern standards. Neither the CPU or GPU is cutting edge in either architecture or performance. Old architecture, low end performance. Is it not true that the Wii U's CPU has roughly the transistor count of a single Xenon core? Or that the GPU at best has around 400 gigaflops of processing power?
If Nintendo are spending serious coin developing and manufacturing a 400 gigaflop GPU and a multi core PPC 750 CPU they're fools. Everything I've seen suggests a budget AMD APU, $80 RRP, could out the two of them without effort.
As for the Mirrorcast software, I can't imagine Nintendo spending tens of millions adapting Broadcom's Mirrorcast tech to the Wii U.
but the result would be what you get with most phones -- a service that soon slows down and stops working after a couple of years. Don't forget that the WiiU is also running two screens as once, and as with a game like Zombie U it does it pretty well (even for a launch title).
Based on the state of the Wii U's OS, its already stopped working a few times for me.
As for the OS being less developed than the others at launch. I'd say they're comparable even today. Given that there's been 7 years to work on them the xbox and PS3 OS aren't good at all. The most annoying thing about the WiiU os is that it has a giant "Please wait" message and it plays a sound. I think without that it's problems would be much less noticeable. It still needs to be fixed though.
I don't agree with this.
You're heavily underestimating those two factors. Also, anything that increases the size of a chip considerably increases manufacturing costs, as it reduces the yield - less dies fitting on a wafer, and a much higher probability that individual dies come out dead. Removing the eDRAM would probably cut the manufacturing cost in half.
Not sure I understand.
Are you saying Nintendo invested significant money into the fabrication setup for the CPU. Because the Wii U's CPU is unlike any other PPC 750 ever produced (multi core, large cache, high clock speed) a whole new fab process had to be setup. This setup would have cost Nintendo a lot of money? How much money we talking?
The size of the Wii U's CPU is nothing. Compared to Xenon at 45nm, isn't it roughly 1/3rd the size and thus transistor count?
As for the GPU, the eDRAM I understand would cost a bit extra. But surely we're talking dollars per chip at most here?
Over all though wisppel I'm not really buying the argument that the Wii U hardware cost a lot then first appearance would suggest. The tablet controller is criticised for its battery life, well Nintendo put a very small battery in it. Why'd they do that? Money is the only reason I can see. 2GB DDR3 on a 64bit bus, seriously it would have been a matter of dollars to increase that to 256bit and say 4GB. The eMMC flash memory in the system is the same low end slow read and write stuff used in iPads and smart devices. Cheapest of the cheap available on the market for the most part, Samsung don't manufacturer flash memory slower then it from what I can find in their product listing. Majority of the Wii U's chips sans CPU and GPU are off the shelf , and all are costed in the single digit dollars range.
The impression I'm getting now is that Nintendo made poor choices for the hardware and basically shot themselves in the foot. Why have they persisted with the PPC 750 CPU architecture? It's a 15 year old architecture give or take, and frankly no matter how modified it is, its going nothing on modern x86 or IBM CPU architectures. Where else is the 750 used? No where outside of Nintendo's products, IBM dumped it over a decade ago with the last of the Apple iBooks.
The only thought I have for why Nintendo persisted with PPC 750 is because of backwards compatibility and a reluctance to embrace and learn a new architecture. Having used PPC 750 since Gamecube sticking with it means they can avoid having to reskill and can reuse a lot of assets and tools they've developed over the years. Nintendo do seem to go out of their way to avoid having to learn or embrace new architectures. Evident by their continued support of PPC 750, fixed function GPUs, and utilising the same base architecture concept from the Gamecube to Wii U. Seems to me they've spent more money trying to adapt their existing architectures and beef them up for HD gaming, like dicking around and making a multi core PPC 750, then what that money could have brought had Nintendo invested it into the best architecture AMD and IBM could have provided.
Spend $100 beefing up a 750 CPU. Result = still pathetically bad performance
Spend $100 buying a best CPU IBM/AMD have available. Result = Very good performance but we'd have to learn a new architecture, develop new tools and assets, up skill and retrain staff, and we'd also lose backwards compatibility