Ok, I think it's about time we tried to put all known Wii U specification things into its own thread and try to have a civil discussion.
Hard facts (either publicly disclosed, or a non-public leak which can be vouched by somebody trustworthy on this very forum):
- MCM design: GPU+eDRAM die and CPU die on the same substrate.
- 2 GB of gDDR3 memory @800MHz (DDR3-1600), organized in 4x 4Gb (256Mx16) modules, sitting on a 64bit bus (@800MHz). That gives a net BW of 12800MB/s (12.5GB/s). We can conveniently refer to this pool as 'MEM2'. Currently 1GB of that pool is reserved for the OS.
- 32 MB of unknown organisation, unknown specs eDRAM, sitting with the GPU. We can conveniently refer to this pool as 'MEM1'
- Tri-core CPU, binary compatible with Gekko/Broadway, featuring 3MB of cache in asymmetric config: 2x 512KB, 1x 2048KB; so far several things indicate CPU cache is implemented via eDRAM itself. Unknown clock, unknown architecture enhancements (e.g. SIMD, etc).
- AMD R700-originating GPU (R700 is AMD architecture codename 'Wekiva'), evolved into its own architecture (AMD architecture codename 'Mario'), relying on MEM1 for framebuffer purposes, but also for local render targets and scratch-pad purposes.
- Memory access specifics: both MEM1 and MEM2 are read/write accessible by the CPU, both subject to caching. GPU in its turn also has access to both pools, and is likely serving as the north bridge in the system (an educated guess, subject to calling out).
- System is equipped with extra co-processors in the shape of a dual-core ARM (unknown architecture) and a DSP core (again of unknown architecture) primarily for sounds workloads.
- BluRay-based optical drive, 22.5MB/s, 25GB media.
Immediate logical implications from the above (i.e. implications not requiring large leaps of logic):
- Not all WiiU CPU cores are equal - one of them is meant to do things the other two are not. Whether that is related to BC, OS tasks, or both, is unclear.
- When it comes to non-local GPU assets (read: mainly textures), WiiU's main RAM BW increase over nintendo's most advanced SD platform (i.e. 5.48GB/s -> 12.5GB/s) hints that WiiU is mainly targeted at a 2x-3x resolution increase over the Wii, or IOW, 480p -> 720p.
- The shared access to MEM1 pool by the GPU and CPU alike indicated the two units are meant to interact at low latency, not normally seen in previous console generations. Definitely a subject for interesting debates this one is.
What happens if a GameCube disc is inserted in Wii mode? Does it reject it or does it actually recognize it?
what happens when you insert one into the revised Wii without BC? i'd assume it works the same way.
So what are the other 2 for? Do we have 1 main CPU like the PS3 but only with 2 SPUs?
Do PS3 and 360 have a seperate audio-chip like the Wii-U or could some of the CPU trouble come from forcing the audio procession through the CPU too?
.Panasonic MN864718 HDMI Controller
Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller
Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM (4 x 4 Gb for a total of 16 Gb or 2 GB RAM)
http://www.micron.com/~/media/Documents/Products/Data Sheet/DRAM/4Gb_1_35V_DDR3L.pdf
DRH-WUP 811309G31
Fairchild DC4AY
SMC 1224EE402
Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash
Whatever happened to bgassasin? His optimism now seems to be mostly unfounded.
AFAIK the Wii U's setup is unique in that it has a dedicated audio DSP and an ARM co-processor.
Regarding the triple core CPU, I assume he meant we can deduce the cores are asymmetrical due to one having significantly more cache than the other two. (2048 KB vs. 512 KB x2)
I see. So it could be possible that later ports will address the sound part. Maybe it already is and it doesn't matter at all, performance wise.
Panasonic MN864718 HDMI Controller
Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller
Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM (4 x 4 Gb for a total of 16 Gb or 2 GB RAM)
http://www.micron.com/~/media/Docume..._35V_DDR3L.pdf
DRH-WUP 811309G31
Fairchild DC4AY
SMC 1224EE402
Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash
Good to see some civilised poster in here. Hopefully this thread stays this way. It definitely a breath of fresh air among the polution the launch day madness has caused.
iFixit has now a teardown aswell. Anything new in there? I noticed they have ram chips from Micron in their WiiU.
I'll ask again. Has it been confirmed that the CPU has direct access to the EDRAM?
CPU has its own EDRAM (3MB). 2MB of it two one of its cores, and 512KB to each of the other two cores.
what happens when you insert one into the revised Wii without BC? i'd assume it works the same way.
There were talks that a couple of ports didn't use the DSP for sound, but ran it over the CPU. IIRC, this is a noticeable burden, sound takes up 1/6th of the XB360 CPU.
Thanks. So the CPU does not have any access to the 32Mb EDRAM pool?
Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.
edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.
If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
Software bugs.If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
Agreed.I don't want to play backseat moderator, but I think this thread wasn't intended to be the place to speculate about the Wii U's fate on the market.
It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.
I wish the Iwata ask was more in depth about the actual specs and why they went that way.
Could it be telling that compared to the west, Nintendo's Japanese "lifestyle" videos for Wii U have focused on low key single-person scenarios. In smaller apartments, etc.
Aside from power consumption and size, I wonder if a super quiet device is also a big concern for appealing to Japanese customers.
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.
I wish the Iwata ask was more in depth about the actual specs and why they went that way.
Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.
Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.
I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.
I think his point was that games targeting 1080p on the big consoles could be ported to run at 720p on Wii U.
I think it's not a bad idea in general, and I made posts to that effect a while ago. The issues are that
- it assumes developers will be going for 1080p on those consoles
- it only helps in scaling graphics, which are pretty easy to scale in the first place. General purpose code is much harder to scale, and I doubt the Wii U CPU (and its bandwidth) will help there vis-a-vis PS4/720 (Even though the latter will probably also feature disappointing CPUs, at least IMHO)
I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.
Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.
Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.
edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.
If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
So Wii U can't connect to my 5ghz wifi since its used for the gamepad?
The greatest strength of the 5 GHz band is the availability of 23* non-overlapping channels; 20* more channels than what is available in the 2.4 GHz band. Since there is no other wireless technology that “fights” for the radio space, the 23* available non-overlapping channels can provide a possibility for easier planning of an interference-free and stable wireless communication. Another advantage of the 5 GHz band is that the greater number of available channels provides for increased density, which means that more wireless devices can be connected in the same radio environment.
It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.
Agreed. In fact I think they should let their partners like ATI and IBM design their hardware instead. Basically give them a certain price point, like £199 full retail and tell them to engineer the best possible hardware for that price point.I feel like Nintendo needs an American division to design hardware and the OS and let Japan handle game software...
Eurogamer said:Bearing in mind the general level of performance we've seen from the £300 Digital Foundry PC - built using off-the-shelf parts - it's a touch disappointing that graphical quality in Wii U shows no generational leap at all over the Xbox 360 or PlayStation 3. The DFPC is in the same ballpark price-point as Wii U, it features superior CPU/GPU power, twice as much RAM and much more storage. The notion that Nintendo could not match or better it in an integrated design bearing in mind the vast buying power it has at its disposal - even factoring in the additional cost of touch-screen GamePad - is disappointing.
In a world where Chinese manufacturers can sell complete Android tablets with capacitive touch-screens for £50, it's safe to say that the Wii U GamePad won't be costing Nintendo too much to construct. That being the case, factoring in the modest processing power on offer, we were firmly of the belief that the platform holder would be targeting a £199/$299 price-point for Wii U. Sadly, it was not to be. From what we've experienced of the hardware and games thus far, the new console definitely feels a bit pricey, bearing in mind the gaming proposition on offer.
Seems like the hylix memory used is gDDR like the samsung memory. This is DDR 3 that is specifically tweaked for graphics/desktop.
Also if the x720 rumours are true then you will likely see a simular set up to the WiiU's. Considering console makers are desperatly trying to keep costs down using DDR 3 is the way to go. It's till far cheaper than GDDR 3 and espically GDDR 5.
smaller/indie devs have been producing great looking games on PS3/360 already this gen, and that will continue next gen. You don't have to spend a fortune to develop a game.