• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Durante

Member
I don't want to play backseat moderator, but I think this thread wasn't intended to be the place to speculate about the Wii U's fate on the market.
 

deleted

Member
Ok, I think it's about time we tried to put all known Wii U specification things into its own thread and try to have a civil discussion.

Hard facts (either publicly disclosed, or a non-public leak which can be vouched by somebody trustworthy on this very forum):
  • MCM design: GPU+eDRAM die and CPU die on the same substrate.
  • 2 GB of gDDR3 memory @800MHz (DDR3-1600), organized in 4x 4Gb (256Mx16) modules, sitting on a 64bit bus (@800MHz). That gives a net BW of 12800MB/s (12.5GB/s). We can conveniently refer to this pool as 'MEM2'. Currently 1GB of that pool is reserved for the OS.
  • 32 MB of unknown organisation, unknown specs eDRAM, sitting with the GPU. We can conveniently refer to this pool as 'MEM1'
  • Tri-core CPU, binary compatible with Gekko/Broadway, featuring 3MB of cache in asymmetric config: 2x 512KB, 1x 2048KB; so far several things indicate CPU cache is implemented via eDRAM itself. Unknown clock, unknown architecture enhancements (e.g. SIMD, etc).
  • AMD R700-originating GPU (R700 is AMD architecture codename 'Wekiva'), evolved into its own architecture (AMD architecture codename 'Mario'), relying on MEM1 for framebuffer purposes, but also for local render targets and scratch-pad purposes.
  • Memory access specifics: both MEM1 and MEM2 are read/write accessible by the CPU, both subject to caching. GPU in its turn also has access to both pools, and is likely serving as the north bridge in the system (an educated guess, subject to calling out).
  • System is equipped with extra co-processors in the shape of a dual-core ARM (unknown architecture) and a DSP core (again of unknown architecture) primarily for sounds workloads.
  • BluRay-based optical drive, 22.5MB/s, 25GB media.

Immediate logical implications from the above (i.e. implications not requiring large leaps of logic):
  • Not all WiiU CPU cores are equal - one of them is meant to do things the other two are not. Whether that is related to BC, OS tasks, or both, is unclear.
  • When it comes to non-local GPU assets (read: mainly textures), WiiU's main RAM BW increase over nintendo's most advanced SD platform (i.e. 5.48GB/s -> 12.5GB/s) hints that WiiU is mainly targeted at a 2x-3x resolution increase over the Wii, or IOW, 480p -> 720p.
  • The shared access to MEM1 pool by the GPU and CPU alike indicated the two units are meant to interact at low latency, not normally seen in previous console generations. Definitely a subject for interesting debates this one is.

So what are the other 2 for? Do we have 1 main CPU like the PS3 but only with 2 SPUs?
Do PS3 and 360 have a seperate audio-chip like the Wii-U or could some of the CPU trouble come from forcing the audio procession through the CPU too?

I'm not technically adept at all, so bear with me if these questions are stupid :p
 

efyu_lemonardo

May I have a cookie?
So what are the other 2 for? Do we have 1 main CPU like the PS3 but only with 2 SPUs?
Do PS3 and 360 have a seperate audio-chip like the Wii-U or could some of the CPU trouble come from forcing the audio procession through the CPU too?

AFAIK the Wii U's setup is unique in that it has a dedicated audio DSP and an ARM co-processor.

Regarding the triple core CPU, I assume he meant we can deduce the cores are asymmetrical due to one having significantly more cache than the other two. (2048 KB vs. 512 KB x2)

edit: If I had to guess I'd say the dual core ARM performs a function similar to that of the Wii's ARM co-processor, nicknamed Starlet by the Homebrew community. By this I mean software encryption and authentication, some wireless functionality and other I/O (USB, optical drive, Wii-specific elements needed for bc). It also possibly handles other background tasks such as downloading updates, managing friend requests, miiverse data, incoming video-calls, etc. since these things have been shown to occur while playing games.
 

deleted

Member
AFAIK the Wii U's setup is unique in that it has a dedicated audio DSP and an ARM co-processor.

Regarding the triple core CPU, I assume he meant we can deduce the cores are asymmetrical due to one having significantly more cache than the other two. (2048 KB vs. 512 KB x2)

I see. So it could be possible that later ports will address the sound part. Maybe it already is and it doesn't matter at all, performance wise.
 

ozfunghi

Member
I see. So it could be possible that later ports will address the sound part. Maybe it already is and it doesn't matter at all, performance wise.

There were talks that a couple of ports didn't use the DSP for sound, but ran it over the CPU. IIRC, this is a noticeable burden, sound takes up 1/6th of the XB360 CPU.
 

Mlatador

Banned
Good to see some civilised poster in here. Hopefully this thread stays this way. It definitely a breath of fresh air among the polution the launch day madness has caused.
 

ozfunghi

Member

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I'll ask again. Has it been confirmed that the CPU has direct access to the EDRAM?
 
Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.
 

mrklaw

MrArseFace
does anyone have some specific examples of modern engines, and what the likely texture/vertex size would be per frame, so we can get some context around the lack (or not) of ram bandwidth?
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
CPU has its own EDRAM (3MB). 2MB of it two one of its cores, and 512KB to each of the other two cores.

Thanks. So the CPU does not have any access to the 32Mb EDRAM pool?
 

deleted

Member
There were talks that a couple of ports didn't use the DSP for sound, but ran it over the CPU. IIRC, this is a noticeable burden, sound takes up 1/6th of the XB360 CPU.

So maybe some of the choppier games have their low FPS from brute forcing the 360 architecture onto the WiiU which obviously doesn't work.
Sounds promising for the future, when dev teams are more familiar with the hardware and are able to use it the way intended.
 
Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.

You know what, the "Wii HD" rumor was first founded in 2006. There was a buzz of the Wii HD a little after the PS3 launched that was in the same power range as what we're seeing now.

I wonder if this is the Wii HD that we would've seen in 2008 or 2009 had those rumors panned out.
 
Seems to me from a technical stand point that Nintendo is going to continue with it's 5-6 year cycles for consoles. We probable will see the next nintendo console in 2016-2017.
 
Nice OP, blu.

I was toying around with some of the system features of Wii U last night, and revisited an idea that was surely kicked around in one of the WUSTs. Regarding the assymetric cache split on the CPU, I tend to agree with Shifty Geezer(?) on Beyond3D, who said basically looking at that size of the CPU, it wouldn't make sense for the cores themselves to differ beyond cache. It would be an insult for any of the cores to be anything less than fully functional!

So I'm wondering this. We know with a fair amount of certainty that there are ARM cores on the GPU for OS, I/O, and security. So when I am using Netflix (works pretty damn good btw) or the browser, alone, does that all run on the ARM cores? No, I think not. I may have fallen into the trap of thinking that before, but the actual "OS" involves none of those applications. So now I am thinking perhaps "Espresso" has an energy saver mode for BC and non-game applications, and in this mode only Core 1 would be active. Perhaps that amount of cache just for applications is overkill, but since they are using a whole gig of RAM for the same purpose, maybe that is what's going on there with the asymmetric cache.
 

Doc Holliday

SPOILER: Columbus finds America
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.
 

wsippel

Banned
If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.
 

Kai Dracon

Writing a dinosaur space opera symphony
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.

Could it be telling that compared to the west, Nintendo's Japanese "lifestyle" videos for Wii U have focused on low key single-person scenarios. In smaller apartments, etc.

Aside from power consumption and size, I wonder if a super quiet device is also a big concern for appealing to Japanese customers.
 
Could it be telling that compared to the west, Nintendo's Japanese "lifestyle" videos for Wii U have focused on low key single-person scenarios. In smaller apartments, etc.

Aside from power consumption and size, I wonder if a super quiet device is also a big concern for appealing to Japanese customers.

Less noisy console means less tv volume necessary to hear over the whoosh which leads to happier neighbors? Could be...
 
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.

Why would you go mainly for the smallest of the 3 markets though? It's not like the US doesn't buy Nintendo products. You do have a point, I'm just wondering why. It has impressive power consumption which is a non-issue with cheap electricity in the US but more of a selling point in Japan.
 

KageMaru

Member
Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.

Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.

I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.

Well IMO devs are going to have to get used to it if the next xbox also has a small pool of fast memory for a buffer.

I think his point was that games targeting 1080p on the big consoles could be ported to run at 720p on Wii U.

I think it's not a bad idea in general, and I made posts to that effect a while ago. The issues are that
- it assumes developers will be going for 1080p on those consoles
- it only helps in scaling graphics, which are pretty easy to scale in the first place. General purpose code is much harder to scale, and I doubt the Wii U CPU (and its bandwidth) will help there vis-a-vis PS4/720 (Even though the latter will probably also feature disappointing CPUs, at least IMHO)

If you're right, I apologize for the misunderstanding and agree with you entirely. Thing is, I imagine the most graphically intensive games next gen won't be 1080p, but 720p and that leaves less room for scalability for the Wii-U on top of the issue with general purpose code.

I'm taking a wait and see on the CPUs in the other two systems. I find it hard to believe that they will just stick stock jaguar cores in their system.

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

The way I see it playing out is other than the higher end next gen games, most games will be cross generational scaling per platform (similar to what we saw with BF3). After a few years, when engines are optimized and the install base for the two bigger platforms have grown, last gen will be slowly left behind. I don't see budgets exploding like they did this gen for multiple reasons.

This really is no different than the last few generations. A developer told me in the beginning of this generation that the PSone was the most profitable platform a few years into the DC/ps2/xbox/GC generation and he expected the same with the PS2 this gen. Only now we don't have one dominant platform, so support will be spread out across multiple current gen systems and the Wii-U will benefit because of this IMO.

Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.

Even if the rumors are true regarding the memory in the next gen xbox, it still won't be nearly as slow as the memory in the Wii-U. Plus it's supposed to have a pool of DRAM (sDRAM?) to help compensate for the slower memory.

I'm not entirely sure what Nintendo was thinking with this memory. They probably wanted to get the most amount possible at the cheapest price possible.

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

I think we're seeing the effect of the low bandwidth here. It needs to access the amount reserved for the OS while the game is also trying to access the other 1GB. Unless it's also slow while you're in the dashboard, then that's pretty sad.
 
So Wii U can't connect to my 5ghz wifi since its used for the gamepad?

Not an expert but:

The greatest strength of the 5 GHz band is the availability of 23* non-overlapping channels; 20* more channels than what is available in the 2.4 GHz band. Since there is no other wireless technology that “fights” for the radio space, the 23* available non-overlapping channels can provide a possibility for easier planning of an interference-free and stable wireless communication. Another advantage of the 5 GHz band is that the greater number of available channels provides for increased density, which means that more wireless devices can be connected in the same radio environment.

source
 

pulsemyne

Member
Seems like the hylix memory used is gDDR like the samsung memory. This is DDR 3 that is specifically tweaked for graphics/desktop.
Also if the x720 rumours are true then you will likely see a simular set up to the WiiU's. Considering console makers are desperatly trying to keep costs down using DDR 3 is the way to go. It's till far cheaper than GDDR 3 and espically GDDR 5.
 

NBtoaster

Member
It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.

It also has a seperate processor for the OS too right? Maybe that's not so hot.
 

v1oz

Member
I feel like Nintendo needs an American division to design hardware and the OS and let Japan handle game software...
Agreed. In fact I think they should let their partners like ATI and IBM design their hardware instead. Basically give them a certain price point, like £199 full retail and tell them to engineer the best possible hardware for that price point.

I have to quote the Eurogamer review because I agree with them 100%. Because Nintendo mass produce and buy in huge quantities they can source parts far cheaper than most manufacturers.

Eurogamer said:
Bearing in mind the general level of performance we've seen from the £300 Digital Foundry PC - built using off-the-shelf parts - it's a touch disappointing that graphical quality in Wii U shows no generational leap at all over the Xbox 360 or PlayStation 3. The DFPC is in the same ballpark price-point as Wii U, it features superior CPU/GPU power, twice as much RAM and much more storage. The notion that Nintendo could not match or better it in an integrated design bearing in mind the vast buying power it has at its disposal - even factoring in the additional cost of touch-screen GamePad - is disappointing.

In a world where Chinese manufacturers can sell complete Android tablets with capacitive touch-screens for £50, it's safe to say that the Wii U GamePad won't be costing Nintendo too much to construct. That being the case, factoring in the modest processing power on offer, we were firmly of the belief that the platform holder would be targeting a £199/$299 price-point for Wii U. Sadly, it was not to be. From what we've experienced of the hardware and games thus far, the new console definitely feels a bit pricey, bearing in mind the gaming proposition on offer.
 
Seems like the hylix memory used is gDDR like the samsung memory. This is DDR 3 that is specifically tweaked for graphics/desktop.
Also if the x720 rumours are true then you will likely see a simular set up to the WiiU's. Considering console makers are desperatly trying to keep costs down using DDR 3 is the way to go. It's till far cheaper than GDDR 3 and espically GDDR 5.

No.

the cost between DDR3 and GDDR3 is not so insurmountable as to fuck the performance of the systems to that degree. Chill.
 

Eteric Rice

Member
smaller/indie devs have been producing great looking games on PS3/360 already this gen, and that will continue next gen. You don't have to spend a fortune to develop a game.

Maybe Nintendo is hoping to get those smaller, indie games? Might be evidenced by free patching and what not.
 
Top Bottom