• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

It also has a seperate processor for the OS too right? Maybe that's not so hot.

My guess is that they are using a Cortex-A5. It's the smallest, most energy efficient, processor in its class and it supports SMP. The only thing is, would the chip need to be binary compatible w/ the ARM926 line core that is in Wii? Or would "application compatibility" be enough? The Cortex-A8 is fully binary compatible w/ the ARM926 line but doesn't support SMP , so I figure that it would be disqualified if wsippel's info is accurate.
 

Lonely1

Unconfirmed Member
Thanks. So the CPU does not have any access to the 32Mb EDRAM pool?

On the leaked documents, supposed taken from the Developers website, The 32MB EDRAM is refereed to as MEM1, while the 2GB or RAM is MEM2. That's all we got at the moment and that suggest, along with previous Nintendo designs, that it does have access. As far as I know, those CPU memory pools are regular L2 CPU cache. But I could be wrong :)
 

gofreak

GAF's Bob Woodward
My guess is that they are using a Cortex-A5. It's the smallest, most energy efficient, processor in its class and it supports SMP. The only thing is, would the chip need to be binary compatible w/ the ARM926 line core that is in Wii? Or would "application compatibility" be enough? The Cortex-A8 is fully binary compatible w/ the ARM926 line but doesn't support SMP , so I figure that it would be disqualified if wsippel's info is accurate.

Did this supposed ARM show up in any tear downs? I glanced over the iFixit one but didn't see any mention of such.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
On the leaked documents, supposed taken from the Developers website, The 32MB EDRAM is refereed to as MEM1, while the 2GB or RAM is MEM2. That's all we got at the moment and that suggest, along with previous Nintendo designs, that it does have access. As far as I know, those CPU memory pools are regular L2 CPU cache. But I could be wrong :)

Okay. I guess we're going to have to wait until more information becomes available.
It's going to be interesting to see how these pools are connected and how that helps developers overcome inherent bottlenecks.
 
I have a serious question for buyers: Looking at that Andtech benchmark, would you guys have gladly sacrificed backwards compatibility for a better CPU (and less bottlenecks)? Can't help but think they wouldn't have been better off going with even an ARM based CPU if power efficiency was their top concern. Do you feel this was a valid tradeoff?

Yes since I could use my PC for better BC. I would take a better CPU and faster ram with the same capacity if it meant loosing BC.
 

mrklaw

MrArseFace
Separate processor for OS would only be needed for background activities which need to run in parallel to games - eg background downloads, notifications etc.

Foreground OS - the dashboard, miiverse etc could and should be using the main CPU, because at that point the game is suspended in the background. The huge ram for the OS should mean no need to reload stuff from flash either, so the OS should be instantaneous.

A couple of things might be slowing down.
1-They're using the arm for the main dash too - but that would be ridiculous and they'd need the GPU anyway to draw stuff to the screen.
2- they're pulling the OS out of flash into main memory every time you switch an app.
 

wsippel

Banned
I feel like Nintendo needs an American division to design hardware and the OS and let Japan handle game software...
This division already exists: Nintendo Technology Development.


It also has a seperate processor for the OS too right? Maybe that's not so hot.
The CPU wouldn't explain the insane load times. Once it's up and running, it's pretty responsive. So no, the CPU isn't the culprit. And I think the ARM processor handles background stuff, not the frontends.
 

efyu_lemonardo

May I have a cookie?
Did this supposed ARM show up in any tear downs? I glanced over the iFixit one but didn't see any mention of such.

they wouldn't show up in a simple teardown because they are literally built into the GPU, in the same way that the ARM926 was built into Hollywood.

Wii_hw_diagram.png
 

Lonely1

Unconfirmed Member
Insane load times are easily explained by non optimized code. Has happened many times before and will happen many times after.
 

v1oz

Member
This division already exists: Nintendo Technology Development.
NTD seems to more of a software outfit. I believe they are working on Nintendo TV. But they helped with the compression tech on the controller. But going from the recent "Iwata asks" overall hardware design is dictated from Japan.
 

mrklaw

MrArseFace
Didn't the 360 get slightly burned when deferred rendering became popular? Something about the architecture and edram being setup for more traditional rendering.

Are there likely to be new emerging rendering techniques during the next 5 years, or is deferred rendering here to stay? And if it's staying, are there architectural choices you can make in a console design to optimise around that?
 

wsippel

Banned
Insane load times are easily explained by non optimized code. Has happened many times before and will happen many times after.
Yes and no. No optimizations only explain the terrible initial load times. Once an app is in RAM, switching to said app again should be a mere task switch. But that isn't what's happening here. Apps completely reload and re-initialize every time you access them.
 

Lonely1

Unconfirmed Member
Yes and no. No optimizations only explain the terrible initial load times. Once an app is in RAM, switching to said app again should be a mere task switch. But that isn't what's happening here. Apps completely reload and re-initialize every time you access them.

Well, that sounds like unoptimized (or bad coded!?) software to me.
 

wsippel

Banned
NTD seems to more of a software outfit. I believe they are working on Nintendo TV. But they helped with the compression tech on the controller. But going from the recent "Iwata asks" overall hardware design is dictated from Japan.
NTD does some software, but mostly hardware. TVii was developed by an external company. NTD coordinates some international operations. NST is Nintendo's main software outfit in the US.


Well, that sounds like unoptimized (or bad coded!?) software to me.
Right. But it also means that the OS doesn't actually use the reserved RAM. Not all of it. Not even close.
 

ozfunghi

Member
Didn't the 360 get slightly burned when deferred rendering became popular? Something about the architecture and edram being setup for more traditional rendering.

Are there likely to be new emerging rendering techniques during the next 5 years, or is deferred rendering here to stay? And if it's staying, are there architectural choices you can make in a console design to optimise around that?

I seem to remember a lengthy interview with a developer on Aliens CM, speaking about deferred rendering on WiiU.
 

v1oz

Member
Insane load times are easily explained by non optimized code. Has happened many times before and will happen many times after.
Even first party games like the 3.5GB Mario have load times. Which is concerning because this isn't their first disc based system. But I'm more worried about the slow OS and system crashes.
 

Durante

Member
I agree with the general sentiment that the long OS load times are probably mostly unrelated to the hardware. Even filling in the worst realistic specs for everything we don't know in the system, it should still be easily capable of pulling up those apps far more quickly. It's a software issue.

Of course faster hardware would mitigate the issue, but it shouldn't be necessary.
 

Dennis

Banned
This is seriously disappointing.

I was well aware that the Wii U was never intended to be Nintendos attempt at ousting the next gen Playstation and Xbox but goddammit.....this is some dream-killing shit.
 

AzaK

Member
He conveniently bailed on forums when more news of Wii U's capabilities started to surface.

I was out walking one night and a black clad figure leapt out of the shadows. As he grabbed me, I was sure this was the end, but instead he slipped me a note and disappeared into the darkness.

"None of the info released has changed my views/opinions on the console's capabilities. My optimism has not changed at all for what Wii U should be capable of"
 

AzaK

Member
Yes and no. No optimizations only explain the terrible initial load times. Once an app is in RAM, switching to said app again should be a mere task switch. But that isn't what's happening here. Apps completely reload and re-initialize every time you access them.

Yeah it's like the OS or lower level filesystem is not caching loaded modules. Even then though, look at what the main menu does. It does almost nothing yet it still takes 20 seconds to get to after closing an app. It make no sense in a sensible system. I am wondering one of a few things

1) They have debug builds there which will be much larger and slower.
2) They are compressing these things and therefore have to uncompress them when loading.
3) The Flash RAM is so shit it's down to crappy USB keyring speeds
4) All of the above.

I can't see how a home screen that essentially looks like a 3DS home screen takes that long to load. How fast is booting the machine, the same speed as returning to the home menu?


Relating to this can someone please send me a link or share information on how modern engines manage their resources? I'm talking about texture loading, normal maps, use of EDRAM, main RAM, shadows and how they load, cache, throwaway etc. I would love to contribute to this thread or at least understand what others are talking about but I'm not a 3D engine programmer. My understanding is that whilst EDRAM can help offset the bottleneck of the slow main RAM, it's tiny compared to that main RAM. It can hold a framebuffer (or 4 for FSAA??) and maybe a few textures but I can't see how that can help offset the texture requirements for a game with wide open space, varied textures, shadows and fancy shit.

Any help in understanding this is appreciated.
 

Teletraan1

Banned
Agreed. In fact I think they should let their partners like ATI and IBM design their hardware instead. Basically give them a certain price point, like £199 full retail and tell them to engineer the best possible hardware for that price point.

I have to quote the Eurogamer review because I agree with them 100%. Because Nintendo mass produce and buy in huge quantities they can source parts far cheaper than most manufacturers.

I agree as well. Using a more PC like setup just like Durango/Orbis is supposedly using probably would have been more popular with third parties and yielded better results in games all around. I thought the days of exotic console hardware and memory configurations were behind us.
 

v1oz

Member
Didn't the 360 get slightly burned when deferred rendering became popular? Something about the architecture and edram being setup for more traditional rendering.

Are there likely to be new emerging rendering techniques during the next 5 years, or is deferred rendering here to stay? And if it's staying, are there architectural choices you can make in a console design to optimise around that?
Yes chiefly because you only had 10MB of eDRam on the 360. Deferred rendering is very memory intense. But the Witcher 2 has has fully deferred rendering and AA and runs amazing on the 360. The WiiU's large pool of eDram will be great for it.
 

mrklaw

MrArseFace
Relating to this can someone please send me a link or share information on how modern engines manage their resources? I'm talking about texture loading, normal maps, use of EDRAM, main RAM, shadows and how they load, cache, throwaway etc. I would love to contribute to this thread or at least understand what others are talking about but I'm not a 3D engine programmer. My understanding is that whilst EDRAM can help offset the bottleneck of the slow main RAM, it's tiny compared to that main RAM. It can hold a framebuffer (or 4 for FSAA??) and maybe a few textures but I can't see how that can help offset the texture requirements for a game with wide open space, varied textures, shadows and fancy shit.

Any help in understanding this is appreciated.


There was a neat little article about calculating desired memory size for a streaming engine, based on. Memory speed etc, but I can't find it at the moment. There must be a bunch of postmortems that go into that kind of detail - I'd like to see them too.
 

Durante

Member
I read elsewhere that the Wii U currently doesn't support full range RGB output, is that true?
Also, a tangentially related question, does the OS run at 1080p.

Really, where are all the full-res framebuffer grabs?
 

DSix

Banned
I seem to remember a lengthy interview with a developer on Aliens CM, speaking about deferred rendering on WiiU.

Actually, the bigger eDram goes a long way to facilitate that.
The main bottleneck for deferred rendering on the 360 is the very limited amount of eDram, which is very useful for all those separate render targets. 32mb could be pretty good for deferred 720p games.
 

Ein Bear

Member
Why exactly does the Wii U OS take up such a massive amount of memory anyway? Is it the pad streaming stuff?

I'm not very techy, so if someone could give an explanation that would be great. It just seems mental to me that the Wii U devotes a full 1GB to its OS, when the PS3's only uses 50MB. Seems like such a waste of resources that could be better spent on games.
 

Durante

Member
Why exactly does the Wii U OS take up such a massive amount of memory anyway? Is it the pad streaming stuff?
Not really. Streaming generally wouldn't use nearly that much memory, and what it uses is mostly in buffers. If anything, since the Wii U streaming is optimized for latency, it should use even less memory than normal (fewer buffered frames).

As for your overall question, I don't think anyone really knows at this point why/if they need all this OS memory. I guess full, responsive multitasking is one advantage (that is, as of yet, still to be realized).

Menu runs in 1080p and at least some form of AA, Miis have no jaggies to them.
Good, the lack of AA on the Vita bubbles annoys me tremendously.
 
I read elsewhere that the Wii U currently doesn't support full range RGB output, is that true?
Also, a tangentially related question, does the OS run at 1080p.

Really, where are all the full-res framebuffer grabs?

Menu runs in 1080p and at least some form of AA, Miis have no jaggies to them.
 
I read elsewhere that the Wii U currently doesn't support full range RGB output, is that true?
Also, a tangentially related question, does the OS run at 1080p.

Really, where are all the full-res framebuffer grabs?

The eurogamer review mentioned the limited range problem, and I guess they know what they are talking about (Digital Foundry being a part of eurogamer).
 
So how powerful is this thing? How many xbox 360 duct taped together?

The question is more "Do you want to play new Nintendo games?" If so, then yes. If you're more concerned about how powerful it is, it's best to wait until the next ones from Sony and Microsoft are announced (and subsequently released). Then make a determination after reading up on 'em.
 

ozfunghi

Member
I'd really like to know how much the DSP is helping out verses having to use general cpu resources.

Are you asking whether games are using the DSP, or if it were used, how much it would help the CPU out?

Like i mentioned here, from what i recall, 1/6th of the Xbox360 cpu is being used for sound.
 

Biggzy

Member
Actually, the bigger eDram goes a long way to facilitate that.
The main bottleneck for deferred rendering on the 360 is the very limited amount of eDram, which is very useful for all those separate render targets. 32mb could be pretty good for deferred 720p games.

Do we know what the bandwidth for the WiiU's eDram is? Because the 360's has a ridiculous amount, and could prove problematic for BC.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Are you asking whether games are using the DSP, or if it were used, how much it would help the CPU out?

Like i mentioned here, from what i recall, 1/6th of the Xbox360 cpu is being used for sound.

Maybe for some games. But not most of them.

Also, Xbox360 has 3 physical/6 logical cores. There is a fair bit of speculation that WiiU only has 3 physical/logical cores and is under clocked relative to Xbox360.
 

NeOak

Member
Why exactly does the Wii U OS take up such a massive amount of memory anyway? Is it the pad streaming stuff?

I'm not very techy, so if someone could give an explanation that would be great. It just seems mental to me that the Wii U devotes a full 1GB to its OS, when the PS3's only uses 50MB. Seems like such a waste of resources that could be better spent on games.

PS3's OS used about 64MB Main RAM and 32MB Graphics RAM in the beginning.

Xbox 360 OS uses 32MB RAM.

The Wii U OS is ridiculous. I mean, not even Windows...
 

ozfunghi

Member
Maybe for some games. But not most of them.

Also, Xbox360 has 3 physical/6 logical cores. There is a fair bit of speculation that WiiU only has 3 physical/logical cores and is under clocked relative to Xbox360.

Yes, most of them iirc. A couple even as much as 1/3rd.

Whatever the case, even if WiiU has less logical cores at a lower frequency, using the DSP would even make more of a difference for WiiU.
 

mrklaw

MrArseFace
I read elsewhere that the Wii U currently doesn't support full range RGB output, is that true?
Also, a tangentially related question, does the OS run at 1080p.

Really, where are all the full-res framebuffer grabs?

What's wrong with limited range RGB? Assuming you're using a HDTV, that matches the levels used by HD broadcast and bluray. 16-235 I think, rather than 0-255. If you're calibrated for those, then your console should be set to limited anyway
 
Asked this in the other thread, but I should've asked here.

Stupid question. Not trying to be annoying.

But is the Vita more powerful than the Wii U? Faster RAM than the Wii U?
 

KageMaru

Member
Didn't the 360 get slightly burned when deferred rendering became popular? Something about the architecture and edram being setup for more traditional rendering.

Are there likely to be new emerging rendering techniques during the next 5 years, or is deferred rendering here to stay? And if it's staying, are there architectural choices you can make in a console design to optimise around that?

The issue was the amount of eDRAM and the lack of CPU muscle to handle tiling in many games. IIRC eDRAM can be beneficial to deferred rendering, saving memory where the G buffer would typically be stored.

Yes and no. No optimizations only explain the terrible initial load times. Once an app is in RAM, switching to said app again should be a mere task switch. But that isn't what's happening here. Apps completely reload and re-initialize every time you access them.

That's interesting, thanks. I guess nevermind on my prior post.
 
Asked this in the other thread, but I should've asked here.

Stupid question. Not trying to be annoying.

But is the Vita more powerful than the Wii U? Faster RAM than the Wii U?

Not possible. Vita is not even similar in power than 360 or PS3 (in some games is minimized because of the screen resolution, but you can see clearly in open-world games like NFS that is still far away from the consoles). I don't know if Wii U is more, the same, or even a little less powerful than those consoles, but is definetively much more powerful than Vita.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Yes, most of them iirc. A couple even as much as 1/3rd.

Whatever the case, even if WiiU has less logical cores at a lower frequency, using the DSP would even make more of a difference for WiiU.

This again.

And example. The audio for Killzone3 took less then 3% of the CPU budget. Audio does not take a large part of a games CPU budget.
 

Ein Bear

Member
PS3's OS used about 64MB Main RAM and 32MB Graphics RAM in the beginning.

Xbox 360 OS uses 32MB RAM.

The Wii U OS is ridiculous. I mean, not even Windows...

I find this insane. I mean yeah, the Dashboard/XMB can be a bit slow sometimes, but they still manage to do pretty much everything a console OS could possibly need. 1GB just seems like such a mind boggling level of overkill.
 
Top Bottom