• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

jerd

Member
I don't understand why Nintendo just doesn't release specs. OK, it's not about graphics or anything, but releasing specs doesn't negate any message. It's just a description of what's in the damn machine.

They want to avoid the perception that the console is inferior due to inferior specs compared to the upcoming competition would be my guess.
 

rpmurphy

Member
I don't understand why Nintendo just doesn't release specs. OK, it's not about graphics or anything, but releasing specs doesn't negate any message. It's just a description of what's in the damn machine.
I think they left the race of trumpeting computing and graphical capabilities in a public setting. They've seem to be more content with highlighting hardware components in their systems that the competition do not have, which is an interesting direction and probably plays to their marketing and business strategy a lot more naturally.
 

JoeInky

Member
I don't understand why Nintendo just doesn't release specs. OK, it's not about graphics or anything, but releasing specs doesn't negate any message. It's just a description of what's in the damn machine.

I keep hearing people say that they did that with the gamecube and it came back to bite them in the ass when their realistic stats on what the system could do were dwarfed by the overestimated PS & xbox stats.

I guess that would be why if it's true, I didn't really keep tabs on that stufff during that gen.

It also gives the wrong impression that you're buying the console for specs, no -resonable - person buys a console for specs.
 
They want to avoid the perception that the console is inferior due to inferior specs compared to the upcoming competition would be my guess.

It's silly since they already get that perception either way...

Simply put, Nintendo doesn't talk hardware specs outside of storage... The fact that Iwata mentioned RAM in the nintendo direct is absolutely HUGE. It's the first time they have EVER done that and may be a sign that they'll be more open in the future, but it's doubtful.
 

Matt

Member
It's silly since they already get that perception either way...

Simply put, Nintendo doesn't talk hardware specs outside of storage... The fact that Iwata mentioned RAM in the nintendo direct is absolutely HUGE. It's the first time they have EVER done that and may be a sign that they'll be more open in the future, but it's doubtful.
They only started not releasing technical info with the DS and the Wii. Before that they talked about specs just as much as anyone else.
 

jerd

Member
It's silly since they already get that perception either way...

Simply put, Nintendo doesn't talk hardware specs outside of storage... The fact that Iwata mentioned RAM in the nintendo direct is absolutely HUGE. It's the first time they have EVER done that and may be a sign that they'll be more open in the future, but it's doubtful.

Yeah I think they don't want it to be quantifiable though.
 

ozfunghi

Member
Switch "processes" into "execution units" and you got it.

Gekko/Broadway cores still fit the bill. 2 integer units, 1 fp unit, 1 load/store unit, 1 branch unit makes five. Dispatch rate is three per clock.

It's not unusual to have more execution units than you can start instructions on per clock, and due to pipelining it really isn't much of a limitation in practice.

Ok. But how does this mix with the comments being made by IBMWatson all through the year that the CPU is a custom Power7? Can it be both?
 

IdeaMan

My source is my ass!
Its a weird quote, saying 2 x but then saying you can guess what kind of frame rate improvement that is...isn't it 2 x?

Well, ozfunghi asked for the range of framerate improvement i talked about a few weeks ago, so i said 2x.

The framerate roughly doubled from winter 2012 to now, thanks to a lot of parameters described before. The roughly 2X could mean a jump from 25fps to 49fps, 35 to 68, etc, in exact values. So the "you can guess" is for the latter point, what kind of precise frame per second are involved.

Still, managing to double the framerate of projects in one semester is pretty encouraging and telling about the better grasp third-parties have on the Wii U.

Ok, maybe some context/history should be provided. I've been spamming and bugging Ideaman ever since he made those first claims. He kept promising "soon" and "maybe in a couple of days". In the meanwhile i had been posing hypothetical cases such as "at what rate would a game running at about 30 fps 6 months ago, run now including extra effects and AA? Less than 40, 40 to 50 or 50 to 60 or more?"

So most likely that's where the "2X" comes from.



It's all over that page. Thread is locked so you can't quote anymore.

http://www.neogaf.com/forum/showpost.php?p=41506081&postcount=1105
http://www.neogaf.com/forum/showpost.php?p=41506487&postcount=1111
http://www.neogaf.com/forum/showpost.php?p=41507162&postcount=1119
http://www.neogaf.com/forum/showpost.php?p=41507562&postcount=1130
http://www.neogaf.com/forum/showpost.php?p=41508338&postcount=1144
http://www.neogaf.com/forum/showpost.php?p=41508639&postcount=1149


This was for games NOT built from the ground up for WiiU, so most likely Ideaman is talking about games we already know/knew exist(ed) or launch window even. Let's just say, if it were for a game such as Assassins Creed III or Mass Effect 3 or Darksiders 2... and they were already matching performance of PS360, that they would surpass that by launch. Let alone were the games built specifically for WiiU and not a port from PS360.

Thanks for explaining this story for those who haven't followed it :p
About the projects, two are involved, one is a multi, one is an exclusive, so the nature of the game isn't important for that matter.
 
Well, ozfunghi asked for the range of framerate improvement i talked about a few weeks ago, so i said 2x.

The framerate roughly doubled from winter 2012 to now, thanks to a lot of parameters described before. The roughly 2X could mean a jump from 25fps to 49fps, 35 to 68, etc, in exact values. So the "you can guess" is for the latter point, what kind of precise frame per second are involved.

Still, managing to double the framerate of projects in one semester is pretty encouraging and telling about the better grasp third-parties have on the Wii U.



Thanks for explaining this story for those who haven't followed it :p
About the projects, two are involved, one is a multi, one is an exclusive, so the nature of the game isn't important for that matter.
Great, great news if true.
 

IdeaMan

My source is my ass!
Great, great news if true.

It's relevant for a specific context though as always with my info (because i can't speak for every studios), from the type of game, the use of gamepad (rather intricate), the companies behind, what kind of engines and middleware are employed, etc. It won't be a 2x increase in framerate across the board these past months, but it concerns at least two launch window titles so it's at the very least reassuring on the spec of the system, because if it was a pretty solid and balanced console one semester ago, power-wise, it's even more the case now after profiting of the latest dev kits, sdk, software (engines & middleware) optimizations, familiarization to the dual-screen setting, etc.
 

Phazon

Member
Played some Darksiders 2 today (alongside with 18 other wii u games)

I don't think it's 60 fps. The framerate also has some serieus drops or short freezes in a while, but I think they'll fix this by release. :)
 
It's to be read with a lot of [context] awareness, from the type of game, the use of gamepad (rather intricate), the studios behind, what kind of engines and middleware are employed, etc. It won't be a 2x increase in framerate for every projects these past months, but it concerns at least two launch window titles so it's at the very least reassuring on the spec of the system, because if it was a pretty solid and balanced console one semester ago, power-wise, it's even more the case now after profiting of the latest dev kits, sdk, software (engines & middleware) optimizations, familiarization to the dual-screen setting, etc.
Sure, i understand. Still, great news :)
 

jaypah

Member
Played some Darksiders 2 today (alongside with 18 other wii u games)

I don't think it's 60 fps. The framerate also has some serieus drops or short freezes in a while, but I think they'll fix this by release. :)

Was it fun? What was the screen used for? Maps or hotkeys?
 

Phazon

Member
Was it fun? What was the screen used for? Maps or hotkeys?

Hotkeys, maps and inventory without pausing the game. :)

Is it worth playing on Wii U instead of PS3?

No, it comes with DLC but it's just the same. (I dont have experience with ps3-version, only with pc version)

Here's the rest I could play: NintendoLand(Pikmin, Metroid Blast, Balloon Flight en Chase Me), Darksiders 2, ZombiU, Rayman Legends (black betty music scene), Skylanders Giants , Mass Effect 3, NBA , New Super Mario Bros U., Tekken (mushroom mode and normal mode), Sonic Racing, Ninja Gaiden 3 (Ayane stuff), Assassin's Creed III (naval battle), Toki Tori II, Nano Neo Assault, Trine 2
 

Rolf NB

Member
Ok. But how does this mix with the comments being made by IBMWatson all through the year that the CPU is a custom Power7? Can it be both?
It can't be both.

I'd like to point out that the tweets throughout the year were mostly very vague and I thought people interpreted way too many things into them. The outright "The Wii U is a custom 45nm #power7 chip" tweet only happened this week. But yeah, it did happen after all. And the account is legitimately linked to IBM.

This is basically the list of the problems I have with the idea:
http://en.wikipedia.org/wiki/POWER7#Specifications

Typical Power7 products look like this:
p_2048_1536_5b4d0a30rfsf6.jpeg


This is an 800W, 3GHz, 32-core multi-chip module. Each core has 4-way SMT, so each module can run 128 threads in parallel.

Can you scale it down, theoretically, practically? Sure you could. Each individual core is around 25W. Clock it lower, around 2GHz or so, reduce the voltage, and you could feasibly get something you can stick in a console.

Problem is, you can fit 4 Broadway cores in half the die size and half the power budget, clock them the same, and you'll end up outperforming the Power7. 4 real cores vs one core with SMT is not an even contest.

Doesn't matter that the architecture is newer. What Power7 added in features is utterly useless in a gaming context. Decimal floating point units are nice to have if you want predictable precision and rounding when handling real-world currency, but there's no reason to ever use them in a game. They're just wasted transistors. So is much of the rest.

One thing that bears repeating is that Broadway is not even a bad CPU architecture. There is no insult embedded into the idea that it may be the one again. It's a fine core architecture, performs well at any mix of code you can think of, and does it with low power usage and small die size. Its lineage may be ancient, but that doesn't mean it's lacking anything significant when built on a modern process node.

Xenon CPU cores and the Cell PPU (which are identical twins) are very, very bad performers for their clock speeds in general-purpose code, and they suck up surprising amounts of electricity to boot. 3.2GHz may sound impressive, but it really only helps with hand-tuned SIMD loops. If you just throw random code at them, they are matched and beaten by any reasonably efficient architecture, like the one Broadway used, at half the clocks or even less.

So, back to Power7, there is a technical possibility to scale it down to maybe a couple cores, scale back the SMT, remove some of the execution units, strip most of the massive core-to-core communication logic (and the on-die dual GBit ethernet links!), etc pp ... and end up with a decent console CPU that just barely squeezes into a reasonable power and cost budget. I just don't see how it makes economical sense for either IBM or Nintendo to have gone down that route.

Nintendo would have had to pay extra for the customization, and most likely extra on top due to basing their chip on IBM's current crown-jewel architecture. Then they'd pay extra in manufacturing because the chips would still end up larger.

At the same time, going with Broadway architecture instantly ensures robust Wii BC, because the CPU would be cycle-accurate and there'd be zero glitches, no "emulation" at all on that side. That Broadway++ would be cheaper to license and cheaper to manufacture, and easier to understand is the other corollary set of benefits.

Until someone pops off the heatspreader and demonstrates that there really is something else in the Wii U, CPU-wise, I'll continue to stick to my previous expectations:
*multi-core version of Broadway, clocked 2~3x higher than the Wii
*eDRAM cache instead of SRAM
*45nm SOI
(the last two would satisfy the earlier "some of the same technology [as in Power7]" tweets just nicely)

This is not intended to be a party poop, never was. It just makes the most sense to me given the small size and low power consumption of the overall system, and also economically.
 

Phazon

Member
How was Mass Effect 3 compared to what's already out? Straight port?

Didn't played it myself (only games i did not play were me3, skylanders and nba), but another editor said that it was just the same. So it's again the maps and inventory that is a plus.


And oh yeah, you can play Darksiders 2 and ME3 fully on the Wii U GamePad. Dat handheld feeling when playing on your controller is something special ^^


But everything looks a lot better than it did a while ago. Now it's really some final tweaking before they are completely finished. Just don't expect to see anything that's looking better than xbox360 or ps3.
 

AzaK

Member
It's relevant for a specific context though as always with my info (because i can't speak for every studios), from the type of game, the use of gamepad (rather intricate), the companies behind, what kind of engines and middleware are employed, etc. It won't be a 2x increase in framerate across the board these past months, but it concerns at least two launch window titles so it's at the very least reassuring on the spec of the system, because if it was a pretty solid and balanced console one semester ago, power-wise, it's even more the case now after profiting of the latest dev kits, sdk, software (engines & middleware) optimizations, familiarization to the dual-screen setting, etc.

Wow nice. IdeaMan, I missed a lot of the discussion, so was this increase in framerate from "shitty" to "solid" e.g. 15 to 30, or more along the lines of "solid" to "great", e.g. 30 to 60? It seems everyone's implying the latter but I haven't see what the base framerate was.


It's all over that page. Thread is locked so you can't quote anymore.
Thanks. I had seen that mention by IdeaMan of good improvements, I'd just missed the 2x thing. If launch window ports manage to get a good level of framerate increase from say 30-60, that's a pretty nice selling point. Let's hope.
 

IdeaMan

My source is my ass!
Wow nice. IdeaMan, I missed a lot of the discussion, so was this increase in framerate from "shitty" to "solid" e.g. 15 to 30, or more along the lines of "solid" to "great", e.g. 30 to 60? It seems everyone's implying the latter but I haven't see what the base framerate was.

It wasn't shitty, the increase happened on an already playable framerate.
i won't say more :)
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
*thumbs up*

I think IdeaMan deserves a Nintendo Medal of Freedom.
 

IdeaMan

My source is my ass!
*thumbs up*

I think IdeaMan deserves a Nintendo Medal of Freedom.

Thanks, but even if it's a positive news, it's still to be understood with tons of context sauce.

To be clear:

In WUST 2, i think i've stated that with the following dev kit (i spoke in a V4 dev kit mindset at the time), newest SDK, and other improvements from different parameters, i would expect the projects of my sources that ran at a playable framerate (between 25 and 35 fps) to grab maybe 10 more fps in the end (i said 40/45 if i remember right), and allocate those additional resources to implement more effects while returning to an acceptable framerate. But i've been told they managed to roughly double this initial framerate (so between 50 and 70 fps) while polishing the visuals.

That's it. So it can't be a rule for every wii u projects, it's just relevant for the titles involved. But if those studios reached this level of progress, why not some others ?
 

ozfunghi

Member
It can't be both.

I'd like to point out that the tweets throughout the year were mostly very vague and I thought people interpreted way too many things into them. The outright "The Wii U is a custom 45nm #power7 chip" tweet only happened this week. But yeah, it did happen after all. And the account is legitimately linked to IBM.

This is basically the list of the problems I have with the idea:
http://en.wikipedia.org/wiki/POWER7#Specifications

Typical Power7 products look like this:
p_2048_1536_5b4d0a30rfsf6.jpeg


This is an 800W, 3GHz, 32-core multi-chip module. Each core has 4-way SMT, so each module can run 128 threads in parallel.

Can you scale it down, theoretically, practically? Sure you could. Each individual core is around 25W. Clock it lower, around 2GHz or so, reduce the voltage, and you could feasibly get something you can stick in a console.

Problem is, you can fit 4 Broadway cores in half the die size and half the power budget, clock them the same, and you'll end up outperforming the Power7. 4 real cores vs one core with SMT is not an even contest.

Doesn't matter that the architecture is newer. What Power7 added in features is utterly useless in a gaming context. Decimal floating point units are nice to have if you want predictable precision and rounding when handling real-world currency, but there's no reason to ever use them in a game. They're just wasted transistors. So is much of the rest.

One thing that bears repeating is that Broadway is not even a bad CPU architecture. There is no insult embedded into the idea that it may be the one again. It's a fine core architecture, performs well at any mix of code you can think of, and does it with low power usage and small die size. Its lineage may be ancient, but that doesn't mean it's lacking anything significant when built on a modern process node.

Xenon CPU cores and the Cell PPU (which are identical twins) are very, very bad performers for their clock speeds in general-purpose code, and they suck up surprising amounts of electricity to boot. 3.2GHz may sound impressive, but it really only helps with hand-tuned SIMD loops. If you just throw random code at them, they are matched and beaten by any reasonably efficient architecture, like the one Broadway used, at half the clocks or even less.

So, back to Power7, there is a technical possibility to scale it down to maybe a couple cores, scale back the SMT, remove some of the execution units, strip most of the massive core-to-core communication logic (and the on-die dual GBit ethernet links!), etc pp ... and end up with a decent console CPU that just barely squeezes into a reasonable power and cost budget. I just don't see how it makes economical sense for either IBM or Nintendo to have gone down that route.

Nintendo would have had to pay extra for the customization, and most likely extra on top due to basing their chip on IBM's current crown-jewel architecture. Then they'd pay extra in manufacturing because the chips would still end up larger.

At the same time, going with Broadway architecture instantly ensures robust Wii BC, because the CPU would be cycle-accurate and there'd be zero glitches, no "emulation" at all on that side. That Broadway++ would be cheaper to license and cheaper to manufacture, and easier to understand is the other corollary set of benefits.

Until someone pops off the heatspreader and demonstrates that there really is something else in the Wii U, CPU-wise, I'll continue to stick to my previous expectations:
*multi-core version of Broadway, clocked 2~3x higher than the Wii
*eDRAM cache instead of SRAM
*45nm SOI
(the last two would satisfy the earlier "some of the same technology [as in Power7]" tweets just nicely)

This is not intended to be a party poop, never was. It just makes the most sense to me given the small size and low power consumption of the overall system, and also economically.

Look man, i didn't ask for your life story... just kidding. Thanks for this.

But it feels like there is still a piece of the puzzle missing. I don't understand that IBM would state so blatantly that it's a custom Power7 with the same SOI, if it really has hardly anything in common.

If it turns out to be what you are expecting, what kind of performance are we looking at, compared to current gen HD consoles?
 
At the same time, going with Broadway architecture instantly ensures robust Wii BC, because the CPU would be cycle-accurate and there'd be zero glitches, no "emulation" at all on that side. That Broadway++ would be cheaper to license and cheaper to manufacture, and easier to understand is the other corollary set of benefits.

Until someone pops off the heatspreader and demonstrates that there really is something else in the Wii U, CPU-wise, I'll continue to stick to my previous expectations:
*multi-core version of Broadway, clocked 2~3x higher than the Wii
*eDRAM cache instead of SRAM
*45nm SOI
(the last two would satisfy the earlier "some of the same technology [as in Power7]" tweets just nicely)

This is not intended to be a party poop, never was. It just makes the most sense to me given the small size and low power consumption of the overall system, and also economically.
It's still not a broadway or three broadways at that; that's dissing it.

For starters triple core is not so easy as to make 3 dies and glue them together with tape; for there are shared components if done right; it's a big evolution, otherwise we'd be calling the first Core Duo's (and Pentium M's in between) Pentium 3 Tualatin's.

Also the clockrate, PPC 750 topped out at 1.1 GHz, sure it was a 90 nm chip but core shrinks and the like can't really make it soar without some core changes.


Also, bare in mind PPC 7xx development was phased out in favour of PowerPC e500 which unlike PPC 7xx has multi-core support. This, PowerPC e600 or even PowerPC 476FP seem more likely to be used seeing they're not deprecated and are more up to date; producing custom versions of them could be cheaper seeing they're in production. (PPC 7xx is also still in production, obviously; but such a different core with a core shrink would require dedicated investment)


Other thoughts:

Broadway is 90 nm SOI, I don't think shrinking it to 45nm could comply as "Power7 tech" by any marketing spin. As does having eDRAM unless they adapted some core technology/trade secret in order to either package it or make it interact with surrounding parts/cpu; even then though, you couldn't say it was Power7 based, as IBM said.
If it turns out to be what you are expecting, what kind of performance are we looking at, compared to current gen HD consoles?
That's full of unknown territory.

For instance we have no idea how high this is clocked and how it really performs; anyway, I'm sure it wipes the floor of today's consoles in general processing, executes more stuff per clock and it's out-of-order execution which helps, but I'm pretty sure it won't beat them in FPU (floating point/gflops) which is not it's purpose anyway.

That would explain how PS360 optimized code (code optimized to 2-way in-order execution) fares badly. Should be less of an issue when next gen consoles launch with more similar cpu's (despite having different architectures) if developers are willing to develop for it, that is.
 
Look man, i didn't ask for your life story... just kidding. Thanks for this.

But it feels like there is still a piece of the puzzle missing. I don't understand that IBM would state so blatantly that it's a custom Power7 with the same SOI, if it really has hardly anything in common.

If it turns out to be what you are expecting, what kind of performance are we looking at, compared to current gen HD consoles?

1 boardway core clocked twice the speed as in Wii will outperform 1 Xenon Core by 60% - 80%. That is plain old vanilla boardway, just simple clocked higher.

I don't know how much using edram instead of 1tsram and having more L2 cache per core will provide in performance boost, but overall the CPU should be more than twice the performance. I have no clue by how much though.
 

The_Lump

Banned
I don't understand why Nintendo just doesn't release specs. OK, it's not about graphics or anything, but releasing specs doesn't negate any message. It's just a description of what's in the damn machine.


My guess is, they know the only people who would base their purchase of a console on specs are unfortunately the same people who know very little about what those specs would actually mean. They tend to just look at the parts with numbers. Which may not look that good to the layperson. Example: "hurhur, 2ghz is less than 3.6ghz, omfg it's CPU is weaker than 7 year old hardware, lolz" and so on.

Whereas the people who might be able to interpret what said specifications actually mean for developers and the games they'll be able to produce, will already have surmised the systems capabilities based on what we already know and thus, won't care beyond satisfying their own curiosity.
 

RedSwirl

Junior Member
They want to avoid the perception that the console is inferior due to inferior specs compared to the upcoming competition would be my guess.

Talking specs has bit them in the ass in the past. At least as far back as the Gamecube it's mostly resulted form the way they talk specs compared to Sony or Microsoft. Back then everyone liked giving raw numbers based on the number of triangles they could process. Nintendo on the other hand liked to estimate "real world" performance with actual game logic going on, which always made their numbers seem lower.
 

The_Lump

Banned
It can't be both.
At the same time, going with Broadway architecture instantly ensures robust Wii BC, because the CPU would be cycle-accurate and there'd be zero glitches, no "emulation" at all on that side. That Broadway++ would be cheaper to license and cheaper to manufacture, and easier to understand is the other corollary set of benefits.

Until someone pops off the heatspreader and demonstrates that there really is something else in the Wii U, CPU-wise, I'll continue to stick to my previous expectations:
*multi-core version of Broadway, clocked 2~3x higher than the Wii
*eDRAM cache instead of SRAM
*45nm SOI
(the last two would satisfy the earlier "some of the same technology [as in Power7]" tweets just nicely)

This is not intended to be a party poop, never was. It just makes the most sense to me given the small size and low power consumption of the overall system, and also economically.


Nice post. One issue I have with the Broadway theory (hopefully you can correct me) is that it would as you say, theoretically provide perfect Backwards compatability. But Nintendo have made a point of saying it "will work with most Wii software". Doesn't sound like much at first, but why wouldn't it work with 100% of Wii software if that's the major reason for Nintendo choosing to continue that CPU line? What's stopping it if it isn't the CPU? And if it's something else stopping 100% BC, why bother using that CPU at all beyond creating continuity for devs (which I'm guessing isn't that important given they are trying hard to coax new developers onto the platform anyway).

My other issue is that the "Enhanced Broadway" thing comes from a source claiming it was taken from warioworld.com (Nintendo's developer portal). To me, this doesn't sound like terminology Nintendo would use. What value would that be to developers requiring info on the CPU? It basically tells them nothing useful. Sounds like guesstimation to me, possibly based on the sources own experience with dev kits/2nd hand info of someone else using the kits.

This is just my own speculation and of course is probably way off the mark :)
 
Nice post. One issue I have with the Broadway theory (hopefully you can correct me) is that it would as you say, theoretically provide perfect Backwards compatability. But Nintendo have made a point of saying it "will work with most Wii software". Doesn't sound like much at first, but why wouldn't it work with 100% of Wii software if that's the major reason for Nintendo choosing to continue that CPU line? What's stopping it if it isn't the CPU? And if it's something else stopping 100% BC, why bother using that CPU at all beyond creating continuity for devs (which I'm guessing isn't that important given they are trying hard to coax new developers onto the platform anyway).

My other issue is that the "Enhanced Broadway" thing comes from a source claiming it was taken from warioworld.com (Nintendo's developer portal). To me, this doesn't sound like terminology Nintendo would use. What value would that be to developers requiring info on the CPU? It basically tells them nothing useful. Sounds like guesstimation to me, possibly based on the sources own experience with dev kits/2nd hand info of someone else using the kits.

This is just my own speculation and of course is probably way off the mark :)

Most wii software - simply means it won't work with games that required peripherals that used the GameCube ports, in fact newer Wii's won't work with those games either
 

The_Lump

Banned
Most wii software - simply means it won't work with games that required peripherals that used the GameCube ports, in fact newer Wii's won't work with those games either

I see. What's an example of that, just out of interest? They said the same with 3DS>DS games. Is that for the same reason?

3DS has a 'DS' mode (much like Wii had a GC mode) which basically turns it unto a DS. Reggie (i think) said in some interview the WiiU will have a 'Wii Mode'. Do you think we're looking at the same sort of thing?
 
I see. What's an example of that, just out of interest? They said the same with 3DS>DS games. Is that for the same reason?

3DS has a 'DS' mode (much like Wii had a GC mode) which basically turns it unto a DS. Reggie (i think) said in some interview the WiiU will have a 'Wii Mode'. Do you think we're looking at the same sort of thing?

There was some namco fitness game that used some sort of mat that plugged into GameCube ports, also possibly dance mat games (DDR and stuff) not sure if they used GC ports or USB
 
Nice post. One issue I have with the Broadway theory (hopefully you can correct me) is that it would as you say, theoretically provide perfect Backwards compatability. But Nintendo have made a point of saying it "will work with most Wii software". Doesn't sound like much at first, but why wouldn't it work with 100% of Wii software if that's the major reason for Nintendo choosing to continue that CPU line? What's stopping it if it isn't the CPU? And if it's something else stopping 100% BC, why bother using that CPU at all beyond creating continuity for devs (which I'm guessing isn't that important given they are trying hard to coax new developers onto the platform anyway).

My other issue is that the "Enhanced Broadway" thing comes from a source claiming it was taken from warioworld.com (Nintendo's developer portal). To me, this doesn't sound like terminology Nintendo would use. What value would that be to developers requiring info on the CPU? It basically tells them nothing useful. Sounds like guesstimation to me, possibly based on the sources own experience with dev kits/2nd hand info of someone else using the kits.

This is just my own speculation and of course is probably way off the mark :)

I think it's just to protect themselves. Sony said the same thing with regards to ps2 BC even though that was fully hardware based.

And even if the CPU is the same as the Wii, the GPU is completely different.
 
I think it's just to protect themselves. Sony said the same thing with regards to ps2 BC even though that was fully hardware based.

And even if the CPU is the same as the Wii, the GPU is completely different.

This.

Also random accessories may not be compatible, especially third party ones. Remember, Wii didn't have 100% gamecube compatibility either and it was just an overclocked one. Games like Phantasy Star Online wouldn't work without the adapter, certain Gamecube microphone games also didn't work.
 

Kai Dracon

Writing a dinosaur space opera symphony
3DS has a 'DS' mode (much like Wii had a GC mode) which basically turns it unto a DS. Reggie (i think) said in some interview the WiiU will have a 'Wii Mode'. Do you think we're looking at the same sort of thing?

Interesting point there.

The Wii's Gamecube mode is truly a 'dumb' mode. Gamecube games were never programmed with the provision for an OS layer to interrupt them during execution. So Gamecube mode on Wii locks every single Wii function out, including controllers. You have to turn the power off to get out of Gamecube mode. The home overlay doesn't operate.

By comparison, DS mode on the 3DS actually does allow the OS to interrupt the game to return to the dashboard, DS games evidently have some hook for the OS to latch onto.

Wii games actually do have the hooks for being paused by an OS overlay with some OS controls available. So, in spite of talk of a Wii mode on Wii U, one wonders how much Wii games can share functions with the Wii U while running. Wii games were also designed to let background services like Wii Connect24 to run (even VC games didn't interrupt housekeeping, as far as I know).
 

The_Lump

Banned
There was some namco fitness game that used some sort of mat that plugged into GameCube ports, also possibly dance mat games (DDR and stuff) not sure if they used GC ports or USB

I think it's just to protect themselves. Sony said the same thing with regards to ps2 BC even though that was fully hardware based.

And even if the CPU is the same as the Wii, the GPU is completely different.

This.

Also random accessories may not be compatible, especially third party ones. Remember, Wii didn't have 100% gamecube compatibility either and it was just an overclocked one. Games like Phantasy Star Online wouldn't work without the adapter, certain Gamecube microphone games also didn't work.

Ok cool, thanks for clearing that up y'all *suspicion mode deactivated*

Interesting point there.

The Wii's Gamecube mode is truly a 'dumb' mode. Gamecube games were never programmed with the provision for an OS layer to interrupt them during execution. So Gamecube mode on Wii locks every single Wii function out, including controllers. You have to turn the power off to get out of Gamecube mode. The home overlay doesn't operate.

By comparison, DS mode on the 3DS actually does allow the OS to interrupt the game to return to the dashboard, DS games evidently have some hook for the OS to latch onto.

Wii games actually do have the hooks for being paused by an OS overlay with some OS controls available. So, in spite of talk of a Wii mode on Wii U, one wonders how much Wii games can share functions with the Wii U while running. Wii games were also designed to let background services like Wii Connect24 to run (even VC games didn't interrupt housekeeping, as far as I know).

Oh cool, that's a very good point. Maybe its a lot more neatly integrated than a 'Wii Mode' then.
 
Interesting point there.

The Wii's Gamecube mode is truly a 'dumb' mode. Gamecube games were never programmed with the provision for an OS layer to interrupt them during execution. So Gamecube mode on Wii locks every single Wii function out, including controllers. You have to turn the power off to get out of Gamecube mode. The home overlay doesn't operate.

By comparison, DS mode on the 3DS actually does allow the OS to interrupt the game to return to the dashboard, DS games evidently have some hook for the OS to latch onto.

Wii games actually do have the hooks for being paused by an OS overlay with some OS controls available. So, in spite of talk of a Wii mode on Wii U, one wonders how much Wii games can share functions with the Wii U while running. Wii games were also designed to let background services like Wii Connect24 to run (even VC games didn't interrupt housekeeping, as far as I know).

Fun fact... The Wii OS actually does NOT RUN AT ALL when a Wii game is loaded. That overlay? It's faked. Check the files on every Wii disc, they all have the menu graphics/functions in them... it's not part of the OS.

The Wii menu is actually much like all other wii software, the only difference is that the Wii menu is told to load on system launch. The drivers/low end operating stuff are handled by what's called IOS. The Wii can store many many multiple IOS at a time (different games requires different drivers, etc) on top of the fact that newer games install newer IOS, old games retain their older IOS to maintain compatibility.

The "menu" isn't an OS hook, it's simply a part of Nintendo's SDK that Nintendo requires all games to have as part of their code.
 
would it be sleep mode that lets them do this?

While we don't have access to the 3DS to know for sure, I think the actual reason it works is much more mundane. When Wii goes into GCN mode it literally shuts down all the wii components, using all of the Wii's hardware exactly as if it were a GCN.

We know the 3DS doesn't do this, but it's almost certain that the 3DS DS mode (god that's confusing) doesn't have any access to 3DS hardware either (though newer DS games probably have code to be able to detect that they are running on 3DS much like later GBC games could tell when they were run on GBA).

DS mode while in 3DS is almost assuredly sand boxed in a virtual machine. Much like I can run Linux on a virtual machine on my Windows PC. The Linux system thinks it's a whole system, but the windows machine can control which bits of hardware and memory it can access.

(edit) Clarification... we don't have access to the 3DS software functions or memory so we can't be certain what it does in DS mode, but chances are it's done this way... virtual machining a DS makes sense because it blocks the multitude of existing DS hacks from being able to access 3DS hardware in any way.
 

The_Lump

Banned
While we don't have access to the 3DS to know for sure, I think the actual reason it works is much more mundane. When Wii goes into GCN mode it literally shuts down all the wii components, using all of the Wii's hardware exactly as if it were a GCN.

We know the 3DS doesn't do this, but it's almost certain that the 3DS DS mode (god that's confusing) doesn't have any access to 3DS hardware either (though newer DS games probably have code to be able to detect that they are running on 3DS much like later GBC games could tell when they were run on GBA).

DS mode while in 3DS is almost assuredly sand boxed in a virtual machine. Much like I can run Linux on a virtual machine on my Windows PC. The Linux system thinks it's a whole system, but the windows machine can control which bits of hardware and memory it can access.

(edit) Clarification... we don't have access to the 3DS software functions or memory so we can't be certain what it does in DS mode, but chances are it's done this way... virtual machining a DS makes sense because it blocks the multitude of existing DS hacks from being able to access 3DS hardware in any way.

At the risk of sounding dumb; if 'sand boxing in a virtual machine' is that simple, why the need for hardware compatibility? Especially to the extent of keeping older hardware just for this purpose (Broadway in Wii U)?

Or is this too broad of a question?
 
At the risk of sounding dumb; if 'sand boxing in a virtual machine' is that simple, why the need for hardware compatibility? Especially to the extent of keeping older hardware just for this purpose (Broadway in Wii U)?

Or is this too broad of a question?

To go back to my windows/linux example.... Linux and Windows both use the same type of CPU (in this example anyways) which is why they can be sandboxed. It's literally just the first OS (in this case the 3DS) saying "Hey, I'm going to set aside part of my CPU and memory to run the DS). Being the same hardware is necessary for sandboxing, otherwise what you're doing is EMULATING.

Emulating is translating the instructions of one CPU into the of another and then running said instructions. The Wii likely didn't go the sandboxing route for GC compatibility simply because it didn't have enough RAM or broadway simply didn't have the multitasking strength to do so... Or it was just simpler to do... but as noted, it completely locks you out of the Wii menu/etc which is why they haven't copied that method for 3DS or Wii U.
 
I mean if it weren't based on Power7, but Broadway, given feasable clockspeeds within a realistic power envelope...
It's pretty much the same scenario bar some nuances.

A Broadway giving Cell and Xenon a run for their money per core in general processing tasks isn't a hard thing to do, just clock it higher (and you don't even need to clock it that higher); question is how high in clockrate it could go and how that performance tradeoff compares to current architectures.

A 32-bit chip seems like a weird choice though; 64-bit theoretically means you can pull tasks twice as fast (seeing it can execute bigger registers per each pass) and it's pretty much a staple most would consider a given at this point.

64-bits has the disadvantage of eating more RAM/depleting more memory in a regular scenario (that doesn't need it), which is why Apple has strayed from using it as the predefined mode for laptops/consumer range machines; but in games most of the data is not of the same nature, with most being used by textures, sound and assets like that, so the "advantage" of keeping it 32-bits is reduced.

Though Leopard uses a 32-bit kernel, Macs running Leopard can contain and use far more RAM than the 4 GB limit the "32-bit" qualifier might seem to imply. But as RAM sizes increase, there's another concern: address space depletion—not for applications, but for the kernel itself.
Source: http://arstechnica.com/apple/2009/08/mac-os-x-10-6/5/

But of course, this doesn't mean much for a console, and the example above can still use 64-bit mode whenever necessary (best of both worlds, this is not); so it's like if I pointed out addressing more than 4 GB of RAM as an 64-bit advantage which would be a null point seeing that this console is in the 2 GB range and thus wouldn't benefit from that. Null point this isn't but it's close as the kernel for a console OS shouldn't be that heavy or have to deal with much multitasking limiting depletion. Lack of 64-bit integer registers though would hurt.

And seeing most code could be optimized for 64 bit, might either hold the others out (from doing so) or hamper it.
 

M3d10n

Member
Interesting point there.

The Wii's Gamecube mode is truly a 'dumb' mode. Gamecube games were never programmed with the provision for an OS layer to interrupt them during execution. So Gamecube mode on Wii locks every single Wii function out, including controllers. You have to turn the power off to get out of Gamecube mode. The home overlay doesn't operate.

By comparison, DS mode on the 3DS actually does allow the OS to interrupt the game to return to the dashboard, DS games evidently have some hook for the OS to latch onto.

Wii games actually do have the hooks for being paused by an OS overlay with some OS controls available. So, in spite of talk of a Wii mode on Wii U, one wonders how much Wii games can share functions with the Wii U while running. Wii games were also designed to let background services like Wii Connect24 to run (even VC games didn't interrupt housekeeping, as far as I know).

The Wii home menu was hardcoded into the games, so the Wii U probably won't be able to touch that. It can, however, intercept the "return to the OS" command just like the 3DS intercepts the same command in DSiWare games.

The 3DS cheats while running DS games: the home button sends the "lid closed" interrupt, causing the DS game to do whatever it should do to enter sleep mode. If you press home on a DS game while using wifi connection, the game will not pause, just like when you close the lid on a real DS. This is why GBA ambassador games don't pause when the home button is pressed: the GBA doesn't have a single sleep command that works on every game.
 
So I've been reading some B3D stuff...

I'm presumably LTTP on this so it may have already been discussed, but regarding the GPU - isn't the mooted ~600 GFlop number now impossible based on the 45W average power draw number now known?

EDIT: Actually different sites seem to have two different numbers for the "typical" power usage: 40W and 45W.
It's less than 45W. 45W is the maximum when all USB ports are in use.
I thought 75W was the max power rating?
 
Top Bottom