• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

I wasn't on Gaf when the Wii came out, but one dude linked a Wii thread and one of the posts that stuck out to me said that the Wii is a 480p-based system that will be able to run 360/PS3 downports easily, just not at 720p.
 
Who's "we"? You mean you. And you are wrong.


Rubbish. Sony touted the Cell as an amazing processor. The hype was everywhere.

"We" is correct but that dosen´t include you. And no i/we are right. According to your logic Wii U is already maxed out.

Google Gabe Newell PS3 and see for yourself
 
"We" is correct but that dosen´t include you. And no i/we are right. According to your logic Wii U is already maxed out.

Google Gabe Newell PS3 and see for yourself
I think the Cell issue had to deal more with how expensive it was for Sony to invest in it, and the CPU was suppose to make up for the PS3's other parts. The PS3 is sort of the opposite of the Wii U in that its very CPU-focused. At the end, GPU-focused designs appears to be the future (for now), and the Cell is a dead-end..so there were probably some justification to the criticism.
 
I'm curious, how did gaf react to wii specs, reveal and launch? I completely ignored the chaos back then lol

The Wii's lack of power was made up by the fact that its a familiar architecture that devs will love, it keeps game costs down so more devs will develop for it, and the games look the same in 480p as xbox games do in 720p because of the needed horsepower.
 

Van Owen

Banned
When it was revealed Wii wouldn't be HD, I had forum posters telling me it didn't matter because Shrek still looked amazing on a 480p DVD. Not shitting you.
 
When it was revealed Wii wouldn't be HD, I had forum posters telling me it didn't matter because Shrek still looked amazing on a 480p DVD. Not shitting you.

That was before it was common knowledge that Wii was literally using overclocked versions of Flipper and Gekko. Even Drinky Crow, of all people, believed at one point that it would at least be able to match 360 launch titles in SD.
 

The_Lump

Banned
When it was revealed Wii wouldn't be HD, I had forum posters telling me it didn't matter because Shrek still looked amazing on a 480p DVD. Not shitting you.


To be fair to them, HD hadn't exactly taken off back then. I'm betting those people hadn't even seen HD pictures before and didn't really know what they were saying.
 
That was before it was common knowledge that Wii was literally using overclocked versions of Flipper and Gekko. Even Drinky Crow, of all people, believed at one point that it would at least be able to match 360 launch titles in SD.

Yeah, originally it was supposed to be using either 2.5GHz G5 or dual core 1.8GHz G5, with I think it was 184 or 256MB of RAM? That would be enough to render PS360 games in 480 mode. I wasn't on GAF at the time but then we found out it was an overclocked GCN with 3 times the RAM.

And of course there were the stories of NURBs and some kind of mapping. There were also stories about 3 GPU's for holographic gaming, each core was required for a different axis. Fun times.
 

Boss Man

Member
To be fair to them, HD hadn't exactly taken off back then. I'm betting those people hadn't even seen HD pictures before and didn't really know what they were saying.
Well, I think we understand that they didn't know what they were saying. It's how most people operate these days anyway.
 
Well, I think we understand that they didn't know what they were saying. It's how most people operate these days anyway.

At the time HDTV uptake wasn't that high, and it is understandable that Nintendo missed the massive explosion in HDTV uptake in the US since it came shortly after as the housing boom exploded. Nintendo was, I think, expecting normal new electronic uptake for HDTV which would mean they'd have until 2010-11 to release a real HD console. Doesn't explain or excuse the fact that after the HDTV boom was evident Nintendo didn't have an HD console until 2012 though.
 
That was before it was common knowledge that Wii was literally using overclocked versions of Flipper and Gekko. Even Drinky Crow, of all people, believed at one point that it would at least be able to match 360 launch titles in SD.

I still think that was one of the funniest things ever.
 
I am just coming up to speed on the Wii U...

Can someone please help me with something?

I don't know if this is the right thread for this, but I am giving it a shot here.

I was at a WalMart where they had a Wii U demo kiosk, and they were running a video demo of Super Mario Brothers Wii U. The demo made a point to show me that there are points where you need to pull out the stylus and control parts of the interaction from the pad with the stylus, basically activating invisible platforms, onto which you jump. I think.

It is the biggest turn off I have seen and I am so primed for games like this. I am big into platformers, big into Nintendo, big on new stuff. That just seemed like, holk fuck, so not fun. Stylus in general seem to be not fun. To me.

Can someone talk me off the ledge on this? If they think that pulling out that stylus is going to be fun, I have to say that I disagree. Is this a big part of the Wii U's appeal, or supposed appeal?
 
I am just coming up to speed on the Wii U...

Can someone please help me with something?

I don't know if this is the right thread for this, but I am giving it a shot here.

I was at a WalMart where they had a Wii U demo kiosk, and they were running a video demo of Super Mario Brothers Wii U. The demo made a point to show me that there are points where you need to pull out the stylus and control parts of the interaction from the pad with the stylus, basically activating invisible platforms, onto which you jump. I think.

It is the biggest turn off I have seen and I am so primed for games like this. I am big into platformers, big into Nintendo, big on new stuff. That just seemed like, holk fuck, so not fun. Stylus in general seem to be not fun. To me.

Can someone talk me off the ledge on this? If they think that pulling out that stylus is going to be fun, I have to say that I disagree. Is this a big part of the Wii U's appeal, or supposed appeal?


You play this game like every Mario platformer. But a second player can use the Gamepad and create platforms you can use to get to hidden areas, or he can make a platform to save you when youre about to fall into a pit, etc. Its not for single player. In single player you play it as a normal platformer.

Btw the player with the pad can do more than just create platforms. He can stun enemies, etc.
 
Anybody have any new theories on what the GPU is based off of this time? Certain websites are saying a "Radeon HD 5670-based GPU" (109mm2 die size).
 
You play this game like every Mario platformer. But a second player can use the Gamepad and create platforms you can use to get to hidden areas, or he can make a platform to save you when youre about to fall into a pit, etc. Its not for single player. In single player you play it as a normal platformer.

Btw the player with the pad can do more than just create platforms. He can stun enemies, etc.

...not to mention you don't need to pull out the stylus - as far as I can tell, it's entirely possible to do as you would with the DS/3DS and tap with a finger.
 
Im a bit late here but seems like the OP is not updated, what are the final specs? Is there a breakdown somewhere?

We don't have final specs because the hardware is mostly custom. Just being able to physically see the chips won't tell us what the features are. IBM/Nintendo/AMD aren't going to have specs listed for custom hardware listed on a website for us to check.
 
Anybody have any new theories on what the GPU is based off of this time? Certain websites are saying a "Radeon HD 5670-based GPU" (109mm2 die size).

Bogus. Nothing has changed. It's still a custom Nintendo part using an R700 base. Shader count unknown. One good point that I read today is that Nintendo/Renesas may have included some redundant logic (extra spus) to account for yields. So if there were 320 usable spus, in actuality there might be 400 just in case some of them didn't make it through the manufacturing process.
 

z0m3le

Banned
Bogus. Nothing has changed. It's still a custom Nintendo part using an R700 base. Shader count unknown. One good point that I read today is that Nintendo/Renesas may have included some redundant logic (extra spus) to account for yields. So if there were 320 usable spus, in actuality there might be 400 just in case some of them didn't make it through the manufacturing process.

That seems like a whole lot... 20% is pretty big, especially if it's 40nm a very well developed process.
 
Bogus. Nothing has changed. It's still a custom Nintendo part using an R700 base. Shader count unknown. One good point that I read today is that Nintendo/Renesas may have included some redundant logic (extra spus) to account for yields. So if there were 320 usable spus, in actuality there might be 400 just in case some of them didn't make it through the manufacturing process.

Yeah, I thought that the "5670-based" GPU thing was a bit off. I still think that it's based off the Radeon HD mobile 4830 and clocked at 533MHz.
 

z0m3le

Banned
Yeah, I thought that the "5670-based" GPU thing was a bit off. I still think that it's based off the Radeon HD mobile 4830 and clocked at 533MHz.

That would be 640 shaders X 533Mhz/1000 X 2... 682GFLOPs.
I'm sure being a mobile part it is possible to hit the usage, but mobile parts are sort of special. I am unsure what they would use, but this is the best case scenario, and I wouldn't keep my expectations this high.
 
That would be 640 shaders X 533Mhz/1000 X 2... 682GFLOPs.
I'm sure being a mobile part it is possible to hit the usage, but mobile parts are sort of special. I am unsure what they would use, but this is the best case scenario, and I wouldn't keep my expectations this high.

There are apparently insiders who have said that the GPU is "not even close" to 600 GFLOPS. Combine that with how the R700 architecture scales, the 1.5x 360 rumors, and what would be the overall balance given the known components, and it is very safe to say that we are looking at 320-400 SPUs. It sucks that there have been so many BS rumors out there, and it got people's hopes up (including my own), but it's still a cool little system with alot of potential for great games and graphics.
 

DonMigs85

Member
There are apparently insiders who have said that the GPU is "not even close" to 600 GFLOPS. Combine that with how the R700 architecture scales, the 1.5x 360 rumors, and what would be the overall balance given the known components, and it is very safe to say that we are looking at 320-400 SPUs. It sucks that there have been so many BS rumors out there, and it got people's hopes up (including my own), but it's still a cool little system with alot of potential for great games and graphics.

So basically a tweaked Radeon 4670/4650 most likely.
 
There are apparently insiders who have said that the GPU is "not even close" to 600 GFLOPS. Combine that with how the R700 architecture scales, the 1.5x 360 rumors, and what would be the overall balance given the known components, and it is very safe to say that we are looking at 320-400 SPUs. It sucks that there have been so many BS rumors out there, and it got people's hopes up (including my own), but it's still a cool little system with alot of potential for great games and graphics.

Well, most of the "bullshit rumors" were wholly invented by the Nintendo faithful who wanted their hopes up, and shouted down anyone knowledgeable who came to this thread and tried to explain why they needed to moderate their expectations.
 
You play this game like every Mario platformer. But a second player can use the Gamepad and create platforms you can use to get to hidden areas, or he can make a platform to save you when youre about to fall into a pit, etc. Its not for single player. In single player you play it as a normal platformer.

Btw the player with the pad can do more than just create platforms. He can stun enemies, etc.
Hey - Thank you for this reply. I really appreciate it. That is super helpful.

So, to be clear... During Super Mario Galaxy (which I believe is the best video game ever made, in history, just so that you know where I am coming from) I really appreciated the wild variety in gameplay. From grabbing on to bulbs and snapping your way along, to holding the Wiimote upright while rolling on a ball. God damn. I am not opposed to new means of control. I mean... Galaxy gaaallaxx........ ammmmmmmmmm..... gaaaah..... Nintendo EAD team A-level shit right there. Amazing what that team did. Art, in my opinion. I went absolutely fucking ga-ga over that game.

I am just trying to say what a fan I am coming into this as...

But when I ever saw the video with them whipping out that stylus, I said to myself, you have got to be kidding me. The bit in the Isaacson Steve Jobs Biography about his hatred of stylus was never shown more clearly dead-on. I almost immediately put that huge controller down, thinking to myself, wow, an EAD-made game with a super poor decision like that. Then I kind of sulked for a big because it had a huge boner-killing effect on me and my desire for a Wii U in general. I am trying to get it going again, but man... that stylus thing looked like a huge fucking pain in the coconuts.

I love my 3DSXL as much as the next dork, but there is a time & place for everything. Stylus appeal ends there for me. It's got has extremely, extremely limited appeal for me in a touch-screen-everything 2012 world. I have little kids. That stylus is going to wind up in a couch cushion or in one of my kids eyes within 20 seconds of the unboxing. I think that was a horrible decision on the part of a company that wants to be Apple so badly it can taste it.

I really can't think of why they went for that over touch. I mean that is literally stopping me in my tracks and making me have to post here and get course corrected from people more knowledgeable than me.
 
There are apparently insiders who have said that the GPU is "not even close" to 600 GFLOPS. Combine that with how the R700 architecture scales, the 1.5x 360 rumors, and what would be the overall balance given the known components, and it is very safe to say that we are looking at 320-400 SPUs. It sucks that there have been so many BS rumors out there, and it got people's hopes up (including my own), but it's still a cool little system with alot of potential for great games and graphics.

Have any insiders said it isn't even close to 600glops ?
 
Wii had a totally new rendering method using Nurbs which would make it blow the other consoles out of the water.

Iwata's famous quote "When you see Wii graphics, you will say 'wow!'." made the actual reveal pretty funny.
Iwata's "We want you to say 'Wow, that certainly is something.'" quote from that 2012 E3 Nintendo Direct is similarly funny in the Wii U's context.
 
So, even with a weak CPU, will the GPU still generate some games that look a step up from today's home consoles? Or what?

Could something like Portal 2 be possible on Wii U, or would the CPU not be able to cope?
 
So, even with a weak CPU, will the GPU still generate some games that look a step up from today's home consoles? Or what?

Could something like Portal 2 be possible on Wii U, or would the CPU not be able to cope?
Retro's art direction alone will probably elevate their Wii U software from most games this gen. Only PS3/360/PC developers I'd rank around them in terms of truely impressive visuals (ignoring the obvious technical limitations Retro are under developing on the Wii) are Naughty Dog, Valve and 343 Studios (who have ex-Retro people on their art team anyway, so...). Good-Feel and EAD Tokyo aren't slouches either in the visuals department. The Wii U power issue is annoying more just because it seems like Nintendo truly didn't give about 3rd party needs again after insisting otherwise leading up to release.

Can't see why Portal 2 would be impossible on the Wii U after some modifications, but what with the weaker CPU it's a shame some of the prettier flourishes would probably have to be toned down since the tablet would've come in handy for the level editor.
 
Well, most of the "bullshit rumors" were wholly invented by the Nintendo faithful who wanted their hopes up, and shouted down anyone knowledgeable who came to this thread and tried to explain why they needed to moderate their expectations.
The unfortunate part was those of moderate expectations were still shooting too high. I'm one of them, as was Fourth Storm (of which he just admitted).

Most expectations on the forum at large were "Better than PS3/360." Even Durante fell prey to lofty expectations. But at the time few aside from specialguys and Van Owens seriously thought it would be in any way appreciably weaker.

When two or three are the most notable detractors... especially when they lack any form of tact, they tend to be marginalized. And in some ways they ended up being closer to reality than those we can see are arguing with pure intent.
 

beril

Member
So, even with a weak CPU, will the GPU still generate some games that look a step up from today's home consoles? Or what?

Could something like Portal 2 be possible on Wii U, or would the CPU not be able to cope?

I can't remember anything that should be remotely cpu intensive in portal 2. there's usually just a handful of interactive objects per room. You could probably get it to work just fine on a ps2 with downgraded visuals
 
I can't remember anything that should be remotely cpu intensive in portal 2. there's usually just a handful of interactive objects per room. You could probably get it to work just fine on a ps2 with downgraded visuals

Wouldn't all the gel puzzles be fairly intensive though? Admittedly don't have the most advanced rig but my PC seemed to falter a tiny bit around those sections when I played through last year.
 
So basically a tweaked Radeon 4670/4650 most likely.

I suppose. But while there are a vocal few who will mockingly go "hur dur eDRAM," it truly does differentiate Wii U's GPU from those cards a great deal. The fact that it's actually on-chip and fully read/write unlike the 360's eDRAM is huge.

Well, most of the "bullshit rumors" were wholly invented by the Nintendo faithful who wanted their hopes up, and shouted down anyone knowledgeable who came to this thread and tried to explain why they needed to moderate their expectations.

Let's break it down. Discounting bullshit like Wii U Daily, on GAF alone here's what I remember.

We had the initial 1 TFLOPS target rumor from a Japanese AMD source that seemed legit for a while, until it was revealed to be a mistranslation late in the game. It turned out the figure was sheer speculation.

Shockingly, IGN was fed a crock of shit in the early days which compared the GPU to a 4850, but somehow that still wouldn't be enough to achieve a noticeable difference in (presumably) multiplatform games. (If only...)

Then, we had bgassassin's claim (and I think he linked it back to wsippel in one post, who's brought us some good stuff) that the early dev kits supposedly did contain a 4850, but w/ two disabled SIMD cores, so 640 shaders total. It was thought, with a die shrink certainly in the cards, that this was achievable given a moderate clock. In the end, we're looking at somewhere around half that amount. We were going back and forth as to whether Global Foundries or TSMC would manufacture the GPU and at what nodes (32nm? 28nm?), meanwhile neglecting to stare the obvious in the face - that Iwata had gone with his friends (he is an honorable man, after all), and utilized the insights of those who worked on Gamecube and Wii from NEC, now Renesas.

It is possible that at one point Nintendo were targeting more performance. I believe dev kits have exceeded final silicon in certain aspects in past occurrences. There was even an outside developer who said that in the early days the specs of the Wii U were in constant flux. That at one point Nintendo wanted to support 4 controllers. I am sure, given the case size as seemingly a priority and rumors of overheating, that Nintendo did not settle on the upper boundary of those tests. Perhaps Iwata and company were willing to go with a higher power GPU if it meant powering more Gamepads (the system's defining feature), but when it turned out another custom WIFI module would need to be added, he backed off the whole thing. Who knows, really?

I don't feel like getting into the whole Ideaman debate. I've talked to him on chat and he came off as a nice guy. I believe he was proven to have solid info. Unfortunately, this was when everyone was breaking things down into 2x, 3x, etc (ah shit, I forgot about the bs rumor about Wii U being like 5x 360 and the Durango like 6x...ugh, too late for that). As we know, those multipliers can be quite misleading. Yeah, it's 2x to 3x in some areas (cache, eDRAM, usable system RAM), but 1/2 in others (system RAM bandwidth, likely CPU clock speed and threads).

There was finally a guy on IRC chat around e3 that was going around saying 800 GFLOPS or something like that. I was online the initial time the guy with his handle let go a few tidbits prior to Nintendo's e3 presentation, and they seemed legit in the end. Thing of it is, any person can then go on and use that handle in the future, so there's no way of telling if you're talking to the same person as before!

So no, I don't think Nintendo faithful just fabricated rumors out of whole clothe to keep their own hopes up. Perhaps they chose to believe optimistic sources, in part no doubt because many of the negative sources did seem rather dismissive and snide towards Nintendo (not Arkam btw, just reread some of those pages, and he came off to me as very blunt but straight forward and somewhat disappointed himself). There was certainly alot of BS going around. Some were mislead, but many simply chose to believe certain somewhat believable sources over other somewhat believable sources, and that's really all there is to it.
 

v1oz

Member
They completely removed the new physics system from the WiiU version of Madden. I'm thinking CPU concerns might be the real reason they didn't bother with that feature this season. It would probably haven taken them too long to optimize the code for the slower CPU.
 

CronoShot

Member
They completely removed the new physics system from the WiiU version of Madden. I'm thinking CPU concerns might be the real reason they didn't bother with that feature this season. It would probably haven taken them too long to optimize the code for the slower CPU.

I'm not usually one to go straight for the "developers are just lazy" explanation, and there's probably more to it than that, but I think EA just figured they wouldn't even bother this year, and just do it next year.
 

guek

Banned
There was finally a guy on IRC chat around e3 that was going around saying 800 GFLOPS or something like that. I was online the initial time the guy with his handle let go a few tidbits prior to Nintendo's e3 presentation, and they seemed legit in the end. Thing of it is, any person can then go on and use that handle in the future, so there's no way of telling if you're talking to the same person as before!

Just to interject, that was Japesy, and he does actually work for NoA (as a low level employee). I remember when this happened. He later clarified that he wasn't trying to make a specific claim and was merely speculating. He also came into the IRC and pointed out seeing 1GB for games on a spec sheet a week or so before it got leaked elsewhere.

As someone who's kept up with the whole spec debate since the beginning (with the exception of the last 1-2 weeks or so), the ram/cpu issues have certainly been surprising. At the initial onset, I held out hope for a gpu close to the 4770, but that quickly faded to more a more conservative outlook of approximately doubling the 360 in performance. Now that that's in question, I'm extremely curious to see how this thing is actually put together. I'm surprised we haven't gotten a BoM breakdown yet.
 
This is what I get for lurking to see launch impressions and all heck breaks loose. I was fine not knowing what had been going on, but I've seen myself mentioned too many times. And to those who have done so, it's called trying to focus on getting oneself in order. Console discussion is far from being that important and I didn't "run away" because of actual details coming out. Some of you need to grow up if you really think that.

Nothing about what I've said/felt in the past has changed with recent info and I'm not changing from that view now. My optimism hasn't changed at all. I'll explain why like this.

x(a+b)=c

a = hardware info given to devs
b = hardware info not given to devs
x = Dev familiarity with the hardware
c = Wii U performance

a and b are constants. Ignoring small tweaks these were not going to see changes. Over the last year and a half I would get info dealing with a and c. And during that time in the WUSTs I was trying to take that info and figure out what b would be. My optimism was based on what I would hear in regards to c. If some were saying this is what we were seeing with c, then they were doing that on a and b. Just because we're now finding actual info on b doesn't magically change c. Let's say you were blind-folded, had a plate of food placed in front of you, and said that was one of the best things you've tasted. However when you take the blindfold off, you find out it was made with worms. Even once you find that out, it doesn't take from the fact it's the one best things you ever tasted. Likewise, just because we are finding out details Nintendo didn't give out doesn't degrade what I've been saying because the performance still came from the specs we didn't know about before. And at the same time the performance will improve as devs learn fully what they can and can't do with the hardware.

Now I'll add this:

x(a+b)/y=c

y = 3rd party/multi-platform development

In this case the less dedication to tailoring games to Wii U hardware, the higher y would be. And in turn the higher y is, the lower c will be. Exclusives would essentially be y = 1.

I would estimate 60-70% of the info I got was based on development not from 3rd party ports. That's why I can be as optimistic as I have been. And since that info is obviously based on hardware that we're finally learning about, why in the world would I feel any different than before?

Because I know how my brain works I have a need to satisfy any curiosity I may have. And in turn I will spend hours trying to do so. Same thing happened digging up that latency info. And that's time I definitely need to dedicate elsewhere. I'll hang around for a day in case anyone wants to say/ask something to me directly, but after that as I told Azak I'm going back in my hole.


Side note: I was skimming and saw a issue with the latency for XDR. I found this from an old B3D post.

http://www.xbitlabs.com/news/memory/display/20031225163917.html

Toshiba’s XDR memory chips are configured as 4Mb word x 8 banks x 16 bits, are available with 40ns, 50ns and 60ns cycle time and 27ns or 35ns latency and have 1.8V VDD.
 
Top Bottom