• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

i-Lo

Member
I'll be laughing my ass off if Sony or MS expect me to buy their next consoles for $400-$500.

$299 is my limit. $350 if they bundle all sorts of goodies plus a game.

You aren't the only person they are targeting to sell to. So yea, you can wait.

Anyway, this thread has been spiralling down the drain a bit now. In about a months time, gaming journalistic outlets will get their hands on WiiU to open it up and post pictures of the chips. I am certain they already possess a few WiiU's but under NDA to reveal the tech specs.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
In about a months time, gaming journalistic outlets will get their hands on WiiU to open it up and post pictures of the chips.
I'm sure that will be very exciting.
 

Meelow

Banned
Just how awesome WiiU is and then not so awesome followed by " no no it's awesum, you know nuttin' " bla bla

It's all about the GPU and it's potential etc etc.

Ahh so pretty much people speculating if it uses the rumored GPU or not.
 
You aren't the only person they are targeting to sell to. So yea, you can wait.

Anyway, this thread has been spiralling down the drain a bit now. In about a months time, gaming journalistic outlets will get their hands on WiiU to open it up and post pictures of the chips. I am certain they already possess a few WiiU's but under NDA to reveal the tech specs.
Kind of OT but in the general train of thought; Did we ever find out the full specs of the Hollywood GPU? I'm just wondering if a teardown of the WiiU will be enough to settle the speculation or if a full technical leak from Nintendo or AMD would be the only way to settle things.
 
First, I'll get out of the way the point I disagreed with, I don't believe it's easy or clear to see what anyone means by DX11 features but not DX11 capable, or SM5 like features, but only used sparingly... How does that make sense to anyone? the changes from DX10.1 to DX11 are very small. It flat out is simply about moving tessellation into unified shaders as well as GPGPU targeting that same advancement. There is no long list of DX11 features that they might have or might not have, it's literally those 2 things from DX10.1, but I'm all ears if you have a list. SM5, well either the chip has it or it doesn't, it's a specification that Nintendo would likely not even mess with.

I do think our disagreements are quite minor. Let me try to clarify my position, however. Firstly, no I don't have a list of all the changes they made from DX 10.1 to DX 11. Nor do I care to research it at the moment. Besides, those 2 advances, they do seem minor. We both agree (and I take Arkam's word) that the GPU is capable of beyond SM4. In general, I expect that Nintendo's customizations took a different path to the same end than that of AMD's more recent cards. So tessellation, for instance, could end up running with lower/equal/higher efficiency than a "proper" DX 11 GPU in any given scenario. We don't know yet. When I said some features might be used more sparingly, that's just common sense when thinking about the upcoming generation. The next Xbox and PS will be able to get away with more effects at once while maintaining a playable framerate. It could be SM4 level or SM5 - there will be a performance gap. It's just with the latter, we don't know yet how efficient Nintendo's modifications have been compared to the standard or what they've deemed important enough to include. I don't think it will be SM5 in everything but name. I expect some things will be missing, but I also doubt anybody will really care in the end outside of a few techies like you and I. ;)

Now to move on, it seems like I'm rocking the boat or whatever, but how can that be when You, BG and myself are plainly saying the same thing. BG is right to ask where he and I disagree, because we don't. He said "No" to there being a different GPU than what was in early dev kits, but I think it was mostly misunderstood that I simply meant that a final silicon chip was put into the final dev kits that replaced the R700 card originally used. Beyond that I said it's possible that that card now supported DX11 (and for OUR purposes, any changes to improve GPGPU and Tessellation, would mean this.)

I think you're being a bit too liberal w/ your usage of "DX11," but I see what you're saying.

As for saying "we don't know" if Wii U's GPU is 2-3X Xenos, I think it's plain that it falls in that line, BG basically agrees with me and I don't see you disagreeing with him. We know that Wii U can render 360 graphics and beyond that, we also know that it can also render a second scene from the game onto the pad (batman did this during E3) we know that Batman isn't exactly pushing the hardware because it's an unoptimized port.

Now even 28nm is being taken seriously because someone else said that it's possible. The reason I don't post here much anymore is exactly because of this, reasonable discussion is halted, if not by trolls, then by people who just don't read or understand what others are saying.

I'd love to know where either of you disagree with me about Wii U specs. I'd love to have that debate, instead of what is "known" and what is "guessed" at... As for this spec leak in the OP, it's clearly second hand information from Arkam (he has said as much) so lets try not to treat it like a fact sheet.

I agree that if we're talking FLOPS, 2-3x is about what I expect. I made the point because you stated something as fact, when that is not clear (other sources say 1.5x). Meanwhile, the specs leak (while not telling the whole story) has been confirmed by multiple parties as being accurate. And that tells us what they were targeting in terms of features and where they started from. And I may as well use this to address your point below since I want to get off this computer and take advantage of the beautiful weather outside. But those target specs are what lead me to believe that perhaps the e6760 is not such a great comparison. I don't expect the ALU count to be the same (although they may be, I'm hoping for 640), I expect the clocks to be lower, and remember: It's all about that eDRAM.

BTW: I completely disagree with people trying to crush the E6760 rumor, because most people know it's a custom part, so of course it's not E6760, but it seems we all agree that it has the target performance fairly close and that is all people really care about, until someone can tell me a feature from DX11 they don't expect in GPU7, out of the two choices there really is to pick from. Then it makes no difference if people look at E6760 or an imaginary custom card that has the same performance. GPU7 won't use DX11 so I hardly see the point in telling people it's not E6760, because it would have all the same characteristics of whatever GPU7 turns out to be EXCEPT Nintendo customizations of course.
 

ohlawd

Member
"words of the prophet"

CaptNfantasy... you're my favorite poster in GAF right now. I don't know how you manage to pull those words together with such clarity. I am convinced that you are a wizard and that you have a magic string that pulls us all together, to face the darkness and the heathens of the underworld.
 
CaptNfantasy... you're my favorite poster in GAF right now. I don't know how you manage to pull those words together with such clarity. I am convinced that you are a wizard and that you have a magic string that pulls us all together, to face the darkness and the heathens of the underworld.

Ok dude, let me stop you right there, I'm the only real Sorcerer around here.
 
Yeah. I hate all of you because Nintendo chose not to make Wii U with two 7970s in Crossfire.

Pffff.....forget that. I was expecting Tri SLi GTX 680's with liquid cooling.

I think we can see that designing, producing and selling a console in this market is really becoming extremely difficult. IMO Nintendo is making a wise choice in having the Wii U be cost effective yet extremely capable with graphical effects and features.

I sure as hell don't want to have the next generation going the way of streaming games to your TV like some kind Neflix Instant Stream Gaming subscription. Depending on bandwidth just to see a clear image of your game at all times would really be lame.....
 

wsippel

Banned
Fair enough, and I want to make it clear I'm not saying it's impossible, just that there are caveats.

What concerns me more is the eDRAM because volume production typically lags the latest node.
I would think the eDRAM is actually either part of the CPU die or a separate die. Either way, it's most likely IBM made 45nm SoI. I mean, there was this interesting statement by Bob Patti, CTO of Tezzaron Semiconductors, a few months ago:

“Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together. Do you think IBM’s going to be real excited about sending their process manifest—their backup and materials—to TSMC? Or maybe TSMC will send it IBM. Neither of those is ever going to happen. Historically they do share this with OSATs, or at least some material information. And they’ve shared it typically with us because I’m only a back-end fab. I can keep a secret. I’m not going to tell their competition. There’s at least a level of comfort in dealing with a third party that isn’t a competitor.”
http://semimd.com/blog/2012/07/31/the-changing-role-of-the-osat/
 

Mlatador

Banned
The Wii CPU is basically a modified Gamecube CPU, and the Wii U CPU a modified Wii CPU with three cores.

Jesus, I just realized, that would be have been more than enough for Factor 5 (if they still were alive) to make PS4 equivalent-looking games on Wii U...

°_°!
 

PetrCobra

Member
The specs talk, while not entirely uninteresting, is sometimes getting a bit too far. Current gen level graphics is tolerable in my eyes so I don't see why it is a big issue for so many people that the Wii U is there (or somewhere slightly above). PS360 have been a joke in the graphical department for years, with people calling them "HD" twins making the joke even more funny, and still their owners didn't sell them or stop using them, they played games on them and didn't care about all the progress that the GPU industry might have made in the meantime. Point is, even today the PS360 visuals don't limit the actual experience too much with lack of detail or anything like that (you know, when not enough information fits on the screen or you can't tell if it's a rock or an enemy on the horizon). I play mostly on Wii so I know that it can really hurt to have those limitations accented by not having enough power. Wii U could probably cost 200 USD if it wasn't for the new controller, or it could have some PC-rivaling tech inside for the same price, but I'm happy that it didn't turn out that way.
 

OryoN

Member
I asked a couple questions about 15 pages ago(see how patient I am?)...

But got no answer/response. I know this thread can move pretty fast at times, but I'm getting the impression that these roller-coaster debates are consuming you guys. So much so that sincere questions are being overlooked/disregarded. I'd love to be proven wrong though...

Anyway, if anyone cares to help me... my questions are in the link below:

http://www.neogaf.com/forum/showthread.php?p=42365465&highlight=#post42365465

*fingers crossed*
 

wsippel

Banned
I asked a couple questions about 15 pages ago(see how patient I am?)...

But got no answer/response. I know this thread can move pretty fast at times, but I'm getting the impression that these roller-coaster debates are consuming you guys. So much so that sincere questions are being overlooked/disregarded. I'd love to be proven wrong though...

Anyway, if anyone cares to help me... my questions are in the link below:

http://www.neogaf.com/forum/showthread.php?p=42365465&highlight=#post42365465

*fingers crossed*
We don't know what the eDRAM is or what purpose it has. It's impossible to answer your question with the info we have. Could be anywhere from a few dozen to several hundred GB/s. To put things into perspective: POWER7 has a regular memory bandwidth of 180GB/s per chip - and that's just the DDR3 RAM, not the eDRAM.
 
I asked a couple questions about 15 pages ago(see how patient I am?)...

But got no answer/response. I know this thread can move pretty fast at times, but I'm getting the impression that these roller-coaster debates are consuming you guys. So much so that sincere questions are being overlooked/disregarded. I'd love to be proven wrong though...

Anyway, if anyone cares to help me... my questions are in the link below:

http://www.neogaf.com/forum/showthread.php?p=42365465&highlight=#post42365465

*fingers crossed*
I know nothing of edram bandwidths.. but bump quoting just for you <3
 

SSM25

Member
How does backward compatibility works? Having what I think is a very different graphics architecture than the wii..
 

OryoN

Member
We don't know what the eDRAM is or what purpose it has. It's impossible to answer your question with the info we have. Could be anywhere from a few dozen to several hundred GB/s. To put things into perspective: POWER7 has a regular memory bandwidth of 180GB/s per chip - and that's just the DDR3 RAM, not the eDRAM.

Ok, gotcha! I'm hoping they use it in a similar way GameCube did.. not just for a framebuffer? Hmmm, I wonder if they could also use it in a way that benefits general purpose processing significantly(similar to CPU cache?).

I know nothing of edram bandwidths.. but bump quoting just for you <3

Cheers mate! :D
 

ozfunghi

Member
How does backward compatibility works? Having what I think is a very different graphics architecture than the wii..

BC is done entirely in hardware, so that means that all the WiiU components have to be able to "do" the same stuff the Wii components did.

Random Question: What is EDRAM and what will the wii u most likely use it for

Embedded DRAM, it's embedded on the CPU/GPU. Depends. It can be accessed much faster and can provide impressive bandwidth. Could be used for loading certain data that has to be loaded very frequently and make up for slower main RAM. Could also be used for AA, but i'm not a techie so people feel free to add or correct.
 

wsippel

Banned
Ok, gotcha! I'm hoping they use it in a similar way GameCube did.. not just for a framebuffer? Hmmm, I wonder if they could also use it in a way that benefits general purpose processing significantly(similar to CPU cache?).
I think it's neither a CPU cache nor a framebuffer. I have a theory, but it's probably all kinds of wrong (and stupid), as that stuff isn't really my area of expertise. We have quite a few people on GAF who know a whole lot more about this than I do, and they'll probably laugh at me, but here it goes: I think the eDRAM is meant to be a fast, low latency scratchpad for both the CPU and the GPU, (mostly) meant to make GPGPU operations more efficient. A buffer for high speed data exchange between CPU and GPU to prevent stalls, basically.
 
I'll be laughing my ass off if Sony or MS expect me to buy their next consoles for $400-$500.

$299 is my limit. $350 if they bundle all sorts of goodies plus a game.

I agree, if either Sony or Microsoft have their next gen efforts retailing for more than $400 they're going to struggle. Especially when you consider that Nintendo will have system selling first party titles such as Mario Kart, a 3D Mario, Retro's new game and Monolith Soft's new games waiting in the wings in an attempt to scupper their launches.

There's always a slim chance that this time next year that the economy worldwide might be in a better state but I certainly wouldn't expect it myself. If they release their consoles for a pricepoint higher than $400 I can't see them flying off shelves no matter what sort of software they have available at launch.
 

i-Lo

Member
I agree, if either Sony or Microsoft have their next gen efforts retailing for more than $400 they're going to struggle. Especially when you consider that Nintendo will have system selling first party titles such as Mario Kart, a 3D Mario, Retro's new game and Monolith Soft's new games waiting in the wings in an attempt to scupper their launches.

There's always a slim chance that this time next year that the economy worldwide might be in a better state but I certainly wouldn't expect it myself. If they release their consoles for a pricepoint higher than $400 I can't see them flying off shelves no matter what sort of software they have available at launch.


Yeah that worked really well for Sony right when they expected people to get 2nd jobs to fund a PS3...yeah those people can't sustain a console's software sales which is what Sony hinges it's hopes on when they sell a console at a loss. ;)

Fact is there exists a price barrier for console's...that barrier seems to be $350 at the top end which is begrudgingly acceptable. The real expectation and hope of the consumer though is at $299.

Stick to WiiU
 

F#A#Oo

Banned
You aren't the only person they are targeting to sell to. So yea, you can wait.

Yeah that worked really well for Sony right when they expected people to get 2nd jobs to fund a PS3...yeah those people can't sustain a console's software sales which is what Sony hinges it's hopes on when they sell a console at a loss. ;)

Fact is there exists a price barrier for console's...that barrier seems to be $350 at the top end which is begrudgingly acceptable. The real expectation and hope of the consumer though is at $299.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
You're correct of course, but I would think Nintendo and TSMC signed the contracts at least a year ago, probably closer to two years. And TSMC expected much higher volumes and better yields for 28nm back then if I remember correctly. So it's certainly possible that Nintendo planned to go with 28nm and designed the TDP around that, and couldn't really change that anymore - which would explain the rumored manufacturing issues.
What and where are those rumors? I missed that.
 

ikioi

Banned
Hey bgassasin, appreciate if you could answer this one question.

In regards to the CPU, have your sources commented on its capabilities?

Previous insiders have commented that the Wii U's CPU is weak in comparison to the HD Twins, and there's also a few developers who have publically have made comments suggesting this is indeed the case. We recently had the Ninja Gaiden devs saying the Wii U version has less on screen enemies then the PS3/60 due to its processor.

What is the likely cause of these sorts of comments:

Developers running unoptimised code on the Wii U

Devleopers failing to take advantage of GPGPU and other hardware that could perform these tasks or lighten CPU load

The Wii U's processor is indeed inferior in most regards to the Xenon and Cell processors.

Middleware and development tools not yet optimised enough for developers to take advantage of the CPU's power
 

OryoN

Member
I think it's neither a CPU cache nor a framebuffer. I have a theory, but it's probably all kinds of wrong (and stupid), as that stuff isn't really my area of expertise. We have quite a few people on GAF who know a whole lot more about this than I do, and they'll probably laugh at me, but here it goes: I think the eDRAM is meant to be a fast, low latency scratchpad for both the CPU and the GPU, (mostly)[ meant to make GPGPU operations more efficient. A buffer for high speed data exchange between CPU and GPU to prevent stalls, basically.

That's what I meant when I said:

"I wonder if they could also use it in a way that benefits general purpose processing significantly(similar to CPU cache?)."

I was referring to general purpose processing (on the GPU...similar to how a CPU uses cache). I know, I didn't make that extremely clear. Sorry.

It'll be really interesting if they did use it that way though. From what I understand, not even current high-end PC GPU's would have that ability, right?
 
Yeah that worked really well for Sony right when they expected people to get 2nd jobs to fund a PS3...yeah those people can't sustain a console's software sales which is what Sony hinges it's hopes on when they sell a console at a loss. ;)

Fact is there exists a price barrier for console's...that barrier seems to be $350 at the top end which is begrudgingly acceptable. The real expectation and hope of the consumer though is at $299.

That may be your personal price barrier, doesn't mean millions of others won't disagree with you. I personally think anyone is batshit to pay $350 for a Wii U, in context that doesn't mean anything.
 

Daschysta

Member
Hey bgassasin, appreciate if you could answer this one question.

In regards to the CPU, have your sources commented on its capabilities?

Previous insiders have commented that the Wii U's CPU is weak in comparison to the HD Twins, and there's also a few developers who have publically have made comments suggesting this is indeed the case. We recently had the Ninja Gaiden devs saying the Wii U version has less on screen enemies then the PS3/60 due to its processor.

What is the likely cause of these sorts of comments:

Developers running unoptimised code on the Wii U

Devleopers failing to take advantage of GPGPU and other hardware that could perform these tasks or lighten CPU load

The Wii U's processor is indeed inferior in most regards to the Xenon and Cell processors.

Middleware and development tools not yet optimised enough for developers to take advantage of the CPU's power

The Ninja Gaiden devs didn't say that, I believe it was one of the Koei teams doing Warriors Orochi, those same devs also said they didn't understand the hardware and that the machine could produce much more beautiful graphics than the other consoles.
 
The Ninja Gaiden devs didn't say that, I believe it was one of the Koei teams doing Warriors Orochi, those same devs also said they didn't understand the hardware and that the machine could produce much more beautiful graphics than the other consoles.

It should also be noted that both Namco and Koei, the two companies that pubically stated that they were having trouble with the CPU, are Japanese developers and more than likely made the game engine for the PS3 first. The PS3 is very CPU-oriented, which is apparently the opposite of the Wii U.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I think it's neither a CPU cache nor a framebuffer. I have a theory, but it's probably all kinds of wrong (and stupid), as that stuff isn't really my area of expertise. We have quite a few people on GAF who know a whole lot more about this than I do, and they'll probably laugh at me, but here it goes: I think the eDRAM is meant to be a fast, low latency scratchpad for both the CPU and the GPU, (mostly) meant to make GPGPU operations more efficient. A buffer for high speed data exchange between CPU and GPU to prevent stalls, basically.
Nothing to laugh at here. But you're missing one detail: if we agree that the main pool is DDR3, then it makes a lot of sense for the GPU to keep its fb (not necessarily in its entirety) in the edram pool. Historically, platforms which did not use any form of local fb but instead stored their fb directly in UMA space suffered from some serious memory contention issues (case in point: xbox).

Regardless, I think the edram will be used both as a GPU scratchpad as well as act as a low latency conduit between the CPU and GPU. That's why one of the architecture details I'm most curious about is how AMD solved any potential CPU/GPU contention issues in that pool.
 

ozfunghi

Member
Knowing that it is at least a modified r700 can we expect at least an r800 tesselator performance wise?

The leaked devkit specsheet specifically mentioned a tesselator. Given the fact that Nintendo A/ is building a custom chip which was finished early 2012 B/ always provides very efficient hardware C/ is not likely to spend money on useless features ... i would think the tesselator should actually be usable compared to the one in the 360.
 

F#A#Oo

Banned
That may be your personal price barrier, doesn't mean millions of others won't disagree with you. I personally think anyone is batshit to pay $350 for a Wii U, in context that doesn't mean anything.

It is my personal barrier yes...nor am I alone though...look at the history of console pricing...

360 $299
Wii $249
Xbox $299
PS2 $299
GCN $199
Dreamcast $199
N64 $199
PS1 $299
SNES $199
Megadrive $189
NES $199

The anomalies in history are PS3 at $599, Saturn $399, Neo Geo $649 and 3DO at $699...and they all bombed at launch.

There is a reason that you don't go over $299...it's historical a precedent has been set...to break down the $299 will take something special and Sony tried by converging different technology but it only worked for the PS2. The perceived value of a console has not shifted from generation to generation...and we'll continue to want our console's at $299 even if it's the basic and gimped version.
 

User Tron

Member
Nothing to laugh at here. But you're missing one detail: if we agree that the main pool is DDR3, then it makes a lot of sense for the GPU to keep its fb (not necessarily in its entirety) in the edram pool. Historically, platforms which did not use any form of local fb but instead stored their fb directly in UMA space suffered from some serious memory contention issues (case in point: xbox).

Regardless, I think the edram will be used both as a GPU scratchpad as well as act as a low latency conduit between the CPU and GPU. That's why one of the architecture details I'm most curious about is how AMD solved any potential CPU/GPU contention issues in that pool.

TBO anything else would be suprising. Main problem of pc gpgpu is that you have alloc and copy between the main memory and the vram a lot. Sharing the edram would only be logical imho.
 

Septimius

Junior Member
In other words, it's not worth $50 for the new controller. I'm sure the console would've been 250/300 if it weren't for the new controller.
 
Top Bottom