• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Do 50GB discs matter? How many PS3 games used them... MGS4 is about the only one I can think of, and that's because it had hours of HD video and audio stored on it.
 
Sure... We know what's inside the Wii after all.

Well... it depends. Off the shelf parts (like RAM) are easy to look up part numbers and get information on. Custom parts, however, you can't tell just by looking at them unless that information is public for some reason.

For example, we know the 3DS processor is a custom ARM with a custom PICA200 soldered onto it, but we have no idea it's actual individual power draw or speed in mhz. The ram, however on the 3DS is a known and widely available 128MB chip who's serial number made it very easy to find out info on.

The wattage we'll be able to mostly figure out once the system is out and we start hooking it up to a wattage meter... but in so far as individual parts, we might run into the same problem GPU/CPU wise.
 

Kenka

Member
Yeah, I wanted to point that out: the Wikipedia entry of the Hollywood GPU in the Wii makes no mention of DirectX support, number of SPUs, and such.



Since the WiiU has a GPGU made by AMD, and based on what the OP has said, the following product (on the right) looks like a perfect candidate right ?

10b.jpg


But if Nintendo uses this off the shelve product, then in what did they spend their R&D budgets these last years ?

576 GFLOPS would make it about 3x less powerful than what the PS4 will be supposed to be. This doesn't sound too bad frankly.
 

ozfunghi

Member
Well... it depends. Off the shelf parts (like RAM) are easy to look up part numbers and get information on. Custom parts, however, you can't tell just by looking at them unless that information is public for some reason.

For example, we know the 3DS processor is a custom ARM with a custom PICA200 soldered onto it, but we have no idea it's actual individual power draw or speed in mhz. The ram, however on the 3DS is a known and widely available 128MB chip who's serial number made it very easy to find out info on.

The wattage we'll be able to mostly figure out once the system is out and we start hooking it up to a wattage meter... but in so far as individual parts, we might run into the same problem GPU/CPU wise.

Not necessarily by cracking it open, info on Wii hardware was a leak from a dev iirc.


Yeah, I wanted to point that out: the Wikipedia entry of the Hollywood GPU in the Wii makes no mention of DirectX support, number of SPUs, and such.

Wii doesn't have programmable shaders but uses TEV. It doesn't have DX support because it uses no DX. You want to know DX-level features.


Since the WiiU has a GPGU made by AMD, and based on what the OP has said, the following product (on the right) looks like a perfect candidate right ?

This is the GPU bgassassin used as a point of reference. But it would presumably not be based on that chip.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Yeah I'm confused. I thought 75w was it's max draw at time, but typically 45. What typically means is anyone's guess. Is that while sitting at the menu, watching a movie, miiversing, or full on gaming?
Typical draw for a unit means the unit is "well" loaded, but not doing anything abnormal (like a power-on sequence, or a power virus, or any other pathological case). Typical for a device would be the sum of the typical draws of the units that get used during the typical use of the device. For something like the WiiU I'd guess 'device typical' is the sum of typical draws of the CPU + GPU + buses + edram + RAM + pad streaming (that includes pad radio).

I thought some dude took a photo of the PSU a while ago and it was 90W? I'm possibly remembering wrong though.
No, it was first reported as being 75W. That's how we've known the figure for a long time now (well, since WUST4 or something).
 
Yeah, I wanted to point that out: the Wikipedia entry of the Hollywood GPU in the Wii makes no mention of DirectX support, number of SPUs, and such.



Since the WiiU has a GPGU made by AMD, and based on what the OP has said, the following product (on the right) looks like a perfect candidate right ?

10b.jpg


But if Nintendo uses this off the shelve product, then in what did they spend their R&D budgets these last years ?

576 GFLOPS would make it about 3x less powerful than what the PS4 will be supposed to be. This doesn't sound too bad frankly.

You still have to test and compare off the shelf parts, hand made prototypes still probably costs tens of thousands each... and even if the hardware is BASED on an existing design, we at least have heresay from developers that they were constantly asking Nintendo for additional features and the like.
 

IdeaMan

My source is my ass!
Could someone put this recent power draw discussion into layman's terms? It seems like this could be a decent indicator of what we're looking at for overall performance.

The best analogy would be food i think

Consider a stomach having the average capability to digest for 10kg (40/45w) of delicacies, composing a meal.

You know thanks to ingredients list and information in the packages that in this meal, you have roughly 2kg of snacks (CPU), 1kg of chocolate (RAM), 1,5kg of french pastries, 2kg of pizza/quiche/pie, and between 0,5 to 2kg of candies (USB ports used, and in this case, your stomach stretch a bit to a capacity of 15kg - 70/75w). But there is a last element of this meal, essential, and its weight isn't documented on the box, it's bagels (GPU). With a mix of assumptions, speculation, search on the average quantity of the other stuff you ate, you can conclude that there is room for 3kg of bagels, a rather reasonable quantity, pushing you to think that, on the contrary to shaders, there are "enough bagels".

But this kind of thinking has its flaws, because 3kg of bagels combined with its brand (from the nearby baker - aka derived from RV700) doesn't say everything about all the ingredients used for its preparation, the technique to cook it (all the customization Nintendo did on the RV700 for a long time), so you can't know for sure if those bagels are of poor quality (lower end performances of RV700) or good (medium to high, but no too high because well, the 3kg limit + the brand make that it's nearly impossible to have such bagels).
 

AzaK

Member
Typical draw for a unit means the unit is "well" loaded, but not doing anything abnormal (like a power-on sequence, or a power virus, or any other pathological case). Typical for a device would be the sum of the typical draws of the units that get used during the typical use of the device. For something like the WiiU I'd guess 'device typical' is the sum of typical draws of the CPU + GPU + buses + edram + RAM + pad streaming (that includes pad radio).


No, it was first reported as being 75W. That's how we've known the figure for a long time now (well, since WUST4 or something).

And by "well", that would generally mean "playing a game"? Even then though, the power levels will fluctuate right? NSMBU won't suck as much juice as CoD.

If so, then we really just don't know what it's capable of when pushed to the limits.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
And by "well", that would generally mean "playing a game"? Even then though, the power levels will fluctuate right? NSMBU won't suck as much juice as CoD.
Right. If you lookup the term 'power virus' you'll see that it's a condition in a device where it draws too much power above the norm, and is normally associated with intentional, perhaps malicious programming of the device to deliberately get it in such a state. It's not impossible for a perfectly good 'useful' code though to have the characteristics of a power virus. Actually, in some modern architectures where the 'typical' power draw relies on a bunch of snooze-like features for the idling blocks in the device, the manual might tell you what not to do so your code would not unintentionally turn into a power virus. Crazy fun times we're living in.
 

AzaK

Member
Right. If you lookup the term 'power virus' you'll see that it's a condition in a device where it draws too much power above the norm, and is normally associated with intentional, perhaps malicious programming of the device to deliberately get it in such a state. It's not impossible for a perfectly good 'useful' code though to have the characteristics of a power virus. Actually, in some modern architectures where the 'typical' power draw relies on a bunch of snooze-like features for the idling blocks in the device, the manual might tell you what not to do so your code would not unintentionally turn into a power virus. Crazy fun times we're living in.

So I don't want to get this wrong, but when 45W was mentioned for typical draw, then that could have been while playing NSMBU and that the Wii U could quite possible draw much more for CoD. Therefore, when we're subtracting wattages to get to the GPU wattage, it could very well be a GPU that can draw 30W and therefore more powerful than we are thinking.

Or am I missing something fundamental here?
 

Ryoku

Member
Yeah, I wanted to point that out: the Wikipedia entry of the Hollywood GPU in the Wii makes no mention of DirectX support, number of SPUs, and such.



Since the WiiU has a GPGU made by AMD, and based on what the OP has said, the following product (on the right) looks like a perfect candidate right ?

10b.jpg


But if Nintendo uses this off the shelve product, then in what did they spend their R&D budgets these last years ?

576 GFLOPS would make it about 3x less powerful than what the PS4 will be supposed to be. This doesn't sound too bad frankly.

Quite a few people here, including me, believe that the e6760 might be the most similar to Wii U's custom GPU, and we've thought this for a while now. Regarding your GFLOP rating, keep in mind that although it's 576GFLOPs, the e6760 is a slightly more powerful GPU than the Radeon4850, which is a 1000GFLOP (or 1TFLOP) GPU. This is why I don't like FLOP comparison between GPUs unless the comparison is made between two GPUs from the same architecture.
 

Kenka

Member
Quite a few people here, including me, believe that the e6760 might be the most similar to Wii U's custom GPU, and we've thought this for a while now. Regarding your GFLOP rating, keep in mind that although it's 576GFLOPs, the e6760 is a slightly more powerful GPU than the Radeon4850, which is a 1000GFLOP (or 1TFLOP) GPU. This is why I don't like FLOP comparison between GPUs unless the comparison is made between two GPUs from the same architecture.
So that 1TFLOP talk about Samaritan was related to some precise architecture ? If yes, then we don't know what the 1.8 TFLOPS of the PS4 will suffice then either. But thanks for pointing this out. When you think of what WiiU could have ended up be based on some negative developer talk, having an E6760 in WiiU's insides is definitely good news.

HDMI 1.4a would mean also Dolby True HD. That would be a great confirmation (read 5.1 or 7.1 is supported).
 

beril

Member
But if Nintendo uses this off the shelve product, then in what did they spend their R&D budgets these last years ?

I've said this before, but I think there's a big misconception about Nintendos R&D budget. Whenever someone posts their r&d number from their financial reports people assume it's just for hardware and possibly some strange research projects, when that number almost certainly includes all their game development (as in R&D). There is no other number in their reports that could really cover game budgets. Another fairly big clue is that their software teams actually used to be called R&D, before changing to EAD, which means pretty much the same thing.

That being said; I don't think they used an off the shelf part, but nor do I think Nintendo are financing the development of the GPU. Isn't the Hollywood pretty much AMDs most sold GPU ever? They'll put in significant resources in creating its successor and meeting any specifications Nintendo gives them.
 

IdeaMan

My source is my ass!
I really think people should be cautious with those rather "simplistic" GPU comparisons based mostly on FLOPS, etc.

The way they designed the 3DS Pica, their habit to heavily customize and adapt to their need components, could produce a 300gflop GPU, but with added features like specialized fixed functions for lighting or other areas that, if taken advantage of, should simulate/fake the results on screen of a 500 or 600 gflop (or even more i don't know) traditional GPU.

We talked about that a lot in Wii U speculation threads.
 

Ryoku

Member
So that 1TFLOP talk about Samaritan was related to some precise architecture ? If yes, then we don't know what the 1.8 TFLOPS of the PS4 will suffice then either. But thanks for pointing this out. When you think of what WiiU could have ended up be based on some negative developer talk, having an E6760 in WiiU's insides is definitely good news.

HDMI 1.4a would mean also Dolby True HD. That would be a great confirmation (read 5.1 or 7.1 is supported).

Keep in mind that even if the e6760 ends up being the most similar off-the-shelf GPU to Wii U's custom GPU, the Wii U won't have an e6760 inside it. It'll be a custom-made GPU, developed side-by-side with AMD to fit their (Nintendo's) needs.

Regarding PS4's GPU, I think it's a consensus that it's going to be (or at least points to) an underclocked Pitcairn architecture (underclocked 7870 or 7850) @ 1.8 TFLOPs. In a console, this is great. Definitely better than what Wii U has, but not alien technology at all.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So I don't want to get this wrong, but when 45W was mentioned for typical draw, then that could have been while playing NSMBU and that the Wii U could quite possible draw much more for CoD. Therefore, when we're subtracting wattages to get to the GPU wattage, it could very well be a GPU that can draw 30W and therefore more powerful than we are thinking.

Or am I missing something fundamental here?
Basically you're asking me if a GPU could draw 30W in the quoted power-draw frame? Yes, it could.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Do we know anything about the ARM processors that are most likely in the controller and possibly in the system itself?
Chances are you could squash them with your thumb.
 

AzaK

Member
Basically you're asking me if a GPU could draw 30W in the quoted power-draw frame? Yes, it could.

Haha yes. Thanks.

Basically I see people taking 45W, subtracting W's left right and centre for various components. Then going, "Well that leave 10W for the GPU, so it will do 5 poly's / second"

I was trying to highlight that in fact that is not necessarily the way to look at it and it could be a lot more powerful than that 10W left over.
 

Knux

Neo Member
Why is everybody so sure that the GPU in the WiiU is E6760 based? This Chip is capable of "DirectX 11" Shader Model5/openGL 4.0 and we KNOW that the WiiU Chip can't do that. (Based on OP's rumor)

The OP Rumor states that the GPU "supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)", which actually looks exatly what the E4690 from the specsheet brings to the table... Combined with the first claims that the WiiU GPU is not even 2x 360, this paints a very clear picture for me..

The wiiU is much more likely running on a underclocked E4690 rather than a E6760. I want to be optimistic too, but even the most cautiously optimistic predictions have been smashed into bits in the past (see the espresso fiasco).
 

Meelow

Banned
Why is everybody so sure that the GPU in the WiiU is E6760 based? This Chip is capable of "DirectX 11" Shader Model5/openGL 4.0 and we KNOW that the WiiU Chip can't do that.

The OP Rumor states that the GPU "supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)", which actually looks exatly what the E4690 from the specsheet brings to the table... Combined with the first claims that the WiiU GPU is not even 2x 360, this paints a very clear picture for me..

The wiiU is much more likely running on a underclocked E4690 rather than a E6760. I want to be optimistic too, but even the most cautiously optimistic predictions have been smashed into bits in the past (see the espresso fiasco).

When was there a confirmation that it couldn't?
 
Why is everybody so sure that the GPU in the WiiU is E6760 based? This Chip is capable of "DirectX 11" Shader Model5/openGL 4.0 and we KNOW that the WiiU Chip can't do that.

The OP Rumor states that the GPU "supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)", which actually looks exatly what the E4690 from the specsheet brings to the table... Combined with the first claims that the WiiU GPU is not even 2x 360, this paints a very clear picture for me..

The wiiU is much more likely running on a underclocked E4690 rather than a E6760. I want to be optimistic too, but even the most cautiously optimistic predictions have been smashed into bits in the past (see the espresso fiasco).

Except that was never stated anywhere.
 

Knux

Neo Member
It was only stated in OP's topic (for instance), which also has the espresso rumor included, which basically is such a weird thing that it must be true.

Other rumors also heavily pointed out that the WiiU GPU is not in the DX11 generation.
 

antonz

Member
It was only stated in OP's topic (for instance), which also has the espresso rumor included, which basically is such a weird thing that it must be true.

Other rumors also heavily pointed out that the WiiU GPU is not in the DX11 generation.

The Original post was lifted from a Gaf Member who has been in this very thread telling people that the data present is not complete and specifically pointed out the fact the GPU is actually beyond the baselines listed.

This entire thread should have been nuked a long time ago considering the threads source came in and called out the thread.
 

Meelow

Banned
The Original post was lifted from a Gaf Member who has been in this very thread telling people that the data present is not complete and specifically pointed out the fact the GPU is actually beyond the baselines listed.

This entire thread should have been nuked a long time ago considering the threads source came in and called out the thread.

Now that you mention it this topic's rumor states the Wii U only has 1GB but it's confirmed to have 2GB.

Only if this rumor is from past dev kits and not the final one which I'm guessing.
 

antonz

Member
Now that you mention it this topic's rumor states the Wii U only has 1GB but it's confirmed to have 2GB.

Only if this rumor is from past dev kits and not the final one which I'm guessing.

Well the issue there is the documentation specifically mentions ram available for use by Developers at this time. So it is accurate in that regard.
 

Knux

Neo Member
Good to know.. Then there might be hope. I just can't seem to get any hope low enough after being burned and burned again with believing nintendo might bring adequate specs to the table. On the other Hand, Iwata specifically stated that it was a GPGPU, which actually points more to the OpenCL capable E6760 ... oh well
 
I'm going to assume this was already posted but, this is not good

http://forum.beyond3d.com/showpost.php?p=1666876&postcount=3313

i looked some new WiiU footage
Ninja gaiden seam sub720 with poor AA (wait retail version for confirm because is strange, is under the PS3/X360 version)
http://www.youtube.com/watch?v=FAreMlwF-3A&feature=plcp

and Tank tank is 1280x360 no AA (+ probably 850x480 on gamepad. it's surely multiplayer mode with aproximatly the same pixel number on TV and Gamepad)
http://www.youtube.com/watch?v=Eo7wPaz4LDQ&feature=plcp
http://imageshack.us/a/img99/1466/tanktank.jpg

welcome to the nextgen sub720 world

EDIT:
i verified another time the NG video, it's probably dynamic resolution for stabilize framerate
sometimes 576p, sometimes 630p but i have grabed one 720p screen too

In the end it's not as bad as it looks, some Tank Tank gimmick game is SD for some controller/multiplayer related version, and NG is a port.

But still.
 

Meelow

Banned
Well the issue there is the documentation specifically mentions ram available for use by Developers at this time. So it is accurate in that regard.

Which means...?

(Sorry, I'm not the best at specs)

I'm going to assume this was already posted but, this is not good

http://forum.beyond3d.com/showpost.php?p=1666876&postcount=3313



In the end it's not as bad as it looks, some Tank Tank gimmick game is SD for some controller/multiplayer related version, and NG is a port.

But still.

Didn't Team Ninja say the Wii U version of Ninja Gaiden will look the best?

The system does have 2GB of ram but the Developers only get to play with 1GB right now pending any future changes to the 1GB reserved for System use.

Ahh, thanks.
 

F#A#Oo

Banned
How are you guys still discussing specs with no specs?

Also does it even matter now? I mean so far it seems the technology is more than competent and should be okay for the next 5 years. Obviously it won't stand up to what ever MS and Sony have planned but I think they can carve out their own slice of the gaming pie.
 
I guess he's trying to say the res is lower than the PS3/360 versions.

But since he edited in that he later found one frame was 720P, I'm going to guess all 3 versions sport dynamic res so I'd disregard that comment for now. Without detailed analysis no point in comparing dynamic resolution games pixel for pixel.

Hell, I'm not even totally convinced that video is running on Wii U. Dont see any reason Nintendo couldn't have used the 360 version at a glance for some anonymous sizzle reel (but I didn't look into the matter, at all)
 

SmokyDave

Member
How are you guys still discussing specs with no specs?

Also does it even matter now? I mean so far it seems the technology is more than competent and should be okay for the next 5 years. Obviously it won't stand up to what ever MS and Sony have planned but I think they can carve out their own slice of the gaming pie.
You know how me 'n' thee disagree quite a lot?

This is one of those times.

If the pixel counting above is accurate, we're in for some giggles.
 

antonz

Member
What does he mean with "under the PS3/360 version"?

Ninja Gaiden 3 uses a dymanic resolution system that allows it to adjust as needed to maintain performance. Guy initially assumed that this is somehow unique to the Wii U version but the PS3 and 360 version resolution jumped all over the place as well.

Considering the state of the original game even with the extra time given to the Wii U version I expect a disaster as they are trying to add to the game and fix broken content more than worry about performance.

Nintendo is cozy with the studio and saw a chance for a bloody game for launch regardless of quality
 
You know how me 'n' thee disagree quite a lot?

This is one of those times.

If the pixel counting above is accurate, we're in for some giggles.

Far be it for me to defend the Wii U, but we can still fall back on the bad SDK/early software/ports/learning the new CPU/whatever excuse.

That said of course it's not positive.

At this point to make a definitive determination I really, really need some GPU functional units/clocks or gflops numbers. Or a game that actually looks significantly better than anything on PS360.
 

SmokyDave

Member
Far be it for me to defend the Wii U, but we can still fall back on the bad SDK/early software/ports/learning the new CPU/whatever excuse.

That said of course it's not positive.

At this point to make a definitive determination I really, really need some GPU functional units/clocks or gflops numbers.
I can't buy those excuses when during any other generational transition we'd already have:
a game that actually looks significantly better than anything on PS360.
There just isn't one though.
 

AzaK

Member
Far be it for me to defend the Wii U, but we can still fall back on the bad SDK/early software/ports/learning the new CPU/whatever excuse.

That said of course it's not positive.

At this point to make a definitive determination I really, really need some GPU functional units/clocks or gflops numbers. Or a game that actually looks significantly better than anything on PS360.


I imagine we'll have to wait for a teardown for anything specific.
 

majik13

Member
HDMI 1.4a would mean also Dolby True HD. That would be a great confirmation (read 5.1 or 7.1 is supported).

Sorry not a tech head, but I thought the audio only supports linear pcm 5.1. No Dolby support. which I believe sucks support wise. but should sound great because it is uncompressed, if you get a compatible hdmi receiver. correct me if I am wrong
 
Top Bottom