• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

FyreWulff

Member
Wonder what changes Nintendo made to dodge licenses this time.

Not very hard. Disney pays as little as possible themselves. If you notice, their movies are "Disney DVD" and "Disney Blu Ray", they don't actually pay to use the DVD/BD trademarks.

You can make DVD/BD compliant disks, but if you want that delicious trademarked logo on them or your player you have to pay the licensing fee for that trademark and all the requirements that come with it, ie supporting java and all that stuff.
 

BlazinAm

Junior Member
Can anybody explain technically, why an overclocked 3 core broadway derivitive is a bad thing for the Wii U CPU?

It might not support modern parallelism techniques that the ps3 and xbox 360 supports. That is probably the biggest thing.

Edit: When next gen rolls around it will be even more behind in that department.
 

Shaheed79

dabbled in the jelly
No, it doesn't get the RAM wrong.

1 gig is currently all games can use. Might be 2 gigs of RAM in the thing, but that doesn't mean much if only 1 gig is usable for games.

And honestly mentioning GPGPU stuff is kind of a misnomer. That functionality has existed since 2004. The 360 has the capability. WiiU it's likely expanded on, but just having GPGPU functionality isn't exactly much more of a tell on the power in the system than using a unified shader model GPU.

We need to know what the general bandwidth and fillrate are, how many stream processors, triangle counts etc.

Honestly, if they did indeed know the amount of ram the final system actually had, then not mentioning that the total system RAM was 2gig was and is disingenuous. That is like leaking the 360 and PS3 memory and only mentioning the amount of ram that was specifically useable by games at the time of launch. No one does that because that number can change over time. Why they chose to do that in this "leak" gave me more than a short pause.

Has no one else noticed that these sort of Wii U "leaks" always seem to happen right before Nintendo has a big conference revealing more info about the hardware? I don't believe that is a coincidence, and not the way in which they were able to receive the information, that Nintendo was about to reveal, early. I do not believe that anything in the OP, that is supposed to be new information about the Wii U hardware, has been confirmed by Nintendo or its partners. The fact that there were glaring omissions, which were confirmed by Iwata two days later, gives it even less credibility.

CPU: “Espresso” CPU on the Wii U has three enhanced Broadway cores
Whatever this is supposed to mean regarding the CPU specs, it has not been confirmed.

GPU: “GPU7” AMD Radeon™-based High Definition GPU. Unique API = GX2, which supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)
No new information on this line has been confirmed. GPGPU was omitted.

Memory: Mem1 = 32MB Mem2 = 1GB (that applications can use)
edram amount I don't believe is confirmed yet. Was "applications" referring to games or OS software? Either way, a whole extra gig of ram was omitted and this cannot be overlooked.

Storage: Internal 8 GB with support for SD Cards (SD Cards up to 2GB/ SDHC Cards up to 32GB) and External USB Connected Hard Drives
8gb internal flash storage, usb connected HDD and SD card support were already known. 32gb internal storage model omitted.

Networking: 802.11 b/g/n Wifi
Already known.

Video Output: Supports 1080p, 1080i, 720p, 480p and 480i
Already known.

Video Cables Supported:

Compatible cables include HDMI, Wii D-Terminal, Wii Component Video, Wii RGB, Wii S-Video Stereo AV and Wii AV.
Already known.

USB: Four USB 2.0 Ports
Already known.

Whoever the source is for the leak in the OP, either they don't know much about the final Wii U hardware, or they do, and were being purposely disingenuous in describing it. In either case, if I were vgleaks I wouldn't use that person as a trusted source for accurate information in the future.
 

ADANIEL1960

Neo Member
It might not support modern parallelism techniques that the ps3 and xbox 360 supports. That is probably the biggest thing.

Edit: When next gen rolls around it will be even more behind in that department.

Being multicore would kind of suggest support of parallelism techniques
 
Is it just me or have Wii U threads calmed down a lot more since the conference. The power of Bayonetta 2? Anyway even though the info in this thread is technically outdated I love reading all this stuff so keep it going.
 
It's weird. The comparisons show that the 6570 performs on par, sometimes noticeably better, sometimes noticeably worse than the 4850, whereas the specs differences suggest that the 4850 should just outperform it very noticeably.

http://www.hwcompare.com/10468/radeon-hd-4850-1gb-vs-radeon-hd-6570-oem-1gb/

Oh well. I've always trusted direct FPS comparisons over what the specs may suggest, so yeah...

Thanks for the clarification, again.

Well some of them the 4850 couldn't perform for obvious reasons and some dealt with watts where the 6570 had a clear advantage. The interesting thing to me was how the gap closed or even slightly reversed on higher settings. Though I should add that the 4850 in this had 512MB of GDDR5 while the 6570 had 1GB. There wasn't a 1GB 4850 available for selection. But yeah it goes back to not solely relying on FLOPs as an indicator and that the performance, if the GPU is like I think it is, could achieve and possibly surpass a 4850.

Honestly, if they did indeed know the amount of ram the final system actually had, then not mentioning that the total system RAM was 2gig was and is disingenuous. That is like leaking the 360 and PS3 memory and only mentioning the amount of ram that was specifically useable by games at the time of launch. No one does that because that number can change over time. Why they chose to do that in this "leak" gave me more than a short pause.

Has no one else noticed that these sort of Wii U "leaks" always seem to happen right before Nintendo has a big conference revealing more info about the hardware? I don't believe that is a coincidence, and not in the way in which they were able to receive the information, that Nintendo was about to reveal, early. I do not believe that anything in the OP, that is supposed to be new information about the Wii U hardware, has been confirmed by Nintendo or its partners. The fact that there were glaring omissions, that were confirmed by Iwata two days later, gives it even less credibility.



Whoever the source is for the leak in the OP, either they don't know much about the final Wii U hardware, or they do, and were being purposely disingenuous in describing it. In either case, if I were vgleaks I wouldn't use that person as a trusted source for accurate information in the future.

I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
 

LCGeek

formerly sane
Is it just me or have Wii U threads calmed down a lot more since the conference. The power of Bayonetta 2? Anyway even though the info in this thread is technically outdated I love reading all this stuff so keep it going.

Because the conference did more to shut up certain detractors than gave them ammo to keep flaming and trolling this nintendo platform. Doubt anyone would've seen bayonetta 2 as an exclusive especially for a system that won't have enough powers supposedly for devs next gen.

ADANIEL1960 what you say is true and applies to the P3 as well in respect to its predcessors.
 

NBtoaster

Member
Honestly, if they did indeed know the amount of ram the final system actually had, then not mentioning that the total system RAM was 2gig was and is disingenuous. That is like leaking the 360 and PS3 memory and only mentioning the amount of ram that was specifically useable by games at the time of launch. No one does that because that number can change over time. Why they chose to do that in this "leak" gave me more than a short pause.

Has no one else noticed that these sort of Wii U "leaks" always seem to happen right before Nintendo has a big conference revealing more info about the hardware? I don't believe that is a coincidence, and not in the way in which they were able to receive the information, that Nintendo was about to reveal, early. I do not believe that anything in the OP, that is supposed to be new information about the Wii U hardware, has been confirmed by Nintendo or its partners. The fact that there were glaring omissions, that were confirmed by Iwata two days later, gives it even less credibility.



Whoever the source is for the leak in the OP, either they don't know much about the final Wii U hardware, or they do, and were being purposely disingenuous in describing it. In either case, if I were vgleaks I wouldn't use that person as a trusted source for accurate information in the future.

It's labelled rumor for a reason. Nintendo are never going to confirm any of this.

And the source of this rumor has been in this thread and another "insider" member has corroborated it.
 
Is it just me or have Wii U threads calmed down a lot more since the conference. The power of Bayonetta 2? Anyway even though the info in this thread is technically outdated I love reading all this stuff so keep it going.


People love to say that the Wii U is outdated before they've even seen what it can really do with games actually made for it, not ports. This will continue until....well......it will probably never stop when I think about it.




The Wii U CPU is not outdated, it's custom built and been in design/production since 2009 and is a descendant of the Broadway CPU (Wii U CPU based on PowerPC 476?) but not just "a Broadway with 3 cores overclocked."

The Wii U GPU has features that are beyond DX10.1 and on par with DX11 effects (compute shaders). The Wii U GPU will show 3-4x the performance of the Xbox 360 GPU. It's really not that much when you think about it since the GPU in the 360 is 7 years old now.

So basically, when the Wii U's engine is running on all cylinders (and not trying to run ports from old tech) it will produce games with graphics a good range better than the current HD consoles.

Cost of production is a big factor too (research why Bayonetta 2 is a Wii U exclusive) all that extra power that "everyone" wants will not due one bit of good if game developers will need to kill themselves working so hard to make every game a blockbuster and if those don't sell it's adios muchachos for them.

I can see the Wii U being a happy medium for everyone.
 
Because the conference did more to shut up certain detractors than gave them ammo to keep flaming and trolling this nintendo platform.
It didn't really provide anything new about hardware that was particularly concrete beyond the use of the term "GPGPU."

I could be wrong but I think even the power draw number was old.

With regard to software Bayonetta 2 caused a bit of a stir, but otherwise it was a handful of titles and wasn't a blow-out as was suggested it would be.

All this ultimately moot, because in a month or so's time someone's bound to get their hands on one, if purely for review purposes, tear it down and tell us what's inside. And nothing's going to change what's inside anyway.
I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
That doesn't seem particularly conducive to development.
 

Kenka

Member
bgassassin, if the GPGU's performance in the WiiU translates in a HD 4850 performance on PC, then the overall experience should not be far from what a majority of gaffers have when they currently play their PC (HD 4850 is 20% inferior to a GTX 560ti which is very popular lately).

If this seems correct to you, then 50% of all my issues with WiiU would be solved.
 

NBtoaster

Member
I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.

Do Nintendo just expect devs to try what works instead of giving useful performance information? This can't be true.
 

Shaheed79

dabbled in the jelly
Let me make something very clear. The majority of GPU/CPU's are, what can be defined as, "enhanced" versions of the GPU/CPU from their previous iterations.

The only reason I can think of for someone to use that specific wording, when describing the Wii U CPU, is to take advantage of the fact that most people are unaware that most all of our PC components are enhanced versions of older PC components.

It is only every once in while when a company will fully develop a new architecture, from the ground up, that absolutely could not be described as an "enhanced" version of a previous product they had already developed.

So even when the actual specs of the Wii U CPU are revealed, anyone who has read this rumor, and does not understand how CPU/GPU/RAM development really works, they will still believe that it is just an "enhanced" Wii Broadway CPU, and therefore underpowered or not very capable. In other words, the damage is done so mission accomplished.

This rumor topic has received far more attention than any future topic, that would confirm the CPU is more modern and capable than it is implied to be in here.
 
bgassassin, if the GPGU's performance in the WiiU translates in a HD 4850 performance on PC, then the overall experience should not be far from what a majority of gaffers have when they currently play their PC (HD 4850 is 20% inferior to a GTX 560ti which is very popular lately).

If this seems correct to you, then 50% of all my issues with WiiU would be solved.

Are you trying to say GPU or GPGPU? Also I'm not sure where you got the 20% from, but I don't believe those two are anywhere near that close in performance.

Do Nintendo just expect devs to try what works instead of giving useful performance information? This can't be true.

Apparently yes. So far it sounds like MS is doing the same.
 

Matt

Member
Apparently yes. So far it sounds like MS is doing the same.

...No...obviously the Wii U documentation has all the technical information developers need.

Though it is true that Nintendo did keep most of the info very close to its chest before actual final dev kits were sent out.
 
...No...obviously the Wii U documentation has all the technical information developers need.

Though it is true that Nintendo did keep most of the info very close to its chest before actual final dev kits were sent out.

So they did finally release it then? Because as I understood it they still didn't get the things I was referring to.
 

Shaheed79

dabbled in the jelly
I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
I understand that. But for a rumor topic, labeled "Wii U final specs", this implies that the information comes from an insider who either has access to final Wii U dev kits or was told what the final consumer hardware would be. Unless you are the source of this leak I don't see why me claiming the source may have been purposely disingenuous, if he did have access to final hardware specs, would matter.

EDIT:Well thanks Matt now I can say what I really wanted to say. It is impossible to efficiently develop for final dev kit without having exact specifications. I'm sure the early stuff was general but this topic is suppose to be about "final Wii U specs" so that excuse doesn't really fly anymore.
It's labelled rumor for a reason. Nintendo are never going to confirm any of this.

And the source of this rumor has been in this thread and another "insider" member has corroborated it.
Yet people are discussing this rumor as if it were confirmed. The fact that it is not confirmed has nothing to do with whether or not Nintendo will confirm full specs in the future. Eventually, just like the Wii, someone will crack it open and give us far more accurate general specs.

Who is the source of this rumor and who do they work for? I must have missed it.

I'm not saying this disrespectfully, but I honestly do not care how many "insiders" cosign this rumor. For me that does not validate or invalidate this rumor because I know how gaf and, more broadly, message board communities work.

In my eyes the only people qualified to confirm Wii U specs, pre release, are Nintendo, one of it's partners who are working directly on the Wii U or a confirmed Wii U developer who comes out and says "These are the Wii U specs" and not some vague generalization. Everything else I take with a grain of salt. That doesn't mean that I do not have my own speculation of what I think the Wii U specs most likely are, but I know the difference between rumor, speculation and confirmation.
 
I understand that. But for a rumor topic, labeled "Wii U final specs", this implies that the information comes from an insider who either has access to final Wii U dev kits or was told what the final consumer hardware would be. Unless you are the source of this leak I don't see why me claiming the source may have been purposely disingenuous, if he did have access to final hardware specs, would matter.

Not me. And I may be confused here and don't take this the wrong way as I'm trying to understand, but if it doesn't matter then why put forth the effort to say they may have been purposely disingenuous?


Yet people are discussing this rumor as if it were confirmed. The fact that it is not confirmed has nothing to do with whether or not Nintendo will confirm full specs in the future. Eventually, just like the Wii, someone will crack it open and give us far more accurate general specs.

Who is the source of this rumor and who do they work for? I must have missed it.

I'm not saying this disrespectfully, but I honestly do not care how many "insiders" cosign this rumor. For me that does not validate or invalidate this rumor because I know how gaf and, more broadly, message board communities work.

In my eyes the only people qualified to confirm Wii U specs, pre release, are Nintendo, one of it's partners who are working directly on the Wii U or a confirmed Wii U developer who comes out and says "These are the Wii U specs" and not some vague generalization. Everything else I take with a grain of salt. That doesn't mean that I do not have my own speculation of what I think the Wii U specs most likely are but I know the difference between rumor, speculation and confirmation.

Two already have though.
 

mrklaw

MrArseFace
!!!BACK-OF-THE-ENVELOPE ALERT!!!

Assuming R7xx architecture (because of "DX 10.1" reference; originally 55nm) shrunk to 40nm at 30% power savings, 25W puts us at 35W of 55nm chip equivalent, which is roughly the halfway point between Radeon HD 4550 and Radeon HD 4650. IOW 240 stream processors, 24 texture units, 600MHz.

This means nothing. It's a computation based on a speculative consumption figure and preexisting products. Don't blame me if it's not pleasant to you somehow.


Its good to see lots of caveats even in an informative post

I think as usual, we're confused by nintendo's ambiguity. They haven't said '75W maximum PSU draw at wall', nor have they defined what 'normal' use is. We have very little context to make any approximations

Personally, my guess would be that the 45/75 numbers are console current. Doesn't make sense to quote PSU figures for one and console figures for the other. Either they are both PSU figures, and efficiency means the actual console is maybe 25-50W, or it's two example figures eg 45W based on web browsing, miiVerse etc, and 75W is when playing a game that is pushing the COU and GPU, and streaming to the gamepad.

Still based on ambiguous information, but it just seems the simplest explanation
 

Shaheed79

dabbled in the jelly
Not me. And I may be confused here and don't take this the wrong way as I'm trying to understand, but if it doesn't matter then why put forth the effort to say they may have been purposely disingenuous?
I am saying it shouldn't matter to you that I call that person disingenuous for reasons stated in my first reply. I assumed that you were taking issue with me making that claim.



Two already have though.

If you do not mind who are the two, what company do they work for and what did they say?
 

Kenka

Member
Are you trying to say GPU or GPGPU? Also I'm not sure where you got the 20% from, but I don't believe those two are anywhere near that close in performance.
Pardon me, I am playing out of my league, I may use words alien to me to express my thoughts, I was refering to graphics computing only. I don't know if you can still split CPU and GPU functions in a GPGU, that's why I used the word.
 

mrklaw

MrArseFace
Arent modern GPUs more power efficient than previous ones? So assuming Nintendo will customise heavily, wouldn't you just go in with an idea of die size (for yield/cost) and power consumption, and your preferred architecture (eg 128-bit bus, ddr3/gddr5 etc)? Then just fill the silicon like at a Pick and mix counter, deoending whether you want more or less shaders or SIMDs? Isn't that basically what AMD/nvidia do now already? Their portfolios are simply variations on the same architectural theme, just faster/slower, different bus widths and more/less shaders?

Would you go as modern as you could and trade off cost for performance? Eg although perhaps AMD don't do an equivalent of a 35W 7xxx series, you could probably spec one with lower speeds and shader units. Just that the licensing cost might be a bit high initially

How much power do you save by stripping out things like driving multiple displays etc?
 

Kenka

Member
How much power do you save by stripping out things like driving multiple displays etc?

Well, you have to display information on two (or three) different screens with the WiiU. Having a GPU (or GPGU in our case) with Eyefinity if definitely a must.
 

Oblivion

Fetishing muscular manly men in skintight hosery
For a example this?

nintendo+land+mario+chase+screen+3.jpg


I haven't seen one PS3/360 game that looks just as good lighting wise, and I think the textures and detail are great.

Excuse me? What's so impressive about this pic? Looks awful.
 

Ryoku

Member
Well, you have to display information on two (or three) different screens with the WiiU. Having a GPU (or GPGU in our case) with Eyefinity if definitely a must.

Just trying to simply correct a mistake. The term is GPGPU (general purpose graphics processing unit), not GPGU :)

bgassassin, if the GPGU's performance in the WiiU translates in a HD 4850 performance on PC, then the overall experience should not be far from what a majority of gaffers have when they currently play their PC (HD 4850 is 20% inferior to a GTX 560ti which is very popular lately).

If this seems correct to you, then 50% of all my issues with WiiU would be solved.

The GTX560ti is much more than just 20% faster than a Radeon4850. Even a GTX460 is more than just 20% faster than a 4850. Do expect worse image quality on Wii U than on PCs because the power will go to extra graphical eye-candy rather than anti-aliasing, resolution, anisotropic filtering, and even higher-res textures, which PCs have the extra power to spend on. Not to say games will look bad; they'll look better than Uncharted, for example, or even Last of Us (assuming the game is developed ground-up on Wii U as a graphical showcase, etc. etc.). You may even get games that look better (image quality aside) than current PC games. This will be even more so with PS4/720.

Excuse me? What's so impressive about this pic? Looks awful.

Read the last couple of pages.
 

The_Lump

Banned
You can tell whether or not now the shadows are real time by moving the light source or tinkering the radiosity.

Edit: I wonder if the wii u can do any form of ambient occlusion. That would go a long way in building great lighting and shadow system.

I'm pretty sure it can be seen in Zombiu. Checkout the nursery demo; seems to be in effect on the toys on the shelves most notably. Looks rather tasty when he shines the flashlight on them.
 

Kenka

Member
Just trying to simply correct a mistake. The term is GPGPU (general purpose graphics processing unit), not GPGU :)
Erf. Thanks to correct me.

The GTX560ti is much more than just 20% faster than a Radeon4850. Even a GTX460 is more than just 20% faster than a 4850. Do expect worse image quality on Wii U than on PCs because the power will go to extra graphical eye-candy rather than anti-aliasing, resolution, anisotropic filtering, and even higher-res textures, which PCs have the extra power to spend on. Not to say games will look bad; they'll look better than Uncharted, for example, or even Last of Us (assuming the game is developed ground-up on Wii U as a graphical showcase, etc. etc.). You may even get games that look better (image quality aside) than current PC games. This will be even more so with PS4/720.
My bad ! Wikipedia listed 1263.4 GFLOPS for the GTX 560Ti and AMD's website gave 1000 GLOPS for the HD 4850. But I guess comparing FLOPS outputed by two different architectures, and made by two different manufacturers, really is a dumb thing to do, as mentioned earlier today by a gaffer.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Read the last couple of pages.

I did, and honestly don't see why the lighting is supposed to be so impressive. Looks about as good as the lighting on the average 360/PS3 game.


edit: Also, what's this new info on the GPU that's been released?
 
The Wii U CPU is not outdated, it's custom built and been in design/production since 2009 and is a descendant of the Broadway CPU (Wii U CPU based on PowerPC 476?) but not just "a Broadway with 3 cores overclocked."

The Wii U GPU has features that are beyond DX10.1 and on par with DX11 effects (compute shaders). The Wii U GPU will show 3-4x the performance of the Xbox 360 GPU. It's really not that much when you think about it since the GPU in the 360 is 7 years old now.

We dont know any of the above to be a fact. It's basically all pure speculation (and unlikely speculation imo, based on all the evidence we have so far) on your part.
 
I did, and honestly don't see why the lighting is supposed to be so impressive. Looks about as good as the lighting on the average 360/PS3 game.


edit: Also, what's this new info on the GPU that's been released?


Looks exactly like Banjoo for the 360 if you ask me. If someone told me it was the same engine I'd believe them.
 
I think as usual, we're confused by nintendo's ambiguity. They haven't said '75W maximum PSU draw at wall', nor have they defined what 'normal' use is. We have very little context to make any approximations

Personally, my guess would be that the 45/75 numbers are console current. Doesn't make sense to quote PSU figures for one and console figures for the other. Either they are both PSU figures, and efficiency means the actual console is maybe 25-50W, or it's two example figures eg 45W based on web browsing, miiVerse etc, and 75W is when playing a game that is pushing the COU and GPU, and streaming to the gamepad.

Somebody at a Wii U event back in June got a look at the Wii U PSU specs and posted them here (I've even linked the post several times in the last few days). They were a 75 watt spec exactly (5A, 15V). Not a coincidence Nintendo throws out a 75 watt number 3 months later.

What we are looking at is a max PSU spec of 75 watts and no CE device ever approaches it's PSU rating, and more than likely the 45 watts is essentially the correct power envelope to be looking at for the most demanding games.

Sadly, it's this stat that is really going to be the differentiator more than any other between Wii U and PS4/720. Which I'm confident will be at LEAST ~100 watts, if not ~200 watts TDP as PS3/360 were as launch machines.
 
It might not support modern parallelism techniques that the ps3 and xbox 360 supports. That is probably the biggest thing.

Edit: When next gen rolls around it will be even more behind in that department.


That's what the GPGPU is for and is a hell of a lot faster at doing parallelism then any CPU will ever be.
 
Top Bottom