• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

I've been told the reasons since E3 and tried to reassure people who asked why there wasn't AA in a lot of games in several threads, by repeating the words of my sources "there will be AA at launch". True to be told, i haven't understood everything, so i needed some insight of people like alstrong or blu on AA before revealing more, and just forgot doing it as i'm more busy than pre-E3 nowadays.
Okay thanx. There is hope
 

IdeaMan

My source is my ass!
My biggest concern is that most of that 1GB OS RAM will be most of the time sitting there doing nothing. I could go on to discuss this in detail but it's a bit too early, don't want to speculate too violently, need more info first.

I hope they will put in place some kind of dynamic allocating of memory resources, to not waste let's say 0,5GB of ram from this 1GB pool "in case this COD player would need to switch to the browser and store/cache the game". Well, it's true IF in those 1GB of ram, there is indeed a huge "ghost" room reserved for multi-tasking.

We can even imagine a "hardware hardcore mode", where a dev would choose to force a real pause/quit of its game rather than multi-tasking, to fully use this reserved space.

All this is relevant if Nintendo will indeed use this 1GB of ram for "OS" this way.
 

SapientWolf

Trucker Sexologist
My biggest concern is that most of that 1GB OS RAM will be most of the time sitting there doing nothing. I could go on to discuss this in detail but it's a bit too early, don't want to speculate too violently, need more info first.
If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.
 

Stewox

Banned
I hope they will put in place some kind of dynamic allocating of memory resources, to not waste let's say 0,5GB of ram from this 1GB pool "in case this COD player would need to switch to the browser and store/cache the game".

Just pray it's the same kind of memory. I'm positive.


EDIT:
If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Yeah, *beaten*.

Also, it's their multi-tasking suspend stuff that we were talking about, that's why it's such a big chunk but they need to find solutions.
 
If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Indeed. That so much is reserved definitely points to DDR3. The other thing is (and this is where my knowledge of matters is more fuzzy, but I've read some beyond3D posts on the subject :p ) that increasing RAM size without significantly increasing bandwidth will only net you so much. There comes a point where the benefits become more limited in application (you could load up the next level with the extra RAM say, but you're not going to get more frames per second) and a 2 GB to 25-30 GB/s ratio seems way off for gaming applications. 1 GB w/ that bandwidth ballpark makes more sense.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Indeed. That so much is reserved definitely points to DDR3. The other thing is (and this is where my knowledge of matters is more fuzzy, but I've read some beyond3D posts on the subject :p ) that increasing RAM size without significantly increasing bandwidth will only net you so much. There comes a point where the benefits become more limited in application (you could load up the next level with the extra RAM say, but you're not going to get more frames per second) and a 2 GB to 25-30 GB/s ratio seems way off for gaming applications. 1 GB w/ that bandwidth ballpark makes more sense.
More memory will not give you more frames per second anyway, unless we're talking scenarios where your scene assets do not fit in your RAM, so you'd have to do mid-frame fetches of assets from storage, in which case storage speed becomes the bottleneck. But how is 2GB not better than 1GB at the same BW? Yes, there might be a diminishing return with increasing the amount to, say, 32GB from 16GB as you have to fill up that memory with something. At 25GB storage you can try to preload most of your content off the disk, but at 22.5MB/s that's 18.5 minutes to read the entire disk, so you need to do that in the same streaming fashion anyway. But we're not talking such amounts here - we're talking 1GB vs 2GB. Trust me, the latter is definitely better than the former.
 

Xun

Member
I know the GPU should be hopefully quite meaty, but any idea of how the CPU will stack up against the PS3/360?
 
More memory will not give you more frames per second anyway, unless we're talking scenarios where your scene assets do not fit in your RAM, so you'd have to do mid-frame fetches of assets from storage, in which case storage speed becomes the bottleneck. But how is 2GB not better than 1GB at the same BW? Yes, there might be a diminishing return with increasing the amount to, say, 32GB from 16GB as you have to fill up that memory with something. At 25GB storage you can try to preload most of your content off the disk, but at 22.5MB/s that's 18.5 minutes to read the entire disk, so you need to do that in the same streaming fashion anyway. But we're not talking such amounts here - we're talking 1GB vs 2GB. Trust me, the latter is definitely better than the former.

Point well taken. I agree more is better even at the same bandwidth, but you don't think there is a diminishing return beyond 1 or 1.5 GB or RAM w/ low bandwidth? From what I have read, the golden ratio of capacity:bandwidth could not be agreed upon, but having higher bandwidth memory definitely seemed to impact performance. Forgetting about the reserved amount for system applications for a moment, wouldn't you rather have 1 GB of high bandwidth GDDR5 than 2 GB of DDR3 at half the throughput? Obviously, performance is affected by the GPU and entire system as well (plus the benefits of the eDRAM, which should go beyond Xenos, as you so kindly explained), but we don't know at this point what the main graphical bottleneck of the system is.
 

FyreWulff

Member
... ff..finally, sort of, FCC IDs not revealed here obviously.

related post: http://www.neogaf.com/forum/showpost.php?p=41490721&postcount=818

Y2sjq.jpg




Definitely not bluetooth :)

This makes sense, as someone else on GAF crunched the numbers and figured out that the bandwidth needed to push the images out to the gamepads was about the same as Wireless-N's upper limit, so 5GHz networking makes sense.

Also means GamePads shouldn't be murdered by microwaves being run.
 
This makes sense, as someone else on GAF crunched the numbers and figured out that the bandwidth needed to push the images out to the gamepads was about the same as Wireless-N's upper limit, so 5GHz networking makes sense.

Also means GamePads shouldn't be murdered by microwaves being run.

Yes, but that's the A/V signal. What is sending the actual input data to and from the Gamepad? I always assumed Bluetooth for that.
 

Daschysta

Member
If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Unless they were being conservative and always intended to allocate more for gaming once they had a true idea of the footprint the OS would require during everyday use. They did precisely that with the 3DS, it is hardly unprecedented.
 
Interesting that they're going with WUP for the product codes. In the past they've almost always used the console's codename, so I was expecting something like CFE.

What's the P stand for? Remember, Wii U had several supposed codenames (Stream, Cafe, etc)... The P still sounds to me like a left over from some sort of code name.
 

Proc

Member
Pre-ordred the pro model today at ebgames...figured I'd reserve one just in case I end up wanting one. I'm on the fence right now.
 
Unless they were being conservative and always intended to allocate more for gaming once they had a true idea of the footprint the OS would require during everyday use. They did precisely that with the 3DS, it is hardly unprecedented.


I keep hearing this about the 3DS, but I thought it was the second CPU they opened up for devs? Wasn't the available RAM 96mb from the start, or is it more now?
 

Van Owen

Banned
Iwata said at an Investors meeting that NOT EVERY GAME GETS AN AAA BUDGET.

Theres your explanation.

Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".
 
What's the P stand for? Remember, Wii U had several supposed codenames (Stream, Cafe, etc)... The P still sounds to me like a left over from some sort of code name.

Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic
 
Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".

Because the mainstream audience dosen´t care about "LOOK AT DEM GRAFIX" like the few % of hardcorezzz like here on NeoGAF.
 
Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic

Wub wub.

No I don't think the premium and basic batteries have different product codes :p (WUP-012)

WUPad, unless the main unit is also WUP, then I dunno.
 

Thraktor

Member
If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

They wouldn't be using GDDR3 in any case, but I'd put a good amount of money on it being a single pool of DDR3 with a 128 bit bus. The only questions are really the latency (my guess is this is what Nintendo's primarily concerned with) and clock speed (which is likely to be a clean multiple/divisor of GPU and CPU clocks).
 
Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic

*goes to look at the back of the white controller image* in the mean time, flaw to your logic, how will GAMES be listed? WUG? That would be the first time in Nintendo's history that a system/accessories/games have had different 3 digit identifiers.

(edit) White controller is also has WUP on the back.
 
Are there any shots of the back of any Wii U boxes yet? I'd love to see if they too use the WUP 3 digit code, I'm going to assume they do though since Nintendo has always used the 3 letter code.

The only difference between the black and white controllers serial is the initial 3 letters (LTA for white, LTB for black).
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Point well taken. I agree more is better even at the same bandwidth, but you don't think there is a diminishing return beyond 1 or 1.5 GB or RAM w/ low bandwidth? From what I have read, the golden ratio of capacity:bandwidth could not be agreed upon, but having higher bandwidth memory definitely seemed to impact performance.
Of course. I was discussing the 'same BW' scenario though.

Forgetting about the reserved amount for system applications for a moment, wouldn't you rather have 1 GB of high bandwidth GDDR5 than 2 GB of DDR3 at half the throughput?
It really depends on various use-case factors, such as the type/genre of game/app, art direction, etc. Basically, it varies from title to title. If you asked me to choose disregarding all such use-case factors, though, I'd probably go with the 2GB of DDR3 for a system like the WiiU where that RAM would not be the sole BW provider. If that condition was not true, though, and that RAM was the sole BW provider, then I'd go with the GDDR5.

Obviously, performance is affected by the GPU and entire system as well (plus the benefits of the eDRAM, which should go beyond Xenos, as you so kindly explained), but we don't know at this point what the main graphical bottleneck of the system is.
We surely don't. That said, even if the system was the most well-rounded console imaginable, the use-case factorisation applies here as well - some game will still be RAM BW limited, another - GPU ALU limited, yet another - CPU limited, and yet another - storage access speed limited (apparently various combinations of those factors during various runtime timestamps are also valid). The job of the engineers responsible for the hw design is not to make a console nobody could ever hit a bottleneck on. It's to avoid making a system where no matter what people do, they always hit the same few bottlenecks time and time again.
 
Are there any shots of the back of any Wii U boxes yet? I'd love to see if they too use the WUP 3 digit code, I'm going to assume they do though since Nintendo has always used the 3 letter code.

The only difference between the black and white controllers serial is the initial 3 letters (LTA for white, LTB for black).
This link mentions a 250 volt adapter with the product code WUP-002. Both adapters have WUP, so the Wii U box, (guessing WUP-001) will too.
 

Jonm1010

Banned
Because the mainstream audience dosen´t care about "LOOK AT DEM GRAFIX" like the few % of hardcorezzz like here on NeoGAF.
I like Nintendo damage controllers.

"OMG look how beautiful Mario galaxy looks. Looks better then some 360 games."

guy* so wiiU is gonna be stomped by ps4/720

"no one cares about graphics!"
 
Except for those saying the games look better would make much more sense, considering the stronger hardware.

Let him dream

In his world a ~600gflop GPU with a far more modern feature set produces shittier graphics than a 241 gflop gpu wich lacks most of modern features.
 
Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".
I have no doubt it will look like a top tier PS3/360 title at the very least.

Though as EatChildren has said it may not have a tech focused artstyle.

But make no mistakes my friend. Retro and EAD Tokyo are not studios to be trifled with. When it comes to art design and technical competence these two studios are Nintendo's best bar none.

The games probably won't look a generation removed from the PS3 or 360's best. But they will be gorgeous games all the same.
 
I like Nintendo damage controllers.

"OMG look how beautiful Mario galaxy looks. Looks better then some 360 games."

guy* so wiiU is gonna be stomped by ps4/720

"no one cares about graphics!"

Quote me where i said that!

And Beautiful graphics are a nice addition but if i had to choose shitty graphics over superior gameplay, i would choose the gameplay...

E3 2013 can´t come soon enough but not for the reason you might think.
 

Jonm1010

Banned
Quote me where i said that!

And Beautiful graphics are a nice addition but if i had to choose shitty graphics over superior gameplay, i would choose the gameplay...

E3 2013 can´t come soon enough but not for the reason you might think.
Graphics and gameplay aren't mutually exclusive.

And it was just a harmless jab, not necessarily directed at you.
 
This link mentions a 250 volt adapter with the product code WUP-002. Both adapters have WUP, so the Wii U box, (guessing WUP-001) will too.

Thank you sir, so that more or less confirms the three letter code, but it still doesn't tell us what the P stands for... I'm probably far more interested in this than I should be... but I've always been fond of them ever since I read an article on what the DOL stood for on the back of GCN game cases (then later confirming form yself that every other system Nintendo made did the same)
 
By a small margin.
One might even say a marginal margin.

Look, there will be some things WiiU undoubtedly does better. Texture res can be higher, world size can be larger. But we are still talking last gen + visuals.

I happen to think if you have trouble seeing your vision to fruition on that capable of hardware... you're doing something really wrong.
 
Top Bottom