• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.
How broken is Ninja Gaiden 3, now in relation to the supposed fixed that are made in Razors Edge?
I was actually kinda looking forward to buying it last year... :[
 
512 mb for an OS is INSANE. Nintendo will optimize it. No way I see the OS using more than 256mb of RAM and even that is too much. 128mb would be more like it. Wii U should have around 1.8GB of RAM available for games.

I've been less concerned about the amount of RAM than I have been about the type and bandwidth. After mulling it over and examining all the rumors/likely scenarios, I'm currently expecting them to throw in some cheap DDR3-1600. Samsung's 30nm chips seem to fit the bill. Nintendo contracted Samsung to provide the GDDR3 in Wii, and as we've seen, Nintendo seem to like sticking with their historical partners unless there's some major drawback.

My reasoning behind this assessment is as follows: For one, there's that Espresso dude over at Beyond3D, who more and more appears to be legit. This is despite the fact that I doubt his claims that the WiiU CPU is "just" Broadway overclocked and shrunk down. It likely appears to be so outwardly, since the 476fp is an evolution of Broadway and Wii U's CPU looks to retain some legacy features which were lost in that evolution. Over the last year or so, he has not been the only person to outwardly say that the dev kits contain DDR RAM (as opposed to GDDR). This seems to make sense of what we've heard from other devs - they are always quick to point out that Wii U has "more RAM" than PS360, but never that it's faster or more advanced, which you'd think they would if that were true (quite the contrary, as Arkam claimed it was slow).

It also fits with what we've seen of Wii U graphically. Textures, which are heavily dependent on RAM bandwidth, have been nothing to write home about. That speed of DDR3 would net a bandwidth of 25.6 GB/s on a 128-bit bus, which is barely over the 360's 21.6 GB/s (and perhaps enough for an additional SD image streaming to the Gamepad). Also take into account that the OS is possibly going to hog up some of that bandwidth. My guess is that the lack of AA in some games we've seen may be due to developers choosing to use some of the GPU's eDRAM as texture cache instead of as a framebuffer.

Finally, it would coincide with the rumored use of a PowerPC 476fp (or if we're getting technical, the synthesizable 470S, since Nintendo will be heavily customizing the core to be even more Broadway-like). That core runs its L2 cache at one half the speed of the CPU core (as do many other IBM designs, as I've recently discovered). The Processor Local Bus 6 (PLB6), which is commonly bundled with that CPU core maxes out at 800 Mhz (alternatively, I've read that a Cell-like ring bus may be employed instead, but that also runs at half speed). This would imply a 1.6 Ghz core clock rate, which fits the rumors of the CPU being clocked closer to Broadway than Xenon and is also the most used example frequency for the 476fp. Coincidentally, it is also one half Xenon's clock, but likely made up for somewhat by its larger l2 cache, OoOE, much shorter pipeline, and capability of issuing 4 instructions per cycle. Knowing this, it makes sense to me for both the L2 and system RAM to run at 800 Mhz.

What does this mean for games? I stated previously that the GPU's 32 MB eDRAM might frequently be used as a texture cache rather than framebuffer, leaving AA to sofware techniques such as FXAA or MLAA. To me, it does seem like a somewhat disappointing bottleneck, but price is clearly on the mind of Iwata, as his comments regarding the CPU reflect. With smart texture caching and perhaps some RAGE-style disc streaming, talented and determined developers should still be able to achieve gorgeous results, but sucking the full power out of the supposed ~600 GFLOPS GPU may prove more difficult than some of us had hoped.
 

Stewox

Banned
This might be a dumb question or answered before, but with the WiiU supporting hdmi would it improve wii games audio /visuals at all versus the av cables the wil uses? I know the WiiU won't upscale wii games to 720 or 1080p but would it offer any slight improvements whatsoever?

My worrying is that the Multi AV cable won't support 6ch outputs for external speakers, that would be kind of waste of experience, I think nintendo really does sometimes go too far with their saving money strategy that doesn't make any particular benefit it just takes off that amount that would make big difference for the overall system.

Unless you find a way to extract audio from hdmi without slowing the video signal kind of adapter.

But if you have an HDTV with optical out, you can use that, but ofcourse then you'll need an adapter for that if you have PC sorround speakers which are usually analog.
 

JordanN

Banned
Fourth Speed said:
This seems to make sense of what we've heard from other devs - they are always quick to point out that Wii U has "more RAM" the PS360, but never that it's faster or more advanced
Just for the record, how many devs actually touch upon RAM speed?

I remember High Voltage and Traveler's Tale only said 3DS has more RAM than Wii. And the actual 3DS RAM outspeeds the Wii's internal 24mb ram.

Only time I've heard direct reference was the N64/Gamecube era and that was mostly Nintendo's doing.
 

PhantomR

Banned
Definite Buy

Rayman Legends
Pikmin 3
Mass Effect 3
Arkham City
Scribblenauts Unlimited
New Super Mario Bros U
P-100

Might Buy

Ninja Gaiden 3
ZombiU
Nintendoland
Game & Wario
Tekken Tag Tournament 2
Assassin's Creed 3
Darksiders II


Hoping that EA announces FIFA 13 for Wii U. If so that goes right on the "definite buy" list.
 

Drago

Member
Definites:

NSMBU
ZombiU
Pikmin 3
Scribblenauts
Sonic & All Stars Transformed
Lego City
ACIII
Rayman Legends
P-100
NintendoLand

Among DL titles and others I'm forgetting
Wallet may Cry :(
 

DrLyme

Neo Member
So guys, after closer inspection of (some of) the Wii U games. Which ones are you getting?

Will be buying:

Little Inferno (More excited for this than anything else, and I don't even know what the gameplay is.)
Pikmin 3
NSMBU
Batman Arkham City: AE

Probably buying:

Nintendo Land
Rayman Legends
Darksiders II

On the radar but not yet sold:

P-100
Scribblenauts Unlimited
Trine 2
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I've been less concerned about the amount of RAM than I have been about the type and bandwidth. After mulling it over and examining all the rumors/likely scenarios...

...

I will choose not to believe your theory. The alternative is too painful.
 

AzaK

Member
All the Wii U games here (although the GTA:V source is unreliable),
http://en.wikipedia.org/wiki/List_of_Wii_U_games

Scroll down for the digital list.

There's not anything that I currently think is a must have at launch/window but I'm interested in NintendoLand if it's a packin or extremely cheap because it won't be a commonly played game for me but nice for a blast with friends every now and then.

I haven't played any Mass Effects so I'm interested in that a bit. I hated AC 1 and III seems to have the same moronic kill someone in plain view and no one notices mechanic so I'll probably give that a miss even though I bet it looks nice.

P100 is interesting so long as they make it more than just a mini boss fest as it looks very shallow so far.

Zombi U is probably highest on my list so I'm likely to pick that up. Aliens is too but I'm a woos when it comes to games me that and I end up often not finishing them.
 

JordanN

Banned
Mythbusted (again)?

Unfortunately, people will still find a way to twist poor Jeremiah's words into something it's not. That, or join the next "anonymous developer" train.
 

Shikamaru Ninja

任天堂 の 忍者
Nintendo Land and Super Mario Bros. U are the only games I am really excited about. Both quite frankly seem like the premier Nintendo experiences this fall. Pikmin 3 I am not so sure about.

The other games like Wii U Fit, and Sing I have no intentions on buying.
 

EuroMIX

Member
I'm actually wondering what's going to happen to the PAL VC games on the Wii that never supported component cable interlaced mode if they are transferred to the Wii U.
 
I've been less concerned about the amount of RAM than I have been about the type and bandwidth. After mulling it over and examining all the rumors/likely scenarios, I'm currently expecting them to throw in some cheap DDR3-1600. Samsung's 30nm chips seem to fit the bill. Nintendo contracted Samsung to provide the GDDR3 in Wii, and as we've seen, Nintendo seem to like sticking with their historical partners unless there's some major drawback.

My reasoning behind this assessment is as follows: For one, there's that Espresso dude over at Beyond3D, who more and more appears to be legit. This is despite the fact that I doubt his claims that the WiiU CPU is "just" Broadway overclocked and shrunk down. It likely appears to be so outwardly, since the 476fp is an evolution of Broadway and Wii U's CPU looks to retain some legacy features which were lost in that evolution. Over the last year or so, he has not been the only person to outwardly say that the dev kits contain DDR RAM (as opposed the GDDR). This seems to make sense of what we've heard from other devs - they are always quick to point out that Wii U has "more RAM" the PS360, but never that it's faster or more advanced, which you'd think they would if that were true (quite the contrary, as Arkam claimed it was slow).

It also fits with what we've seen of Wii U graphically. Textures, which are heavily dependent on RAM bandwidth, have been nothing to write home about. That speed of DDR3 would net a bandwidth of 25.6 GB/s on a 128-bit bus, which is barely over the 360's 21.6 GB/s (and perhaps enough for an additional SD image streaming to the Gamepad). Also take into account that the OS is possibly going to hog up some of that bandwidth. My guess is that the lack of AA in some games we've seen may be due to developers choosing to use some of the GPU's eDRAM as texture cache instead of as a framebuffer.

Finally, it would coincide with the rumored use of a PowerPC 476fp (or if we're getting technical, the synthesizable 470S, since Nintendo will be heavily customizing the core to be even more Broadway-like). That core runs its L2 cache at one half the speed of the CPU core (as do many other IBM designs, as I've recently discovered). The Processor Local Bus 6 (PLB6), which is commonly bundled with that CPU core maxes out at 800 Mhz (alternatively, I've read that a Cell-like ring bus may be employed instead, but that also runs at half speed). This would imply a 1.6 Ghz core clock rate, which fits the rumors of the CPU being clocked closer to Broadway than Xenon and is also the most used example frequency for the 476fp. Coincidentally, it is also one half Xenon's clock, but likely made up for somewhat by its larger l2 cache, OoOE, much shorter pipeline, and capability of issuing 4 instructions per cycle. Knowing this, it makes sense to me for both the L2 and system RAM to run at 800 Mhz.

What does this mean for games? I stated previously that the GPU's 32 MB eDRAM might frequently be used as a texture cache rather than framebuffer, leaving AA to sofware techniques such as FXAA or MLAA. To me, it does seem like a somewhat disappointing bottleneck, but price is clearly on the mind of Iwata, as his comments regarding the CPU reflect. With smart texture caching and perhaps some RAGE-style disc streaming, talented and determined developers should still be able to achieve gorgeous results, but sucking the full power out of the supposed ~600 GFLOPS GPU may prove more difficult than some of us had hoped.
Too bottlenecked for a Nintendo console.

So I doubt it.
 

Stewox

Banned
512 mb for an OS is INSANE. Nintendo will optimize it. No way I see the OS using more than 256mb of RAM and even that is too much. 128mb would be more like it. Wii U should have around 1.8GB of RAM available for games.

See. This is one of the examples of the term "optimize" getting miscontextualized i was talking about 10+ pages back.

The OS won't use that much memory, it's ridicolous, we don't know what else is WiiU reserving that memory for, suspend mode maybe, but i got a solution to that (writting temp kind of hiberfil.sys to storage memory then reloading it back but I think for sake of speed this was not done, for the commerical, i think the casuals would be complaining about waiting, kind of a petty childish non-issue to be depending upon against a much more important technical disability)

I really hope devs point that out and gets unlocked sooner rather than later, all those memories could be immediately important for games like Rage if they port it, Rage is quite a big game and it got extermely compressed down for PS360

Maybe the next xbox will be called Xbox 490 or something so neogaf can continue using similar "PS490" - imagine contacting microsoft and asking about that.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
Katsuhiro Harada is anonymous?

I just want to get this straight- someone saying that the clock speed on the CPU is currently running lower than on other consoles really means he is saying that the console is weaker than the other ones?

Just take a minute and make sure that is what you are trying to say.
 
DEFINITE
Assassin's creed 3
Nintendo land
Game and wario
Just dance 4
project p-100
Pikmin 3

MAYBE
scribblenauts unlimited
Mass effect 3
Ninja gaiden 3
Darksiders 2
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
If I get the console at launch, I would for sure pick up P-100, Assassins Creed, and Pikmin 3.

If reviews were solid I would also get ZombiU day one.
 
Just for the record, how many devs actually touch upon RAM speed?

I remember High Voltage and Traveler's Tale only said 3DS has more RAM than Wii. And the actual 3DS RAM outspeeds the Wii's internal 24mb ram.

Only time I've heard direct reference was the N64/Gamecube era and that was mostly Nintendo's doing.

I think it would be something worth noting, and also something which would be easy to notice in games (especially since higher resolution assets exist for those games with PC versions). This one point in and of itself isn't enough to make a case, but it is something I took note of every time devs mention Wii U's increased RAM. If it were GDDR5, that would represent a clear leap over current gen.

I will choose not to believe your theory. The alternative is too painful.

I'm here for you, bro...and I could be wrong.

Too bottlenecked for a Nintendo console.

So I doubt it.

Eh, they might not see it as a bottleneck. Nintendo seem to be targeting a specific performance/cost ratio. There are cards out there with a similar config. Take a look at the Mobility HD Radeon 4830. As I stated, there could be ways around it.

Besides, I take the notion that Nintendo never bottlenecks their systems as a myth - almost every one of their home consoles has been held back by one or 2 aspects.
 

LCGeek

formerly sane
Besides, I take the notion that Nintendo never bottlenecks their systems as a myth - almost every one of their home consoles has been held back by one or 2 aspects.

You're making that notion worse.

There's a difference between how data storage can bottleneck power and just making a bad architecture.

bottlenecks be it console or pc are quite common and the only people without them are people building their stuff right.
 

donny2112

Member
Don't think I'll be getting the console at launch. Of the launch window titles, it would probably work out like this to purchase eventually, though.

Definite
Nintendoland
NSMB U
Pikmin 3
Assassin's Creed III
Wii Fit U

Maybe
P-100
Zombi U
Game and Wario
 
You're making that notion worse.

There's a difference between how data storage can bottleneck power and just making a bad architecture.

bottlenecks be it console or pc are quite common and the only people without them are people building their stuff right.

Sorry, I'm a bit confused by your wording that I'm making that notion worse. For the record, I wasn't talking only about storage media, but the SNES' slow CPU, the small texture cache of N64, and nearly useless ARAM of Gamecube. I agree bottlenecks are common, but it probably comes down to price. Who knows, perhaps Nintendo wouldn't have saved much by going for a shittier GPU, or maybe they truly believe that what they have is sufficient. But we're also hearing about a pretty modest CPU that would probably end up being overfed w/ more than 25.6 GB/s. If anything, a 600 GFLOPS GPU is the outlier in the whole equation. But the eDRAM should be fast enough to pick up some slack, and the system operations in the background shouldn't hog much bandwidth either. I expect them to be minimal and not actually kick in until you hit the home button, at which point some type of save state feature could be employed as has been described by Stewox.
 
Yes, I agree. Everything looks quite a bit smoother than the E3 build. Looks like another game I have to add to the list of games I will buy at launch.

I can't believe I'm considering this after buying Red Steel at the Wii launch lol.

Think I'm probably the only person on the planet that actually liked the first Red Steel. For its time it was pretty good and had some nice touches along the way such as the Rabbids level, being able to roll or throw grenades and had an original multiplayer mode, Killer, which gave instructions to each player through the remote speaker. It had its faults but as a launch title I personally found it entertaining.

Those two videos of ZombiU look great, I thought the lighting at E3 looked great but it looks like Ubisoft have stepped the quality up a couple of notches. Really looking nice.

This game is going to sell truckloads as long as Ubisoft don't find a way to fuck it all up. I've only seen one person that's complained about it after playing it, and I suspect he was enjoying a bit of trollage.

I can see this being a real system seller and am expecting it to be one of the third party titles that Nintendo are going to match marketing costs to promote.
 
Eh, they might not see it as a bottleneck. Nintendo seem to be targeting a specific performance/cost ratio. There are cards out there with a similar config. Take a look at the Mobility HD Radeon 4830. As I stated, there could be ways around it.
Nintendo are the guys that will tone down performance of a platform if that reduces bottlenecks, they're not so likely to go with less than "optimal" ram of all things.

Their philosophy is not so much about high end as it is about a wysiwyg kind of architecture. Since they produce software for it they'll be worried first and foremost about being easy to pull stuff out of it (and reducing R&D later on too). Since they're going for third party's too they'll want to have a straightforward platform rather than a broken up development experience.

And a 32 MB eDRAM bank doesn't really change that, it's too small; would require lots of caching when you have HD textures taking several times more than that amount.
Besides, I take the notion that Nintendo never bottlenecks their systems as a myth - almost every one of their home consoles has been held back by one or 2 aspects.
There's always bottlenecks, even Nintendo platforms that were regarded as balanced had them, for instance GC lacked in RAM amount; but they went for less RAM precisely because of all things they'd rather have top grade RAM in their system as that reduced bottlenecks big time.

They did the same with 3DS recently. They seem to have a fetish to go with RAM with a twist.
 
Think I'm probably the only person on the planet that actually liked the first Red Steel. For its time it was pretty good and had some nice touches along the way such as the Rabbids level, being able to roll or throw grenades and had an original multiplayer mode, Killer, which gave instructions to each player through the remote speaker. It had its faults but as a launch title I personally found it entertaining.
.

I enjoyed it too. However, I was pretty blinded by any kind of motion control at the very beginning of the Wii life, even if bad, including Wii Play and Twilight Princess. They really had me hyped back then like crazy. Pretty much the complete opposite of now.
I'll keep it for nostalgia of the magical Wii launch time and whatnot, but even back then I remember certain parts pissing me off. Some actually neat ideas here and there, but it certainly was half baked. Music was great afair though.

ZombiU really doesn't appeal to me at all though. Red Steel at least went for a not so ordinary setting/scenario, which kind of made up for me that I really don't like almost every FPS out there.
 

JordanN

Banned
And a 32 MB eDRAM bank doesn't really change that, it's too small; would require lots of caching when you have HD textures taking several times more than that amount.There's always bottlenecks, even Nintendo platforms that were regarded as balanced had them, for instance GC lacked in RAM amount; but they went for less RAM precisely because of all things they'd rather have top grade RAM in their system as that reduced bottlenecks big time.

They did the same with 3DS recently. They seem to have a fetish to go with RAM with a twist.
Compared to what? The Dreamcast had 26mb, PS2 was 36mb, GCN was 43mb and Xbox was 64mb. It would only be lacking to the Xbox and that launched later so not a fair comparison.
 
Speaking of zombies, does anyone else think RE6 will end up being awful? I've not been impressed with what I've seen of it.


------

Q about Zombie U. Will they change the height of the first person view depending on the survivor you are playing?

From what I've seen so far ZombiU is going to be more Resident Evil than Resident Evils 4, 5 and 6. And I never thought I'd say that about a Ubisoft title of all publishers! :Oo
 
I've noticed something interesting about the new trailer for Zombi U.

Zombie has red jacket in the middle area:


Desktop_2012_07_13_20_38_15_461.bmp



From the same trailer, Zombie now doesn't have red jacket or same pants. Was it eat by crows, or is it a different model altogether?

Desktop_2012_07_13_22_07_04_639.bmp



- Graphical improvements over E3


Depth of Field & Anti-Aliasing added:

Desktop_2012_07_13_22_07_11_936.bmp



E3 build without either DOF or AA:

Desktop_2012_07_13_22_07_44_613.bmp
 
Nintendo are the guys that will tone down performance of a platform if that reduces bottlenecks, they're not so likely to go with less than "optimal" ram of all things.

Their philosophy is not so much about high end as it is about a wysiwyg kind of architecture. Since they produce software for it they'll be worried first and foremost about being easy to pull stuff out of it (and reducing R&D later on too). Since they're going for third party's too they'll want to have a straightforward platform rather than a broken up development experience.

And a 32 MB eDRAM bank doesn't really change that, it's too small; would require lots of caching when you have HD textures taking several times more than that amount.There's always bottlenecks, even Nintendo platforms that were regarded as balanced had them, for instance GC lacked in RAM amount; but they went for less RAM precisely because of all things they'd rather have top grade RAM in their system as that reduced bottlenecks big time.

They did the same with 3DS recently. They seem to have a fetish to go with RAM with a twist.

I think the "twist" this time is the IBM eDRAM. 2 GB does stick out to me as alot to have for such a low bandwidth, but it was probably easier for them to stick in 2 GB than the 1.5 GB max they indicated in those early dev kit specs. And DDR3 is cheap, so why not?

Anyway, I'd rather not believe it either, so hopefully someone can come in and prove me wrong. It's just that we've seen absolutely nothing to indicate that there's GDDR5 in that little box. There's somewhat faster DDR3 on the market as well, but with the way the CPU is shaping up to be clocked...
 
25.6 GB/s @ 1.6 GHz DDR3 is not a big enough improvement over from X360's 21.6 GB/s @ 1.4 GHz (700x2) GDDR3 specially considering this is a Nintendo platform (and it has to last for 5 or more years). And considering this generation was bandwidth starved. It's implementing a bottleneck that plagued this gen. And it's not just about the CPU being overfed, this is the RAM serving GPU bandwidth.

You've said 1.6 GHz DDR3 (and there's DDR3 @ 2.5 GHz so they could even go higher, as eccentric as Nintendo is about RAM), so why not the same solution that was used for X360? 2x1.6=3.2 GHz would effectively double it to 51.2 GB/s and wouldn't cost much more to implement.
I think the "twist" this time is the IBM eDRAM. 2 GB does stick out to me as alot to have for such a low bandwidth, but it was probably easier for them to stick in 2 GB than the 1.5 GB max they indicated in those early dev kit specs. And DDR3 is cheap, so why not?
It's cheap, but this is a games console so going with 1066 MHz DDR3 or similar low end solutions would make no sense (even if it technically already beats X360's 700 MHz solution).

1600 MHz could be fine if it was dual channel 2x1 GB=3.2 GHz. 1600 MHz total RAM speed on the other hand... not so much when last gen was sitting at 1.4 GHz and it suffered bottlenecking for it. It's a bad tradeoff.

Now we're depending on the CPU speed, I believe Nintendo wouldn't let total RAM speed outset the CPU, so if it's 2.8 GHz then I bet RAM will be at most totaling 2.8 GHz itself; hampering bandwidth a little.
Anyway, I'd rather not believe it either, so hopefully someone can come in and prove me wrong. It's just that we've seen absolutely nothing to indicate that there's GDDR5 in that little box. There's somewhat faster DDR3 on the market as well, but with the way the CPU is shaping up to be clocked...
Oh, I doubt it's GDDR5, but I also doubt it's 25.6 GB/s bandwidth; it's too low.
 

Meelow

Banned
I've noticed something interesting about the new trailer for Zombi U.

Zombie has red jacket in the middle area:


Desktop_2012_07_13_20_38_15_461.bmp



From the same trailer, Zombie now doesn't have red jacket or same pants. Was it eat by crows, or is it a different model altogether?

Desktop_2012_07_13_22_07_04_639.bmp



- Graphical improvements over E3


Depth of Field & Anti-Aliasing added:

Desktop_2012_07_13_22_07_11_936.bmp



E3 build without either DOF or AA:

Desktop_2012_07_13_22_07_44_613.bmp

Nice find!.

Nice find.
they have some nice titles under their arm. Too bad those are eclipsed by their own shovelware

Lol, they did say though they want to make a new F-Zero so that's good.
 
From what I've seen so far ZombiU is going to be more Resident Evil than Resident Evils 4, 5 and 6. And I never thought I'd say that about a Ubisoft title of all publishers! :Oo

eh, what? I love old school RE, but ZombiU doesn't look anything like it. The games were never really that much about the zombies anyway, but being trapped in crazy-ass structures with monsters in it.
ZombiU seems to be going for the most typical zombie setting (except... in Britain) in the vein of DayZ and such, regarding gameplay.
 

Roo

Member
Lol, they did say though they want to make a new F-Zero so that's good.

that's sad actually.
they have potential but let's be honest. F-Zero is too damn high for them. Unreachable I'd say.
Nice to see they're too confident about their work tho.
 
25.6 GB/s @ 1.6 GHz DDR3 is not a big enough improvement over from X360's 21.6 GB/s @ 1.4 GHz (700x2) GDDR3 specially considering this is a Nintendo platform (and it has to last for 5 or more years). And considering this generation was bandwidth starved. It's implementing a bottleneck that plagued this gen. And it's not just about the CPU being overfed, this is the RAM serving GPU bandwidth.

You've said 1.6 GHz DDR3 (and there's DDR3 @ 2.5 GHz so they could even go higher, as eccentric as Nintendo is about RAM), so why not the same solution that was used for X360? 2x1.6=3.2 GHz would effectively double it to 51.2 GB/s and wouldn't cost much more to implement.It's cheap, but this is a games console so going with 1066 MHz DDR3 or similar low end solutions would make no sense (even if it technically already beats X360's 700 MHz solution).

1600 MHz could be fine if it was dual channel 2x1 GB=3.2 GHz. 1600 MHz total RAM speed on the other hand... not so much when last gen was sitting at 1.4 GHz and it suffered bottlenecking for it. It's a bad tradeoff.Oh, I doubt it's GDDR5, but I also doubt it's 25.6 GB/s bandwidth; it's too low.

I don't think it's a big enough improvement either, but hasn't that been the running theme with this console? I'm not trying to be negative - just relaying things as I see them after digging around a bit. And don't compare it to the 360 if that helps, because next to the Wii it's still pretty much the biggest generational jump ever.

Correct me if I'm wrong, but I believe 2.5 Ghz DDR3 is just overclocked DDR-2133 and is pretty rare (mainly for PC gaming enthusiasts). On the other hand, when I calculated 25.6 GB/s, that is already taking multiple channels into account. A 128-bit bus would get you 12.8 GB/s x 2. Nintendo would likely use 8 2-Gigabit chips (each with a 16-bit interface) to achieve 2 gigabytes total (4 gigabit chips exist, but are likely too new/expensive, and not all come w/ a 32-bit interface, so bandwidth might actually end up being worse).

Edit: I believe it highly unlikely at this point that the CPU is clocked above 2 Ghz. I see Harada's comments about it being just a little bit slower as sheepishly downplaying his previous comment after he realized that he'd already said too much.
 

nordique

Member
I've been less concerned about the amount of RAM than I have been about the type and bandwidth. After mulling it over and examining all the rumors/likely scenarios, I'm currently expecting them to throw in some cheap DDR3-1600. Samsung's 30nm chips seem to fit the bill. Nintendo contracted Samsung to provide the GDDR3 in Wii, and as we've seen, Nintendo seem to like sticking with their historical partners unless there's some major drawback.

My reasoning behind this assessment is as follows: For one, there's that Espresso dude over at Beyond3D, who more and more appears to be legit. This is despite the fact that I doubt his claims that the WiiU CPU is "just" Broadway overclocked and shrunk down. It likely appears to be so outwardly, since the 476fp is an evolution of Broadway and Wii U's CPU looks to retain some legacy features which were lost in that evolution. Over the last year or so, he has not been the only person to outwardly say that the dev kits contain DDR RAM (as opposed the GDDR). This seems to make sense of what we've heard from other devs - they are always quick to point out that Wii U has "more RAM" the PS360, but never that it's faster or more advanced, which you'd think they would if that were true (quite the contrary, as Arkam claimed it was slow).

It also fits with what we've seen of Wii U graphically. Textures, which are heavily dependent on RAM bandwidth, have been nothing to write home about. That speed of DDR3 would net a bandwidth of 25.6 GB/s on a 128-bit bus, which is barely over the 360's 21.6 GB/s (and perhaps enough for an additional SD image streaming to the Gamepad). Also take into account that the OS is possibly going to hog up some of that bandwidth. My guess is that the lack of AA in some games we've seen may be due to developers choosing to use some of the GPU's eDRAM as texture cache instead of as a framebuffer.

Finally, it would coincide with the rumored use of a PowerPC 476fp (or if we're getting technical, the synthesizable 470S, since Nintendo will be heavily customizing the core to be even more Broadway-like). That core runs its L2 cache at one half the speed of the CPU core (as do many other IBM designs, as I've recently discovered). The Processor Local Bus 6 (PLB6), which is commonly bundled with that CPU core maxes out at 800 Mhz (alternatively, I've read that a Cell-like ring bus may be employed instead, but that also runs at half speed). This would imply a 1.6 Ghz core clock rate, which fits the rumors of the CPU being clocked closer to Broadway than Xenon and is also the most used example frequency for the 476fp. Coincidentally, it is also one half Xenon's clock, but likely made up for somewhat by its larger l2 cache, OoOE, much shorter pipeline, and capability of issuing 4 instructions per cycle. Knowing this, it makes sense to me for both the L2 and system RAM to run at 800 Mhz.

What does this mean for games? I stated previously that the GPU's 32 MB eDRAM might frequently be used as a texture cache rather than framebuffer, leaving AA to sofware techniques such as FXAA or MLAA. To me, it does seem like a somewhat disappointing bottleneck, but price is clearly on the mind of Iwata, as his comments regarding the CPU reflect. With smart texture caching and perhaps some RAGE-style disc streaming, talented and determined developers should still be able to achieve gorgeous results, but sucking the full power out of the supposed ~600 GFLOPS GPU may prove more difficult than some of us had hoped.

Excellent, and very informative post, Fourth Storm

It seems strange to read about Nintendo gimping out on the RAM in such a fashion (its not bad per se but its not the somewhat-typical "the best & crazy ram" Nintendo) while ironically they leave 32MB of eDRAM which is quite the opposite and more than enough eDRAM for an HD video game system.

Any word as to whether the Wii U will actually be supporting 1.5GB or 2GB of ram in the final box? I know 1.5 is what I have been expecting so I am somewhat surprised many people have jumped on the 2GB bandwagon. That's better for getting a "next gen" type of texture bump from the Wii U over PS360, even if it does take a few years of developers learning how best to utilize the slow ram
 

sphagnum

Banned
Not getting at launch, but of the games we know so far, I'd be interested in:

Definite
Aliens: Colonial Marines
Assassin's Creed III
Batman: Arkham City
Darksiders II
New Super Mario Bros. U
Nintendo Land
Pikmin 3
Project P-100
Rayman Legends
Tank! Tank! Tank!
Zombi U
Zelda
Smash Bros
Retro's game
Monolithsoft's game

Maybe
Game & Wario
Mass Effect 3
Ninja Gaiden 3: Razor's Edge
Scribblenauts: Unlimited
 

nordique

Member
Nintendo are the guys that will tone down performance of a platform if that reduces bottlenecks, they're not so likely to go with less than "optimal" ram of all things.

Their philosophy is not so much about high end as it is about a wysiwyg kind of architecture. Since they produce software for it they'll be worried first and foremost about being easy to pull stuff out of it (and reducing R&D later on too). Since they're going for third party's too they'll want to have a straightforward platform rather than a broken up development experience.

And a 32 MB eDRAM bank doesn't really change that, it's too small; would require lots of caching when you have HD textures taking several times more than that amount.There's always bottlenecks, even Nintendo platforms that were regarded as balanced had them, for instance GC lacked in RAM amount; but they went for less RAM precisely because of all things they'd rather have top grade RAM in their system as that reduced bottlenecks big time.

They did the same with 3DS recently. They seem to have a fetish to go with RAM with a twist.


Though, I do agree with Fourth Storm that everything seems to be pointing in a "Wii U will have DDR3" ram direction

I would love for Nintendo to pop some GDDR5 in there instead (heck I'd love them to power the entire system up a little bit) but it doesn't seem like Nintendo will go down that route with the Wii U

RAM could change last minute though, but those days are numbered.


It could become a legitimate cost thing, and Nintendo could simply be banking on competent developers figuring out a way to exploit the Wii U's larger amount of ram and utilizing tech wizardry to get the most out of the graphics.
 
Excellent, and very informative post, Fourth Storm

It seems strange to read about Nintendo gimping out on the RAM in such a fashion (its not bad per se but its not the somewhat-typical "the best & crazy ram" Nintendo) while ironically they leave 32MB of eDRAM which is quite the opposite and more than enough eDRAM for an HD video game system.

Any word as to whether the Wii U will actually be supporting 1.5GB or 2GB of ram in the final box? I know 1.5 is what I have been expecting so I am somewhat surprised many people have jumped on the 2GB bandwagon. That's better for getting a "next gen" type of texture bump from the Wii U over PS360, even if it does take a few years of developers learning how best to utilize the slow ram

Thanks, I try. :D Really, I'd love to be proven wrong on this one. The lack of info on the topic has driven me to extreme amounts of reading online (which makes my gf not too happy).

I think 2 GB is probably more likely than 1.5 GB right now just because if they were to achieve that same bandwidth with 1.5 GB, it would require the same amount of chips. 4 would be 2 gigabit chips and the other 4 would be 1 gigabit. Any less, and you'd be getting even slower than the aforementioned 25.6 GB/s and then it starts looking real scary.
 
I don't think it's a big enough improvement either, but hasn't that been the running theme with this console? I'm not trying to be negative - just relaying things as I see them after digging around a bit. And don't compare it to the 360 if that helps, because next to the Wii it's still pretty much the biggest generational jump ever.
The running theme of this console is being incremental, not really a dig at going bankrupt but a balancing act of the best performance $300 (minus the price of a high tech controller) can buy. That's probably their briefing anyway.

It's not an attempt at making a X360, and bottlenecking stuff is really not in Nintendo's tradition. If they were to bottleneck then I bet they could go for 4 GB of RAM instead considering how cheap as DDR3 is now for even us. (I bought 8 GB last year for 70 euro tops).

The rule of thumb seems to be at making it at least 2x or better (I'd actually say 2.5/3x; 2x is worst case scenario) as the GPU seems to be 2.5x the Gflops of X360 (3 times the PS3 RSX Gflops), the RAM amount is 3x X360/PS3 (if it's 1.5 GB, if they unlock more memory it'll be even higher), framebuffer is also 3.2x larger than the X360 one; and the CPU real performance is inconclusive so far, flunking on RAM would be bad.
Correct me if I'm wrong, but I believe 2.5 Ghz DDR3 is just overclocked DDR-2133 and is pretty rare (mainly for PC gaming enthusiasts).
No; they're designed and certified to go at that speed and thus it's not the RAM that's being overclocked but the system itself because the DDR3 speeds in question are not standard for the motherboard and cpu.

Doesn't really apply to a console as they control multiplier and fsb. They can go for whatever RAM they want although 2.5 GHz is unlikely, those types get hot (and come with heat dissipation measures)
On the other hand, when I calculated 25.6 GB/s, that is already taking multiple channels into account. A 128-bit bus would get you 12.8 GB/s x 2. Nintendo would likely use 8 2-Gigabit chips (each with a 16-bit interface) to achieve 2 gigabytes total (4 gigabit chips exist, but are likely too new/expensive, and not all come w/ a 32-bit interface, so bandwidth might actually end up being worse).
X360 is using 4 RAM chips right now (other side is blank) with 8 chips you could surely more than double the bandwidth of a 7 year old platform using DDR3.


32 MB of eDRAM is too small for a twist when it's being used as a framebuffer. The more you have the better it'll be. I did the math a few pages back. It's not the only factor, as that might not max the bandwidth, you you really don't want to stress the framebuffer into doing a lot of extra stuff in between two things that are constant (driving two screens that is) so you're left with the leftover RAM and even most devs won't mess much with that, preffering to add extra effects and passages than streaming textures.


1280x720p with 2xMSAA would sit at 12.3 MB and the controller's 853x480 with 2xMSAA would cost 5.5 MB 17.8 MB leaving a whooping 14.2 MB for something else.

4xMSAA? then it's 17.6 MB plus 7.8 MB's, 25.4 MB leaving 6.6 MB. There isn't a budget for texture streaming in there.

Of course you could do lot's of gymnastics with it (like tiling) but even if you had the whole 32 MB bank for texture streaming it's not that big (there's bigger textures than that, specially if you go over 2048x2048) and if the RAM is slow you'd be stuck with them on the framebuffer taking cycles before the RAM can finish streaming them.

Not that useful.
 

USC-fan

Banned
Maybe be more specific about usable ram not reserved for OS and apps. I think the PS 3 has has 256MB of usable for GPU and 206MB for CPU as the OS takes up 50MB so 462MB usable for games.

Wii U has 512MB reserved for OS for some reason and 1536MB for games. so it is around 3 1/3 times more usable ram for games vs xbox PS3. Might be more depending on that OS footprint reserve.

I think people got this wrong, they took the 512mb of flash for is os ram. That is were the extra 512mb. Base on what been leaked:

Wiiu has 1.5 gb ram
Flash 512 for is and another 8gb for game/user
 
Interesting post FS.

And a 32 MB eDRAM bank doesn't really change that, it's too small; would require lots of caching when you have HD textures taking several times more than that amount.

I was going to chime in on that as well as I don't see it being used as a texture cache. IMO if anything it will be a pseudo-L3 cache if devs choose not to use it exclusively as a framebuffer.

Also FS's post made me think back to when I brought up S3TC and it's usage, but I didn't get much info to satisfy what I was trying to figure out so maybe you can chime in on this. Do you see it as a viable implementation into the GPU for Wii U and if so how would it benefit a modern hardware environment? Nintendo renewed their license a couple years ago.

I think people got this wrong, they took the 512mb of flash for is os ram. That is were the extra 512mb. Base on what been leaked:

Wiiu has 1.5 gb ram
Flash 512 for is and another 8gb for game/user

Looks like you missed my B3D post. That rumor came from me based on a couple of people I asked.
 
The running theme of this console is being incremental, not really a dig at going bankrupt but a balancing act of the best performance $300 (minus the price of a high tech controller) can buy. That's probably their briefing anyway.

It's not an attempt at making a X360, and bottlenecking stuff is really not in Nintendo's tradition. If they were to bottleneck then I bet they could go for 4 GB of RAM instead considering how cheap as DDR3 is now for even us. (I bought 8 GB last year for 70 euro tops).

The rule of thumb seems to be at making it at least 2x or better (I'd actually say 2.5/3x; 2x is worst case scenario) as the GPU seems to be 2.5x the Gflops of X360 (3 times the PS3 RSX Gflops), the RAM amount is 3x X360/PS3 (if it's 1.5 GB, if they unlock more memory it'll be even higher), framebuffer is also 3.2x larger than the X360 one; and the CPU real performance is inconclusive so far, flunking on RAM would be bad.No; they're designed and certified to go at that speed and thus it's not the RAM that's being overclocked but the system itself because the DDR3 speeds in question are not standard for the motherboard and cpu.

Doesn't really apply to a console as they control multiplier and fsb. They can go for whatever RAM they want although 2.5 GHz is unlikely, those types get hot (and come with heat dissipation measures)X360 is using 4 RAM chips right now (other side is blank) with 8 chips you could surely more than double the bandwidth of a 7 year old platform using DDR3.


32 MB of eDRAM is too small for a twist when it's being used as a framebuffer. The more you have the better it'll be. I did the math a few pages back. It's not the only factor, as that might not max the bandwidth, you you really don't want to stress the framebuffer into doing a lot of extra stuff in between two things that are constant (driving two screens that is) so you're left with the leftover RAM and even most devs won't mess much with that, preffering to add extra effects and passages than streaming textures.


1280x720p with 2xMSAA would sit at 12.3 MB and the controller's 853x480 with 2xMSAA would cost 5.5 MB 17.8 MB leaving a whooping 14.2 MB for something else.

4xMSAA? then it's 17.6 MB plus 7.8 MB's, 25.4 MB leaving 6.6 MB. There isn't a budget for texture streaming in there.

Of could you could do lot's of gymnastics with it (like tiling) but even if you had the whole 32 MB bank for texture streaming it's not that big (there's bigger textures than that, specially if you go over 2048x2048) and if the RAM is slow you'd be stuck with them on the framebuffer taking cycles before the RAM can finish streaming them.

Not that useful.

Fair enough on the 1 gigabit GDDR3 chips (was it you who I debated this with before?). However, even if they did go w/ GDDR3, that's still 12 chips for 1.5 GB or 16 chips for 2 GB. It's completely unfeasible.

I appreciated your framebuffer analysis. I'm not knowledgeable enough on how texture caching works to say any more about it. All I know is that Gamecube and Wii used it to store commonly used textures (ground textures, for example) and that Wii U will need some form of it for backwards compatibility. I do know that modern GPUs have replaced dedicated texture caches w/ a more general L2 cache, which can be used for textures among other things, and that Gamecube's 1 MB texture cache is still an impressive amount compared to what GPUs offer today.
 
Status
Not open for further replies.
Top Bottom