• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U has 2GB of DDR3 RAM, [Up: RAM 43% slower than 360/PS3 RAM]

It's funny how people are freaking out about port performance when this was discussed at length and expected in the old Wii U speculation threads. Where's BG when you need him? lol


Ports from current gen games need "work" to run on the Wii U decently, just like a PC does.

This was truer when the Cell was introduced, but I'm not convinced the R700 and CPU are doing something amazingly new. These simple DSPs everyone lauding around should even simplify things. The bottleneck comes in where devs have to shoehorn more stuff into the eDRAM, to a greater extent than they did on 360.
 

AzaK

Member
So this is basically just Nintendo magic:

http://www.youtube.com/watch?feature=player_embedded&v=y9uYCU8jFiU

http://www.youtube.com/watch?v=6OHUwDShrD4&playnext=1&list=PL7982B2177A8DA0D2&feature=results_video


These were running on pre-alpha dev kits. What's the big surprise that ports of current gen games that were made for older hardware have some performance issues on Wii U? Every time a performance hiccup occurs on a Wii U it seems that the hardware is to blame and not the developers. Wasn't it just confirmed that Wii U can run Unreal Engine 4 and Unity Engine 4?

Slower RAM and lower clock CPU be damned, those demos are more than enough convincing to me and it was confirmed that the Wii U dev kits had several enhancements since that time and not the opposite. If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?

Well it's possible they ended up with a worse machine than they were showing 18 months ago. Maybe they realised it was going to be too expensive so cut the specs. We don't know but they have not made any attempt to show us anything graphically impressive, or even current gen.
 

ymmv

Banned
If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?

Do you really think developers with Nintendo dev kits are going to publicly criticize a yet unreleased console? Just like Nintendo they have a vested interest in making the Wii U a success. Everyone who divulged actual information on the Wii U hardware or criticized it did so anonymously.
 

F#A#Oo

Banned
So Nintendo cheapened out on RAM? It's surprising for me as I thought the Wii U would be a power house in the form of a GCN. Ah well.
 

SmokyDave

Member
Slower RAM and lower clock CPU be damned, those demos are more than enough convincing to me and it was confirmed that the Wii U dev kits had several enhancements since that time and not the opposite. If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?

Because there has never been a 'next-gen' console that struggled to achieve parity with the previous generation of hardware, even at launch. That is the top & bottom of it.
 
^ It did. 90% of games on wii look worse than Xbox games imo.

Do you really think developers with Nintendo dev kits are going to publicly criticize a yet unreleased console? Just like Nintendo they have a vested interest in making the Wii U a success. Everyone who divulged actual information on the Wii U hardware or criticized it did so anonymously.

To spin this about face, do you really think Randy Pitchford would have said generally positive and supportive things over and over that weren't true if he didn't have to? You lie by omission perhaps, but you don't go out of your way to praise something
 

Cuth

Member
What if Nintendo downgraded the specs in the final hardware?
This is difficult to believe, but after people pointed to the Arkam post in the speculation thread I've read several pages of it and some posts just seem strange.

Going the posts in the speculation thread, it seems like there are information about 3 different WiiU dev kits (maybe even more):

1. the earliest one, for obvious reasons the most different of the 3 from the final hardware
2. the middle one, the dev kit Arkam commented on saying it was a little less powerful than an Xbox 360, similar to the final hardware but not the same
3. the final dev kit, similar to the previous but with better (and final) hardware, we know now that it's at the very least better than an Xbox 360 in several aspects

What is strange is that several people, people who don't usually go around telling lies on this forum (and going by what they posted in the past some of them can certainly have some connection with developers), reacted to the Arkam post saying that, going by their sources, the early dev kit was several time more powerful than an Xbox 360. They never said to have direct info, so maybe this is the problem, but I think it's difficult to believe their sources lied to all of them (unless it's always the same source for everyone, but this also doesn't seem believable).

wsippel (I think, I've read the thread yesterday and I'm going by memory) mentioned that the early dev kit used GDDR3. He doesn't mention a source, so I'm not sure if that info it's reliable or not, but I started to think... Is it possibile that Nintendo changed the memory configuration from GDDR3 in the early dev kit to (a bigger quantity of) DDR3 in the middle and final versions?
Given that an early dev kit is usually a "normal" pc, I suppose that's possibile, it can't be really similar to the final hardware.
Eventually that change was probably a good choice, but this thought fueled the idea that in the dev kit progression some changes at least modified the balance of the hardware enough to require a different way to approach it from an engine point of view (but, frankly, if the "several times more powerful than an Xbox 360" info at the time of the early dev kits was true, the only explanation is a significant downgrade of some parts).

Now, after I've written all this text, what do other people think? Crazy talk? Or there is something that could be believable?
 
This is difficult to believe, but after people pointed to the Arkam post in the speculation thread I've read several pages of it and some posts just seem strange.

Going the posts in the speculation thread, it seems like there are information about 3 different WiiU dev kits (maybe even more):

1. the earliest one, for obvious reasons the most different of the 3 from the final hardware
2. the middle one, the dev kit Arkam commented on saying it was a little less powerful than an Xbox 360, similar to the final hardware but not the same
3. the final dev kit, similar to the previous but with better (and final) hardware, we know now that it's at the very least better than an Xbox 360 in several aspects

What is strange is that several people, people who don't usually go around telling lies on this forum (and going by what they posted in the past some of them can certainly have some connection with developers), reacted to the Arkam post saying that, going by their sources, the early dev kit was several time more powerful than an Xbox 360. They never said to have direct info, so maybe this is the problem, but I think it's difficult to believe their sources lied to all of them (unless it's always the same source for everyone, but this also doesn't seem believable).

wsippel (I think, I've read the thread yesterday and I'm going by memory) mentioned that the early dev kit used GDDR3. He doesn't mention a source, so I'm not sure if that info it's reliable or not, but I started to think... Is it possibile that Nintendo changed the memory configuration from GDDR3 in the early dev kit to (a bigger quantity of) DDR3 in the middle and final versions?
Given that an early dev kit is usually a "normal" pc, I suppose that's possibile, it can't be really similar to the final hardware.
Eventually that change was probably a good choice, but this thought fueled the idea that in the dev kit progression some changes at least modified the balance of the hardware enough to require a different way to approach it from an engine point of view (but, frankly, if the "several times more powerful than an Xbox 360" info at the time of the early dev kits was true, the only explanation is a significant downgrade of some parts).

Now, after I've written all this text, what do other people think? Crazy talk? Or there is something that could be believable?

Why do we need such a complex explanation to account for all the fake rumors and "sources" people have put on the internet? Maybe the information was just wrong.
 

Vagabundo

Member
To spin this about face, do you really think Randy Pitchford would have said generally positive and supportive things over and over that weren't true if he didn't have to? You lie by omission perhaps, but you don't go out of your way to praise something

I believe Randy worked on Samba de Amigo on the Wii. So Previous experience with the Wii will probably help. If you're coming in from 7 years of PS360 programming then you might not be able to get decent performance from the Wii U.

Treyarch has some decent Wii experience and Blops2 multiplayer is reported very smooth (although there are issues with the campaign).

In my opinion this is a launch/dev issue. Only time will tell though.

SmokeyDave said:
Because there has never been a 'next-gen' console that struggled to achieve parity with the previous generation of hardware, even at launch. That is the top & bottom of it.

Do we really need three power house consoles on the market?

The reasons I will probably by a Wii U next year:

- Small form factor: 30W is a big deal for me.
- WiiPad/ Wiimote controls. I was leery of the WiiPad, but I've finally been sold on it.
- Good enough visuals. I'm pretty confident that the machine is capable of 1.5x-2x the 360. That is good enough for me.
 

Cuth

Member
Why do we need such a complex explanation to account for all the fake rumors and "sources" people have put on the internet? Maybe the information was just wrong.
Maybe the information was just wrong, of course.
I think I covered why I thought about other explanations in the message, but maybe I wasn't clear. Some of the people who commented the Arkam message, saying they had completely different information, don't have the habit to lie on this forum and I think some of them can have some connections in the game industry to gather info.

Because of this the possibility of them being just wrong is certainly an explanation, the more obvious one, but I think other explanations could be worth of consideration... and of course I have some time to waste :p
 

lherre

Accurate
It exist more than 3 devkits models (or upgrades) on wii U.

The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
 

IdeaMan

My source is my ass!
Like i said on IRC, i don't think the problem we see on Wii U ports originates from a lack of "total power" (the "performances" under optimal use that the system can deliver and that could be measured somewhat). It really seems it's more a matter of a non-quick-with-not-many-dev-behind-friendly architecture, not-well-enough-designed-for-titles-like-that SDK, poor documentations, something like that.

I think i've revealed that in a v4 dev kit context, my sources struggled at keeping the framerate of their projects (exclusives) at a steady 30fps (a semester ago at least) with an intricate 3D usage of the second screen, but since then, the final dev kit + new sdk + other parameters made them reaching roughly 60 fps. So since E3, i've never heard of performance issues, of a problem with the main ram speed that could have hindered the games, etc.

There is something going on with ports. And even if it could be partially explained by a lack of optimization and tailoring of these titles for the Wii U specifics, i'm pretty disappointed by the near total absence of benefit those ports gained on the system, whereas with 2x more memory, a more modern GPU architecture, more cache, framebuffer, etc, we should have at the very least expected a tad more anti-aliasing, texture-filtering, a tad higher quality textures, viewing distance, whatever. Nintendo should have monitored those ports and ensure that their system, their sdk, documentation, all the tools to develop on Wii U, are built in a way than even small teams with low resources could take advantage of the specificities of the platform. I hope all these chains will improve with time.
 

IdeaMan

My source is my ass!
It exist more than 3 devkits models (or upgrades) on wii U.

The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.

The changes were rather noticeable between V3 and V4, more limited between V4 and V5. But in a V5 framework, several teams managed to improve the performance, the framerate, of their projects, by a lot (so a combination of this newest dev kit + sdk + software optimization). Some even found rather late in the development cycle a problem in their use of the CPU that hindered its total power, it was quickly fixed and added a nice boost.
 

Kenka

Member
It exist more than 3 devkits models (or upgrades) on wii U.

The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
lherre, can you explain in every man's words how the information coming from the disk drive is processed by the RAM for the OS, the RAM for the games, the eDRAM and finally the GPU? I really want to understand how a system can be "unbalanced" (or balanced for that matter).
 
I'm dying here people.

1. PC have Vram on graphic cards which is main pool not normal Ram which is nice to have fast but it isn't like Vram.
2. Good ram can only if game is more advanced where assets which are very big need to be loaded fucking fast, add bazylion of effects like in Crysis 2 and that is place when good high speed ram will matter.

It's good you had good times.

My point is THIS is the main RAM on WiiU. Intensive stuff will be run on eDRAM.

My point is Crysis run fine on DDR2-800 as far as you have a decent GPU, and it streams A LOT of hires assets.

Oh, by the way, would be funny to see those mocking of me now Anandtech corrected and DOUBLED their estimated bandwidth.
 

Easy_D

never left the stone age
It's good you had good times.

My point is THIS is the main RAM on WiiU. Intensive stuff will be run on eDRAM.

My point is Crysis run fine on DDR2-800 as far as you have a decent GPU, and it streams A LOT of hires assets.

Oh, by the way, would be funny to see those mocking of me now Anandtech corrected and DOUBLED their estimated bandwidth.

But that decent GPU on your PC has dedicated Vram, the Wii U has 2gB of ram total. (+ the eDRAM)
 

Phazon

Member
I'll think we'll see better results when Nintendo shows their next 'batch' of Wii U stuff. They always did a sort of conference in the beginning of the year, so I guess a Nintendo Direct in February will show us some nice stuff. (The December Direct one will probably have some new third party stuff for launch window or spring)

But yeah, a lot of third party games are a disappointment now, but a lot of them were made by other devs than the original ones. (darksiders, batman, epic mickey, mass effect, ...). I'm really curious what the real devs can do once they figure out the hardware and have more time. (because sometimes I feel that a lot of devs started making their ports very late)
 

Cuth

Member
It exist more than 3 devkits models (or upgrades) on wii U.

The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.

The changes were rather noticeable between V3 and V4, more limited between V4 and V5. But in a V5 framework, several teams managed to improve the performance, the framerate, of their projects, by a lot (so a combination of this newest dev kit + sdk + software optimization). Some even found rather late in the development cycle a problem in their use of the CPU that hindered its total power, it was quickly fixed and added a nice boost.
Thanks for the information.
 
But a PC have no unified pool of RAM. GPU's are more like an isolated subsystem attached by the PCIE bus. That means a GPU running over PCIE 2.1 @x16 have a peak 8 GB/s transfer link. If you run out of VRAM, your performance will die, since you can't stream data efficiently from the main RAM at that speed. Consoles do not work that way.

DDR3 1600 is the standard memory right now for high end PC's, btw. Anything over that is overclock.
 

Vagabundo

Member
Nintendo should have monitored those ports and ensure that their system, their sdk, documentation, all the tools to develop on Wii U, are built in a way than even small teams with low resources could take advantage of the specificities of the platform. I hope all these chains will improve with time.

I'd imagine this should have been happening so we'd have a good launch. However from the looks of it Nintendo needed to get this out the door and their engineers/coders were busy up to the last minute.
 

SmokedMeat

Gamer™
Ah well, as long as it can run Nintendo's first party stuff what do they care? Xbox 720 and PS4 can have the third parties to themselves again in another year.
 
I checked Anandtech's page and they still think the RAM has a peak BW of 12.8 GB/s.

Anandtech 11/19 said:
There are four 4Gb (512MB) Hynix DDR3-800 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 6.4GB/s of peak memory bandwidth. That doesn't sound like a lot, but the Wii U is supposed to have a good amount of eDRAM for both the CPU and GPU to use.

That is the fix in numbers I'm talking about, mister. Probably they forgot to multiply the effective DDR transfer rate.
 

mrklaw

MrArseFace
But a PC have no unified pool of RAM. GPU's are more like an isolated subsystem attached by the PCIE bus. That means a GPU running over PCIE 2.1 @x16 have a peak 8 GB/s transfer link. If you run out of VRAM, your performance will die, since you can't stream data efficiently from the main RAM at that speed. Consoles do not work that way.

DDR3 1600 is the standard memory right now for high end PC's, btw. Anything over that is overclock.

which is why you buy a GPU with enough memory to support the resolution you plan to run at. You don't want it to run out of local memory and have to fetch from system ram because its so slow.

Your comparison wasn't valid, end of.
 
Well there is a third tiny die on the package which might be the eDRAM, unless it has already been identified as something else...

mcmsmzeow8.jpg

Unlikely to be anything else. The fact it's a separate package suggests the CPU may well have (limited) access to it.
 
It exist more than 3 devkits models (or upgrades) on wii U.

The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
Thanks for your info. Hope you can give us a little more in light of thos teardown, but we take what we can get. :)

Like i said on IRC, i don't think the problem we see on Wii U ports originates from a lack of "total power" (the "performances" under optimal use that the system can deliver and that could be measured somewhat). It really seems it's more a matter of a non-quick-with-not-many-dev-behind-friendly architecture, not-well-enough-designed-for-titles-like-that SDK, poor documentations, something like that.

I think i've revealed that in a v4 dev kit context, my sources struggled at keeping the framerate of their projects (exclusives) at a steady 30fps (a semester ago at least) with an intricate 3D usage of the second screen, but since then, the final dev kit + new sdk + other parameters made them reaching roughly 60 fps. So since E3, i've never heard of performance issues, of a problem with the main ram speed that could have hindered the games, etc.

There is something going on with ports. And even if it could be partially explained by a lack of optimization and tailoring of these titles for the Wii U specifics, i'm pretty disappointed by the near total absence of benefit those ports gained on the system, whereas with 2x more memory, a more modern GPU architecture, more cache, framebuffer, etc, we should have at the very least expected a tad more anti-aliasing, texture-filtering, a tad higher quality textures, viewing distance, whatever. Nintendo should have monitored those ports and ensure that their system, their sdk, documentation, all the tools to develop on Wii U, are built in a way than even small teams with low resources could take advantage of the specificities of the platform. I hope all these chains will improve with time.
It is launch time, and a lot of these game finished development some time ago. Even in your example, the developers discovered something quite late in the development cycle, and it would take some time for them to scan through the entire game to ensure that the extra processor speed and effectively being used. Some of the problems, like the slowdowns, happens more randomly and developers probably didn't have time to go through all the issues.

These titles generally are more ambitious and larger than any console launch before, and a lot of them were ported from systems that is not a traditional generation weaker, but are very different in artitecture. The games coming after launch will have a bigger advantages, including being able to access miiverse and more time with the final dev kits.
 

gofreak

GAF's Bob Woodward
Nvm, the site says 2.65mm^2.

I'm not sure that can be eDram. Based on density estimates for Power7 eDRAM at 32nm, for example, only 3-4MB of eDram would fit in that kind of space. This likely isn't 32nm either (?)

Are we sure there's 32MB of eDram on here? It would take up a hefty amount of that larger die, I think.
 
Well there is a third tiny die on the package which might be the eDRAM, unless it has already been identified as something else...



That third tiny Die is most certainly not the eDRAM. It's TINY (2.65mm² according to Anandtech), which is far, far too small to be the eDRAM. The eDRAM should easily be 10+ times the size of that.
 

THE:MILKMAN

Member
Do we know what kind of area that third die is?

I think I read somewhere it is 6.3mm2? in the Anandtech report. Way too small for the eDRAM according to guesstimations (-37mm2 for 32MB)

I wonder if it is a ARM core for security or something for Wii BC?

Edit: 2.65mm2 for the 3rd die.
 
which is why you buy a GPU with enough memory to support the resolution you plan to run at. You don't want it to run out of local memory and have to fetch from system ram because its so slow.

Your comparison wasn't valid, end of.

That's why you have 32MB eDRAM. In PC you have a bottleneck of 8Gb/s that doesn't exist here. Even so, low end GPU's with shared memory (And no eDRAM!) are more that enough to run 720P @60fps

mem%20wow%20average.png


Guys, you are trying to be smarter than Nintendo engineers here.
 

IdeaMan

My source is my ass!
Thanks for your info. Hope you can give us a little more in light of thos teardown, but we take what we can get. :)


It is launch time, and a lot of these game finished development some time ago. Even in your example, the developers discovered something quite late in the development cycle, and it would take some time for them to scan through the entire game to ensure that the extra processor speed and effectively being used. Some of the problems, like the slowdowns, happens more randomly and developers probably didn't have time to go through all the issues. These titles generally are more ambitious and larger than any console launch before, and a lot of them were ported from systems that is not a traditional generation weaker, but are very different in artitecture. The games coming after launch will have a bigger advantages, including being able to access miiverse and more time with the final dev kits.

Well i'm strictly talking about ports as explained in my message. I'm confident for the overall power of the system when properly developed for, but marketing/third-party wise, they should have found a way to ensure that those ports would have reached at the very least a tiny superior version status thanks to the advantage of the Wii U hardware. I know too well all these matters you described, i'm criticizing the apparent lack of surveillance/monitoring from Nintendo for those projects, or any hardware/sdk/documentation oversight that prevent such "port teams" to reach a satisfactory level without too much efforts. Securing the "AC3 superior console version" with higher quality texture and a smoother framerate should have occurred.

But it will surely change yes with further improvements.
 
Top Bottom