• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U has 2GB of DDR3 RAM, [Up: RAM 43% slower than 360/PS3 RAM]

Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
 

codhand

Member
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.

I recall lots of anon devs saying it was worse than 360/ps3.
 
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.

Strict NDAs? We didn't even know how much RAM was in the box before 6 weeks ago or so. And you don't want to slag a console right before you drop a few flawed launch games on new customers at $60.
 

Erethian

Member
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.

Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.

Aren't you contradicting yourself here by talking about the great design of the GC when this thread is about a single part of the Wii U, and we have no definitive understanding of what steps they've taken to balance out their system design aside from leaks and rumours?

It could well be a badly balanced design, but those sorts of definitive proclamations should be left for when the full specs are uncovered.
 

SmokeMaxX

Member
Don't we have several game designers (some for large companies) here? Can any of them coherently explain all this jazz accurately? I feel like I'm at the Blackjack table with all the "veteran players" telling me I shouldn't hit 12 against a dealer showing 3 in a shoe game.
 

ghst

thanks for the laugh
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.

Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.

the gamecube was essentially the swan song for the raw material to performance wizardry that was custom home console hardware design. the ps3 was its funeral.

it's all off the shelf parts and APIs from here on out.
 

v1oz

Member
The GC had three 1T-SRAM pools. eFB, eTC and MEM1. That's the official terminology, straight from the technical manual. MEM1 is the main memory. If the Wii U eDRAM was a framebuffer, Nintendo would have called it "eFB", not "MEM1". And the amount is actually a pretty good hint, as the 360 proves that 10MB are sufficient for 720p.

In all honestly what we have learned this gen is that 10MB proved insufficient for the Xbox 360. Because you'd have to resort to tiling to fit things into memory and that severely impacted performance especially any deferred rendering. And for modern fully deferred rendering engines 32MB eDram makes a lot more sense.

Anyway the way the Wii U will work is that the 2GB pool will be used as main system memory. And assets will get swapped into the eDram for rendering on screen. The eDram is inadequate to store all your assets.
 
I recall lots of anon devs saying it was worse than 360/ps3.
Only a few of them, and the ones that complained about the system often pointed at the CPU. None of them IIRC said anything negative about the RAM. In fact, we have heard praises about it instead. As Wsippel said, however, the ones that praised it was talking about the whole memory system instead of just the main RAM. Greater than the sum of its parts.
 
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.

Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.

Wii U avoids RAM bottleneck, says Nano Assault dev

the team was “amazed” by “how much code” the hardware could handle without slowdowns, even before optimisation.

“The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U,” he said.

The developer said bottleneck apply to any hardware but Nintendo’s decisions as regards cache layout, ram latency and ram size prove an effective solution.

I smell b-b-b-b-bullshit.

I said good day Sir.

I SAID GOOD DAY!

Alright Bill, didn't see you there. How's the wife.. I SAY... HOW'S THE WIFE??
 
Yes the game engines will have to be significantly re-engineered. That's why all the Wii U games have issues with low res shadows - the RAM is too slow. To think Iwata even mentioned the Wii U was designed with easy portability for 3rd parties in mind.

Lol, no.

I just can't believe some of you really think 3rd parties games would be more difficult to port to a modern architecture, with some times the ram pool. This is just crazy.

You also don't understand that a better architecture it's more efficient, so it can do better use of bandwidth. Also, you don't even know the real bus size of the thing. Counting chips number it's just a guess. Could be 128 bits as it could be 256 bits.

Guys, you lack a lot of essential information here.
 
Enjoy your ban?

Why are you so aggressive?

Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.

I may be wrong about this but I'm not sure if RAM speed actually makes that much of a big deal for gaming. On PC, all the advice I seemed to get is that it doesn't really matter what speed your ram is.
 

Durante

Member
You also don't understand that a better architecture it's more efficient, so it can do better use of bandwidth. Also, you don't even know the real bus size of the thing. Counting chips number it's just a guess. Could be 128 bits as it could be 256 bits..
Totally. Those serial numbers on the chips are just rough guidelines after all, you can connect as many bits as you want to them!
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Totally. Those serial numbers on the chips are just rough guidelines after all, you can connect as many bits as you want to them!


I heard Nintendo were drawing extra lanes on the Hynix chips with a graphite pencil!
 

mrklaw

MrArseFace
Lol, no.

I just can't believe some of you really think 3rd parties games would be more difficult to port to a modern architecture, with some times the ram pool. This is just crazy.

You also don't understand that a better architecture it's more efficient, so it can do better use of bandwidth. Also, you don't even know the real bus size of the thing. Counting chips number it's just a guess. Could be 128 bits as it could be 256 bits.

Guys, you lack a lot of essential information here.

modern architecture plus more ram means nothing if its slower. Look at PC GPUs with DDR3 memory (in budget laptops) compared to GDDR5 versions of the same chip to see the impact that slow memory can bring.

The edram is the unknown quantity here, but then you're back to potentially needing to optimise specially for the WiiU, and thats not something that all developers can or will do. The idea of a 'modern architecture' (at least in part) was to allow developers to bring ports across easily.

I'm sure Nintendo first parties will make the WiiU sing, leveraging its quirks well. But many won't.
 

Erethian

Member
I may be wrong about this but I'm not sure if RAM speed actually makes that much of a big deal for gaming. On PC, all the advice I seemed to get is that it doesn't really matter what speed your ram is.

The issue would be about the RAM available to the GPU more than anything, since it's a unified pool.
 

vocab

Member
I may be wrong about this but I'm not sure if RAM speed actually makes that much of a big deal for gaming. On PC, all the advice I seemed to get is that it doesn't really matter what speed your ram is.

If the RAM is shared by GPU, slow ram does indeed matter. CPU's can rely on cache to make calculations. GPU's need faster ram because you are rendering much more complex calculations with floating points. There's definitely a bottleneck in there some where.
 
If the RAM is shared by GPU, slow ram does indeed matter. CPU's can rely on cache to make calculations. GPU's need faster ram because you are rendering much more complex calculations with floating points. There's definitely a bottleneck in there some where.

Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
 

wsippel

Banned
In all honestly what we have learned this gen is that 10MB proved insufficient for the Xbox 360. Because you'd have to resort to tiling to fit things into memory and that severely impacted performance especially any deferred rendering. And for modern fully deferred rendering engines 32MB eDram makes a lot more sense.

Anyway the way the Wii U will work is that the 2GB pool will be used as main system memory. And assets will get swapped into the eDram for rendering on screen. The eDram is inadequate to store all your assets.
It doesn't need to store all the assets. Performance critical stuff like render targets, display lists and the like go in MEM1, big assets like textures or audio are stored in MEM2.
 

mrklaw

MrArseFace
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

unless those are software rendering benchmarks, thats kind of irrelevant because PCs would use GPU ram as the critical element, system memory would have less of an impact there.
 

Soul_Pie

Member
Can someone explain some of these terms in lay mans language, I've looked up some of this stuff on google but I still don't get it. I'm really interested in this stuff, but words like bandwidth in regards to ram and DDR3 and DDR5 are like foreign language to me.

Anyone want to give us tech noobs a bit of an overview?
 

mujun

Member
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.

Maybe they don't want to say the equivalent of, "Our Wii U versions will be inferior!"
 

dark10x

Digital Foundry pixel pusher
Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
Yeah, the Gamecube was a brilliant piece of hardware. Perfectly balanced and very powerful for its time. I still think it was a much more elegant than the XBOX.
 

Raist

Banned
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

That's a 2004 game already maxed out with "shitty" RAM. Of course recent one won't make a huge difference.
 
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

What´s next? Quake 1 benchs?.
 
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

What? You know that the Wii U also uses this slow RAM for the GPU... why don't you post a benchmark of different RAM types on GPUs?
 

Vol5

Member
GAF getting all upset over slow(er) memory > Global downturn, companies cutting back, inefficiencies being streamlined.
 

kuroshiki

Member
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

Dude. I have a feeling that you will eat massive crow.

just stop now and watch for awhile.
 

Nirolak

Mrgrgr
Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

Video cards have a lot of VRAM on PC.

They don't use system RAM for the GPU.
 

KageMaru

Member
Do we have a detailed breakdown over the memory architecture in the system and how it's used?

New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.

Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.

GC had bottlenecks. The poly performance was behind the PS2 and xbox for example.

Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.

NDA could be a reason. Both MS and Sony have heavy NDAs covering parts of their existing systems (the CPU and GPU respectively)

Most intensive bandwidth stuff will run on eDRAM.

You DON'T need crazy numbers at system ram, as PC's proves:

hl2.png


To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.

This doesn't prove anything. You can't compare a PC memory set up to what we're seeing in the Wii-U. What type of memory did the GPU use for those benchmarks?
 

Xanonano

Member
Wouldn't the bandwidth be shared equally between the game and the operating system, since each can only access half the RAM? If games can only address two of the four chips, they might only get 6.4GB/s of bandwidth.
 

onQ123

Member
Gemüsepizza;44513079 said:
What? You know that the Wii U also uses this slow RAM for the GPU... why don't you post a benchmark of different RAM types on GPUs?

Maybe the Wii U is like the PS2 & use the Embedded Ram for Vram but also use the slower ram for things that are not on the screen at the moment.
 

DonMigs85

Member
I wonder just how much bandwidth the eDRAM has... At the very least it has to be over 20GB/sec to match the Wii's 3MB + 24MB 1T-SRAM aggregate bandwidth, but that's still slow for a modern HD system.

I wish a dev would just leak the specs somewhere they're unlikely to be traced.
 

McHuj

Member
Maybe the Wii U is like the PS2 & use the Embedded Ram for Vram but also use the slower ram for things that are not on the screen at the moment.

Probably, but I'm not sure 32MB is enough. It maybe enough for 360/PS3 ports and WiiU only games designed specifically for it, but going into next gen I don't think it will be enough.

Best case for WiiU, is that both 720/PS4 will only be 4X faster in terms of main memory bandwidth (and that's really conservative) and 3.5-4X size for the main memory. That's a huge discrepancy and will make porting difficult.
 
Someone put it in layman's term to me. What the hell does this mean? Because right now I'm reading it as 1GB of PS360 RAM is ~ 2GB of WiiU RAM in terms of performance. So the current 512MB the PS360 has is really only half the RAM performance of the WiiU instead of 1/4.
 

DonMigs85

Member
Someone put it in layman's term to me. What the hell does this mean? Because right now I'm reading it as 1GB of PS360 RAM is ~ 2GB of WiiU RAM in terms of performance. So the current 512MB the PS360 has is really only half the RAM performance of the WiiU instead of 1/4.

What? No no, the Wii U RAM just has limited bandwidth, barely more than half the speed of the 360's 512MB RAM pool or the PS3's 256MB XDR pool.
 
Top Bottom