lwilliams3
Member
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
You know saying this makes you a stupid Nintendo fanboy in a stage of denial? You can't possibly be reasonable and unbiased. How dare you... HOW DARE YOU!!!
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.
Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.
Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
How many Gamecube's does this equal out to?
Enjoy your ban?
The GC had three 1T-SRAM pools. eFB, eTC and MEM1. That's the official terminology, straight from the technical manual. MEM1 is the main memory. If the Wii U eDRAM was a framebuffer, Nintendo would have called it "eFB", not "MEM1". And the amount is actually a pretty good hint, as the 360 proves that 10MB are sufficient for 720p.
Only a few of them, and the ones that complained about the system often pointed at the CPU. None of them IIRC said anything negative about the RAM. In fact, we have heard praises about it instead. As Wsippel said, however, the ones that praised it was talking about the whole memory system instead of just the main RAM. Greater than the sum of its parts.I recall lots of anon devs saying it was worse than 360/ps3.
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.
Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
Wii U avoids RAM bottleneck, says Nano Assault dev
the team was “amazed” by “how much code” the hardware could handle without slowdowns, even before optimisation.
“The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U,” he said.
The developer said bottleneck apply to any hardware but Nintendo’s decisions as regards cache layout, ram latency and ram size prove an effective solution.
I said good day Sir.
I SAID GOOD DAY!
Yes the game engines will have to be significantly re-engineered. That's why all the Wii U games have issues with low res shadows - the RAM is too slow. To think Iwata even mentioned the Wii U was designed with easy portability for 3rd parties in mind.
Enjoy your ban?
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
Totally. Those serial numbers on the chips are just rough guidelines after all, you can connect as many bits as you want to them!You also don't understand that a better architecture it's more efficient, so it can do better use of bandwidth. Also, you don't even know the real bus size of the thing. Counting chips number it's just a guess. Could be 128 bits as it could be 256 bits..
Totally. Those serial numbers on the chips are just rough guidelines after all, you can connect as many bits as you want to them!
Lol, no.
I just can't believe some of you really think 3rd parties games would be more difficult to port to a modern architecture, with some times the ram pool. This is just crazy.
You also don't understand that a better architecture it's more efficient, so it can do better use of bandwidth. Also, you don't even know the real bus size of the thing. Counting chips number it's just a guess. Could be 128 bits as it could be 256 bits.
Guys, you lack a lot of essential information here.
I may be wrong about this but I'm not sure if RAM speed actually makes that much of a big deal for gaming. On PC, all the advice I seemed to get is that it doesn't really matter what speed your ram is.
I may be wrong about this but I'm not sure if RAM speed actually makes that much of a big deal for gaming. On PC, all the advice I seemed to get is that it doesn't really matter what speed your ram is.
If the RAM is shared by GPU, slow ram does indeed matter. CPU's can rely on cache to make calculations. GPU's need faster ram because you are rendering much more complex calculations with floating points. There's definitely a bottleneck in there some where.
It doesn't need to store all the assets. Performance critical stuff like render targets, display lists and the like go in MEM1, big assets like textures or audio are stored in MEM2.In all honestly what we have learned this gen is that 10MB proved insufficient for the Xbox 360. Because you'd have to resort to tiling to fit things into memory and that severely impacted performance especially any deferred rendering. And for modern fully deferred rendering engines 32MB eDram makes a lot more sense.
Anyway the way the Wii U will work is that the 2GB pool will be used as main system memory. And assets will get swapped into the eDram for rendering on screen. The eDram is inadequate to store all your assets.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
The edram is the unknown quantity here....
unless those are software rendering benchmarks, thats kind of irrelevant becausePCsWii U would useGPUeDram as the critical element, system memory would have less of an impact there.
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
Yeah, the Gamecube was a brilliant piece of hardware. Perfectly balanced and very powerful for its time. I still think it was a much more elegant than the XBOX.Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
.43%? Yikes
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
43%? Yikes
.
Seriously considering just not getting this system and canceling my Monster Hunter preorder. There's been no good news since it released.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
New exotic architectures that people lack familiarity with are one thing. But whatever the PS4 turns out as, I don't think anyone will ever question it for design choices that seemingly bottleneck the whole system. Sony's hardware designs have actually been getting better recently.
Whereas ever since Nintendo released the Gamecube, a system with the perfect Ying & Yang balance, where every design choice complemented the other and with no obvious bottle necks throttling the system. Since then their new hardware seems to show a lot of design oversights. Even the Gamecube could run perfect and often superior ports of third party games with the minimum fuss. I remember Activision saying they got Tony Hawks running in just three days on the GC and it made launch without issues.
Ok..for all the posters continually stating how much of a problem it is, why we have not heard any devs working on Wii U games complaining about it? We definitely heard about the CPU, for example.
Most intensive bandwidth stuff will run on eDRAM.
You DON'T need crazy numbers at system ram, as PC's proves:
To those chip counters, go count low end GPU's with crazy low density chip numbers and 128 bits of BUS. Counting chips it's only a guess. And you don't have any clue of latencies either.
Gemüsepizza;44513079 said:What? You know that the Wii U also uses this slow RAM for the GPU... why don't you post a benchmark of different RAM types on GPUs?
Maybe the Wii U is like the PS2 & use the Embedded Ram for Vram but also use the slower ram for things that are not on the screen at the moment.
But can it run Half Life 2?
And that's wrong. The eDRAM is MEM1. It's not a framebuffer.
Someone put it in layman's term to me. What the hell does this mean? Because right now I'm reading it as 1GB of PS360 RAM is ~ 2GB of WiiU RAM in terms of performance. So the current 512MB the PS360 has is really only half the RAM performance of the WiiU instead of 1/4.