After one or two experiences like that, I've come to understand that Nintendo build hardware by estimating the very least they think they need, for their own titles. And nothing more.
agree. and that's why there will be no third party support.
After one or two experiences like that, I've come to understand that Nintendo build hardware by estimating the very least they think they need, for their own titles. And nothing more.
They were complementing the amount of RAM based on the target specs, I don't think they realized how slow it would be.Have we had developers complaining about ram speed? I thought I remember some devs complimenting it.
Jesus. What the fuck, Nintendo? Even the Wii was a much better system for its time than this. I can only blame the Yen-Dollar conversion rate for driving Iwata insane.
Have we had developers complaining about ram speed? I thought I remember some devs complimenting it.
Nah, that's really not true. The Wii was 5 years out-of-date performance levels coupled with a 5 years out-of-date feature set. Wii U seems to be 6 years out-of-date performance levels with a 2 years out-of-date feature set.Jesus. What the fuck, Nintendo? Even the Wii was a much better system for its time than this. I can only blame the Yen-Dollar conversion rate for driving Iwata insane.
His point was that the Wii was better than the original Xbox and the WU might be worse than the 360.What? No. The Wii U from Wii is a better jump up than the Wii from GameCube.
Yeah, Wii U seems to actually be better in some regards (more ram total, stronger GPU, more eDRAM) whereas the Wii seemed content to match the Xbox in numbers and nothing more. Still, it does feel like a frustrating case where just a few relatively small steps could've really pushed the system forward, even if it would still be trounced by whatever Sony/MS put out. At the least we'd probably be seeing parity in multiplatform titles more than downgrades.What? No. The Wii U from Wii is a better jump up than the Wii from GameCube.
They were complementing the amount of RAM based on the target specs, I don't think they realized how slow it would be.
I'm really surprised by this news. Nintendo usually has been really good on the RAM front in many of their consoles. I figured the Wii U would be the same.
I remember plenty of RAM amount compliments, not any on its speed.
I'm talking about the Wii U from the HD twins. Its possible its less of a jump than wii from ps2/xbox/gc. Perhaps not a good argument from me, I'm tired, pethaps I might not be thinking soundlyWhat? No. The Wii U from Wii is a better jump up than the Wii from GameCube.
Shin'en explicitly complimented the performance, actually:They were complementing the amount of RAM based on the target specs, I don't think they realized how slow it would be.
I'm really surprised by this news. Nintendo usually has been really good on the RAM front in many of their consoles. I figured the Wii U would be the same.
http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/todays hardware has bottlenecks with memory throughput when you dont care about your coding style and data layout. This is true for any hardware and cant be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls.
Not quite. No analog triggers. As for 3rd party support, well I'll wait for the announcments.
No, it is pretty obvious unfortuneately that Nintendo went cheap, pathologically cheap. No optical out, no ethernet, no hdd, cheapest RAM on the market, cheap screen, cheap battery, cheap flash,...everything about this console is cheap and dirty. And that isn't even looking at the half baked OS.
This is a really poor console for 2012. And yes, you can still have fun with it and play mario, but don't delude yourself into thinking 3rd party support is going to be any better than wii. In fact, it will most likely be worse because no way this sells like the wii.
Could very well be.I think this is simply a case of 'the game-pad is the centre piece, at all costs'.
I seriously think they built it for 1st party devs and the things they asked for
Showed 3rd parties how to make the WiiU purr, had sessions to show what can be done
Most publishers/devs just said fuck it, port it, why try to go the extra mile for shit already released
Just have it ready to ship on launch day, disaster commences
No one QA'ed Epic Mickey 2?... are you fucking kidding me, that shit made Skyrim for the PS3 look like it was running 120FPS
Try as I might, I simply cannot fathom Nintendo's logic here. Not for pricing, not for future proofing, not for performance, not for anything. On paper it just seems bafflingly illogical and unnecessarily crippling, to the point where I instinctively assume I'm missing something critical because of how silly this is.
I just don't understand this hardware or what Nintendo expects *shrug*.
It saves a full half of that for the OS. Not that the 360 and PS3 didn't set some aside for the OS, but they kept the footprints VERY small. If Nintendo had a similarly small footprint I don't think this would matter as much, I imagine a lot of developers could make up for it just by loading MUCH more "just in case".
Though I wouldn't be surprised if future updates optimized the OS and shrank the footprint, allowing more of the ram to be used by games.
Annoying thing is that if the exchange rate stayed about the same as it did when the Wii came out, they could've really ramped things up by keeping the current price point (in NA anyway), but as is they're at a huge disadvantage for exporting. I'm not sure "insane" is the right word but it definitely is true that the exchange rate is making some of the Japanese corporations make calls they wouldn't have in the prior decade.Another $30-50 might have done enough to fix most of those problems. I would have paid it.
I wouldn't go THAT far, but then I'm in it for exclusives or games that take good advantage of the Wii U's setup versus bleeding edge hardware (which PC is always best for anyway). Still, it's going to nag me whenever I play games that going JUST a bit further could've made a huge difference, kinda like with 480p being a stretched 4:3 image rather than a native 16:9 on Wii (though I wonder if most TVs even take a 848x480 image?)Sounds terrible
I wont be getting a nintendo system this gen
Sounds terrible
I wont be getting a nintendo system this gen
what makes you think any of this is true?
Another $30-50 might have done enough to fix most of those problems. I would have paid it.
So no matter what the quality of the games, the memory speed is a deal breaker? lol
Not bad. Under idealized assumptions, if you want to stream last-gen assets with last-gen image quality (e.g. anisotropic filtering really messes with your calculation).This thread is getting depressing. Let's playdevil'sangel's advocate.
The Wii was woefully underpowered and printed money. The 3DS is woefully underpowered and stomped the Vita into a hole.
My worry is that if the Wii-U 'wins' next time around, Sony & MS will give up on the power race too. Why bother making killer hardware if nobody cares?
Port from Wii to PS3/360, then port "back" to Wii U. Not that it's unprecedented for HD upports to run worse even if you think they shouldn't, but I doubt Epic Mickey 2 has the "well the hardware didn't scale up in a way that matched the predecessor" excuse that Zone of the Enders 2 had... and going by most accounts it sounds like Epic Mickey 2 makes ZoE2 look like a shining example of how to turn a game HD. Hell, possibly Silent Hill HD collection too if the frame rate literally dives into single digits rather than being 15 FPS or whatever.I mean Epic Mickey was OK as an exclusive, the hell happened with 2?
So I asked the same question in the specs speculation thread but I'll ask it here to hedge my chances:
How does Anand know the clock of the DDR3? Because nothing in those chip markings that I've seen indicates that. For reference, here's the manufacturer's page: https://www.skhynix.com/products/computing/view.jsp?info.ramKind=19&info.serialNo=H5TQ4G63MFR
http://www.skhynix.com/inc/pdfDownload.jsp?path=/upload/products/gl/products/dram/down/GDDR.pdf
Decoding guide. 12C means it's 800 MHz.
That Giant Bomb Wii 7 hour live stream escapade
None of the 3rd party games were optimized, looked good
Go see Epic Mickey 2...
I mean Epic Mickey was OK as an exclusive, the hell happened with 2?
If you are happy buying a Wii U to play Nintendo franchises and current generation third party offerings then great. Just don't expect it to hold any relevance in a few years from now.
It's using all the bus width available from the chips.What if it is a dual channel interface like in the PC world? Wouldn't this double the bandwidth?
Those were never in the cards.I guess this makes PS4/720 ports a little less likely too.
I wonder how big of an issue this really is. As anyone that does a lot of overclocking and benchmarking on their PC knows, latencies are often more important than raw bandwidth. I'm going to wait until we get more developer impressions before I join in on all the hyperbole.Shin'en explicitly complimented the performance, actually:
http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/
Notice how he doesn't compliment bandwidth itself, but how cache and low latency alleviate common throughput issues.
Well, a bit of both isn't out of the question, Black Ops 2 sounds like it came out mostly fine and at least has some legitimate advantages that the rest can't claim: if you want Wii Remote controls with Epic Mickey 2 go with Wii, or for similar in HD go to PS3, while the others don't even really seem to have actual improvement by having the GamePad there. I think we'll see how things go with games like Aliens that aren't being rushed out for launch, though I do kind of expect them to be the same at best.exactly. NONE of them. If your common point between all these games is the Wii U you need to ask yourself. is it maybe the console?
Why do you think that?the X360/PS3 will continue to be supported for the next 3-4 years
You're largely off (below) on the texture side (1:1 texure mapping is a really ideal case - introduce an angle on that and it gets much worse), and similarly largely off (but this time above) on the geometry side (such vertex/pixel ratios are not something you'd aim for without tessellation). Other than that, what really helps with texture filtering are the caches - modern GPUs normally do their texutre lookups in tiles of multiple texels (for instance, 32x32). Last but not least, some of those render targets originating from edram will need to get spilled to main RAM.This thread is getting depressing. Let's playdevil'sangel's advocate.
Assuming everything else (framebuffers, shader constants, lookup tables, etc...) being allocated in the eDRAM, 12.8 GB/s is probably more than enough bandwidth for streaming textures and geometry to the GPU.
Because of the way mip-mapping works your worst case scenario is going to be around a unique texel per pixel in a modern engine that does some sort of Z pre-pass. So for 1080p that gives you 1920*1080 texels * 4 bytes (RGBA) each. For ~8.3 MB/frame. Let's triple that to take into account extra stuff like normal maps, specular maps and so on. Thats ~25 MB/frame for textures. 25 MB * 60 fps = 1.5 GB/s for texture transfer. Some people may object that I'm not taking into account that each texel requires multiple samples, but I'm not taking into account texture compression either and it should roughly balance out.
We can do a similar calculation for geometry. You don't actually want to dice your geo down to the 1 triangle per pixel range in a modern engine because your shaders typically work in quads and you'll waste ~75% of your shader power doing that. Let's aim for a unique vertex every 10th pixel. 1920 * 1080 / 10 = 207360 vertices. Each vertex takes around 32 bytes (I can break this down if need be) so that gives us ~6.64 MB/frame. We haven't taken into account things like overdraw or that fact we don't have perfect frustum culling however. Let's say that increases our geometry by an order of magnitude giving us ~66.4MB / frame. 66.4 MB * 60 fps = 4 GB/s for geometry transfer. (This number is probably a huge overestimate)
So 1.5 + 4 = 5.5 GB/s needed. We're not even using half of the 12.8 MB/s we have!I have the flu and doing this amused more than it should have.
Just telling me i'm off is no fun. Let's see your math!You're largely off (below) on the texture side (1:1 texure mapping is a really ideal case - introduce an angle on that and it gets much worse), and similarly largely off (but this time above) on the geometry side (such vertex/pixel ratios are not something you'd aim for without tessellation). Other than that, what really helps with texture filtering are the caches - modern GPUs normally do their texutre lookups in tiles of multiple texels (for instance, 32x32). Last but not least, some of those render targets originating from edram will need to get spilled to main RAM.
In other words, give it another try ; )
If you are happy buying a Wii U to play Nintendo franchises and current generation third party offerings then great. Just don't expect it to hold any relevance in a few years from now.