My head isn't in the sand, you don't understand what this thread is about. Also, all three next gen systems have "terrible" CPUs - the other two (which, again, are not really relevant to the topic at hand in the first place) simply have more shitty cores clocked at a slightly higher frequency. It's still netbook level tech.Come on? Your head is in the sand. The box has 1 gb of usable ram and a terrible cpu compared to X1 and PS4. The gpu is suddenly going to buck this trend? It's horribly underpowered (again, compared to PS4/X1, I'm not talking about PC here). Doesn't take 7000 posts to figure that out.
There's not much to analyze, we already knew most of those things. For example, core 1 is the one with 2MB cache, so it was always pretty obvious that it would be the "main" core. And while multithreaded rendering is a "DX11" feature, it's just a software thing.I would ask for someone like blu and wsippel to check out some of these to give their analysis. Some things I noticed:
- GX2 is mentioned, which further support the leaked spec document for the Wii u sometime ago.
- CPU core 1 is definitely defined as the main core. It looks like they are still working on offloading tasks from it.
- The devs are using EDRAM.
- It seems as if the Wii U does support some dx11 features, though maybe not all of them.
What I find a bit strange is that the game has been in development for so long, yet they apparently only started moving render targets to MEM1 a couple of weeks ago.