So as far as I can tell, up until earlier this year, there were two different dev kit configs, one of which had 1 GB of RAM the other 1.5 GB of RAM. Problem solved.
And the GPU does not go against an underclocked 4830 at all per say. It obviously had to be underclocked, because at 55 nm, it was too hot for say, a case size resembling the Wii U in form but somewhat bulkier?
In any case, the final dev kits were described as more powerful I believe. I can imagine the final custom silicon, if at 32nm, which seems absolutely necessary and unavoidable, being clocked a bit higher.
Do multipliers even matter anymore? I recall someone mentioning that they don't. Is it possible to avoid them via some customization to just run in a low downclocked mode set to match the Wii?
I've had enough of this shit. Let's try to figure it out.