Wii U clock speeds are found by marcan

#1
https://twitter.com/marcan42

marcan is a well known Wii hacker, in case you haven't heard of him.

Wii U codenames worth knowing: system Cafe, CPU Espresso, GPU/SoC/etc. Latte, ARM secure processor Starbuck (we made that one up).

1.243125GHz, exactly. 3 PowerPC 750 type cores (similar to Wii's Broadway, but more cache).

GPU core at 549.999755MHz.

we're calling the WiiU security processor the Starbuck (vs. Starlet on Wii). And it seems to be about equally vulnerable, too

sorry, I'd rather not talk about how I got that yet. It doesn't involve leaks, it involves Wii U hacks ;)


 
#3
Why do I have the feeling that Nintendo's insistence on backwards compatibility is going to bite them in the ass in regards to homebrew? Oh marcan...
 
#5
I guess that explains why some developers were having trouble getting ports to run on it. I hope he, or someone else, manage to hack it soon. Region locks are the worst.
 
#10
itbegins.gif

Also, the irony of the possibility of Wii U being hacked before 3DS. Either Wii U is that easy or 3DS is that hard. I don't really know which right now.
 
#21
Holy balls @ that CPU clock
It's like the Pi version of processing speed, but instead of starting as 3.14 that would have been close to the 360 or PS3 processor's clock, it's nearly a third slower, like a slice of Pi instead.

Although, this would explain the low power consumption.
 
#25
Are you fucking kidding me ? 1,25 GHz ?

I change my statement.

Devs were not lazy with ports on WiiU they were fucking wizards to achieve that on WiiU.
 
#27
How much does a laptop with a higher clocked quad-core and a higher clocked integrated GPU cost?

Edit: Don't mistake this as "Nintendo should just have bought off-the-shelf components from Newegg and we'd have a better system!".
 
#42
So, obviously, 1.25Ghz doesn't have the same significance as a 1.25Ghz CPU on a PC as you could never play AC3 and the like with that. So what's the difference that allows it to run faster and what is it's nearest PC equivalent?
 
#43
ITT: people that don't understand clock speed isn't as an important factor in modern cpu/gpu architecture

I expected speeds right around this ballpark, with ooe and the chunk of edram it should still serve up some great looking software down the line.

Just cool to have the number so we don't have to speculate anymore
 
#44
So erm... Have reasons behind the choice of CPU been discussed? Is it for first party familiarity?
Backwards compatibility is a non-issue if you reuse parts of your former hardware.
But I don't understand why Nintendo lowered the clock speed, and hence, the wattage of their CPU so much. Something is wrong.



GPU clock speed seems in line with the hints given in the various WiiU specs threads.
 
#46
So erm... Have reasons behind the choice of CPU been discussed? Is it for first party familiarity?
I think cost and energy use come before ease of use with decisions of this nature. Talented developers can learn to make games on anything.
So, obviously, 1.25Ghz doesn't have the same significance as a 1.25Ghz CPU on a PC as you could never play AC3 and the like with that. So what's the difference that allows it to run faster and what is it's nearest PC equivalent?
Like a PowerPC Power Mac from the mid 2000s? Total guess on my part but they had Power PC processors.
 
#50
So, time for DBZ power levels for the Wii U CPU vs Cell vs Xenon?
In terms of pure clock speed, it's a bit like comparing a Fiat with a Ferrari. I know that CPU efficiency has increased so you can do more with 1Ghz than you could seven years ago, but a 3Ghz CPU still trumps a 1Ghz CPU any day of the week. It's pretty much the same strategy as the Wii: spend all the R&D money on the controller and release a cheap and underpowered console alongside it.

That, and the suggestion that the hackers are confident they can blow the console open, is not great news for everyone involved.