Wii U clock speeds are found by marcan

Aug 12, 2009
19,861
0
860
#1
https://twitter.com/marcan42

marcan is a well known Wii hacker, in case you haven't heard of him.

Wii U codenames worth knowing: system Cafe, CPU Espresso, GPU/SoC/etc. Latte, ARM secure processor Starbuck (we made that one up).

1.243125GHz, exactly. 3 PowerPC 750 type cores (similar to Wii's Broadway, but more cache).

GPU core at 549.999755MHz.

we're calling the WiiU security processor the Starbuck (vs. Starlet on Wii). And it seems to be about equally vulnerable, too

sorry, I'd rather not talk about how I got that yet. It doesn't involve leaks, it involves Wii U hacks ;)


 
Feb 8, 2008
23,136
0
0
#3
Why do I have the feeling that Nintendo's insistence on backwards compatibility is going to bite them in the ass in regards to homebrew? Oh marcan...
 
Jun 27, 2007
4,470
0
0
#10
itbegins.gif

Also, the irony of the possibility of Wii U being hacked before 3DS. Either Wii U is that easy or 3DS is that hard. I don't really know which right now.
 
Dec 23, 2008
14,758
0
0
Canada
#21
Holy balls @ that CPU clock
It's like the Pi version of processing speed, but instead of starting as 3.14 that would have been close to the 360 or PS3 processor's clock, it's nearly a third slower, like a slice of Pi instead.

Although, this would explain the low power consumption.
 
Jul 20, 2009
13,184
0
0
#27
How much does a laptop with a higher clocked quad-core and a higher clocked integrated GPU cost?

Edit: Don't mistake this as "Nintendo should just have bought off-the-shelf components from Newegg and we'd have a better system!".
 
Nov 8, 2011
11,466
0
0
#42
So, obviously, 1.25Ghz doesn't have the same significance as a 1.25Ghz CPU on a PC as you could never play AC3 and the like with that. So what's the difference that allows it to run faster and what is it's nearest PC equivalent?
 
Jan 31, 2012
10,679
0
0
#43
ITT: people that don't understand clock speed isn't as an important factor in modern cpu/gpu architecture

I expected speeds right around this ballpark, with ooe and the chunk of edram it should still serve up some great looking software down the line.

Just cool to have the number so we don't have to speculate anymore
 
Feb 20, 2007
11,779
0
0
Switzerland
#44
So erm... Have reasons behind the choice of CPU been discussed? Is it for first party familiarity?
Backwards compatibility is a non-issue if you reuse parts of your former hardware.
But I don't understand why Nintendo lowered the clock speed, and hence, the wattage of their CPU so much. Something is wrong.



GPU clock speed seems in line with the hints given in the various WiiU specs threads.
 
Oct 20, 2011
14,631
0
0
#46
So erm... Have reasons behind the choice of CPU been discussed? Is it for first party familiarity?
I think cost and energy use come before ease of use with decisions of this nature. Talented developers can learn to make games on anything.
So, obviously, 1.25Ghz doesn't have the same significance as a 1.25Ghz CPU on a PC as you could never play AC3 and the like with that. So what's the difference that allows it to run faster and what is it's nearest PC equivalent?
Like a PowerPC Power Mac from the mid 2000s? Total guess on my part but they had Power PC processors.
 
Feb 18, 2012
10,875
0
0
delusibeta.tumblr.com
#50
So, time for DBZ power levels for the Wii U CPU vs Cell vs Xenon?
In terms of pure clock speed, it's a bit like comparing a Fiat with a Ferrari. I know that CPU efficiency has increased so you can do more with 1Ghz than you could seven years ago, but a 3Ghz CPU still trumps a 1Ghz CPU any day of the week. It's pretty much the same strategy as the Wii: spend all the R&D money on the controller and release a cheap and underpowered console alongside it.

That, and the suggestion that the hackers are confident they can blow the console open, is not great news for everyone involved.