• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

Vagabundo

Member
Is cool.

The espresso leaker at B3D must have some good green if he thinks in any way that it would have similar performance at it's 1.2 GHz vs the 3.2 GHz Xenon though.

It makes me question the reliability of his information. He either knows what he is talking about or he doesn't.
 
Is cool.

The espresso leaker at B3D must have some good green if he thinks in any way that it would have similar performance at it's 1.2 GHz vs the 3.2 GHz Xenon though.
Actually, lherre or arkam recently told us that the CPU clockspeed increased by 25% between third and fourth version devkits, so the cpu he was talking about was only about 1GHz.

It has been calculated before that broadway performs surprisingly well in at least some tasks against xenon despite the clockspeed difference (except in SIMD, of course.) There was one person that claimed that in real-time performance, Broadway was only 20% weaker than a Xenon core. If we use those calculations, that would go along with espresso's statements.
 

DrNeroCF

Member
It doesn't make any sense at all to me - this idea that magically this core tech is dramatically faster per core.

PowerPC was ditched by major players years ago.

Intel over nearly a decade invested tens of billions to make their current CPU core tech 3x faster than late P3 early P4 cores. Just imagine if Intel abandoned x86, and someone picked up the Pentium 4 design, ran it at 1.25 GHz...why are we to believe it's running faster than a P4 at 3.2 GHz?

Everything we know suggests that it's weak.

Isn't that what the core duos were? Intel ditching the Pentium 4 design and running with the Pentium M design?

I remember reading about how much developers hated the 360 and PS3 CPUs when they first came out. Considering there were no G5 PowerPC chips clocked that high back then, or that had that many cores (like you said, major players ditched PowerPC, no G5 Powerbook and the Power Mac only went up to two dual cores at 2.5Ghz), the console design must have been really lobotomized in order to reach those clocks.
 
Isn't that what the core duos were? Intel ditching the Pentium 4 design and running with the Pentium M design?

I remember reading about how much developers hated the 360 and PS3 CPUs when they first came out. Considering there were no G5 PowerPC chips clocked that high back then, or that had that many cores (like you said, major players ditched PowerPC, no G5 Powerbook and the Power Mac only went up to two dual cores at 2.5Ghz), the console design must have been really lobotomized in order to reach those clocks.
Yes, and it should be noted that the 750 series (including Gekko and Broadway) max clockspeed was under 1GHz due to its limitation in design. The fact that Wii U's CPU is clocked over 1GHz means that those cores are not "real" 750 CPUs, but are designed to behave the same as them.
 

Vagabundo

Member
Yes, and it should be noted that the 750 series (including Gekko and Broadway) max clockspeed was under 1GHz due to its limitation in design. The fact that Wii U's CPU is clocked over 1GHz means that those cores are not "real" 750 CPUs, but are designed to behave the same as them.

Was that limitation a design or process size issue?

I believe that the clockspeed bump came after the die shrink, but maybe that die shrink also included some new design. That was around the time 3rd parties sources reported a jump in performance, maybe it wasn't all related to clockspeed increase.

It is possible the B3D source above had access to the previous chip and it was just 3 broadway cores, but that then changed. It still wouldn't explain why he believed it to have the processing power of a Xenon when the calculations show it shouldn't even be close.
 
A

A More Normal Bird

Unconfirmed Member
It could be that the calculations you're employing are next to useless for determining what you think they do. A tri-core Wii CPU at ~1.2GHz is in fact exceptionally close to a Xenon, as both estimates (by informed folks such as the hacker whose exploits are featured in the OP and the B3D source you refer to) and real world results show. Neither Xenon nor Cell are the monster CPUs their floppage suggest.

Edit: This post comes across as a bit grouchy. I don't know how much technical knowledge you have, but many in these threads are very confident in spouting hyperbole without any reason to be. The Wii & Wii U CPUs were both faster clock for clock than the Xenon, except maybe in SIMD (lack of VMX or similar). Nintendo fans seem to have latched onto the notion that rewriting code for the Wii U will solve all of its problems, and to some degree they're correct. However, it may involve making some sacrifices alongside any improvements and will not involve somehow writing for OoOE. As far as CPU tech goes, the unit seems right in line with the rest of the system. It's not Nintendo being cheap (or at least not excessively more than usual) but their obsession with power draw seems to have played quite a large role in making it what it is. Even an extra 15-20 watts could have placed the Wii U more clearly ahead of the 360/PS3.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Honestly, that GPU core is clocked higher than I cynically expected.
 

Vagabundo

Member
It could be that the calculations you're employing are next to useless for determining what you think they do. A tri-core Wii CPU at ~1.2GHz is in fact exceptionally close to a Xenon, as both estimates (by informed folks such as the hacker whose exploits are featured in the OP and the B3D source you refer to) and real world results show. Neither Xenon nor Cell are the monster CPUs their floppage suggest.

This would be my guess. My own gut tells me this little machine, in total, is about 1.5 360's, but I'm not sure if 3rd parties will spend the time/money on cross platform games unless the Wii U gets a large player base.

Edit: This post comes across as a bit grouchy. I don't know how much technical knowledge you have, but many in these threads are very confident in spouting hyperbole without any reason to be.

Not the least little bit. Fun fact: I was once offered a job as a chip tester in an ibm fab. That was a long time ago ('98) and my career when in a very different direction. Things being different I could be leaking you guy information! My electronic engineering skills are very out of date.

The Wii & Wii U CPUs were both faster clock for clock than the Xenon, except maybe in SIMD (lack of VMX or similar). Nintendo fans seem to have latched onto the notion that rewriting code for the Wii U will solve all of its problems, and to some degree they're correct. However, it may involve making some sacrifices alongside any improvements and will not involve somehow writing for OoOE. As far as CPU tech goes, the unit seems right in line with the rest of the system. It's not Nintendo being cheap (or at least not excessively more than usual) but their obsession with power draw seems to have played quite a large role in making it what it is. Even an extra 15-20 watts could have placed the Wii U more clearly ahead of the 360/PS3.

Yeah, each system has it's highlights and bottlenecks.

And I'm sold on the Wii U, but I've always appreciated power efficient designs :D.
 

QaaQer

Member
It's not Nintendo being cheap (or at least not excessively more than usual) but their obsession with power draw seems to have played quite a large role in making it what it is. Even an extra 15-20 watts could have placed the Wii U more clearly ahead of the 360/PS3.

I really don't think we can know the motivations of Mr. Nintendo. But a more likely reason than power draw is cheapness. That extra 15 or 20 watts would have added substantially to their bom. You know, like having a 7 or 8 hour battery.
 

Vagabundo

Member
I really don't think we can know the motivations of Mr. Nintendo. But a more likely reason than power draw is cheapness. That extra 15 or 20 watts would have added substantially to their bom.

But it is always a trade off between heat/noise/reliability and more power. Supposedly the Japanese market likes small things. I'm not sure we can just put it down to cheapness, but obviously finding the right price point was important to them, and they are not likely to ever go in with a heavy loss leader. It's just not their style and considering that they are still around with a mountain of cash it is hard to argue with that mindset.

You know, like having a 7 or 8 hour battery.

Wasn't that a weight issue? They needed a controller that kids/old people could comfortably hold.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Computer engineering is so weird. The fact that a tri-core 1.2 ghz modified broadway CPU can compete on ANY level with a 3.2 ghz Xenon is (in my opinion) utterly startling. This would mean that the 2.4 ghz numbers being thrown around all the way up until a few weeks ago would have utterly devastated Xenon.
 
It could be that the calculations you're employing are next to useless for determining what you think they do. A tri-core Wii CPU at ~1.2GHz is in fact exceptionally close to a Xenon, as both estimates (by informed folks such as the hacker whose exploits are featured in the OP and the B3D source you refer to) and real world results show. Neither Xenon nor Cell are the monster CPUs their floppage suggest.

Edit: This post comes across as a bit grouchy. I don't know how much technical knowledge you have, but many in these threads are very confident in spouting hyperbole without any reason to be. The Wii & Wii U CPUs were both faster clock for clock than the Xenon, except maybe in SIMD (lack of VMX or similar). Nintendo fans seem to have latched onto the notion that rewriting code for the Wii U will solve all of its problems, and to some degree they're correct. However, it may involve making some sacrifices alongside any improvements and will not involve somehow writing for OoOE. As far as CPU tech goes, the unit seems right in line with the rest of the system. It's not Nintendo being cheap (or at least not excessively more than usual) but their obsession with power draw seems to have played quite a large role in making it what it is. Even an extra 15-20 watts could have placed the Wii U more clearly ahead of the 360/PS3.
It's not "obsession" is called "marketing". Btw, i don't use this to desqqualify your post A More Normal Bird, becuae it might well be the case.
Computer engineering is so weird. The fact that a tri-core 1.2 ghz modified broadway CPU can compete on ANY level with a 3.2 ghz Xenon is (in my opinion) utterly startling. This would mean that the 2.4 ghz numbers being thrown around all the way up until a few weeks ago would have utterly devastated Xenon.
Maybe im not remembering correctly, but MS didn't have much alternatives when setting up with Xeon. The procesor is more focused for general code, so maybe Nintendo did a better job optimising the Esspresso for game code, hence the claims they are close even with the great clock difference. IIRC, it happened the same thing to MS CPU wise as it happened to Sony GPU wise, they had a schedule to satisfy and had to go with a quick stop gap solution.
 
If Wii U's GPU is clocked at 550Mhz, I can't see it having more than 320SP.
That's really bad. If Nintendo wasn't obsessed with low power drawing... They could have done something better than PS360, and really easily. Worst thing is that even 60W would have been considered low power consumption.
This is sad.
 
Computer engineering is so weird. The fact that a tri-core 1.2 ghz modified broadway CPU can compete on ANY level with a 3.2 ghz Xenon is (in my opinion) utterly startling. This would mean that the 2.4 ghz numbers being thrown around all the way up until a few weeks ago would have utterly devastated Xenon.

If you read up a bit on pipelining, branch prediction, superscalar execution and out-of-order execution you won't be that surprised about such differences anymore.
Though I wouldn't be too optimistic about Espresso's efficiency. There is a reason that some developers complain about the CPU.

Maybe im not remembering correctly, but MS didn't have much alternatives when setting up with Xeon. The procesor is more focused for general code, so maybe Nintendo did a better job optimising the Esspresso for game code, hence the claims they are close even with the great clock difference.

Espresso should have advantages at branchy integer code, which is still important in games. On the other hand, Xenon exceeds at floating point performance and has quite powerful vector units. It heavily depends on the situation how big the performance gap between the two is.
 

Jeffa

Neo Member
If Wii U's GPU is clocked at 550Mhz, I can't see it having more than 320SP.
That's really bad. If Nintendo wasn't obsessed with low power drawing... They could have done something better than PS360, and really easily.

320SP @ 550MHZ would be better than PS360, plus newer design/features adding more performance.

BTW die size doesnt suggest 320SP.
 
320SP @ 550MHZ would be better than PS360, plus newer design/features adding more performance.

BTW die size doesnt suggest 320SP.



Power consumption doesn't suggest otherwise.
Also, that would be as some people said, 1,5x Xbox 360 GPU.
Actually, Wii U seems to be on par with PS360, with newer design/features. And that's it, nothing more, nothing less.
It's good considering it's drawing 30W. And that's it, don't expect magic when drawing 30W. If it was something like 60W, sure, Wii U would be easily capable of outperforming PS360. But for now ?
 

Mr Swine

Banned
Power consumption doesn't suggest otherwise.
Also, that would be as some people said, 1,5x Xbox 360 GPU.
Actually, Wii U seems to be on par with PS360, with newer design/features. And that's it, nothing more, nothing less.
It's good considering it's drawing 30W. And that's it, don't expect magic when drawing 30W. If it was something like 60W, sure, Wii U would be easily capable of outperforming PS360. But for now ?

You can't compare a 7 year old GPU against a GPU built/ derived from 4 years ago. Even if Wii U GPU had as many shaders as the 360 GPU it would still outclass it by a margin
 
This has been said before, but to whom is a low power draw a strong selling point? It's not like the power consumption of gaming consoles is a big cost or has any substaintial enviornmental impact. It's not like Nintendo is touting it in advertising either. So why make huge comprimises for something that you don't even use as a selling point? It's just beyond me.
 
If you read up a bit on pipelining, branch prediction, superscalar execution and out-of-order execution you won't be that surprised about such differences anymore.
Though I wouldn't be too optimistic about Espresso's efficiency. There is a reason that some developers complain about the CPU.

Espresso should have advantages at branchy integer code, which is still important in games. On the other hand, Xenon exceeds at floating point performance and has quite powerful vector units. It heavily depends on the situation how big the performance gap between the two is.
Yes, this is pretty much what leaks and sources suggests.

Regarding power draw, is not the main focus really. Cheap hardware is, but once they settle for the relative low power spec they do try to optimize as much as possible. If Nintendo was so much into power savings there are other better alternatives performance/consumption ratio than that power PC derivative.
 

Mr Swine

Banned
This has been said before, but to whom is a low power draw a strong selling point? It's not like the power consumption of gaming consoles is a big cost or has any substaintial enviornmental impact. It's not like Nintendo is touting it in advertising either. So why make huge comprimises for something that you don't even use as a selling point? It's just beyond me.

I rather have a console that is quiet and runs cool than having one that sounds like a plane because of how hot it runs
 

Xun

Member
I follow the ZBrush community (zbrushcentral) pretty closely and I've not seen it used for creating environments. It's not really meant for that. Set pieces, yes. But not not entire rooms or maps or anything.

Here's the closest I've seen to it being used for environments, where it was used for creating cave features in Halo 4.

http://www.zbrushcentral.com/showthread.php?67348-Some-Zbrush-and-Max-Work&p=985984#post985984
I could've sworn I saw it used more, weird.

The point still stands that most games are already made at a higher poly count than what we're already seeing, so I doubt much will change from a technical standpoint to the budget.
 

wsippel

Banned
Yes, this is pretty much what leaks and sources suggests.

Regarding power draw, is not the main focus really. Cheap hardware is, but once they settle for the relative low power spec they do try to optimize as much as possible. If Nintendo was so much into power savings there are other better alternatives performance/consumption ratio than that power PC derivative.
If it was about cheap hardware, Nintendo would have used 476FPs and called it a day.
 
I still think it's more than a little sad that we're all lounging around here arguing about whether or not the Wii U can keep up with a seven year old console. That's the pathetic part of Nintendo's offering.
 
I rather have a console that is quiet and runs cool than having one that sounds like a plane because of how hot it runs
You are a minority. There's a sort of "treshhold" after which things like size and power consumption don't matter.

The Wii could have been the size of a PS2, for example, and it would have been the succes it was anyways.
If it was about cheap hardware, Nintendo would have used 476FPs and called it a day.
Doesn't change what im trying to convey here, because they do balance the system and have minum performance goals. And it is about cheap (don't get why the "cheap" adjective ruffles feathers), cheap that also includes lowering R&D costs and probably fewer tools revisions, I.E. isn't the CPU architecture closely related to what they have been using for 3 generations now?
 

The Boat

Member
This has been said before, but to whom is a low power draw a strong selling point? It's not like the power consumption of gaming consoles is a big cost or has any substaintial enviornmental impact. It's not like Nintendo is touting it in advertising either. So why make huge comprimises for something that you don't even use as a selling point? It's just beyond me.
I've said this before, but I don't understand wht this is so hard to see.
With Wii Nintendo started a strategy of making consoles that are inconspicuous, blend in and don't "scare" non-gamers. This was a fundamental part of their strategy. It's why the Wii was small, cheap, silent (except for the drive at times, but still more silent that the competition), didn't consume much, didn't have a lot of power and had a simple, remote-like controller. This worked big time. People on forums may like to whine that Nintendo abandoned them or whatever, but it's a fact: they hit gold.

Now, they might be going less agressively after the non-gaming crowd this time, but completely abandoning a successful strategy doesn't strike me as inteligent. They want a system that's cheap (people say it's expensive, yet it's cheaper than what 360 and PS3 were, most likely than what Orbis and Durango will be and 50 bucks more expensive than Wii and being sold at a loss), silent, small, reliable(the jury's still out on that one) and inconspicuous. Low wattage in itself isn't the goal, it's not that many people care about that (it's a good marketing bullet-point though and Japanese might care), what matters here is that to build a small durable console it can't generate heat, if it can't generate heat it has to be low powered, especially if you don't want it to be noisy due to fans.

Now, you might not care about the console being small, I don't either (I like it though), but it's incredibly obvious what the logic behind it is and while the Wii's size wasn't what made it a huge success, it might have helped and since it's clear power doesn't dictate sales, Nintendo isn't throwing away cards that could win them the game. Whether or not they're right, that's yet to be seen.
 

Jeffa

Neo Member
Power consumption doesn't suggest otherwise.
Also, that would be as some people said, 1,5x Xbox 360 GPU.
Actually, Wii U seems to be on par with PS360, with newer design/features. And that's it, nothing more, nothing less.
It's good considering it's drawing 30W. And that's it, don't expect magic when drawing 30W. If it was something like 60W, sure, Wii U would be easily capable of outperforming PS360. But for now ?

Power consumption doesnt suggest 320SP. AMD e4690 is 320SP @600mhz on 55nm with 128bit 1.4GHZ GDDR3, its 25w. Drop down to 40nm, downclock core 8% and swap RAM to 64bit DDR3 and you'll be closer to 15w than 20w. WiiU GPU + RAM should be over 25w.

WiiU will use more than 33w at full load, a 2D game is using 33w but WiiU is very GPU-centric, test a better looking game and power usage will increase.
 

wsippel

Banned
Doesn't change what im trying to convey here, because they do balance the system and have minum performance goals. And it is about cheap (don't get why the "cheap" adjective ruffles feathers), cheap that also includes lowering R&D costs and probably fewer tools revisions, I.E. isn't the CPU architecture closely related to what they have been using for 3 generations now?
476FP is done. It's an off-the-shelf 45nm core, and designed for SMP. No R&D costs at all. 750 on the other hand was never available in 45nm and simply doesn't support SMP. Not to mention no 750 ever used eDRAM. The chip Nintendo uses might be cheap to manufacture as it's small (though still a lot bigger than three 476FPs), but it came with significant R&D expenses as it's very different from any off-the-shelf design.
 

Oblivion

Fetishing muscular manly men in skintight hosery
This has been said before, but to whom is a low power draw a strong selling point? It's not like the power consumption of gaming consoles is a big cost or has any substaintial enviornmental impact. It's not like Nintendo is touting it in advertising either. So why make huge comprimises for something that you don't even use as a selling point? It's just beyond me.

People have claimed that Nintendo was catering to the Japanese who seem to like small, low power devices. This could have been a reasonable excuse if it wasn't for the fact that Sony was a Japanese company too.
 

2San

Member
People have claimed that Nintendo was catering to the Japanese who seem to like small, low power devices. This could have been a reasonable excuse if it wasn't for the fact that Sony was a Japanese company too.
Makes no sense even if you ignore Sony. Nintendo is world player and is focused world wide success, not to mention Japan is focused on handheld devices in which Nintendo is already active in.
 

Boerseun

Banned
I still think it's more than a little sad that we're all lounging around here arguing about whether or not the Wii U can keep up with a seven year old console. That's the pathetic part of Nintendo's offering.

The perception of graphics is completely subjective. And, as far as I can tell, very few people have (this subjective perception of) graphics as a deciding factor when buying software, or indeed hardware.

I've said this before, but I don't understand why this is so hard to see.

With Wii Nintendo started a strategy of making consoles that are inconspicuous, blend in and don't "scare" non-gamers. This was a fundamental part of their strategy. It's why the Wii was small, cheap, silent (except for the drive at times, but still more silent that the competition), didn't consume much, didn't have a lot of power and had a simple, remote-like controller. This worked big time. People on forums may like to whine that Nintendo abandoned them or whatever, but it's a fact: they hit gold.

Now, they might be going less agressively after the non-gaming crowd this time, but completely abandoning a successful strategy doesn't strike me as inteligent. They want a system that's cheap (people say it's expensive, yet it's cheaper than what 360 and PS3 were, most likely than what Orbis and Durango will be and 50 bucks more expensive than Wii and being sold at a loss), silent, small, reliable(the jury's still out on that one) and inconspicuous. Low wattage in itself isn't the goal, it's not that many people care about that (it's a good marketing bullet-point though and Japanese might care), what matters here is that to build a small durable console it can't generate heat, if it can't generate heat it has to be low powered, especially if you don't want it to be noisy due to fans.

Now, you might not care about the console being small, I don't either (I like it though), but it's incredibly obvious what the logic behind it is and while the Wii's size wasn't what made it a huge success, it might have helped and since it's clear power doesn't dictate sales, Nintendo isn't throwing away cards that could win them the game. Whether or not they're right, that's yet to be seen.

Great post! It says exactly what I'm thinking.
 

test_account

XP-39C²
No, that's not really what I mean. I mean that Nintendo's focus as a publisher first, hardware maker second sometimes makes them a little adversarial with 3rd parties. I imagine that EA wanted too much control over Nintendo properties or perhaps too much of a slice of digital sales for Nintendo's liking which made their online plans fall apart. The CD add-on situation was definitely a case of Sony wanting too much control over Nintendo's properties but that still could have been handled better by both sides.

I'm not really sure what Nintendo could have done really in either situation but I think being such a successful publisher places them in an odd situation when dealing with other 3rd party publishers.

I do think that working with smaller independent studios like Platinum is a good idea for them. If the games they work on together are successful it could start to build a foundation that leads other independent dev houses to see Nintendo as a viable partner.

Edit: Just to make things clearer I'm not talking about a situation where hypothetically EA wants to make Battlefield:Mushroom Kingdom, I'm talking more about a situation where let's say EA wants to co-publish 3D Mario with Nintendo because of their theoretical online intergration with WiiU.
Ah ok, like that, i understand :) Yeah, if a 3rd party should work that close to Nintendo (or Microsoft and Sony for that matter), i guess that they would their share of the control as well, indeed.
 

TAS

Member
WiiU will use more than 33w at full load, a 2D game is using 33w but WiiU is very GPU-centric, test a better looking game and power usage will increase.

Iwata stated at the Nintendo Direct that the Wii U consumes 75w of power under full load. So why are certain individuals stuck on the 33w number from NSMBU as the maximum? Are we ignoring facts here or am I missing something?
 

wsippel

Banned
Iwata stated at the Nintendo Direct that the Wii U consumes 75w of power under full load. So why are certain individuals stuck on the 33w number from NSMBU as the maximum? Are we ignoring facts here or am I missing something?
I wonder if NSMB has the chips running at 1GHz/400MHz as was initially planned for the system...
 

Log4Girlz

Member
I still think it's more than a little sad that we're all lounging around here arguing about whether or not the Wii U can keep up with a seven year old console. That's the pathetic part of Nintendo's offering.

And in the tech world 2 years in an eon. We are discussing if the wii u can keep up with a centenarian in a marathon.
 
Iwata stated at the Nintendo Direct that the Wii U consumes 75w of power under full load. So why are certain individuals stuck on the 33w number from NSMBU as the maximum? Are we ignoring facts here or am I missing something?

75W is the maximum the power supply can deliver. Actual consumption is usually quite a bit lower than that. IIrc Nintendo stated at some point that it will be around 40-45W.
 
People have claimed that Nintendo was catering to the Japanese who seem to like small, low power devices. This could have been a reasonable excuse if it wasn't for the fact that Sony was a Japanese company too.
The local development community doesn't consider SCE Japanese. They've decentralised their game unit so much, and fixated so much on western development, support and partnerships, that the head of Square Enix actually said Nintendo was the last Japanese hardware maker.
 

Log4Girlz

Member
The local development community doesn't consider SCE Japanese. They've decentralised their game unit so much, and fixated so much on western development, support and partnerships, that the head of Square Enix actually said Nintendo was the last Japanese hardware maker.

I for one am glad nintendo is so focused on the japanese console market. Perhaps they feel they have the power to resurrect that which has died, been cremated, and whose ashes have been shot into space to orbit the sun somewhere between saturn and uranus.
 
Makes no sense even if you ignore Sony. Nintendo is world player and is focused world wide success, not to mention Japan is focused on handheld devices in which Nintendo is already active in.

Yep, this is what I've been saying. NA & EU are bigger console markets than Japan. This whole strategy makes little sense other than a way to cheap out on the hardware and then use this low power draw / size thing as an excuse for PR reasons. It makes no sense to cater to the Japanese market.
 
Top Bottom