• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
So w-we...we won't get this?? :(

screen_zeldawiiutechdemo.jpg

Which explains why there are only 2 characters in the demo.
 
That exist because people can't take advantage of things that have happened since last generations. We know for a fact i7 for a fact are much better than i5 because of how they perform with stuff like encoding but game tech doesn't make use of the cpu the same way they do.

Games need to grow up themselves. We waste far too much for the results we want.

I wasn't disagreeing with this. In reality, there isn't near as much the CPU needs to do with games. Most of what needs to be done in a game is handled via GPU and handled better at that. Remembering that x86 were designed as general purpose processors and GPUs have always been designed specifically to cover things that 3d gaming needed.
 

Thraktor

Member
Also, as an aside, I think it's somewhat amusing how the XBox360's CPU clock speed is taken as some sort of golden standard. The XBox360 was released right at the very height of clock-speed chasing by CPUs. And I mean literally: The XBox360 was released in NA on the 16th November 2005. Just two days before that, on the 14th of November, Intel released the Pentium 4 HT 672, with a clock speed of 3.8GHz. They haven't released a CPU clocked that high in the 7 years since.
 

AColdDay

Member
After seeing this, coupled with all of the other Wii U issues and Nintendo in-general screw-ups (poor decisions regarding their online network, localization decisions, marketing decisions, product decisions like 3d emphasis on the 3DS and the Wii Mini) I am really beginning to lose faith in Iwata's leadership.

Its easy to blame Reggie for all of this as he is constantly in the spotlight, but Reggie is a figurehead and doesn't hold any real power over these decisions. He basically has to clean up the mess Japan hands him. Iwata has a background in development, he endured programming with the N64, he should KNOW the kind of burden this underpowered hardware places on third parties to port. He should KNOW that lack of 3rd party support handicapped his previous system. He approved all of this, and he should be taken to task for not learning his lessons from the 3DS. Games sell the system.
 

ohlawd

Member
Well I'm just gonna go with the wave and say that's really slow.

But I'm waiting for Bayonetta 2. If that shit ends up being as slow as pre patch PS3 version of Bayonetta 1, I don't know what I'd do to myself lol. Can Platinum Games and Nintendo evoke the Nintendo magic?! Tune in next time...
 

TunaLover

Member
New Tweets from marcan

Hector Martin ‏@marcan42
The Espresso is an out of order design with a much shorter pipeline. It should win big on IPC on most code, but it has weak SIMD.

Hector Martin ‏@marcan42
It's worth noting that Espresso is *not* comparable clock per clock to a Xenon or a Cell. Think P4 vs. P3-derived Core series.

Hector Martin ‏@marcan42
No hardware threads. One per core. No new SIMD, just paired singles. But it's a saner core than the P4esque stuff in 360/PS3.
 
Also, as an aside, I think it's somewhat amusing how the XBox360's CPU clock speed is taken as some sort of golden standard. The XBox360 was released right at the very height of clock-speed chasing by CPUs. And I mean literally: The XBox360 was released in NA on the 16th November 2005. Just two days before that, on the 14th of November, Intel released the Pentium 4 HT 672, with a clock speed of 3.8GHz. They haven't released a CPU clocked that high in the 7 years since.

I remember back when clock speeds were the way most people compared CPUs, AMD actually put an ad out explaining that clock speeds aren't everything.
 

Foffy

Banned
After seeing this, coupled with all of the other Wii U issues and Nintendo in-general screw-ups (poor decisions regarding their online network, localization decisions, marketing decisions, product decisions like 3d emphasis on the 3DS and the Wii Mini) I am really beginning to lose faith in Iwata's leadership.

Its easy to blame Reggie for all of this as he is constantly in the spotlight, but Reggie is a figurehead and doesn't hold any real power over these decisions. He basically has to clean up the mess Japan hands him. Iwata has a background in development, he endured programming with the N64, he should KNOW the kind of burden this underpowered hardware places on third parties to port. He should KNOW that lack of 3rd party support handicapped his previous system. He approved all of this, and he should be taken to task for not learning his lessons from the 3DS. Games sell the system.

Nintendo has games to sell the system. The problem is only Nintendo has games that could sell the system.
 

Erasus

Member
Also, as an aside, I think it's somewhat amusing how the XBox360's CPU clock speed is taken as some sort of golden standard. The XBox360 was released right at the very height of clock-speed chasing by CPUs. And I mean literally: The XBox360 was released in NA on the 16th November 2005. Just two days before that, on the 14th of November, Intel released the Pentium 4 HT 672, with a clock speed of 3.8GHz. They haven't released a CPU clocked that high in the 7 years since.

Isnt it 3.2GHZ because its 3 PPEs from Cell, and those are also 3.2GHZ? Those architechtures needed the high clock to perform? Simmilar to AMD vs Intel now, AMD procs generally have higher clocks but Intel still outperforms them at lower.
 

Kenka

Member
Kenka mentioned me a few pages back, so I might as well give my two cents.

First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).

The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.

Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.

The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).
Thank you for these explanations. Reading you tends to show that Nintendo had five desires when they designed the console:

  1. launch before competition
  2. be able to play current HD gen games
  3. backward compatible with the Wii
  4. low power consumption
  5. port from the future consoles made easy by similarities in architecture, and maybe power


If they can fulfil the fifth point, then I'd agree in saying that they were smart. If not, then I'd question my eventual future purchase.
 

prag16

Banned
Kenka mentioned me a few pages back, so I might as well give my two cents.

First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).

The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.

Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.

The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).

Great post. The question here regarding the launch ports is: How much offloading to GPU has already been done by devs? And how much more can be done to be truly optimized?

As it stands, if the CPU (and system overall) was really as weak as some in here seem to think, these ports would not run nearly as well as they do.
 

F#A#Oo

Banned
Underwhelming is what it is.

That being said I still think it will be OK. I wasn't expecting a beast console but I will say the specs don't look as strong as insiders hinted.

In any case I'm not really sure what is what anyway's as you cannot compare these specs to PC parts.
 

Durante

Member
I didn't expect that. My low-end expectation was 1.6 GHz, and my high-end 2.4 GHz.

What's hilarious is thinking back to when people were enraged that I dared to suggest that the Wii U CPU will be clocked significantly lower than the PS360s. I just meant "500 MHz+" by "significantly" and that made people angry. I wonder how they feel about a 2 GHz difference.

Well, it has no vector units for starters. And the clock speed is pretty low. So yeah, it's probably pretty slow for certain workloads like physics or crowd AI. Unless developers manage to move those workloads to the GPU.
It's a 1.2 GHz processor. And while it's OoE, it's not an ILP monster like an i7 either. It's fair to say it's slow for all workloads.
 

JJD

Member
Man the possibility of the WiiU being hacked so early on it's life cycle scares me.

I mean, on one side I'm happy that we can avoid Nintendo region locking bullshit.

But on the other if piracy finds it's way to the console so quickly and easily there's no way this won't harm the console in the long run.

Being from a country were piracy is practically socially acceptable I can count on the fingers of my hands the amount of people I know that buy original Wii and DS games.

Nintendo seems to have done a good job with the 3DS security, hopefully it will be the same with the WiiU.

In a perfect world hackers would find a way to allow region free gaming and homebrew on the WiiU without opening the doors to criminals exploit their works.
 

Meelow

Banned
Kenka mentioned me a few pages back, so I might as well give my two cents.

First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).

The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.

Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.

The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).

So what your saying is...Nintendo did the right move with the Wii U hardware? Or am I reading this wrong?

So w-we...we won't get this?? :(

screen_zeldawiiutechdemo.jpg

We will most likely get this, or possibly better.

Didn't the Wii U get more powerful from the E3 2011 dev kit?
 
I remember back when clock speeds were the way most people compared CPUs, AMD actually put an ad out explaining that clock speeds aren't everything.

Though they also muddied the waters by releasing a "Speed rating" for their CPUs comparing them (internally) to Intel chipsets. Nothing like trying to explain to a non-techy why an AMD XP 2500 ran at 1800mhz.
 
Thank you for these explanations. Reading you tends to show that Nintendo had five desires when they designed the console:

  1. launch before competition
  2. be able to play current HD gen games
  3. backward compatible with the Wii
  4. low power consumption
  5. port from the future consoles made easy by similarities in architecture, and maybe power


If they can fulfil the fifth point, then I'd agree in saying that they were smart. If not, then I'd question my eventual future purchase.

They won't be similar in power. The funny thing is it might be simpler to work with next gen ports than current gen ports. There will be big compromises either way though
 

Doc Holliday

SPOILER: Columbus finds America
Wait what? Wii I understand, but the DS wasn't gimped in hardware.

Compared to hardware that launched at the same time, the PSP, yes it was. Of course looking back Nintendo made great decisions on the DS. My point is they are not interested in an arms race with sony/MS.

I find it amazing actually that Nintendo can do this actually. Iwata is a very shrewd business man. I think however that Nintendo is blowing a huge chance to be truly dominant for years by not looking forward just a little bit more on the tech side.

Kinda surprised Nintendo didn't just go the ARM route considering how energy efficient they are.
 

wsippel

Banned
It's a 1.2 GHz processor. And while it's OoE, it's not an ILP monster like an i7 either. It's fair to say it's slow for all workloads.
Impossible to tell. We don't know how many execution units there are for example. And even Marcan seems to think that it's actually faster than CELL and Xenon for pretty much anything that isn't SIMD, thanks to a significantly better IPC performance.
 
Also, as an aside, I think it's somewhat amusing how the XBox360's CPU clock speed is taken as some sort of golden standard. The XBox360 was released right at the very height of clock-speed chasing by CPUs. And I mean literally: The XBox360 was released in NA on the 16th November 2005. Just two days before that, on the 14th of November, Intel released the Pentium 4 HT 672, with a clock speed of 3.8GHz. They haven't released a CPU clocked that high in the 7 years since.
Which tells use alot about how cpu development has changed since then.
 

LCGeek

formerly sane
I wasn't disagreeing with this. In reality, there isn't near as much the CPU needs to do with games. Most of what needs to be done in a game is handled via GPU and handled better at that. Remembering that x86 were designed as general purpose processors and GPUs have always been designed specifically to cover things that 3d gaming needed.

GP can not be levaraged that well in games I agree. We have known for at least a decade gpus should be the focus with a good overall balance. I'm more miffed that anyone who takes an interest in nintendo platform glances over this. Taking designs meant for one platform distinct way of achieveing results and not in any way remodifying this for nintendo is just dumb. As a consumer I reserve my right to buy products from company who take this line of thinking.
 

LeleSocho

Banned
I think the point is Shinen are one of the only developers who actually knew what they were doing on the Wii and and actually put effort into making graphical showcases. So we should probably listen when they talk about technical stuff because they understand that type of architecture best, while the other dev that said Wii U is horrible had a game already built for a different type of architecture and tried to port it.

Again, for the situation they are in they will only point to the good sides of the hardware and never the bad sides. They also do not make complex games whatsoever so is easier for them to give eyecandy to people.


Another tweet from marcan gives us another bad news, confirms that there is not "hyperthreading" so it's one thread per core...
 

lherre

Accurate
2 questions to some people:

- Have a comparable xenon performance CPU is an achievement in 2012?

- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
 

Berg

Member
All I can say is what I've been playing on the wiiU the last couple weeks looks awesome....

For 5 minutes, then I'm completely into the game and forget about how detailed everything is.
 

Nirolak

Mrgrgr
Impossible to tell. We don't know how many execution units there are for example. And even Marcan seems to think that it's actually faster than CELL and Xenon for pretty much anything that isn't SIMD, thanks to a significantly better IPC performance.

That is not what he said:

Marcan said:
Hector Martin ‏@marcan42

@ZuelaBR I don't know how it compares at the actual clock speeds, but at the same clock the 750 wins hands down except on pure SIMD.
Source: https://twitter.com/marcan42/status/274181216054423552

He said at the same clockspeed it would win. It is definitely not the same clockspeed.
 

JordanN

Banned
Told you all your getting Wii'd again.
Having a HD console with next gen processing power and 4x the RAM of current gen in 2012 is a far better trade off than the Wii with its 2001 API, SD output and 2/1.5x increase in RAM in 2006.
 
Compared to hardware that launched at the same time, the PSP, yes it was. Of course looking back Nintendo made great decisions on the DS. My point is they are not interested in an arms race with sony/MS.

I find it amazing actually that Nintendo can do this actually. Iwata is a very shrewd business man. I think however that Nintendo is blowing a huge chance to be truly dominant for years by not looking forward just a little bit more on the tech side.

Well, just remember Nintendo is always developing the "next big console"... If Nintendo truly "screws the pooch" on this one, I wouldn't be surprised if Nintendo could have a plan B ready in 2 years. Even if they don't, they may go with a 5-6 year cycle (versus the 7-10 year cycle the HD twins have/had), that would put them as a new "next gen" just 2-3 years into PS4 to try again.
 

EVH

Member
All I can say is what I've been playing on the wiiU the last couple weeks looks awesome....

For 5 minutes, then I'm completely into the game and forget about how detailed everything is.

This is absolutely right until you see the price.
 

Thraktor

Member
Right. Crowd AI/ pathfinding isn't. And that's something GPUs are really damn good at.

You're sort of right here. Pathfinding, as usually implemented by hierarchical versions of the A* algorithm, runs much much better on CPUs than GPUs, as it's entirely latency-limited, and the Wii U's CPU should be very good at this kind of code, as it's a short-pipeline, out-of-order CPU with lots of cache.

Crowd AI, on the other hand, doesn't actually have much in common with regular pathfinding at all. As far as I'm aware, most games where there are large crowds of people simulate their movement in a way much closer to physics; ie each person moves in a general direction until they bump into someone and then starts moving in a different direction. This is much cheaper, computationally, than running proper pathfinding for each individual, and because the player is looking at a large crowd, they don't notice when individuals in that crowd are just walking around in circles. It's also something that can be done well on a GPU, as it's not latency limited.
 
All I can say is what I've been playing on the wiiU the last couple weeks looks awesome....

For 5 minutes, then I'm completely into the game and forget about how detailed everything is.

Not to call out you specifically, but it's getting almost as annoying to see a " I DON'T CARE ABOUT GRAPHICS POST" in a graphics thread as " I DONT CARE ABOUT SALES
 

beril

Member
Not only that, but it's not like Shin'en works on huge, ambitious, CPU-demanding games like Skyrim. I'm sure Shin'en is more than fine with what the Wii U offers in terms of performance.

Ok this isn't really relevant to the thread but I've seen it mentioned a few times in relation to CPU power. Why the hell would Skyrim be particularly CPU demanding? I haven't played it, but it seems close enough to oblivion or morrowind. A slow paced rpg with a handful of enemies/npcs at most active at once, with little other dynamic objects and awful animations (ok not really relevant, but I just can't mention the series without commenting on the animations). Other than a massive streaming world there doesn't seem to be anything particularly impressive from a technical point of view.
 

Erasus

Member
Impossible to tell. We don't know how many execution units there are for example. And even Marcan seems to think that it's actually faster than CELL and Xenon for pretty much anything that isn't SIMD, thanks to a significantly better IPC performance.

But, all games are focused on multi-core now...
 
Top Bottom