• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
At least it's hd now.

I'm interested to see how rising development costs for next gen impact most devs. Not everyone can afford to dump even more money into even better looking games on more powerful hardware.

The wiiu may be weaker than anyone wanted, but maybe it doesn't have to be more than it is. Graphics in current gen games can be good enough for years still.

And this was clearly what they were shooting for.

It's a huge gamble. It's not a popular choice on this board, but that doesn't really matter.

Only time will tell if it is a popular choice to Average Joe Consumer.
 
any game that can do three players or more split screen will comfortably run two gamepads... I don't think CPU really comes into split screen support much.
 

wsippel

Banned
true, but there have been reports that it's terribly slow, which makes people jump to the worst possible conclusion, that it's a three-way SMP enabled, higher-clocked Broadway at a 45 nm process. considering the jump from Gekko to Broadway was basically a 50% clock boost with a die shrink... it's not hard to imagine this being the case.
Well, it has no vector units for starters. And the clock speed is pretty low. So yeah, it's probably pretty slow for certain workloads like physics or crowd AI. Unless developers manage to move those workloads to the GPU.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Shin'en strikes again.

"The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU."

You hear that? Stop focusing on Mhz. Focus on the architecture.

Yeah, I'm not really going to listen to the words of a dev who has never worked on anything other than Nintendo hardware, compared to a 3DS or a Wii, a Wii U must seem like arcane magic to them.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage

Erasus

Member
Why are people comparing clock speeds without having any underlying knowledge of the architecture? Clock speed doesn't mean much unless we're given reference performance/clock metric from a similar chip on the market.

e.g. a 1.8GHz Core 2 Duo is a lot faster than a dual-core 3.4GHz Pentium IV.

I'm not suggesting this is the case with the Wii U, but making judgements based on clock speed alone makes no sense.

But its (probably) not a whole new architechture like the Core arch! If it was then no problem, however it is backwards compatible, and not like the PS3 where it was an extra chip.

It has to be based on 3 Broadway cores with more clockspeed, more cache and some better instruction sets. Still at its base is the 7750 arch! It cant be a whole new radical arch like Core vs Netburst.

Then as some have said, 1.24 might be the OS/idle speed. It should clock down when its doing nothing. We dont know if he was stressing the CPU when he measured it.
 

beril

Member
so, all in all, wiiu cpu should be around on par with 360 cpu for general purpose use (due to out of order and other benefits) but it will fall behind regarding physics/ai? seems to be a summary of everything said, the gpu is obviously better than 360, but will they have to offload physics/ai to that to take strain from cpu, thus negating alot of the gpus advantage?

Gaming AI is about as general purpose as you get
 

v1oz

Member
Which is why it's not what was in a G3 MacBook. Re-read what I posted.
You said the architecture was used in old Apple Laptops. And no it's not very refined, developers who are used to coding for embedded systems like consoles have already come out and said the CPU is weak.
 

JordanN

Banned
Yeah, I'm not really going to listen to the words of a dev who has never worked on anything other than Nintendo hardware, compared to a 3DS or a Wii, a Wii U must seem like arcane magic to them.
That's a shame. Working on Nintendo hardware doesn't mean you're clueless about the outside world.
 

Doc Holliday

SPOILER: Columbus finds America
I can't believe people are surprised by this. Never be surprised by Nintendo's ability to gimp or cheap out on hardware. They just don't care about what MS/Sony are doing hardware wise. They do what works for them unfortunately.

I'ts going to take a massive failure for them to change their hardware choices. Wii and DS proved they could get away with crappy hardware.

One question I have, does producing an old design actually cost more at this point? How does a chip factory work? I mean do the guys at TMSC or whatever go "shit we don't even have the parts to make that shit anymore" ;)
 
The empircal evidene shows that WiiU can run late and maxed out PS360 games, at least as good as PS3, at launch. This wouldn't be possible with souch an underpowered CPU. We've had rumours that the CPU was on par with 360 or marginally weaker. Somehow they managed to circumvent the slow clockspeed with a different architecture, dedicated silicon, edram or whatever. For some reason that is still beyond me, they wanted extremly low power-draw and thus a very low clockspeed makes sense. I guess they spent alot of R&D-money on finding/developing something that could get a decent performance with such a low clockspeed, in order to give more juice to the GPU.

What I've always thought/said is that nintendo had to sacriface alot of performance and money for a small box and low power consumption. I for one would have prefered a slightly bigger, little more power hungry box and I'm a freakin' environmental science student!
 

Lonely1

Unconfirmed Member
Yeah, that CPU isn't a power house. Solid arguments can be made that it isn't good. However, saying that its ~2.5 slower than Xenos is a gross oversimplification and far from being true.

Yeah, I'm not really going to listen to the words of a dev who has never worked on anything other than Nintendo hardware, compared to a 3DS or a Wii, a Wii U must seem like arcane magic to them.

I chose them over you.

One question I have, does producing an old design actually cost more at this point? How does a chip factory work? I mean do the guys at TMSC or whatever go "shit we don't even have the parts to make that shit anymore" ;)

Is a new design. Just not a powerful one.
 

Oddduck

Member
The console might have, but I thought it was stated that the controller was expensive. The tech it uses is relatively new according to the article released the week before launch.

They're not making a profit yet.

Article makes sense when you consider all of the other costs outside of just buying parts.

I bet you if we could get numbers from a very concrete/solid source, we'd find out that the parts/components inside of an Xbox 360 (in 2012) costs around $100 or less.
 

LeleSocho

Banned
Shin'en strikes again.

Good lord stop quoting shin'en you totally can't trust them on nintendo hardware, they are the same as an unofficial second party, they have only made games on nintendo consoles and they have a good relationship with them that obviously don't want to cut... they will never ever say something bad about nintendo because it's the reason of why they live.
It's like asking Rare if they think that the 360 as the best hardware, they will never say no.
 

defferoo

Member
Well, it has no vector units for starters. And the clock speed is pretty low. So yeah, it's probably pretty slow for certain workloads like physics or crowd AI. Unless developers manage to move those workloads to the GPU.

hopefully Nintendo's SDK makes it easy for devs to offload vector heavy CPU computations to the GPU.
 

DonMigs85

Member
My old eMac had a 1.25 GHz G4 processor. It has Altivec, so it's probably more powerful than one of these cores right?
 

JordanN

Banned
Good lord stop quoting shin'en you totally can't trust them on nintendo hardware, they are the same as an unofficial second party, they have only made games on nintendo consoles and they have a good relationship with them that obviously don't want to cut... they will never ever say something bad about nintendo because it's the reason of why they live.
It's like asking Rare if they think that the 360 as the best hardware, they will never say no.
This is ridiculous hyperbole full of unfounded assumptions.

Yeah, 'cause Iwata is going to pull out an Uzi on anyone who doesn't worship his console. @_@
 
Yeah, I'm not really going to listen to the words of a dev who has never worked on anything other than Nintendo hardware, compared to a 3DS or a Wii, a Wii U must seem like arcane magic to them.
So based on the logic,devs like Naughtydog don't know what they're doing or talking about when it comes to the ps3.
 

Busty

Banned
It's like Nintendo tried to make the system underpowered. On the other hand they wouldn't get next-gen ports anyway so why bother?

This was my first thought as well.

I'm sure it wouldn't make a difference to the Nintendo's first team teams and third party support will become an after thought once the 'next gen' systems are released.

Same old, same old.
 

kinggroin

Banned
Kenka mentioned me a few pages back, so I might as well give my two cents.

First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).

The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.

Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.

The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).


Thanks for taking time to add some level headed analysis to the discussion.

With the direction all three next gen platforms seem to be moving in, do you think this will have an effect on PC ports? Will the performance difference between AMD and Intel CPUs be mitigated somewhat by the move to more GPU centric workloads and engines?
 
Oh I know but why would costs "skyrocket" then? AI? Story? Physics? Animation? Its still a big part

To take advantage of greater hardware more has to be done than simply putting a shiny coat of paint on current gen games (though I have no doubt there is a large portion that would be content with this). Yes, AI, physics, animation, etc can all be helped "toned down" price wise with good middleware, but middleware can't help with level design. Levels/zones/etc that could be done in a few months may have to take weeks or months additional to accomodate and test those new features.


For an example, let's say you're playing in an FPS. The scene calls for a nearby building to be hit by a rocket and collapse. Current gen that building collapse would be 100% pre-determined. This takes a lot of work to do and next gen middle ware will exist to handle building collapses in real time. Great! That saves time and effort! But there is a draw back... now those building fragments aren't predetermined where they'll fall, now you have to spend extra time making rules to make sure those buildings don't come down in such a way as to negatively impact the person playing. That is, nothing blocks the players path or lands on them when it shouldn't.

This could be handled by level design or altering the physics just enough that nothing should impact the player. Overall you may save time and thus money going this route... but there is a danger. When you make an effect "easier" there is a tendency to use it more. Especially in a genre like FPS where spectacle is king! Now in the current gen you may only have had the time to spare to make 1 really good close up building explode, but now with next gen you can do it dozens of times! In fact, you hear that a competitor is making an FPS that does it 50 or more times!

You can't let them have the upper hand so you, of course, design 51 buildings to explode and thus have to design your levels even more intricately.

This is why dev costs will skyrocket... not because the individual effects will be more costly than current gen, but rather you have to do MORE to compete, and now there won't be such a rigid limitation they'll be more or less forced to.
 
Isn't the point that it's based off of PPC750 architecture? That doesn't mean it has to be exact to past releases.

The implication people are making is that it IS exact to past releases, when in fact it obviously isn't. These DURRR 10 YEAR OLD ARCHITECTURE statements are disingenuous.

Gekko could only process 32 bit integers, before Gekko - PPC7xx didn't have an FPU for coprocessing game relevant math... broadway's SIMD solution was no Altivec/VMX, we don't know what this chip does in that area at all but we do know its OoOE. We don't know what these cores provision at all really, what evolutions have taken place. We know a likely clockspeed now, and we know that it is fabricated on a smaller process than broadway, that there are three cores and L2 eDRAM. We know it's helped out by an ARM coprocessor, possibly acting as security and/or DSP or something. In any case, there are significant deviations from previous PPC 7xx relatives that we know of.

Beyond woeful. I can't even begin to fathom how it's possible to create a console with such vastly inferior hardware to one from 2005.

Statements like this are wilfully ignorant and stupid. I don't think anyone will be surprised by that particular post though.

Thank god we have posters like Thraktor too.
 

ASIS

Member
I can't believe people are surprised by this. Never be surprised by Nintendo's ability to gimp or cheap out on hardware. They just don't care about what MS/Sony are doing hardware wise. They do what works for them unfortunately.

I'ts going to take a massive failure for them to change their hardware choices. Wii and DS proved they could get away with crappy hardware.

One question I have, does producing an old design actually cost more at this point? How does a chip factory work? I mean do the guys at TMSC or whatever go "shit we don't even have the parts to make that shit anymore" ;)

Wait what? Wii I understand, but the DS wasn't gimped in hardware.
 

Akkad

Banned
I can't believe people are surprised by this. Never be surprised by Nintendo's ability to gimp or cheap out on hardware. They just don't care about what MS/Sony are doing hardware wise. They do what works for them unfortunately.

Pretty much, I actually was a little interested in the Wii U but then remembered who Nintendo are.
 
So the cats now (mostly) out of the bag. Hopefully now we can some real discussion from devs about how these decisions will impact game development.

This type of design is a fairly radical departure from the HD twins CPU centric design and if the other next gen consoles are not GPU centric that will cause many of the same issues we saw this gen in regards to multiplatform games.
 
Thanks for taking time to add some level headed analysis to the discussion.

With the direction all three next gen platforms seem to be moving in, do you think this will have an effect on PC ports? Will the performance difference between AMD and Intel CPUs be mitigated somewhat by the move to more GPU centric workloads and engines?

The performance difference between CPUs in PC games is generally less than 5%. Offloading more work to GPUs will unlikely change that.
 
Nintendo have had crap 3rd party support since the mid 90s (and treated them like dirt prior to that), multiplatform games are a distant 2nd or 3rd priority as far as they're concerned - always has been, always will be. And yet they still produce the most profitable and critically acclaimed products in the industry. Why on Earth would anyone expect this system to be any different? It's a Nintendo system for Nintendo games, ports are fillers for the launch line up - it's the exclusives which will define the system. Anyone with any experience of a Nintendo system knows what they're about.

Wake up.

This is naive in hindsight, but it might have something to do with over a year of statements from Iwata, Reggie et al. to the effect that they regret that Wii was primarily seen as a casual system, that they intend to do a better job of cultivating third-party support, that they wanted it to be a system equally for core and casual players, etc.

I was never even close to convinced, mind you, that Nintendo would actually succeed at winning their own significant slice of the HD core pie - that would require excelling in too many areas that are well outside Nintendo's historical competencies - but until E3 this year, I believed they were at least making something vaguely resembling a real effort. Well, I learned my lesson!
 

Gueras

Banned
I dont care... I just wanted nintendo games in HD on a console and now I have

thats enough, for mults i will have a x720
 
The clock speed on it's own means nothing when you compare different types of chips. The speed in this instance does not directly equate to x86 processor clock speeds.

Case in point: The Acorn Archimedes had an 8mhz clock speed. However, it totally smoked the Amiga in many ways and was the far superior machine - it was just using a different architecture (ARM).
 
Thanks for taking time to add some level headed analysis to the discussion.

With the direction all three next gen platforms seem to be moving in, do you think this will have an effect on PC ports? Will the performance difference between AMD and Intel CPUs be mitigated somewhat by the move to more GPU centric workloads and engines?
Pc development has been going gpu centric for awhile,so i can only see this getting better for pc's down the road.
 

LCGeek

formerly sane
The performance difference between CPUs in PC games is generally less than 5%. Offloading more work to GPUs will unlikely change that.

That exist because people can't take advantage of things that have happened since last generations. We know for a fact i7 for a fact are much better than i5 because of how they perform with stuff like encoding but game tech doesn't make use of the cpu the same way they do.

Games need to grow up themselves. We waste far too much for the results we want.
 

Shion

Member
Good lord stop quoting shin'en you totally can't trust them on nintendo hardware, they are the same as an unofficial second party, they have only made games on nintendo consoles and they have a good relationship with them that obviously don't want to cut... they will never ever say something bad about nintendo because it's the reason of why they live.
It's like asking Rare if they think that the 360 as the best hardware, they will never say no.
Not only that, but it's not like Shin'en works on huge, ambitious, CPU-demanding games like Skyrim. I'm sure Shin'en is more than fine with what the Wii U offers in terms of performance.
 
This is naive in hindsight, but it might have something to do with over a year of statements from Iwata, Reggie et al. to the effect that they regret that Wii was primarily seen as a casual system, that they intend to do a better job of cultivating third-party support, that they wanted it to be a system equally for core and casual players, etc.

I was never even close to convinced, mind you, that Nintendo would actually succeed at winning their own significant slice of the HD core pie - that would require excelling in too many areas that are well outside Nintendo's historical competencies - but until E3 this year, I believed they were at least making something vaguely resembling a real effort. Well, I learned my lesson!

I thought the same. Gotta say I was fouled, so well done Nintendo.
 

LCGeek

formerly sane
Not only that, but it's not like Shin'en works on huge, ambitious, CPU-demanding games like Skyrim. I'm sure Shin'en is more than fine with what the Wii U offers in terms of performance.

We could talk about skyrim own performance issues relative to other games that look just as good if not better and give more fps. Bethesda is the last company to trump up especially when those modding their products often make their games better than they would be if they stayed vanilla.
 

superbank

The definition of front-butt.
Good lord stop quoting shin'en you totally can't trust them on nintendo hardware, they are the same as an unofficial second party, they have only made games on nintendo consoles and they have a good relationship with them that obviously don't want to cut... they will never ever say something bad about nintendo because it's the reason of why they live.
It's like asking Rare if they think that the 360 as the best hardware, they will never say no.
I think the point is Shinen are one of the only developers who actually knew what they were doing on the Wii and and actually put effort into making graphical showcases. So we should probably listen when they talk about technical stuff because they understand that type of architecture best, while the other dev that said Wii U is horrible had a game already built for a different type of architecture and tried to port it.
 
And this was clearly what they were shooting for.

It's a huge gamble. It's not a popular choice on this board, but that doesn't really matter.

Only time will tell if it is a popular choice to Average Joe Consumer.

maybe all they want is ipad ports , call of duty and games alike and nothing more.
 

Akkad

Banned
Yeah, it's not like there is a primary reason we are buying these systems. The hyperbole in this thread is worse than the shitty Wii U CPU

I don't like Nintendo franchises or characters. The only reason I was gonna get it was Bayonetta, but I don't buy consoles for one game and knowing that third party games won't be superior on the Wii U made my decision easier.
 
So w-we...we won't get this?? :(

screen_zeldawiiutechdemo.jpg
 
This is naive in hindsight, but it might have something to do with over a year of statements from Iwata, Reggie et al. to the effect that they regret that Wii was primarily seen as a casual system, that they intend to do a better job of cultivating third-party support, that they wanted it to be a system equally for core and casual players, etc.

I was never even close to convinced, mind you, that Nintendo would actually succeed at winning their own significant slice of the HD core pie - that would require excelling in too many areas that are well outside Nintendo's historical competencies - but until E3 this year, I believed they were at least making something vaguely resembling a real effort. Well, I learned my lesson!


The scary thing is what happens if Wii U lack of 3rd party announcements really is just a complete lack of games coming.

screen_zeldawiiutechdemo.jpg


It's not like the Wii U's CPU will stop it from running anything.
 

LeleSocho

Banned
This is ridiculous hyperbole full of unfounded assumptions.

Yeah, 'cause Iwata is going to pull out an Uzi on anyone who doesn't worship his console. @_@

Please don't act stupid, in all their life Shin'en have only released games on Nintendo console and obviously it's their only source of money, the last thing they want is to make Nintendo unhappy or to think bad of them. It is obvious that in relation of nintendo things they are not objective and so cannot be trusted.
 
Top Bottom