• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

Orayn

Member
Can we settle the generation argument by agreeing that game systems come out in batches?

The Dreamcast, GBA, PS2, Gamecube, and Xbox were the 1998-2001 batch.

The Nintendo DS, PSP, Xbox 360, Wii, and PS3 are the 2004-2006 batch.

The 3DS, Vita, Wii U, Durango, and Orbis will all be part of the 2011-2013 batch.
 

Panajev2001a

GAF's Pleasant Genius
Yes it is ! It was also GameCube CPU it seems...
Really, I can't think how they could achieve this... How they could lose money on this or how they couldn't find cheaper and better solutions.
This GPU better be something powerfull.

How are they losing money on this? I think they have troubles with securing low prices in their supply chain, so it is a supply chain issue too probably. The opposite of what Apple manages to achieve with some of their billions of $ spent each year with manufacturers' agreements perhaps.
 

Drek

Member
I normally do not participate in these technical discussions so excuse my lack of knowledge on this but what do we know about the (GP)GPU? From what I've seen/heard, it's the most advanced thing about the Wii U. Could it be possible that it more than makes up for the relatively slow processor, which explains why current gen games can still run fine on the Wii U?

Edit: Also, could the bad ports be attributed to the fact that devs weren't fully utilizing the GPU, instead using the CPU that they're used to using on PS360?

I'd have a hard time imagining any professional programmer wouldn't think to gobble up a few extra GPU cycles to pad CPU inefficiencies, even on a quick port.

Fact is, there is only so much you can shove off to the GPU. CPU intensive processes can't universally be handed off to the GPU in a 1:1 ratio. The failings of some ports, most notably Batman: Arkham City, is likely due to how CPU dependent the game engine is.

What this all really boils down to: unless 3rd parties go above and beyond even a 360>PS3 port for Wii U releases the multi-plat titles started on 360 will suffer on Wii U at least comparably badly as the PS3. Generally won't be deal breakers, but you aren't getting the latest and greatest version of ports with the Wii U.

It also clearly outlines that the delusion of PS4/Xbox 3 games can and will be ported to the Wii U should just stop. It isn't going to happen. It will be very similar to the Wii v. PS3/360/PC "ports" of watered down, different engine releases.

The real selling point of the Wii U - Nintendo first party releases and select 3rd party exclusives - will continue to be the same elite software offerings we've come to expect. I'd bet that Bayo2 will blow anything from the PS3/360 generation out of the water. Devs who commit to building their game around the WiiU's architecture will have a very strong environment to work in relative to the PS3/360 era.

But then this isn't really news, just confirmation of what we could see developing for the vast majority of the Wii U's pre-release period. It is another iteration in the Nintendo walled garden hardware series, and it will likely work out quite well for Nintendo specifically. It is a hardcore Nintendo fan's first purchase, the 'family friendly' alternative for casuals, and the first "2nd system" most gamers will be tempted to buy. That strategy did just fine for the Wii. Hardware limitations won't hamper the Wii U's marketability. The appeal of tablet, the higher MSRP than the Wii, and what the early software lineup offers will determine it's future.
 

Kenka

Member
In the MHz myth, instruction go down the pipe at the same speed? But the P4 is supposed to execute instructions more than twice faster. Is there a mistake in the presentation or have I missed a point.
 

batbeg

Member
Just watched Epic Mickey 2 and the difference between the x360 and the wii u is night and day!

The wii u version felt so lifeless ...

Is Epic Mickey 2 really going to be your measurement of performance between those two consoles...?

Edit: silly typo
 
I love my WII U but can we now honestly say it's a current gen system? It appears a somewhat weaker current gen system at that. That won't stop me from playing it though. HD Mario, Zeldas, Metroids, etc make me a happy happy camper. More so if more games like Zombi U keep coming out for it too.

It's producing ports that are better than the PS3 counterparts with a much lower clock speed on the CPU. It's current gen, but certainly not next gen like some were hoping for.
 
Can we settle the generation argument by agreeing that they comes out in batches?

I propose that we just stop engaging in the argument altogether. When Person A calls it a next-gen console based on the traditional usage of what a generation means, we know what they're saying. It may not have what's expected of next-gen hardware, but that doesn't need to be called out. Likewise, when Person B decides to assert that it's a last-gen machine because of the tech inside, one understands what they're getting at -- they're criticizing the hardware. Correcting the assertion by clarifying that a generation refers to a timeframe as opposed the underlying tech is merely obfuscating the issue by trying to sidestep around a perhaps fair critique of the hardware.
 

tsab

Member
Marcan probably measured it while in idle mode.
Or the machine has 2 power profiles.

Come on there are fanless ultra slim laptops there with better specs/performance.

I refuse to believe this
 

SmokyDave

Member
But... most of the top selling 360 games don't even have HD graphics... :p

I wish I could be arsed to check the top 10 sellers and their resolutions because I reckon you're probably wrong.

Having said that, you could replace 'HD' with 'flashy' in the original post and it'd still make sense.
 

dwu8991

Banned
We also spoke to several several developers, all of whom asked to remain anonymous. They all mentioned the very decent GPU and the comparatively generous 2GB of video RAM, which is four times the amount available in the original Xbox 360 (but likely to be half of what we get in the next-gen PlayStation and Xbox). However, they also had concerns about the under-powered CPU, some even questioning whether it will have the capacity to drive two GamePads simultaneously.

http://www.guardian.co.uk/technology/gamesblog/2012/nov/29/wii-u-essential-guide
 

FACE

Banned
oles_shishkovtsov_-_cdbus9.jpg


"Told ya"
 

-KRS-

Member
I must say, even as someone who mostly stick to Nintendo systems, it's pretty funny reading people here trying to justify this.
 

McHuj

Member
In the MHz myth, instruction go down the pipe at the same speed? But the P4 is supposed to execute instructions more than twice faster. Is there a mistake in the presentation or have I missed a point.

Depends on the architecture.

In some architectures, some instructions have dedicated pipelines. In some very simple architectures, all instructions go through the same amount pipeline stages.
 

LeleSocho

Banned
In the MHz myth, instruction go down the pipe at the same speed? But the P4 is supposed to execute instructions more than twice faster. Is there a mistake in the presentation or have I missed a point.

It was like that to make the concept clear but yes the P4 should've gone twice as fast.
 
I haven't been able to read this thread posts, but my first impressions were:

1). Matt and Lherre wasn't kidding when they said that the in-sync clocks shouldn't be looked at so much. Looks very random in at least first sight.

2). Nintendo truly did play with the CPU clock to get a clock number like that.. If I had a guess, it is like they wanted to downclock the CPU as much while maintaining as it can so they can use a relatively powerful GPU.
 

Orayn

Member
I must say, even as someone who mostly stick to Nintendo systems, it's pretty funny reading people here trying to justify this.

Most aren't saying it'll compete with Durango/Orbis, just clarifying what the clockspeed does and doesn't mean.
 
Much lower than I thought.

But the news I'm most excited for is the Wii U being vulnerable. The original Wii is the greatest homebrew machine ever, and the Wii U can be so much more with that GamePad and additional processing power.
 

v1oz

Member
This makes sense though, the PPC7xx series was great for bang-for-buck wattage even at 90nm and in a single core design - hence Apple using them in old Powerbooks etc. A more refined, modern take on it might have fitted right into their plans to make a small 30-70w box.

Anyone know if IBM have any other PPC descended chips that this might be a close relative of?
It doesn't. The G3 MacBooks are ten years old.
 

beril

Member
found the source for what I was talking about http://gbatemp.net/threads/retroarch-a-new-multi-system-emulator.333126/page-7
I believe if you program only against one main CPU (like we do for pretty much most emus), you would find that the PS3/Xenon CPUs in practice are only about 20% faster than the Wii CPU.

I've ported the same code over to enough platforms by now to state this with confidence - the PS3 and 360 at 3.2GHz are only (at best - I would stress) 20% faster than the 729Mhz out-of-order Wii CPU without multithreading (and multithreading isn't a be-all end-all solution and isn't a 'one size fits all' magic wand either). That's pretty pathetic considering the vast differences in clock speed, the increase in L2/L1 cache and other things considered - even for in-order CPUs, they shouldn't be this abysmally slow and should be totally leaving the Wii in the dust by at least 50/70% difference - but they don't.

Can't say if it's true or not; I've mostly done high level gameplay code on PS3&360 and only a tiny bit of homebrew coding on the Wii. If it is, Wii Us clock should put it a notch above xenon cores. However there's still only one thread per core.
 
We also spoke to several several developers, all of whom asked to remain anonymous. They all mentioned the very decent GPU and the comparatively generous 2GB of video RAM, which is four times the amount available in the original Xbox 360 (but likely to be half of what we get in the next-gen PlayStation and Xbox). However, they also had concerns about the under-powered CPU, some even questioning whether it will have the capacity to drive two GamePads simultaneously.

http://www.guardian.co.uk/technology/gamesblog/2012/nov/29/wii-u-essential-guide

It won't run two at once without major changes.
Even if it could Nintendo is not in a position to sell the pad at an affordable rate.

If I were a retailer I would just not stock the things.
We all need to forget about multiple gamepads; from internal specs to simple market conditions its not happening.

Wii U 2 can add 4 player functionality or something. Wii U just can't and we've frankly known that since NoA spouted the bollocks at E3.
 

nib95

Banned
So.. Is the processor in my Galaxy S3 (international) faster than that thing or is that pushing it too far?
 

Erasus

Member
Everyone on Neogaf needs to watch it.

You dont measure CPU power just by clock speeds.

I (and many others) know this. You cant compare the 3.2GHZ PPC tri-core Xenon to the WiiU CPU!

HOWEVER the WiiU cores are still based on PPC 750/Broadway cores due to backwards compatibility. The 750 arch came out in 1997, Broadway in 2005?
Can the WiiU CPU be seen as a whole new architechture, how much improvments are made?

CPU heavy stuff like AC3 already have trouble running...
Early port yes, but I dont think its a case of early PS3 ports where devs would shove it all into the PPE, I think they are using all 3 cores as it cant be that much of a diff between those and the Wii CPU, and even the 360 CPU.

1.24GHZ is still a very low clock, architechture improvments can do a lot sure I agree. And if it was a whole new PPC CPU arch, like 9xx, I would not be worried. But they have to ensure backwards compat...
Maybe they took a lot of improvments from a new PPC series, but at its base its still 3 broadway cores at low clock.

GPU and RAM size seems good though, hope they have a good GPU calc toolchain so devs can use GPGPU for CPU calcs.
 

wsippel

Banned
So, its still based on PPC 750 cores???

http://en.wikipedia.org/wiki/PowerPC_7xx#PowerPC_750CL

Why not go 9xx? Not backwards compatible?

And is 1.24GHZ the idle speed or does it run that low in games too? Then is pretty amazing how stuff like AC3 even runs.

The GPU is good though, as the architechture is newer, but CPU architechture from 2001-2002....

Still power consumption is impressive, but man would it have killed them to upp the GHZ a bit and have it draw 50watts instead of 33?
No off-the-shelf PowerPC is 750CXe/ CL compatible. And Espresso is not necessarily a PPC750 either. But whatever it is, it uses the same ISA, or a superset of the same ISA. That's it. It can't be a PPC750 as there never was a SMP capable PPC750, there never was a 45nm PPC750, and there never was a PPC750 with eDRAM L2.
 

PaNaMa

Banned
ITT: people that don't understand clock speed isn't as an important factor in modern cpu/gpu architecture

I expected speeds right around this ballpark, with ooe and the chunk of edram it should still serve up some great looking software down the line.

Just cool to have the number so we don't have to speculate anymore

Tell that to Planetside 2! They have this thing implemented in game where it shows your framerate, but also displays the letters "GPU" or "CPU" alongside the fps number.
Scenery type stuff is GPU, but once you reach big battles it's all CPU.

CPU is almost always the delimeter on my screen, and I'm running a I72600k @ 4.2 Ghz.
 

Shion

Member
Most aren't saying it'll compete with Durango/Orbis, just clarifying what the clockspeed does and doesn't mean.
Clock speed isn't the only thing that's disappointing here, though.
The CPU in Wii U is ancient and underpowered regardless of its clock speed.
 

Coolwhip

Banned
I must say, even as someone who mostly stick to Nintendo systems, it's pretty funny reading people here trying to justify this.

Uninformed post like yours are even funnier though. The only good source we have claims the Wii U CPU is about the same in performance as Xenon. Something we already knew. Ha ha?

Weird.How much more really would it have added to the final price if they had bumped up the cpu abit to at least current gen levels? This company chooses profits over performance once again.

This thread is hopeless. Every page new people barge in and base their judgement on their own inability to understand technology. At least read the thread.
 

Cousteau

Member
Weird.How much more really would it have added to the final price if they had bumped up the cpu abit to at least current gen levels? This company chooses profits over performance once again.
 
It won't run two at once without major changes.
Even if it could Nintendo is not in a position to sell the pad at an affordable rate.

Eh, I think the dual-gamepad feature will be used mainly for party/casual games and other titles that don't push the CPU. Probably not so much FPS titles.
 
The worst part for me is the inability to have multiple gamepads. When i first heard about the controller i had so many exciting ideas but so many of them involved having multiple gamepads.

The graphics i can live with as i think nintendo are going to produce some amazing looking games and that's mostly what i'm in it for.
 

DieH@rd

Banned
Wikipedia lists Wii Broadway as 2.9 GFLOPS part. So... less than a double cpu clock, larger cashes, and little optimizations... Wii U tricore cpu is somewhere in the ~20 GFLOPS range.

Not good... Not good.

CELL was 25.6 GFLOPS at 3.2 GHz with its main cores alone [ionfo from wiki], and around 100 GFLOPS when power of monstrous SPE satellite processors is added.

edit - newer info by Elios83
Xenon is 115 Gigaflops (12 flops single precision per cycle per core at 3.2GHz)
Cell with the SPEs is 218 Gigaflops (12 flops for the PPE at 3.2GHz + 8 flops for the 7 SPEs at 3.2GHz).
Wii U CPU is... just LOL.
 

BillyBats

Banned
I'm trying to bring balance to the Force and remind people that only games matter?
Who is going to think about clock speed while playing Wonderful 101 or Bayonetta 2?
You should be interested in Wii U or not only for the upcoming games. The rest is just bullshit.
Third party are irrelevant on a Nintendo console since N64.

Then why ever come out with a new console? Nintendo could have saved millions by still releasing NES games because, you know, only games matter.
 

Stewox

Banned
What if that CPU speed is just the low-activity one, they could be running with that in OS but with games ramp it up to max.

Also, people are making too quick judgements from this info, but this is just the start of the dig.
 

defferoo

Member
why would you stick with 10+ year old architecture just to maintain backwards compatibility, and do such a crappy job of integrating compatibility with the old system? if they were so focused on providing backwards compatibility, they could have at least figured out how to give it to us without making us go into "Wii mode".

i love Nintendo's games, but the decisions they make are just stupid. the GC CPU architecture is dead, ancient. it was time to move on 6 years ago (Apple did)... seriously, they didn't consider any of IBMs other CPUs? Instead they put tons of R&D into getting a marginal improvement on a CPU originally designed over 10 years ago?
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
found the source for what I was talking about http://gbatemp.net/threads/retroarch-a-new-multi-system-emulator.333126/page-7


Can't say if it's true or not; I've mostly done high level gameplay code on PS3&360 and only a tiny bit of homebrew coding on the Wii. If it is, Wii Us clock should put it a notch above xenon cores. However there's still only one thread per core.

The bolded is what will damn the Wii U as far as Orbis/Durango ports go IMO if it's confirmed, not clock-speed.

Single-threaded, triple core, vs 4-8 core? multithreaded beasts? - Shit it's gonna be a bloodbath.
 
Top Bottom