• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

stuminus3

Member
I wish I could be arsed to check the top 10 sellers and their resolutions because I reckon you're probably wrong.

Having said that, you could replace 'HD' with 'flashy' in the original post and it'd still make sense.
4 or 5 of the top selling titles are all Call of Duty (couple of Halos in there too), well known sub-HD games. Post-MW CoD, too. I think it would be a hell of a stretch to say the massive success of CoD from MW onwards has been because it's 'flashy'...

There are a number of things Microsoft nailed with the 360 that they'd have been dead in the water without. HD graphics alone wouldn't have done much for them for long.

I say the same thing as a PC gamer (that has poured far more money into PC hardware than any console could match) - if the only reason you're doing it is for the shiny graphics, you're doing it wrong. It's not 2004 anymore. Actually... you're big on iOS gaming, Dave - you know how this works. There's probably no better example.
 

Wiz

Member
I'd have a hard time imagining any professional programmer wouldn't think to gobble up a few extra GPU cycles to pad CPU inefficiencies, even on a quick port.

Fact is, there is only so much you can shove off to the GPU. CPU intensive processes can't universally be handed off to the GPU in a 1:1 ratio. The failings of some ports, most notably Batman: Arkham City, is likely due to how CPU dependent the game engine is.

What this all really boils down to: unless 3rd parties go above and beyond even a 360>PS3 port for Wii U releases the multi-plat titles started on 360 will suffer on Wii U at least comparably badly as the PS3. Generally won't be deal breakers, but you aren't getting the latest and greatest version of ports with the Wii U.

It also clearly outlines that the delusion of PS4/Xbox 3 games can and will be ported to the Wii U should just stop. It isn't going to happen. It will be very similar to the Wii v. PS3/360/PC "ports" of watered down, different engine releases.

The real selling point of the Wii U - Nintendo first party releases and select 3rd party exclusives - will continue to be the same elite software offerings we've come to expect. I'd bet that Bayo2 will blow anything from the PS3/360 generation out of the water. Devs who commit to building their game around the WiiU's architecture will have a very strong environment to work in relative to the PS3/360 era.

But then this isn't really news, just confirmation of what we could see developing for the vast majority of the Wii U's pre-release period. It is another iteration in the Nintendo walled garden hardware series, and it will likely work out quite well for Nintendo specifically. It is a hardcore Nintendo fan's first purchase, the 'family friendly' alternative for casuals, and the first "2nd system" most gamers will be tempted to buy. That strategy did just fine for the Wii. Hardware limitations won't hamper the Wii U's marketability. The appeal of tablet, the higher MSRP than the Wii, and what the early software lineup offers will determine it's future.

I agree with you on 3rd parties. The only way I see it - Nintendo sponsored/published 3rd party games will usually be the best non-1st party offerings on the platform. We've heard of every other port having problems, and if devs don't either a) Put more resources into making a better port or b) Create an engine from the ground up just for Wii U, then it's hard to say the console will gain much from the upcoming generation of hardware in terms of non Nintendo published 3rd parties.

Bottom line is that if you're going to develop for the Wii U hardware, an easy port wont be enough. And I don't know at this time a lot of devs that are willing to stick out their foot to do more than that.
 

Stewox

Banned
The worst part for me is the inability to have multiple gamepads. When i first heard about the controller i had so many exciting ideas but so many of them involved having multiple gamepads.

The graphics i can live with as i think nintendo are going to produce some amazing looking games and that's mostly what i'm in it for.

It's not abotu the CPU or the GPU, it's about wireless bandwidth. They would need to put in more antennas and additional hardware to increase bandwidth, would get costs up, basically what they did for dual gamepads was to chop each FPS by half, so the bandwidth stays the same. And bandwidth is really important to not be bottlenecked in such sn operation when latency is really important.
 
On the topic of the specs themselves: basically, this provides further evidence that you shouldn't buy Wii U for Western multiplatform titles. Which we knew for a while now.

To be clear, I'm not saying that a slow CPU doesn't matter otherwise. I'm saying that there's not much else we can extrapolate from this without knowing more about what's coming in 2013.
 

Fox Mulder

Member
Because the Wii's massive success convinced them that Average Joe Consumer really doesn't care that much about graphics/RAM/clock speed.

At least it's hd now.

I'm interested to see how rising development costs for next gen impact most devs. Not everyone can afford to dump even more money into even better looking games on more powerful hardware.

The wiiu may be weaker than anyone wanted, but maybe it doesn't have to be more than it is. Graphics in current gen games can be good enough for years still.
 
It's not abotu the CPU or the GPU, it's about wireless bandwidth. They would need to put in more antennas and additional hardware to increase bandwidth, would get costs up, basically what they did for dual gamepads was to chop each FPS by half, so the bandwidth stays the same. And bandwidth is really important to not be bottlenecked in such sn operation when latency is really important.

I don't know the reasons why it doesn't work i'm just saying it's disappointing.
 
Beyond woeful. I can't even begin to fathom how it's possible to create a console with such vastly inferior hardware to one from 2005.

Backward compatibility can't be the reason. Why sacrifice future games just to play old ones?
 
I (and many others) know this. You cant compare the 3.2GHZ PPC tri-core Xenon to the WiiU CPU!

HOWEVER the WiiU cores are still based on PPC 750/Broadway cores due to backwards compatibility. The 750 arch came out in 1997, Broadway in 2005?
Can the WiiU CPU be seen as a whole new architechture, how much improvments are made?

CPU heavy stuff like AC3 already have trouble running...
Early port yes, but I dont think its a case of early PS3 ports where devs would shove it all into the PPE, I think they are using all 3 cores as it cant be that much of a diff between those and the Wii CPU, and even the 360 CPU.

1.24GHZ is still a very low clock, architechture improvments can do a lot sure I agree. And if it was a whole new PPC CPU arch, like 9xx, I would not be worried. But they have to ensure backwards compat...
Maybe they took a lot of improvments from a new PPC series, but at its base its still 3 broadway cores at low clock.

GPU and RAM size seems good though, hope they have a good GPU calc toolchain so devs can use GPGPU for CPU calcs.

CPU has been modify so you cant do a direct comparison to a PPC750. You dont know what added tech is in there.

I own AC3 and it runs the game pretty dam well. It looks like the small things that are missing is due to deadlines rather than hardware.

I agree that in todays tech world 1.25Ghz is kinda low. Perhaps, theres a turbo mode in there being held off till PS3/720 are released.

GPU and RAM seems very nice. In the end, im enjoying my Wii U a lot and im looking forward to whats gonna happen in the next year.
 
I don't know what any of those specs mean. I know CPU is the main processor and GPU is the graphics chip, and memory is memory, but how does that translate to a game? What plays what part?

Also how do all those categories compare to the PS3/360? Maybe I just need a nice bar graph, lol.
 
Schattenjäger;44877598 said:
What so hard to grasp?
When Wii came out - they said it would be marginally better than the Xbox and it still sold fine

In turn,

the Wii U is marginally better than PS360, maybe even less better than the Wii was over the Xbox or maybe even the same as the PS360 - so what
We get Nintendo in 1080p - that's enough to make people buy it

Nintendo hasn't been about processing power/graphics for a long time now

Really? What Nintendo games are in 1080p for the Wii U?
 

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
At least it's hd now.

I'm interested to see how rising development costs for next gen impact most devs. Not everyone can afford to dump even more money into even better looking games on more powerful hardware.

The wiiu may be weaker than anyone wanted, but maybe it doesn't have to be more than it is. Graphics in current gen games can be good enough for years still.

God I'm sick of this fear-mongering. 60fps games and higher poly models (how theyre initially created and then pared down) don't magically cost 2x budgets, nor does higher IQ and the suchlike or rendering natively at 1080p.

Insanely larger game worlds, complex AI, a more robust MP alongside campaign and such forth do require more manpower and thus cost, but its all about clever management. Having more power to work with just doesnt suddenly have a parallel cost jump to the way each and every studio does things. Is Platinum's budget for making 101 or Bayonetta 2 1.25x that of their X360 or PS3 budgets because the Wii U might have a little more wiggle room? No.
 
Pretty much.
Tbf you can see why. Not like the Wii was that hard to develop for at the start of its gen.
No one bothered.

Iwata changed the release set up to take into account third party support; think they were surprised tbh.

I suppose the hope can be that over the next year or two theres some movement to develop with the WiiU in mind to some degree. People in the industry learning to use it etc.


Will this happen? Meh.
Third party support will be lower than if it was more normal architecture but then first party content might not push the system as much.

We'll see how the gen goes. How quickly PS4/720 sell could be a big effect (and yes I'd expect their sales to take off faster considering the long generation).

I think Nintendo can still be successful with a system that is "on par" with PS360.

Rely on heavily on First/Second Party games.
Publish key core title's when possible. (Example: Bayonetta 2)
Be as indie friendly as possible.
Occasional 3rd party exclusive.
PS360 should receive support even a few years into Sony/MS next gen systems. If Wii U should be able to get ports of those titles.

There are a lot of if's in the above, but they can be successful with a system that is "on par". Like you say, we'll just have to see how this gen plays out.

Edit: I should add, I wouldn't expect ports of PS4/720. While it may happen, expecting it seems like a set up for disappointment. If you plan to buy a single console/PC and want to play AAA multi-plat titles, Wii U is likely not the best option. Still, I've enjoyed Wii U so far and will continue to enjoy it even after I purchase my Sony or MS next gen system.
 

SmokyDave

Member
4 or 5 of the top selling titles are all Call of Duty (couple of Halos in there too), well known sub-HD games. Post-MW CoD, too. I think it would be a hell of a stretch to say the massive success of CoD from MW onwards has been because it's 'flashy'...

There are a number of things Microsoft nailed with the 360 that they'd have been dead in the water without. HD graphics alone wouldn't have done much for them for long.
Oh, no doubt. I wasn't really weighing in on the overall importance to the mass-market, more the perception that many of the 360's best / biggest sellers weren't HD.

I say the same thing as a PC gamer (that has poured far more money into PC hardware than any console could match) - if the only reason you're doing it is for the shiny graphics, you're doing it wrong. It's not 2004 anymore. Actually... you're big on iOS gaming, Dave - you know how this works. There's probably no better example.
I dunno man. I love my Vita because it has great graphics for a handheld. I started playing phone games once phones were powerful enough to handle 'big' games with (relatively) great graphics. I love my PC because it can produce, you guessed it, great graphics. I love the PS3 & 360 because they were the first consoles capable of 'HD' graphics. I'm a big graphics / tech whore, basically.
 

Ashes

Banned
At least it's hd now.

I'm interested to see how rising development costs for next gen impact most devs. Not everyone can afford to dump even more money into even better looking games on more powerful hardware.

The wiiu may be weaker than anyone wanted, but maybe it doesn't have to be more than it is. Graphics in current gen games can be good enough for years still.

But will it remains so at new gen prices? Some people are saying that nintendo is not making a profit on this console. A three hundred quid console with a game. :/

I was on the line to get this if not this year than next, but I may actually skip this... Sorry nintendo.
 

Erasus

Member
I'm interested to see how rising development costs for next gen impact most devs.

Textures and models are already made in insane high-res, then scaled down to fit into PC/Console hardware.

Even running stock Uncharted 3 at 1080p brings out amazing detail in textures, that atleast I didnt notice in 720p.

I cants find those shots posted here.... Someone dig em up
 
Probably to avoid overheating, I'd guess.



I think it is, yes. I guess that kinda contributes to the half-baked launch ports, as the PS3 and 360 (at least the PS3) are very CPU-centric.

Interesting.

Hopefully devs know how to work around this better, I guess it won't be a Wii situation which would require them to go from scratch or nothing, but they should be encouraged to have a plan to work around this for a Wii U version.

Should we get one in the first place. :(

Hopefully more exclusives will come, wonder how Bayonetta 2 will look.
 

TheExodu5

Banned
Why are people comparing clock speeds without having any underlying knowledge of the architecture? Clock speed doesn't mean much unless we're given reference performance/clock metric from a similar chip on the market.

e.g. a 1.8GHz Core 2 Duo is a lot faster than a dual-core 3.4GHz Pentium IV.

I'm not suggesting this is the case with the Wii U, but making judgements based on clock speed alone makes no sense.
 

Drek

Member
It's a modified ppc750 and it's different from the Xenon so the clock numbers does not have a 1:1 ratio in terms of power.

So?

Devs working on the hardware have said the CPU is comparatively weak. We now see an incredibly weak clock speed. It's confirmation that the complaints had merit and the CPU truly is deficient with respect to the PS3/360 systems.

It doesn't mean the system is even truly bottlenecked for someone developing exclusively on the Wii U. It has enough CPU power to meet the minimum requirements for believable physics and A.I. for most consumers. It isn't likely to run a bunch of 64 player online shooters or have the latest and greatest physics model rolled out in full glory on it, but that stuff is only secondarily related to gameplay.

It is a system built first and foremost for Nintendo's 1st party library where the major benefits of a beefy multi-core CPU isn't going to be fully realized. Nintendo's games play in their own special world in terms of physics. Multiplayer is not a top priority for them as a company. Intricate A.I. logic trees take a secondary seat to what is actually fun to play. Nintendo's games will play great, as always, and will now look even better. That is the point of the Wii U.
 

dwu8991

Banned
I don't know what any of those specs mean. I know CPU is the main processor and GPU is the graphics chip, and memory is memory, but how does that translate to a game? What plays what part?

Also how do all those categories compare to the PS3/360? Maybe I just need a nice bar graph, lol.

Games have to be build up to look good otherwise ports will look bad in comparison to other consoles
 

Jacobi

Banned
Textures and models are already made in insane high-res, then scaled down to fit into PC/Console hardware.

Even running stock Uncharted 3 at 1080p brings out amazing detail in textures, that atleast I didnt notice in 720p.

I cants find those shots posted here.... Someone dig em up

Working in video, 720p just isn't that huge of a resolution anymore... 1080p is where it's at, and the difference definitely is visible
 

wsippel

Banned
Isn't the point that it's based off of PPC750 architecture? That doesn't mean it has to be exact to past releases.
No developer can tell you what it's based on. All they see is that it behaves similar and supports the same unique features. It has to, to be compatible. But that doesn't tell us much about the chip itself.
 

Kenka

Member
Why are people comparing clock speeds without having any underlying knowledge of the architecture? Clock speed doesn't mean much unless we're given reference performance/clock metric from a similar chip on the market.

Bruce Lee's death: 1973
WiiU's CPU launch: 1999
WiiU launch: 2013



It's old. It's old. It's old.
 
Textures and models are already made in insane high-res, then scaled down to fit into PC/Console hardware.

Even running stock Uncharted 3 at 1080p brings out amazing detail in textures, that atleast I didnt notice in 720p.

I cants find those shots posted here.... Someone dig em up

Textures and models aren't 90% of the work in making games.
 
I still expect that Durango/PS4 ports will be more common than PS3/360 ports were on Wii, since not all games will be equally CPU-intensive and Wii U at least has a modern GPU architecture.

"More common" isn't a very high bar to clear, though.

(and before it comes up: I'm not talking about spinoffs like Dead Rising: CTYD. Not a port in any meaningful sense of the word, more a demake.)
 

BGBW

Maturity, bitches.
We know. Jesus. We know ghz isn't everything, but this is horribly slow and the CPU family is ancient.

I just see a lot of people comparing two numbers and saying the larger one if better despite that video saying that's not always the case. Surely if we all know that's the case then why are people still just comparing the clockspeeds?
 

acm2000

Member
so, all in all, wiiu cpu should be around on par with 360 cpu for general purpose use (due to out of order and other benefits) but it will fall behind regarding physics/ai? seems to be a summary of everything said, the gpu is obviously better than 360, but will they have to offload physics/ai to that to take strain from cpu, thus negating alot of the gpus advantage?
 

mokeyjoe

Member
I don't see why being based on 1997 tech is a bad thing. I mean aren't modern PC processors based off the 1995 Pentium Pro architecure (P6)?

Given a slightly slower CPU, a slightly better GPU and much more RAM I think it's pretty safe to say that the Wii U, by and large, is about equal to PS360. Which seems to be pretty much what people thought in the first place. I certainly expect exclusive stuff like Bayonetta 2 to look as good as it would have done on the 360.

There seems to be a lot of FUD in this thread. People (as always) lurking about waiting for a moment to claim Nintendo is doomed and duct-taping Gamecubes together etc. I've seen the same shit over and over since the Megadrive vs Snes days. It's like every time Nintendo release something everyone on the internet develops a bad case of collective amnesia.

Nintendo have had crap 3rd party support since the mid 90s (and treated them like dirt prior to that), multiplatform games are a distant 2nd or 3rd priority as far as they're concerned - always has been, always will be. And yet they still produce the most profitable and critically acclaimed products in the industry. Why on Earth would anyone expect this system to be any different? It's a Nintendo system for Nintendo games, ports are fillers for the launch line up - it's the exclusives which will define the system. Anyone with any experience of a Nintendo system knows what they're about.

Wake up.
 
Why are people comparing clock speeds without having any underlying knowledge of the architecture? Clock speed doesn't mean much unless we're given reference performance/clock metric from a similar chip on the market.

e.g. a 1.8GHz Core 2 Duo is a lot faster than a dual-core 3.4GHz Pentium IV.

I'm not suggesting this is the case with the Wii U, but making judgements based on clock speed alone makes no sense.

The Core architecture was completely different than Netburst, in almost every way. What's in the OP says the Wii U's CPU is based on what the GC and Wii had, so that limits the same kind of drastic architectural overhaul (imho).
 
so, all in all, wiiu cpu should be around on par with 360 cpu for general purpose use (due to out of order and other benefits) but it will fall behind regarding physics/ai? seems to be a summary of everything said, the gpu is obviously better than 360, but will they have to offload physics/ai to that to take strain from cpu, thus negating alot of the gpus advantage?

Depending on how the developers take advantage (or not) of this, that is one likely scenario.
 

Elios83

Member
Wikipedia lists Wii Broadway as 2.9 GFLOPS part. So... less than a double cpu clock, larger cashes, and little optimizations... Wii U tricore cpu is somewhere in the ~20 GFLOPS range.

Not good... Not good.

CELL was 25.6 GFLOPS at 3.2 GHz with its main cores alone [ionfo from wiki], and around 100 GFLOPS when power of monstrous SPE satellite processors is added.

Xenon is 115 Gigaflops (12 flops single precision per cycle per core at 3.2GHz)
Cell with the SPEs is 218 Gigaflops (12 flops for the PPE at 3.2GHz + 8 flops for the 7 SPEs at 3.2GHz).
Wii U CPU is... just LOL.
 
I just see a lot of people comparing two numbers and saying the larger one if better despite that video saying that's not always the case. Surely if we all know that's the case then why are people still just comparing the clockspeeds?

To be fair, I think most people are taking into account other things with the clock speeds. We've had multiple rumors. In my experience, when the same, or similar, rumor continues to pop up from different sources it has some sort of basis in reality. We also know the CPU is on the smallish side, the console overall is small, it doesn't use a lot of power, etc....
 

wsippel

Banned
The Core architecture was completely different than Netburst, in almost every way. What's in the OP says the Wii U's CPU is based on what the GC and Wii had, so that limits the same kind of drastic architectural overhaul (imho).
No, he actually doesn't say that. He can't tell.
 

stuminus3

Member
I dunno man. I love my Vita because it has great graphics for a handheld. I started playing phone games once phones were powerful enough to handle 'big' games with (relatively) great graphics. I love my PC because it can produce, you guessed it, great graphics. I love the PS3 & 360 because they were the first consoles capable of 'HD' graphics. I'm a big graphics / tech whore, basically.
Oh, don't get me wrong - me too. The more glorious the graphics the better IMO. I'm just making the point that in today's market, the old "a generation is defined by graphics" thing hasn't really been true for a long time now. There's just so much variety in what's out there now.

Though I'm just remembering I'm discussing this in a topic about Wii U specs which is probably a silly thing to do.
 

JordanN

Banned
Shin'en strikes again.

"The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU."

You hear that? Stop focusing on Mhz. Focus on the architecture.
 

defferoo

Member
No developer can tell you what it's based on. All they see is that it behaves similar and supports the same unique features. It has to, to be compatible. But that doesn't tell us much about the chip itself.

true, but there have been reports that it's terribly slow, which makes people jump to the worst possible conclusion, that it's a three-way SMP enabled, higher-clocked Broadway at a 45 nm process. considering the jump from Gekko to Broadway was basically a 50% clock boost with a die shrink... it's not hard to imagine this being the case.
 

Skiesofwonder

Walruses, camels, bears, rabbits, tigers and badgers.
Weren't all the WiiU speculation prophets screaming that the CPU was very similar to the 360/PS3? Doesn't sound like it now.
 

Drek

Member
found the source for what I was talking about http://gbatemp.net/threads/retroarch-a-new-multi-system-emulator.333126/page-7


Can't say if it's true or not; I've mostly done high level gameplay code on PS3&360 and only a tiny bit of homebrew coding on the Wii. If it is, Wii Us clock should put it a notch above xenon cores. However there's still only one thread per core.

It's semi-true. The 360 and PS3 do rely very heavily on multi-threading to be truly effective and if you program for just a single core you will see weaker results.

But NO ONE in the games industry has done that for years. The industry has spent over half a decade now forcing everyone into multi-core architectures. Sony has been pushing for that over a decade in fact. Video games have more than enough simultaneous discrete calculations happening behind the scenes to be ideal software for multi-core development.

So even if his claim was 100% factual it is still irrelevant when it comes to the vast majority of games industry applications. He's talking about coding an emulator. So great, the Wii U should run MAME like a sumbitch once it's cracked. But it won't suddenly run big 3rd party releases exceptionally well.
 
Shin'en strikes again.

"The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU."

You hear that? Stop focusing on Mhz. Focus on the architecture.

Again this quote?.

Can I post too "horrible and slow" quote then?
 

Thraktor

Member
Kenka mentioned me a few pages back, so I might as well give my two cents.

First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).

The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.

Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.

The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).
 
Top Bottom